The words you are searching are inside this book. To get more targeted content, please make full-text search by clicking here.
Discover the best professional documents and content resources in AnyFlip Document Base.
Search
Published by libraryipptar, 2023-01-29 21:55:04

Animation Magazine - February 2023

Majalah dalam talian

feb 23 47 www.animationmagazine.net OPPORTUNITIES Autonomous Animator Is It Really Necessary to Go Back to the Office? N ow that the world has more or less settled from the major shockwaves caused by the pandemic, the entire global workforce is starting to realize that working from home may be forever more and things may never go back to the way they were. And my question is: Even if they could, should they? Other than missing out on in-person camaraderie, staff and business owners have certainly, by now, realized that paying for a giant building and requiring hundreds of people to commute to and from work is not only unnecessary, but as it turns out, is almost unanimously counter-productive and less profitable than an autonomous approach. And it only makes sense. It’s hard to fathom the amount of time unnecessarily wasted by pre-pandemic commuting, only to sit in a cubicle to use a phone and/or computer which is quite possibly inferior to the one you have at home or even carry around in your pocket. Sadly, commuting one to three hours one way is still a reality for some even in today’s work environment. That’s up to six wasted hours per day that could be recaptured and used to be more productive and live happier lives if they were to work at home instead. In addition to reducing or even eliminating time wasted by unnecessary commuting, there is an incomprehensible amount of money that has been saved by reducing the burdens that over-reliance on transportation causes. And let us not forget Mother Nature in this equation: The copious amounts of fuel required to transport the working population of an entire country across town, sometimes even across borders, just for the sake of clocking in at a building far from home — and then back again — produces Earth-choking amounts of toxic fumes and a host of other sinister pollutants. Now, in an environment where commuting has been greatly reduced, there is room for Mother Nature to breathe just a little easier. Trepidation: Meet Results One of the biggest fears for upper management having staff suddenly required to work from home was loss of productivity. However, it turns out that in many (if not most) cases, productivity actually improved. One example of this is a study by Stanford which showed that working from home increased productivity by 13%. Improvements like this greatly contribute to a company’s bottom line, which means things will probably never go back to the way they were. What business owner, seeing a company-wide boost in productivity and resulting profit, would ever want to go back to operating the way things were pre-pandemic? Trust was the other concern. Supervisors that were used to monitoring staff by peering over their shoulder worried they would lose control of their workforce and have difficulty communicating. However, with unlimited free email, group chat rooms, remote logins, electronic timecards and plenty of phone and video conferencing options available, trepidation about the loss of management’s ability to supervise staff members at home was swiftly quelled. With almost every economy in the world being connected by an internet that travels near the speed of light, it just makes sense that every person who can work from home should work from home. There will always be those that hold onto pre-pandemic beliefs about how businesses should operate, but they may soon become the minority. Today, when someone tells me they just moved into a 40,000 sq. ft. building, my initial reaction is not as it once was, “Wow, that’s impressive, you must be successful!,” but rather something more along the lines of, “Is that really necessary?” Even Uncle Sam has joined the party. Highlevel government agencies have already experienced great success with staff members being allowed to work from home. Ironically (sarcastically) enough, productivity has increased as in the private sector, morale has improved, and indirect costs and expenditures have shown a decrease. Some companies certainly need large spaces to operate, but for every person that can perform their duties remotely from a home office the entire world stands to benefit. A mass reduction in the number of people required to commute to and from work on a daily basis has significantly reduced the burden on our economy, our environment and our personal lives. What say we continue down this path to free up even more resources so we can reallocate them to focus on what really matters: improving the quality of life? Martin Grebing is the president of Funnybone Animation Studios. He can be reached at funnyboneanimation.com. - by Martin Grebing - WorldMags.net WorldMags.net


www.animationmagazine.net 48 feb 23 T hree years ago, David Steward II formed Lion Forge Animation when he and his family recognized a huge need in the industry to create a platform for authentic diverse voices. Drawing from his experience in publishing (graphic novels and comics) and informed by market trends, he launched Lion Forge Animation, one of the only Black-owned animation studios in the world. Within months, Lion Forge Animation had an Oscar win as the studio behind the animated short, Hair Love (directed by the enormously talented trio of Matthew A. Cherry, Everett Downing Jr. and Bruce W. Smith). The company has been developing and producing world class IP, building global partnerships, and focused on team recruitment and talent development. Today, Lion Forge Animation is well-positioned and uniquely equipped to create, drive, and deliver genuine, diverse stories to meet increasing audience demand. “Lion Forge Animation’s overarching goal is to create a broad content slate ranging from children’s and family entertainment to adult animation that is rooted in authenticity and speaks to audiences all over the globe,” explains Steward II. “Our content is wildly immersive and anchored by groundbreaking characters, which allows us to explore new ways of telling stories. We build deep and layered franchises that advance the diversity of content in the media landscape in meaningful and powerful ways, and in turn, if we do it right, drive change.” He adds, “A fundamental part of achieving our goals is fostering a talented, collaborative group of professionals whose lived experiences and voices make these creative visions a reality. We are particularly focused on developing and training the next generation of diverse talent for the animation industry.” According to the company’s VP of production Saxton Moore (Sugar and Toys, Iyanu: Child of Wonder), Lion Forge has a robust content pipeline. “Several current projects at Lion Forge Animation include a highly anticipated adaptation of the popular graphic novel series, Iyanu: Child of Wonder, which our studio is making into a children’s animated series for HBO Max. Created by Nigerian creator Roye Okupe, Iyanu: Child of Wonder is an epic superhero tale steeped in Nigeria’s rich culture, music, and mythology.” Other projects include a Hair Love spinoff series (Young Love) for HBO Max; Marley and the Family Band, an animated series based on Cedella Marley’s picture book in association with Universal Music Group; Rise Up, Sing Out, executive produced by Tariq “Black Thought” Trotter and Ahmir “Questlove” Thompson, for Disney; an animated series based on the multiple award-winning Excerpts from an Unknown Guidebook book series; a slate of projects introducing the groundbreaking life and legacy of Wendell Scott, the first Black race car driver and team owner to win a race at NASCAR’s highest level; and a new multi-media initiative in partnership with St. Louis-based Nine PBS designed to positively represent Black and brown kids and help close the literacy gap, Drawn In.” Drawn to Raising Awareness Steward II believes that great storytelling is authentic and grounded in experience. “We believe by embracing diversity and amplifying previously unheard voices, animation can be a powerful driver for awareness and understanding,” he notes. “We are advancing the narrative of authentic content and see animation playing a pivotal role in reaching and engaging previously ignored audiences. We not only make content for the marketplace – we make content that creates and shapes new markets. We want to show the industry that going beyond on-camera diversity and working with writers, producers and crews that are representative of the stories being told benefits everyone.” Both Steward II and Moore believe that it’s imperative that the industry move past the idea that there can only be one of any particular genre of show or movie with a diverse or under-represented lead character. As Moore points out, “The animation industry’s key players have gone through multiple cycles of growth and subsequent reduction, a trend which has been exacerbated by the streaming platforms entry, expansion, and prominence in the global market.” “This impacts content output, fuels uncertainty around studio stability, and creates demand for more diverse content,” he adds. “We see this as a great opportunity for authentic, minority voices to contribute and make new content. The bottom line is that there remains a lack of minority talent in the industry and Lion Forge is here to help address that issue.” Steward II wants the animation community to know that Lion Forge tells stories that others won’t or can’t. “We are home to diverse creators and professionals, and a robust IP portfolio,” he says. “We have developed collaborative industry partnerships that are changing how content gets made. And we have proven that we make and deliver dynamic, award-winning content.” ◆ For more info, visit lionforgeanimation.com David Steward II Lion Forge: A New Creative Home for Authentic, Diverse Voices Saxton Moore Dawud Anyabwile and Darnell Johnson Advertorial WorldMags.net WorldMags.net


feb 23 WorldMags.net 49 www.animationmagazine.net WorldMags.net


www.animationmagazine.net 50 feb 23 HOME ENTERTAINMENT Rediscovering a Romanian Space Odyssey Rare animated sci-fi flick The Son of the Stars scores a new 4K restoration. - By Jeff Spry - E very so often, cosmic tumblers align and the sci-fi universe gifts us a vintage jewel in the animation arena, some long-forgotten project now cast in the golden glow of modern accolades and provided with a generous dose of tender loving care from restoration specialists. Deaf Crocodile Films, a major player in the discovery and revival of previously unknown movies deserving of historical respect, has taken the 1985 Romanian animated science fiction saga The Son of the Stars and given it a sparkling 4K restoration to be presented in North America as a first-ever special edition Blu-ray in early 2023. Back in the days of hair metal bands and Jazzercise classes, the animation industry was in a bit of a funk. At a time when Michael Eisner’s House of Mouse renaissance was still in its infancy, this sequel to the 1984 space opera spectacle Delta Space Mission was unleashed in Europe to great fanfare. Surreal Space Adventure Reuniting Delta Space Mission’s same creative team of directors Călin Cazan and the late Mircea Toia, The Son of the Stars was never officially offered theatrically in the United States. Its plot line is best explained as an ambitious fusion of Alien, Star Wars: The Empire Strikes Back and Edgar Rice Burroughs’ Tarzan for a trippy, 80-minute mash-up that delights with old-school charm and hallucinatory design aesthetics.  In partnership with the Romanian National Film Archive and Cinematheque and the Romanian Film Centre, Deaf Crocodile has freshened up this impressive project with a new ultra high-definition transfer scanned from the original 35mm camera negative and soundtrack, supervised by Deaf Crocodile co-founder and in-house restoration wizard Craig Rogers.   “Everything you loved about Delta Space Mission, only even better!” Rogers claims. “Space dragons, pterodactyls, amorphous blob creatures, amulets of power … Seriously, what’s not to love?” Strap in for the official synopsis: “In the year 6470, a husband and wife team of explorers receive a mysterious distress signal from a female astronaut who disappeared decades earlier. They leave their son on board their ship while they go searching for the missing woman — but fate intervenes, crashlanding the ship on a jungle-like planet populated by bulbous, telekinetic aliens and eerie stone gardens of frozen space creatures.”  The genesis of the narrative was a combination of many influences besides Star Wars and Alien, and carried with it certain phantasmagoric origins that seemed in step with that awesome decade. “Tarzan, Robinson Crusoe, the Brothers Grimm stories, Moby Dick ... these are childhood books for everyone,” Cazan tells Animation Magazine in an exclusive interview. “Of course, we also saw the movies — The Jungle Book (the Disney animated version from 1967, and the version with actors from 1942) at the Cinematheque — a special cinema in Bucharest.” And if eerie disembodied eyeballs, cosmic sorcery, decomposing Cubist structures, levitating purple tentacles and a crusading medieval space knight weren’t enough insanity, The Son of the Stars also showcased a “Their ingenuity, working behind the Iron Curtain in the mid-1980s, incorporating elements of Western sci-fi and Japanese anime with Eastern European music and mythology and even politics, was just astonishing.” — Dennis Bartok, co-founder, Deaf Crocodile WorldMags.net WorldMags.net


feb 23 51 www.animationmagazine.net HOME ENTERTAINMENT sensational psychedelic score by synth-rock pioneer Ștefan Elefteriu. “We ended up working with Ștefan Elefteriu because he was already known at the time for piano music and synthesizer,” Cazan recalls. “It was not difficult to get in touch with him through the Union of Composers with the relationships of our colleagues from the sound team. It wasn’t hard. He is such an open man that he was able to work without money, just [out of] curiosity.” Similar to Delta Space Mission, The Son of the Stars was first produced as a series of short episodes which originally aired on Romanian television and then combined to make an entire feature. Plans for a second series began immediately after production ended with Delta Space Mission and the distinctive style was partly inspired by American Hanna-Barbera cartoons. “We didn’t really have the Cold War here. The Flintstones, Yogi Bear, Tom & Jerry and The Jetsons were all on Romanian TV,” explains Cazan. “Hanna-Barbera’s style was perfect — simple characters in structure, very funny in appearance, without unnecessary additions, very expressive. Each character has his own personality, his own way of moving and his own expressions. And the stories are well balanced. There were wonderful people at Hanna-Barbera Studios.” Behind the Iron Curtain Dennis Bartok, Deaf Crocodile co-founder and head of acquisitions and distribution, remembers his ideal working relationship with Cazan on Delta Space Mission’s restoration. “We discovered that he and Mircea Toia had directed another feature right after, just as surreal and psychedelic — and if anything, even more criminally unknown here in the U.S.,” Bartok says. “Their ingenuity, working behind the Iron Curtain in the mid-1980s, incorporating elements of Western sci-fi and Japanese anime with Eastern European music and mythology and even politics, was just astonishing. I just wish they’d been able to make a dozen more features — but we’re incredibly proud that we’ve been able to restore their two animated features together.” ◆ The Son of the Stars is scheduled to shine on 4K Blu-ray sometime in 2023. BACK TO THE ‘80S: Working with the Romanian National Film Archive and Cinematheque and the Romanian Film Centre, Deaf Crocodile has restored The Son of the Stars in 4K. www.animationmagazine.net Your Number-One Connection to the Global Animation Community WorldMags.net WorldMags.net


www.animationmagazine.net 52 feb 23 VFX & TECH Diving into Pandora’s Magic Director James Cameron and his VFX team discuss the stunning visual effects achievements of Avatar: The Way of the Water. - By Trevor Hogg - W atching the Hollywood blockbusters of James Cameron, it is not hard to point out the influence of his deep fascination with marine life, whether it be an underwater alien encounter in The Abyss, setting a love story aboard a doomed ocean liner in Titanic or depicting a coastal clan in Avatar: The Way of the Water, the first of four proposed sequels that takes place on the exomoon named Pandora. Cameron and his VFX team discussed his eagerly anticipated feature, its state- of-theart visual effects and their inspirations during a recent interview with Animation Magazine. “I would say that the first Avatar borrowed from the colors and patterns and the variety and the shape language, like the Wood Sprites being jellyfish-like, from shallow water marine life,” he explains. “All we were doing was taking the amazing aliens that exist right here on planet Earth and taking them out of context, changing their scale because some of them are quite tiny. The interesting truth of it is, we can’t beat nature’s imagination.” Inspired by Natural Wonders He adds, “But when tasked to tell a story, we have to come up with something interesting. The [film’s] ilu, for example, has the neck of a Plesiosaurus and the body of a manta ray that was turned into a biplane, so it has lower and upper wings. It’s a beautiful creature and looks completely plausible. It’s based on touchstones in reality, but is its own thing.” The director acknowledges the evolution of VFX technology and its important role in the filmmaking process. “Anybody from my generation was going to go out and do our physical production,” notes Cameron. “Then we’re going to shoot plates for some small subset of the movie and those are going to be spectacular sequences that would be finished later during a long post production. However, for me, visual effects are the fabric of my craft as a filmmaker. There are days when I’m just hand-holding the camera and running around with the actors.” He adds, “We’re just doing it within a space that allows the camera to be tracked so that we can set extend using CG. That’s the only difference from a straight live-action movie, but I get to do all of the stuff I used to love back in the day. The point is that my attitude towards visual effects is that it’s the heart and soul of how we create this other world and these otherworldly characters.” Overseeing the digital animation and augmentation were production VFX supervisor Richard Baneham and the team at Wētā FX led by Joe Letteri and Eric Saindon with additional support provided by ILM. “We basically gave Wētā FX a big traunch of money WorldMags.net WorldMags.net


feb 23 53 www.animationmagazine.net VFX & TECH to develop how to do water,” explains Cameron. “Everything from how water drains out of hair, how water sheds off of skin, how a person emerges from water, how their costume would take the weight of the water and then drain out, and how the water itself would react as it’s displaced around the body. They were working on that piece of the puzzle and we were working on the physical aspects of how to capture [everything] in water, which proved to be quite a daunting thing.” Performance capture had to take place within three different situations and stitched together in real time. “They were in an infrared volume above the water, when they went into the water and were swimming on the surface or diving down, and they would go into an ultraviolet volume underwater,” he points out. Selected performance capture moments were assembled together to produce what is referred to as a template. “We have a lab in Los Angeles consisting of 160 people who build these files that are a slightly cruder representation of the movie, but they describe exactly what it is we want to achieve,” states Richard Baneham. “There is a huge investment of time and energy in making that happen. The technology that we’re using onstage is part of the Wētā FX pipeline, so we are truly integrated with them from the inception of the shots all the way through to final execution. Jim [Cameron] is able to take the cameras that he or I did on set and literally repeat them on set with the live-action crew.” Baneham adds, “We’re making a movie about an alien world, so we try to ground ourselves in a film language that is familiar. Jim would literally go, ‘What piece of equipment are we using to do that?’ He’s always thinking of it from a practical standpoint.” Simulcam was an indispensable tool throughout the production. “We knew that the plates and live action would fit together because it was all captured in the right depth and lighting,” says Eric Saindon, senior VFX supervisor at Wētā FX. “[Cinematographer] Russell Carpenter was able to light it in a way that we don’t have to manipulate these plates like you typically have to. The plates drop into the CG environment and you know that they’re going to work because you see it in rough format early on when you’re shooting it.” Saindon points out that the depth compositing and Simulcam worked even with the water. “We were able to see where the other characters were going to be in the water and waves as we shot it,” he notes. “Because of this, we knew that there was enough space for them later on.” One of the film’s many dramatic sequences involves a whale-like creature. “The tulkun hunt is a great action sequence, but is also terribly sad,” says Saindon. “All of it happens at sea and we never went out on the water for any of the shots! Because of the way we shot the boats and gimbals and have them move properly in the water it feels like it was shot out on the water, and all we’re adding is a few “The end result on the screen which is what I’m seeing every day as it is pouring in from Wētā FX is quite astonishing in terms of the veracity of the performances, the body movement, the hands, all of those things both in air and water.” — Director James Cameron BUILDING A WATER WORLD: Led by James Cameron and Joe Letteri, the visual effects team tested the capabilities of the medium by constructing a performance-capture tank at Lightstorm Entertainment’s facility in Manhattan Beach, Calif. According to Wētā FX, 57 new species of sea creatures were eventually created for the film. WorldMags.net WorldMags.net


www.animationmagazine.net 54 feb 23 VFX & TECH things in the background.” For senior VFX supervisor Joe Letteri, one of the most important aspects of the visuals were the advances made in facial animation. “There are two aspects to it,” he notes. “One is capturing the data, making sure that you understand what the actors are doing in the moment, and the second aspect involves translating that into unique characters that are based on the actors which have to hold their own on the screen. That’s where we have put the bulk of the effort, trying to get that realism not only in the technical visual sense but in the emotional sense. Basically, understanding what the actors are doing on the stage should translate to what the characters are doing on the screen.” An important factor to consider is that the facial rig above ground does not hold up underwater. “The face is floating a little bit and you’ll tend to squint more underwater,” says four-time Oscar-winner Letteri, whose many credits include Avatar, King Kong and The Lord of the Rings and The Hobbit movies. “Things like that that may work for a human actor are not what you want these characters to be who spend their life in the water. I’m talking about the Metkayina characters which by design are more impervious to being in the water; their expressions are more natural underwater. We take that into account by mixing what the actors do above water with what we’re seeing them doing below water to craft that performance together.” Better than Ocean Photography Obviously, the Avatar sequels required years of planning and developing VFX technologies and methodologies. “We’re not going to make a movie just to do something new,” notes producer Jon Landau. “We’re going to do it because there is a new story that we want to tell and feel people will be compelled by. Now there are tools at our disposal from a technology standpoint that allow us to tell stories that we couldn’t tell in the past in a better, more compelling and engaging way.” The creative and technological odyssey leading to Avatar: The Way of Water has been a highly rewarding experience for Cameron and his team. As the visionary director concludes, “The end result on the screen which is what I’m seeing every day as it is pouring in from Wētā FX is quite astonishing in terms of the veracity of the performances, the body movement, the hands, all of those things both in air and water. Of course, all of the water work is spectacular. It’s in the same league as ocean photography which was our original goal, but it’s not an easy one!” ◆ Avatar: The Way of Water was released in theaters worldwide by 20th Century Studios in December. “The Metkayina characters are more impervious to being in the water; their expressions are more natural underwater. We take that into account by mixing what the actors do above water with what we’re seeing them doing below water to craft that performance together.” — Senior VFX supervisor Joe Letteri WAVES OF INVENTION: According to the Wētā FX technicians, about 1,600 different major effects simulations were created for the film. These visuals involved the proper flow of waves on the ocean, waves interacting with characters and environments, and the thin film of water that runs down the skin. WorldMags.net WorldMags.net


feb 23 55 www.animationmagazine.net VFX & TECH Tech Reviews - By Todd Sheridan Perry - Higx Point Render for Nuke E veryone loves smokey, tendrilled and wispy effects. I don’t know why, but there’s just nothing like a gorgeous nebula in space, or how ink moves when dropped in water, or the way Wanda conjures up wiggly-woo magic in Marvel movies. When these are created in CG, it’s usually made in a 3D program with particles, a bunch of wacky forces and, for those beautiful tendrils, a simulation of a gazillion particles. Back in my day, you had to export out passes with slightly different parameters and then comp them back together because our workstations couldn’t handle the particle count. But now, there is a blazingly fast plugin for Nuke called Point Render, developed by Higx (Mads Hagbarth), which allows these cool effects within the compositor using Nuke’s BlinkScript in the 3D work area. The speed comes from it being a point renderer (hence the name), which means its handling a point per pixel, rather than actual particles. And the points are additive, meaning that if points are on top of each other they will get brighter. There is no occlusion because there is no lighting. This is the signature for Higx Point Render: cool, energy-like forms that swirl and undulate. The basic workflow is pretty straightforward: You set a point generator, modulate and modify with nodes to add fractal noise, twists or custom expressions, and then render with the Point Render node using a camera. But it’s the layering of these that give you the beautiful complexity of the imagery. Because you can use cameras, the points live within the 3D space of the shot — so it tracks with all your other elements. In addition, the render supports motion blur and depth of field. Point Render uses only Nuke native nodes, so it will keep working as Nuke gets upgrades. What’s even cooler is that the nodes are open source. The BlinkScript in the nodes is accessible, so you can poke around under the hood to find out how it works. And you can even customize the code for your own purposes. To summarize, it’s powerful, fast and at $157, it’s well worth the price. Not only you can have it in your toolkit, it will also support the talented folks who are out there making cool tools for you. Website: higx.net/point_render Price: $157 Adobe Creative Cloud 2023 A dobe recently announced its plans for Creative Cloud for the fall, which should be released out in the world by the time this article hits the newsstands. This next generation CC features 18 upgraded desktop applications, including Photoshop, Illustrator, InDesign, InCopy, Animate, XD, Dreamweaver, Premiere Rush, Premiere Pro, After Effects, Audition, Character Animator, Media Encoder, Prelude, Bridge, Camera Raw, Lightroom and Lightroom Classic. There are a lot of advances in most of the apps, but there is also a bunch of crosspollination. So for this article, I’m going to focus on Premiere Pro, because it reaches into the new tools for After Effects and Audition as well. There aren’t any huge updates to the actual “editing” part of Premiere Pro, but there are some sweet additions to the supplemental things that make our projects look and sound good. A primary upgrade is a set of tools that will change your whole color workflow. It’s called Selective Color Grading. Those familiar with Nuke will know this as a HueShift. Basically, it lays out your color wheel into a horizontal line. You then add control points to isolate a specific hue you want to modify or select. The resulting curve can be as soft or hard as you wish depending on how specific the hue is you wish to isolate. This can be used to adjust luma or shift hue or what I have been using for quite a bit — despilling (which allows you to specify a color from the image and separate from the alpha bias). It is extremely powerful, and once you wrap your head around it, there won’t be a day that goes by in which you won’t crack it open. After Effects has more powerful motion graphics templates that can be ported over to Premiere Pro. This means the motion graphics artist can design the titles or branding for a spot or TV show — and then send it over to the editor. Parameters can be promoted to only reveal the components that will be changing. This way, editors can update credits, lowerthirds text, etc., but use the base animation from the artist. This is a huge timesaver to allow the text change to happen in editorial — keeping the artist free to make new, super cool graphics for the next show. As an extension of the motion graphics tool, Premiere Pro can accept data from a CSV spreadsheet, and then utilize the data therein to automagically create infographics. This also saves a lot of time, allowing the data to drive the animation. But my absolute favorite new tools lie in the audio. Using Sensei, Adobe’s AI/machine learning algorithms, Premiere Pro and Audition have intelligent denoising and dereverbing tools. Analysis of your audio (and presumably numerous other audio samples) allows the Sensei to learn what is sound and what is undesirable noise. It can then separate them out. The same holds for the de-reverb. After learning, Sensei detects the additional echoes (the reverberation) of the primary sound and filters them out. I can think of at least 20 things that I need this for, and that’s just off the top of my head! Website: adobe.com/creativecloud Price: All Apps, $82.49/mo. (monthly), $54.99/mo. (yearly), $599.88/yr. (yearly upfront); Premier Pro, $20.99/mo. WorldMags.net WorldMags.net


www.animationmagazine.net 56 feb 23 VFX & TECH Adobe Characterizer A few months ago, Characterizer was also announced as part of the software company’s Character Animator CC release. In its first inception, I wasn’t bowled over by Character Animator. I felt it was an easy way to make things move, which basically opened up the ability to people who had no experience doing it in the first place. This leads to horrible animation which results in judgments against the tools rather than the talent — from people like me, for instance. However, I also knew that when it got into the hands of capable animators, we would start to see some ingenious stuff. The animated segments we saw on The Colbert Show and Tooning Out the News are perfect examples of how the technology can be used in live situations. Feedback from these actual production situations helps Adobe focus on further development of their tools. Character Animator seemed to have started as a quaint idea rather than a fully-formed development plan, but it is driven by the feedback of the users to create a better product. This is a textbook example of how app development should work: Release frequently, fail quickly, pivot and rerelease. I see this in Characterizer as well, which seems to have started as a seed of an idea that doesn’t know the strength of its own potential. Characterizer is able to take samples of your face from your webcam, analyze the poses and, via Sensei, learns what pieces of your face are what. You can then apply different artistic styles to your visage. This is not limited to “pencil sketch” or “charcoal” or “oil painting” — you can use photos of objects that have a color palette that will be transferred to you. However, the process of creation doesn’t entirely need to be automated. There are tools to define areas of the sample image to apply to your features. Once you have something you like, the result can be tied into Character Animator rigs which are able to be driven by your own performance through your webcam or other cameras. Now you have a live performance that could be a pen and ink drawing ... or a wall of bricks. It’s absolutely fascinating. The results currently range from artsy to terrifying, but not necessarily artistic. This is because we are very early in the evolution of Characterizer. Sensei is learning and continues to learn how people’s faces work and how to apply the looks to them. The more it learns, the more refined the results will become. And the more refined the results, the more users will be attracted to using it. That, of course, means Sensei will gain even better knowledge and artistic expertise. Some have predicted major advances to happen with Characterizer in the next five years. I have a feeling those people are underestimating the power of machine learning. Website: adobe.com Price: Included in Adobe CC suite, see above. Todd Sheridan Perry is an award-winning VFX supervisor and digital artist whose credits include  Black Panther,  Avengers: Age of Ultron,  The Christmas Chronicles  and  Three Busy Debras. You can reach him at  todd@ teaspoonvfx.com. WorldMags.net WorldMags.net


feb 23 57 www.animationmagazine.net This month, we were invited to share a day with the amazing Ellen Jin, the talented art director of DreamWorks’ animated series Kung Fu Panda: The Dragon Knight, which begins its second season this month on Netflix. 2 1 4 8 a.m. After a couple of decades of my career, deciding what to wear in the morning is still the most time consuming-thing in the world. Why is that? I always end up wearing something black anyways! 10 a.m. Fun time begins at work. Working with my EP Peter Hastings, CG producer Dan Godinez and the hardworking and talented crew of Kung Fu Panda: The Dragon Knight is so much fun. We turn everything into magic here: Skadoosh! Noon: Here’s my semi-balanced lunch! Vegetables are overrated anyway, right? Just please don’t tell my kids I said that! 1:30 p.m. Our incredible lighting lead Kim and I 2:30 p.m. finish reviewing footage. Teamworks, DreamWorks! Our superstar production manager Monica keeps us all on the right track. 7:30 p.m. So that was another good day at work. Now it’s time to relax with my family! 8 3 11 a.m. Whaaaat? I just get to sit and paint!? I love this! 5 6 7 3 p.m. Production coordinators Jada and Carolyn take a stroll around the campus with me. We all need a break! A DAY IN THE LIFE WorldMags.net WorldMags.net


EUROPEAN FILM MARKET IT ALL STARTS HERE 16 — 22 FEB 2023 efm-berlinale.de WorldMags.net WorldMags.net


WorldMags.net WorldMags.net


WorldMags.net WorldMags.net


Click to View FlipBook Version