

- #Adobe character animator demo how to
- #Adobe character animator demo driver
- #Adobe character animator demo manual
- #Adobe character animator demo software
#Adobe character animator demo software
As far as AvatarLabs is aware, this was the first time the software would be used to create a live event around feature film content.Ĭharacter Animator has a simple but highly configurable motion capture feature which captures motion data through something as simple as a webcam and allows for puppetry of an illustrated character in real time.
#Adobe character animator demo driver
They had the My Little Pony feature film coming up and they awarded us the opportunity to bring our demo to life.”Īdobe Character Animator had been the driver for last year’s live episode of The Simpsons, as well as the tool underpinning live animations of such figures as Donald Trump and Hillary Clinton on the Late Show with Stephen Colbert, now developed into the Showtime series Our Cartoon President. They had heard of Adobe Character Animator and were logistically trying to solve similar problems. “Lionsgate likes to take risks,” she continues, “and so do we. “We had already been offering live streaming, but how could we bring our clients’ animated characters into the live space as well?
#Adobe character animator demo how to
“What we were trying to solve with this demo was how to bring these animated properties into the live streaming space,” says JB Fondren, AvatarLabs senior digital producer. Footage was brought into Unity to render and integrated with Adobe Character Animator. There were several technical challenges to solve, including appropriately synchronising and matching footage captured from the Kinect motion capture camera, depth camera and RGB camera.

The demo showcased how 2D and 3D animated characters might be fully integrated into a scene and interact with live human performers in a live broadcast. Though not all Sneaks features make it to the final product, previous features shown at Sneaks like Content-Aware Fill have, so hopefully the Body Tracker will make it to After Effects soon.They assembled a demo, which was sent to selected companies with animated properties, which included regular client Lionsgate. The feature will be shown at Adobe’s annual Max design conference during the Sneaks keynote, where experimental lab features are showcased. “We hope happy with the results as an early starting point, to save a lot time on their final product,” Yang says. It can be thought of as a powerful supplemental tool, much like third-party plug-ins such as Duik Bassel, a free character animation tool that automatically rigs humanoid structures and creates walk cycles.
#Adobe character animator demo manual
“I can easily link the tracking points to the corresponding controllers on my character, so that the keyframes can be transferred,” Yang said.īody tracker isn’t meant to replace the animation process, but rather to reduce manual keyframing and save time. The feature makes use of Adobe’s artificial intelligence platform Sensei, which was trained on more than 10,000 images to manually identify the body points, Adobe research scientist Jimei Yang said during a demo. The feature is also useful for adding motion graphics or other objects to the tracked body - for example, if you want to animate a scene in which the characters play basketball, you can make it so that the basketball stays tracked to the character’s hands. It can also be used to easily create an outline around a subject you want to remove with content-aware fill, which automatically fills in a selection based on its surroundings. The feature can also create a contour mask around the body, which can be used in a variety of ways, like video color grading and highlighting the foreground. Similar to how Adobe’s Character Animator can track facial expressions, the feature could be a quick way to create 2D body animations.

The body tracker detects human body movement in source videos to generate track points for 18 joints across the arms, torso, and legs, which can then be transferred to the character that’s being animated. Adobe is previewing an R&D feature for After Effects that can automatically track human movements and apply them to animations.
