Customer Focus: Users share their ideas, thoughts and workflows on After Effects.
Having worked together with Director Ridley Scott and Production Designer Arthur Max on Prometheus, Territory, a design studio that works on a range of motion projects including feature films, brand work, and popular video games, was asked to create the screen graphics for The Martian, recently nominated for an Oscar® for Best Achievement in Visual Effects. Although both films are set in space, The Martian is predicated on real science, and ‘authenticity’ was key to the creative.
When Territory’s team, led by Creative Director David Sheldon-Hicks and Ar Director Marti Romances, broke down the script, they realized that story-led motion graphics would be a constant presence in every scene, helping to explain, clarify, or direct the dialogue and the action.
Here, Marti takes us behind the scenes, sharing the creative process so we can understand how Creative Cloud and After Effects CC helped the team achieve the stunning graphics that feature in the film.
There were 8 key sets, including Mission Control, the Hermes spaceship, NASA Jet Propulsion Laboratory (JPL), Mars Ascent Vehicle (M.A.V.), HABitat facility on Mars, Mars Rovers, NASA offices, Pathfinder, and smaller projects for the Space Suit Arm Computers and Crew personal laptops.
Each set featured hundreds of screens, most of which needed to be animated, and I created visual design languages to help differentiate each set. From there, my team of designers, animators, and 3D artists began to create content.
Adobe Creative Cloud was with us every step of the way, with Illustrator CC, Photoshop CC and After Effects forming the backbone of our creative tools, helping us to bring our concepts to life and make our time more effective.
The biggest challenge we faced was to find the best ways to combined real science, stunning design, and dramatic storytelling. Our graphics had to represent complex scientific information very clearly so that the audience could understand and keep up with plot twists. At the same time, we had to make sure that the science was still credible and met with NASA’s approval.
My approach to a project of this scale is to create all the user interfaces in Adobe Illustrator first, and then animate all the windows and widgets in After Effects. Building the graphics in a non-destructible mode (being able to scale it up and down without losing pixel quality) was key, as we knew we were going to be repurposing lots of the graphics in different aspect ratios.
CREDIT: Territory Studio
We also knew that many of the screens required interactions or animations to tie into story points, so we designed the graphics with movement in mind. We looked carefully at the UX of the interfaces that we ‘reimagined’ for the story to make sure that the choreography felt right in terms of ease of use and expected function.
CREDIT: Courtesy Twentieth Century Fox
An authentic Mission Control
Mission Control was the biggest set and featured around 100 screens, including a bank of LED monitors 18m x 6m. Ridley and Arthur were very clear that the Mission Control screens (NASA and JPL) needed to look real and work authentically.
CREDIT: Courtesy Twentieth Century Fox
One of the film’s key scenes plays out in Mission Control so it was essential that we got the balance right between factual screen content and visual design. Each screen has a real purpose in that context and we needed to make sure that our work reflected that. And it was important to give a unique identity to the set, which features a lot of information, including realistic video feeds and telemetry data for the actors to react to and interact with.
Creating a visual language
To be able to create a visual language to wrap those realistic elements in, I researched NASA’s current data and interface conventions, and how data was prioritised and when, how that was organized and depicted on screen and in the Mission Control space, how crew interacted with it, what commands were given, and how that changed the data display. We also talked to NASA about how they think that will evolve over the next 20 years.
I then began to visualize how to bring all that data, which in real life is displayed in a mix of styles, formats and screens, together. I applied information architecture principles to the interface designs, and thought about data priorities and the user experience. I wanted to achieve a consistent UI design that could work for NASA in real life.
CREDIT: Territory Studio
Out of the possible routes we suggested, Arthur chose the combination that was very true to the data requirements and the spirit of NASA’s current Mission Control, and yet pushed 20 years forward.
CREDIT: Territory Studio
The backgrounds were black and dark blue with white fonts and light blue indicators. Red was used to highlight mission critical data and indicate warning status. The overall look of the interface is serious and authoritarian, but the hierarchy of information is clearly readable to tie in with story points.
Animation ref: Martian NASA Mission Control 02
With hundreds of animations playing concurrently, After Effects played a key role in bringing the UI to life within the context of the action. I would say the best thing was the ability to export every single layer from Illustrator to After Effects and animate them as we wanted to. They were couple designs and having everything organised was key.
The 2D graphs on the right hand side were also Illustrator paths transformed into After Effects paths so we could animate the bezier points in After Effects. The “Numbers” native effect was very useful when replicating timecodes like the countdown for the missions or (again) for the amount of running numbers in the left hand side.
Designing for The Hermes spaceship
The Hermes space ship was another key set. For the Hermes screens, UI Art Director Felicity Hickson wanted a twist on the typical spacecraft avionic designs. Again, NASA’s reference material for the relevant technology and science led the design of the console screens, but as we were designing for a ‘near future’ spaceship, we also looked at what SpaceX are doing to push the designs a bit further.
We ended up with a good set using different dark tones on the backgrounds (dark blues, purples and greens) and very vivid and bright colours (mainly whites and bright screens) for the data and buttons on those screens.
Martian HERMES Purnell Orbits 01
Again, the Hermes screens were all displaying mission critical status information, be that engineering schematics of the Hermes itself, or of operations performed by its equipment.
CREDIT: Territory Studio
After Effects really stood out for its ability to help us achieve the simplicity of the main graphs for the orbits, such as the ones in the top left corner of the above screen, plus all the rest of the running numbers animated using the “Numbers” effect native from After Effects. The Posterise Time effects helped a lot too when trying to replicate a slightly delayed update of information from the satellites, receiving information not instantly but a bit phased (by 15 or 20 frames per second). So we could animate everything smoothly and later on apply the Posterise Time effect to down the time of the refresh on the animation by the same values for the rest of animations on that screen (i.e., numbers, graphs etc.).
CREDIT: Territory Studio
Insights from Designer and Animator Daniel Højlund
Expressions offer a great way to establish a lot of extra animation control in After Effects, and we used it occasionally to help drive certain animation like number values and relationship between different layer properties. Building a rig with Expression controls linked up to multiple layer properties, also help us ensure certain animation patterns and feel to stay consistent across screens. It was a very time efficient way to generate some of the more generic animation elements for quick turnarounds with the use of fairly simple expressions.
Also, the pipeline between Illustrator and After Effects is much improved and very useful. Designing the graphic elements in Illustrator and importing them into After Effects, made it easy to go back into Illustrator to make design adjustments, and then have those elements live update in After Effects. It is a much faster and sufficient way of working with Illustrator layers, which was very handy for us on more than a few occasions.
The fact the screens were not just set dressing but ‘mission critical’ and necessary to both story and credibility added to the pressure, but ultimately the satisfaction. And once the screens were programmed by Compuhire, our on-set playback partners, they really brought the sets to life and it was great to see the actors performing with live screens on-set.
Experts in animation and illustration create more fun and engaging 2D characters for growing YouTube channel
When Christian Hughes couldn’t find educational video content on YouTube for his toddlers that appealed to him, he decided to create it himself. As the Founder and CEO of a successful video production company, he used his background in animation and video to establish Toddler Fun Learning, a successful YouTube channel for the younger set.
The channel is a family affair, with Christian serving as Creative Director and his wife Amalie as Marketing Director. They also employ talented freelance animators, including Chay Hawes, who work with them to improve the quality of animated content with new tools in Adobe Creative Cloud. Most recently, they started working with Adobe Character Animator, which is available with a download of Adobe After Effects CC, to create charming 2D animated characters.
Adobe: Why did you decided to start making educational YouTube videos for kids?
Hughes: I have a video production company, Curly Productions, which specializes in online video content, promotional videos, and motion graphics for clients. After I had kids, educational video content became important to me, yet I couldn’t find anything on YouTube that reminded me of the content I watched when I was young. A lot of the content was hyperactive and used clipart style graphics, yet still got millions of views.
I decided to take a crack at creating some animated videos myself, hearkening back to what I remembered as a kid, but with a modern YouTube twist. I created some animations, posted them on YouTube, and they got more and more views. Today, we have more than 100 videos and 40 million views on Toddler Fun Learning.
Adobe: How long have you used Adobe software?
Hughes: I’ve been using Adobe software since I was 14 years old. I taught myself Adobe Premiere Pro and After Effects, and to a lesser extent Photoshop and Illustrator. I’ve implemented Creative Cloud across Curly Productions and also use it for the Toddler Fun Learning videos.
Adobe: How did you first start creating animated video content for Toddler Fun Learning?
Hughes: I created the first 10 videos myself. I came up with the idea for Number Zoo when I was on a flight to South Africa and immediately started drawing it using Adobe Ideas, which is now Adobe Draw. I’m not an illustrator by trade, but I thought kids might like drawings that looked like they were drawn by kids.
I synced the artwork to Creative Cloud, opened them in Illustrator to refine the drawings, separated them into layers, and created the arms and legs. Next, I moved into After Effects for animation and then used Dynamic Link to bring the animations into Premiere Pro where I added the music.
We release one animation each Friday. The software enables us to achieve this schedule. We just started releasing a second video on Tuesdays that is a partnership with Penguin Books. Filmed at the YouTube studio in London, Story Time for Children features live action of celebrities reading storybooks to kids.
Adobe: Who is the team behind the Toddler Fun Learning videos and how do you work together?
Hughes: In addition to me and my wife, we have three trusted animators, all with full-time jobs during the day. They enjoy spending some time on weekends and evenings doing something creative. Our channel doesn’t have a distinct style. I just tell the animators and illustrators that the content is for kids and they can be as creative and off the wall as they want. They come up with ideas, create storyboards and character designs, then we’ll create an animatic followed by the full video. It’s a quick process, and Creative Cloud makes it easy for all of us to work together.
Hawes: I initially started working with Toddler Fun Learning on a few nursery rhyme animations, and then we created the Puppy Park series together. At the time, I was starting my own animated YouTube series called Daisy and Fluff. This began as another narrated animation, but when I saw Adobe Character Animator I decided to expand it to include elements of character acting and lip sync to interact more with the audience. Christian and I agreed to do the same for our next Toddler Fun Learning series.
Adobe: How is Character Animator significant to the work you do?
Hughes: Animation can be quite time consuming and costly, so all of the money we make on the channel is reinvested into new content. We’re always trying to find new ways to create higher quality content on a really tight budget. Character Animator gives us the opportunity to try on-screen character animation with our new series Gecko’s Garage at a much lower cost than traditional character animation, which is really exciting.
Hawes: Before Character Animator, we could only do very basic on-screen animations such as adding a wiggle or an expression, waving an arm, or making the mouth move a bit. With Character Animator we can create all of the character’s movement and speech using a webcam. It was easy to get set up with the basic character for Gecko’s Garage. After creating the character in Photoshop I imported it into Character Animator. From there, Christian recorded the voiceover while I generated the puppet’s movements.
Adobe: What are your favorite Character Animator features?
Hawes: Lip sync in Character Animator is huge. After I put the initial time into making the mouth shapes and the character I did the lip sync in just a couple of minutes. Instead of going back and recreating the entire animation whenever we needed to make a change, we just did takes to update features such as the eye movements.
I also like how I can just move around in my seat and bring life to the character so easily. The breathing tool is also great. Even if the character isn’t doing something on screen, they can still be breathing, which is a simple motion to add that doesn’t require keyframing.
Adobe: What feedback have you gotten on Gecko’s Garage?
Hughes: Our kids are our biggest critics, and we often tweak characters depending on their response. In the case of Gecko’s Garage, their feedback led us to speed up the intro so you meet the characters right away. We’ve also looked at the YouTube analytics and the audience retention for the show is really good, with a 70% watch time and more than 30,000 views to date.
Adobe: What are the future plans for Toddler Fun Learning?
Hughes: We have eight episodes of Gecko’s Garage planned at the point, with a different educational slant for each one. Overall, we want to create more character oriented videos and we have a new series with aliens coming up. Eventually we’re looking at creating content for older kids as well.
Learn more about Adobe Creative Cloud
Fictional UI designer and animator adds stunning details to fantasy worlds in major motion pictures using Adobe Creative Cloud
Jayse Hansen is a sought-after fictional UI designer and animator who learned his trade not in school, but through books and from other great designers. After working in print and web design, he taught himself Adobe After Effects and set his sights on a career in the film industry.
Ten years later with a string of blockbusters under his belt, including XMen Origins: Wolverine, Iron Man 3, Ender’s Game, and Robocop, he still enjoys the unique challenges each project presents. He now also consults with companies exploring augmented reality and virtual reality technologies that may someday make his amazing fictional creations available for real-world applications.
Adobe: How did you get your start in the film industry?
Hansen: I was doing motion design and commercial work when a friend of mine showed me his reel. It was full of film UI work and I thought that would be the most awesome job. A while later I booked a commercial for Intel and pitched the idea of including futuristic interfaces. I shared that work with my friend and he started referring me for jobs. It took a while to break into the film industry, but I eventually made it!
Adobe: How does your early love of design apply to what you do today?
Hansen: I’ve always liked drawing and photography—design is a combination of the two. When I was young I would create engineering drawings and blueprints that broke down the inside structure of things. Through photography I learned all about composition, color, and lighting. Both of those early explorations apply to the work I do now.
For example, when I’m compositing a holograph, such as a part of the Iron Man suit, it is transparent so I’m designing the inside structure and showing the breakdown of how it works. When I’m putting that into a shot and compositing, I’m thinking about the lighting, contrast, and composition. I put everything to work in After Effects and it is both artistic and technical at the same time.
Adobe: What were some of your first projects?
Hansen: Trust is very important in the film industry. Everyone who is hiring is on the line, things move at a fast pace, and artists can sometimes be flakey. So I was very grateful when Gladys Tong at G-Creative took a chance and gave me a job working on XMen Origins: Wolverine and 2012. I created a few hero screens, which display full screen and tell a part of the story. It was cool that my first job was creating hero screens instead of something in the background that would get blurred out.
Along the way, I met Stephen Lawes and Sean Cushing at Cantina Creative. We struck up a friendship and that led to my working with them on Avengers and many other films. I still work with both studios today.
Adobe: Tell us about your work on The Hunger Games: Mockingjay 1 with Cantina Creative.
Hansen: I was involved on set in Atlanta while they were filming, using Adobe Illustrator and After Effects to create the graphics that played on the computer screens while they were filming. I talked to some old-school hackers to get ideas of what to show on some key analysis and hacking screens in the film.
We also did the post work on the film, replacing screens with more story-specific versions, as well as creating all of the holographic effects using After Effects and CINEMA 4D. I used to always create temp or slap comps using Illustrator and Photoshop to show the screens and comps with the actors. I’ve now started going straight from Illustrator to After Effects where I’ll do a quick mock up. If it gets approved, we can just hand it off to artists and they have all of the settings to begin animating, tracking shots, and rotoscoping right away.
Adobe: Have you worked on projects on your own as well?
Hansen: I love working with companies and being part of teams, so there are only a few films that I’ve done on my own. One example is Big Hero 6, which is probably the film I’m most proud of because it’s Disney and it was so good! I was contacted by Paul Felix, a legendary Disney Art Director. He said they were working on a new film, had stuff from my website on their inspiration boards, and were big fans.
I never thought I’d work on an animated film because most of my stuff is so realistic. But when I found out that they wanted me to work on holograms and UI screens it was perfect. They knew I worked in After Effects and Cinema 4D so they had me concept out a ton of stuff and deliver a kitbash that they could take modules from to use throughout the film. Bruce Wright was the visual effects artist who took what I created and gave it a Disney look.
Adobe: What did you do with Cantina Creative for the film Pixels?
Hansen: The filmmakers wanted us to put Easter eggs into the military interface that referenced Galaga, Pac-Man, or other old-school video games while still maintaining a hard-core, no-nonsense look. I recreated all of the Galaga icons and made them more military looking and designed controllers for their video feed that were shaped like old school video game controllers.
Adobe: What do you like about working with After Effects CC?
Adobe: How are you starting to work with augmented and virtual reality?
Hansen: There are few companies that have reached out to me. One that is really intriguing is called Meta. They are being super ambitious, wanting to do transparent, which is a lot harder than virtual reality. They want to make it possible to reach out and grab digital data and move it around. It’s a lot like how we’ve been designing stuff for Iron Man and Enders’ Game, but they’re looking for real-life applications.
I’m consulting with their design teams, doing concepts and mock ups in After Effects and Cinema 4D that their development team can replicate. Digital holograms are going to be a big new thing. Just imagine a doctor having access to a patient’s heart rate and other digital data without using a screen or needing to touch anything. It’s very surreal.
Learn more about Adobe Creative Cloud
Leading design and effects house anticipates 10x improvement in RAM preview times with latest release of Adobe Creative Cloud
For the talented team at Cantina Creative, hanging out with superheroes or spending time immersed in future worlds is just a typical day at the office. The studio produces monitor replacements, matte paintings, heads-up displays, set extension composites, and other amazing visual effects for some of the world’s most action-packed science fiction and comic book-based films. Working for companies such as Marvel and Lionsgate, Cantina Creative’s animation and visual effects veteran and Co-Owner Stephen Lawes regularly applies his visual storytelling expertise to blockbuster feature films.
Adobe: Can you tell us about some of your recent blockbuster projects?
Lawes: It’s been a busy year. We worked on The Hunger Games: Mockingjay – Part 1 and we’re currently working on The Hunger Games: Mockingjay – Part 2. For the Avengers: Age of Ultron we did a lot of HUDs, monitor graphics, and comps. Another really interesting project for us was Furious Seven. We also completed 23 design and graphics shots for Pixels.
Adobe: Is this work being done using Adobe Creative Cloud?
Lawes: Absolutely, we use Adobe After Effects CC, Illustrator CC, and Photoshop CC on nearly every shot. We use them for everything from initial design and animation of graphics to on-set and post material. We’re also doing more holographic displays, too, using a combination of After Effects and CINEMA 4D.
Adobe: How do you see the latest release of After Effects CC impacting your workflow?
Lawes: We have a rig that we built and have refined over the years since our work on Iron Man 2. For Avengers: Age of Ultron we gave it a refresh to make it simpler and faster so it is easier for artists to animate and navigate. One of the most complex rig shots we had revolved around the Mark 44 suit. It’s like a suit within a suit, so the comps were very heavy and it was hard to navigate around them and even animate simple things.
I tested this shot with the redesigned rig and saw incredible performance improvements. With 32-bit comps, we saw a three-fold speed increase playing back and reviewing content, and with 8-bit comps it was 13 times faster. That’s really a game-changer for us. It is a 62 frame shot that previously took 39 minutes to RAM preview so we would just do it in small chunks. The latest release of After Effects CC reduced that time to just three minutes, which is just incredible. That kind of speed will be insanely good on the next projects.
Beyond refining the rig, we’re also doing a lot more work that requires a combination of CINEMA 4D and After Effects CC as we move into doing more stereo, 3D work.
Adobe: What type of work did you do for the Hunger Games films?
Lawes: These projects were interesting because we don’t usually do onset playback graphics, but we were lucky enough to come onboard early and worked with the Director and Production Designer to create graphics that displayed on monitors as they were shooting, rather than shooting with blue or green screens and replacing the monitors in post.
In Mockingjay – Part 1, District 13 has a giant, 40-foot wide monitor that provides most of the light in the shot. We did the playback graphics on that screen to generate the correct color and lighting rather than flooding the scene with blue or green. In post, we replaced what was there with design material and still ended up doing a lot of rotoscoping using After Effects, but it gave us a better lighting scenario to start with.
Adobe: How was the Furious Seven project?
Lawes: We started off doing design for monitor graphics. We came into the production early, so we started influencing some of the storytelling elements of the film. We created content that they could use for bridging the edits. One of the big storytelling elements we helped create was the God’s Eye, which is a surveillance device that characters in the movie use to locate people and wrestle for control.
Adobe: From an insider’s perspective, what do you think of the increasing use of visual effects in film?
Lawes: Technology has been changing the way people tell stories in film for a long time. It influences how we all approach a story artistically and creatively but I think one of the biggest challenges going forward is how to use this technology wisely. Now that we can pretty much create anything with incredible detail and realism in CG, filmmakers shouldn’t get lazy and use it as a crutch to fix story lines. Sometimes the best approach involves a lack of money. It forces you to be more creative.
Adobe: What is next for you and Cantina Creative?
Lawes: We’re always pushing ourselves to evolve as a studio, so in between projects we work on internal ideas as a way to test out design and story concepts. Ideally, these concepts incorporate all the aspects of our studio including editorial, visual effects, and color correction. Ultimately, these ideas will hopefully benefit a movie project in the future, or could take on a life of its own.
Learn more about Adobe Creative Cloud
Visual effects studio designs screens and graphics for blockbuster films using Adobe Creative Cloud
Territory Studios enjoys its reputation for being able to handle nearly any project that comes its way. With expertise in branding, motion, and digital, the studio works on a range of projects including feature films, brand work, and popular video games. After completing graphic and screen design for Marvel’s Guardians of the Galaxy and Avengers: Age of Ultron, Territory worked on Hitman: Agent 47, applying its design expertise across screen graphics and UI, VFX, logo, and titles. David Sheldon-Hicks, Creative Director and Co-Founder of Territory, appreciates being able to work with a talented team that regularly pushes the limits of creativity with help from Adobe Creative Cloud.
Adobe: How did Territory Studios get started?
Sheldon-Hicks: After getting my start doing computer screen graphics for Casino Royale and The Dark Knight, I met my two business partners, Lee and Nick. We decided that instead of working for other companies, we wanted to form our own studio. We pitched Electronic Arts for a project producing the opening 90-second cinematic for the game Medal of Honor, won the job, and got our first monetary investment.
We worked on video games and brand work for about a year, then one day we got a call from the art department working on Prometheus. Ridley Scott was doing a prequel to Alien and wanted us to be involved in the screen graphics for the film. It was obviously an amazing opportunity. As graphic designers we were huge fans of the title sequence and graphics in Alien. That project lasted a year and was a turning point for us in terms of producing on-set screen graphics, often with a 3D or holographic feel, and user interfaces for big name directors and films.
Adobe: Tell us about your toolset.
Sheldon-Hicks: The backbone of the work we do is with Creative Cloud. We’re all designers, and Adobe apps are the first tools you learn as a designer, then you augment with everything else. We work with CINEMA 4D for our 3D pipeline and have one license of Nuke, but we’ve found that we can do almost all of our compositing in Adobe After Effects CC.
The tight integration between CINEMA 4D and After Effects lets our team experiment more, and with the perfectly exported cameras and lights, we can do more without going back to the main 3D app. For example, we can swap backgrounds easily, test ideas out quicker, and with 3D alignment we can position our graphic elements perfectly.
For editing we were using both Adobe Premiere Pro CC and Final Cut Pro, but now we’re primarily using Premiere Pro and occasionally Avid if that’s what our clients want. Not having to transcode footage in Premiere Pro makes for quick work when we’re putting together rip-o-matics that use lots of different sources, or when we bringing in rushes from a camera card need to turn things around quickly.
One of the main workflow improvements of upgrading to Creative Cloud is how fast we can now import vectors and get them prepped for animation. The vector to shape layer option has saved us lots of time where previously we would painstakingly re-draw the Illustrator files as shape layers.
Adobe: How did you get involved in the Hitman: Agent 47 project?
Sheldon-Hicks: Charlie Woods, the production designer we worked with on both Marvel Studio films, Guardians of the Galaxy and Avengers: Age of Ultron, recommended us to the producer who was looking for a team to take on screen graphics in post. We met with the editing team in London and really hit it off, especially with the Editor Nicolas de Toth.
Nick and his team were trying to solve a number of narrative challenges with the story, and needed help adding some graphics and user interface elements to support parts of the story that weren’t coming across in dialog or action sequences. We helped them figured out some of the narrative points and pull together some sequences.
Adobe: How did your role on the film evolve?
Sheldon-Hicks: When Nick realized how well we worked together building a narrative, he asked for help on the title sequence. We suggested including some backstory to give texture and history to the film before launching into the main action. Nick created an idea and we worked it up as an animatic with live action and a design treatment that included beautiful typography, creative compositing, and grading done in After Effects.
After a successful test screening, Nick got the green light to direct a second unit shoot. It was stunning, very filmic, moody, and impressionistic without giving away too much. We did all of the graphic design and integrated it into the title sequence.
Adobe: Did you help solve any other challenges on the film?
Sheldon-Hicks: During many scenes in the movie [SPOILER] the viewer sees what’s happening through the agents’ eyes. We needed a creative treatment on that footage to make it obvious to the audience that the viewpoint had changed. We used After Effects and various blurs to generate a particular look that was applied across over 40 shots. This project was great for us because we touched more elements than in any other film we’ve worked on. We were close to the storytelling and the people running the project and defining the narrative. We had a great time and want to find more projects like this one in the future.
Adobe: What other projects have you really enjoyed working on?
Sheldon-Hicks: We worked on The Martian with Ridley Scott. The film features a lot of highly detailed and story driven screen graphics and we worked very closely with NASA to get accurate information for all the screens. Ridley is the first director we worked with and his creative direction of our work was formative in shaping our own approach. He’s an inspiration to us because he really values the role that screen graphics can play as strong narrative devices and we’re incredibly thrilled to be working with him a second time.
For Guardians of the Galaxy we got a lot of inspiration from the creatives on the film set, including costume design, set decoration, and concept art. We created styleframe iterations in Photoshop and Illustrator, shared graphics using Creative Cloud Libraries to make sure they were consistent across multiple screens, and collected the typography to design the overall language. Next, we animated in After Effects, comped everything, and then moved between After Effects and CINEMA 4D to keep a quick, tight workflow so we could deliver the graphics in front of the actors and directors.
The graphics for Avengers: Age of Ultron were based around director Joss Whedon’s vision for a grittier, more human story, so our concepts were based on the characters lives and interests, as well as on their superhero efforts and collaborations. It needed to feel grittier than the original Avengers so we looked at real-world references and merged them with the Marvel comic book-based design work. We varied the color palette when we designed the screens for the different characters to give each of them their own look.
Adobe: What skills do you look for when hiring new motion designers?
Sheldon-Hicks: Our motion designers need to know Adobe Illustrator CC, Photoshop CC, and After Effects CC and they need to be able to design and art direct. Ideally, they will have a secondary skill, such as photography, stop-frame animation, sculpture, or drawing and illustration that they bring to the mix. Everyone on our team knows how to design and choreograph movement and how to create emotional connections that tell a story.
Adobe: What other apps do you use in Creative Cloud?
Sheldon-Hicks: We use Adobe Acrobat for all of our mood boards, presenting initial ideas to game companies or film directors, and creating style frames, storyboards, and presentations. It is probably our most used piece of software! I’m looking forward to working with Adobe Character Animator. It’s really clever, especially for getting ideas across to clients quickly. We’ve also seen some of the mobile apps and Adobe Photoshop Sketch and Adobe Brush CC look really interesting and I’m sure we’ll be using those in the future.
Adobe: How does Creative Cloud for teams benefit your studio?
Sheldon-Hicks: Creative Cloud for teams is an essential offering because we spend less time assigning software when new team members join us and we don’t have to worry about assigning licenses on an individual basis. It is pain free, adaptable, and always up to date so our teams can collaborate and expand as needed, without the software being a barrier. We can also figure Creative Cloud for teams into our yearly spend more effectively and strategically plan against our ambitions for the studio.
Adobe: How has your film work influenced work with other clients?
Sheldon-Hicks: The computer game manufacturers we work with see what we we’re doing in film and want similar effects incorporated into their games. Similarly, the work we do producing high-quality digital experiences and telling stories in interesting ways in films is very relevant to brands such as technology companies and automotive manufacturers. As a result, we’ve expanded the services our studio offers to include product design and service design from a brand experience perspective.
Learn more about Adobe Creative Cloud
Visual effects studio enables filmmakers to achieve amazing motion control shots at a lower cost with help from Adobe Creative Cloud
Though he started his professional career touring Europe as a musician, today Patrik Forsberg is the Creative Director at Stiller Studios, a Swedish creative agency that focuses on intricate motion control work. The path he took to get where he is today was paved with skill, creativity, and bold decisions. He is now working on achieving his vision—producing precise motion control work that rivals what can be achieved in Hollywood blockbusters, at a fraction of the cost.
While the focus of Stiller Studios is narrow, Forsberg hopes that it can help democratize the creation of advanced shots typically only available for high-budget films. To capture these shots against almost any background—from 3D scenes to pre-shot stills and moving plates—Stiller Studios uses a range of equipment, proprietary tools, and off-the-shelf products, including the video apps in Adobe Creative Cloud.
Adobe: What led you to working in visual effects?
Forsberg: My career includes producing radio commercials, films and videos, visual effects and motion graphics, and live action on computer generated images (CGI). Ultimately, I wanted to do something that has never been done before. I had a vision to build the best place in the world for live action on CGI with a moving camera. It required a lot of investment and learning to get started, but it’s amazing how far we’ve come!
Adobe: What equipment do you use in your studio?
Forsberg: We built our studio in 2007 around the same type of equipment used on Harry Potter and Quantum of Solace. The first unit we bought was a Cyclops motion control from Mark Roberts Motion Control. It weighs 4.6 tons, reaches up to six meters, and is the most exact motion control available. We also have a six axes Motion Base, a moving platform that can carry up to one ton, and a high speed motion control called Bolt.
We use plenty of proprietary software and workflows to help us reach our goal: to be world leaders in live action on computer generated images with a moving camera and live on set previsualization from anywhere.
Bolt and Cyclops (L) Motion Base (R)
Adobe: With all of the specialized gear, how important is planning in your process?
Forsberg: Previz is critical to every project. We start off with a storyboard and talk to the director and VFX director about what they want to achieve. Then we produce a 3D previz and 3D setups, put things together, check the pace, and see if it looks right. Next, we start setting up everything for the studio.
Adobe: How do you combine the physical equipment with the software?
Forsberg: We see the studio as an add-on to the 3D or finishing program. Instead of sending things out to the compositing software and 3D program, we think of the studio as a plug-in to it. If we need another layer and that layer will be live action, we think about how it should be shot in order to be pixel on pixel when we put it on the CGI content, so everything is lined up perfectly and looks as realistic as possible.
We’re working hard at getting it perfect, and we’ve got it down to 0.0014 degrees, which is sort of exact. We don’t need tracking markers and we don’t need to do any post tracking. We can just go in and shoot, using virtual sets as if they are real. Actors see where they are and the directors and producers see a live comp with a moving camera.
Actual Shot (L) On set (R)
Adobe: What software do you use?
Forsberg: Our whole system is made of off-the-shelf products, and proprietary software. We use Maya, 3D Studio Max, Motion Builder, Unreal, QTake, Flair, Nuke, Adobe Premiere Pro CC, and various compositing software including After Effects CC.
Once we’ve shot everything it goes into compositing software, such as Premiere Pro or After Effects, as different layers. Seconds after a shot, artists can make sure the light is coming from the right direction and everything is set up right and then start working on it. We like that our artists can focus on making shots beautiful rather than fixing problems.
Adobe: What is your core value proposition for filmmakers?
Forsberg: We produce live action on computer generated images and deliver pixel on pixel for the artist in the end. We shoot technical stuff in a way that makes sense to people who are more traditional storytellers and filmmakers. Instead of getting one or two shots a day, which is typical with this type of work, we can accomplish the fifteen or sixteen shots a day that you get on feature films or commercials.
We’re the only place in the world where you get all the data and film layers on top of the 3D or pre-produced layers, seconds after you shoot. It’s a narrow technique used mainly on blockbusters as well as some big TV series. We’ve built a workflow that is available, not super expensive, and fits in with European or Swedish film budgets. We’re making it possible to shoot scenes that look as cool as big American blockbuster movies.
In an ordinary green screen, motion control shoot environment filmmakers don’t see an edit until three to five days later. If they’ve done something wrong it’s too late to change it. It’s important to be able to see what we have seconds after a shot, especially for people who don’t regularly work with visual effects. They can get a feeling for what it’s going to be and see that it is going to work. It makes things much faster and reduces the amount of content we shoot that never makes it on screen.
Actual Shot (L) On Set (R)
Adobe: Why is a workflow like this needed?
Forsberg: If you look at blockbusters for the past 10 years, somewhere between 90% and 95% have used techniques like this, but in inefficient and expensive ways. Just setting up a workflow like ours is $600,000 to $700,000. We wanted to do something that worked for European budgets, where we can deliver a setup that lets filmmakers shoot really cool shots for considerably less cost.
If our setup works for you, there’s no way you’re going to do it more efficiently anywhere. On feature films we’re saving 70% of manpower costs in post-production by having computerized system delivering everything as a pre-comp in the compositing software of choice, including After Effects and Premiere Pro. Everything is aligned and setup, and artists can go straight in and get to work.
Adobe: What’s next for Stiller Studios?
Forsberg: We did a couple of feature films in the early 2010s. We’ve also produced some of our own technical films telling about our high-speed motion control workflow. One hit 10 million downloads! With more and more people seeing what we can do we’re getting a lot of interesting propositions.
We’ll likely do some work with American and British filmmakers in our studio in Stockholm. Everyone wants to know if we can build another system somewhere else. It is possible to duplicate, it’s just a matter of getting the right hardware and implementing all of the software knowledge we’ve gained.
Learn more about Adobe Creative Cloud
Andrew Kramer started creating Adobe After Effects tutorials for fun, with no idea that his hobby would lead to an enormously successful and rewarding career. After starting his Video Copilot website when he was just 20 years old, he has become a highly respected professional in the visual effects and motion graphics industry. In addition to creating software and tools for professional designers, he also works in the film industry. No matter what he’s working on, he always finds time to train and inspire others in the community to realize their creative potential.
Andrew Kramer likes staying busy, and this past year was no exception. In addition to releasing a new 3D plug-in for Adobe After Effects CC, he’s been working with Bad Robot on a couple of new, top secret projects. He also created a new city destruction tutorial that highlights the use of 3D Camera Tracker in After Effects.
For the tutorial, he shot HD aerial footage of downtown Los Angeles, and broke up the city as if there was some type of monster invasion. He used the 3D Camera Tracker in After Effects to track the scene, identify the track points, and then place objects and layers in the 3D space. In one scene there is a hole punched into a skyscraper that shows the inside levels of the building in a completely photorealistic way.
After creating this tutorial, Kramer wanted to explore what it would be like to use this same effect on a human. He filmed an actor and used the same 3D tracking on his face. The tracker assumed the geography in the same way it would do in a landscape, added track points, and let him create the camera position for the compositing.
“We’re trying to show tutorials that have deeper uses,” says Kramer. “Our city destruction tutorial shows an innovative way to use the 3D Camera Tracker in After Effects to create a popular effect. There are so many different things you can create once you have a good track on a scene or even on a person.”
To see more of Kramer’s work with the 3D Camera Tracker in After Effects, don’t miss his presentation at the 2015 NAB Show entitled “After Effects CC: Motion Tracking the Impossible” in the Adobe theater on Tuesday, April 14th at 2:00 pm.
Watch Andrew Kramer’s presentation at NAB 2014.
Talented artist uses Adobe After Effects to create fitting typographic animation for parody video
On Tuesday, July 15, 2014, the most shared video on YouTube and Facebook was “Weird Al” Yankovic’s “Word Crimes,” a parody of Robin Thicke’s popular “Blurred Lines” single. With more than 12 million YouTube views and climbing, the song is both clever and catchy. But what really brings it to life is the video’s impressive typographic animation. Jarrett Heather, a software developer with the California Department of Food and Agriculture, spent 500 hours over three months working with Al Yankovic on the project, which relies heavily on Adobe After Effects, Photoshop, and Illustrator.
Production company creates immersive experience for well-known DJ artist at art and music festival using Adobe Creative Cloud
Plastic Reality is a production company known for branding and other video work for big corporate clients such as BP and Unilever. But unlike most corporate video companies, Plastic Reality has a wild side, called The Happiness Labs, focused on producing experiential content and graphics for live events and installations.
Seamless visual effects for “The Wolf of Wall Street” created with help from Adobe After Effects CC and Adobe Photoshop CC
Paul and Christina Graff of Crazy Horse Effects (CHE) are visual effects aficionados, with projects to their credit such as There Will Be Blood and Life of Pi. They also work with a team of some of the best matte painters and designers in the visual effects industry, and are recognized for their award-winning compositing. They recently created some seamless visual effects for The Wolf of Wall Street, directed by Martin Scorsese, with Oscar-winning VFX supervisor Rob Legato overseeing the shots.
Adobe: How did you become involved with The Wolf of Wall Street?
Paul: I actually met Rob at a panel presenting outstanding work in VFX done in After Effects. We went to have a drink afterwards and he asked me about our new office in New York. We had worked on The Aviator andShutter Island with him and he thought we could help with some of the shots in The Wolf of Wall Street. We were stoked to make the reunion with Rob, and excited to work on the project, although we joined the team late in the game when most of the effects were already well underway.
Adobe: What type of work did he send your way?
Christina: We didn’t do any of the normal set extension work that we usually do. Instead, we focused on a lot of last minute fixes and designed several sequences. We worked on a lot of quirky shots! We contributed to several corporate identity “videos,” a few driving scenes, and a longer sequence with the real Jordan Belford at the end of the movie. Our work is really scattered throughout the movie.