Michelle Gallina – ProVideo Coalition https://www.provideocoalition.com A Filmtools Company Sun, 22 Dec 2024 14:12:06 +0000 en-US hourly 1 https://wordpress.org/?v=6.3.5 https://www.provideocoalition.com/wp-content/uploads/cropped-PVC_Logo_2020-32x32.jpg Michelle Gallina – ProVideo Coalition https://www.provideocoalition.com 32 32 Crafting Culinary Magic with Le Petit Chef and Adobe Creative Cloud https://www.provideocoalition.com/crafting-culinary-magic-with-le-petit-chef-and-adobe-creative-cloud/ https://www.provideocoalition.com/crafting-culinary-magic-with-le-petit-chef-and-adobe-creative-cloud/#respond Tue, 10 Dec 2024 13:47:03 +0000 https://www.provideocoalition.com/?p=286806 Read More... from Crafting Culinary Magic with Le Petit Chef and Adobe Creative Cloud

]]>
Crafting Culinary Magic with Le Petit Chef and Adobe Creative Cloud 1
Image Source: Filip Sterckx, co-founder and creative director at Le Petit Chef and Skullmapping.

Restaurants are typically known for their food, service, and ambiance, but Le Petit Chef, the smallest chef, takes a unique approach to the traditional dining experience by combining theater and culinary arts to create an immersive experience for the patron.

The show, available in more than 70 restaurant locations around the world, projects a story on your dining table and features a 6-centimeter-tall animated French chef preparing a four-course meal. As you watch the charming chef travel to different places to create each course — like a summer garden to pick fresh vegetables or the deep sea for fresh fish — and plate the dish, the restaurant brings the actual meal to the table for you to enjoy.

Filip Sterckx, co-founder and creative director at Le Petit Chef and Skullmapping, relied on Adobe Creative Cloud to edit the show and craft intricate animations that complement the chef’s creativity.

Read below for more insights into Sterckx’s unique editing approach.

How and where did you first learn to edit?

I studied animation in Brussels about 20 years ago. This university used Adobe Premiere Pro for editing and for frame-by-frame capturing the animations. Over the years, I directed and edited several different projects, from short films and music videos to projection mapping projects. Le Petit Chef doesn’t require a lot of editing since we use the table, which we project like a stage. Most Le Petit Chef animations unfold more like a theater play.

How do you begin a project/set up your workspace?

The experience we create with Le Petit Chef is very different from traditional AV projects, but we follow similar steps to those in a standard animation pipeline. This process includes brainstorming concepts, scriptwriting, storyboarding, and creating 3D pre-visualizations in Cinema 4D. In our studio, we have a demo table set up with a projector mounted above it. Once I complete the pre-visualisations, I project them onto the plate to get a sense of what the actual experience will look like. Watching an animation on your screen is very different from viewing it projected on a plate, so I always make sure I sit down and evaluate the different steps in the animation seated at the table.

Tell us about a favorite scene or moment from this project and why it stands out to you.

So far, we’ve created six different Le Petit Chef shows. It’s hard for me to choose my favorite scene or moment because we try to ensure that each show has its own identity. I would say the best moment is launching the show and seeing our animations merge with the creativity the chefs bring to the table (and trying the food)! We work for about one year on a Le Petit Chef show and focus on the audiovisual side. However, once it’s installed at an actual restaurant and combined with the service and food, then it truly becomes a multisensory experience.

Crafting Culinary Magic with Le Petit Chef and Adobe Creative Cloud 2
Image Source: Filip Sterckx, co-founder and creative director at Le Petit Chef and Skullmapping.

What were some specific post-production challenges you faced that were unique to your project? How did you go about solving them?

As mentioned above, we treat the Le Petit Chef animation on the table as if it were a stage. Most of our 3D scenes are designed as a single, continuous ‘shot’ that can last up to three minutes. Our 3D scenes can become quite complex, containing multiple animations and models, which makes rendering them time-consuming. Effective organization is also crucial, as any changes to a scene require adjustments to everything that follows since we can’t cut to another shot. Therefore, I always aim to complete any 3D work well before the launch, allowing enough time for rendering, compositing, and sound design.

Crafting Culinary Magic with Le Petit Chef and Adobe Creative Cloud 3
Image Source: Filip Sterckx, co-founder and creative director at Le Petit Chef and Skullmapping.

What Adobe tools did you use on this project and why did you originally choose them? Were there any other third-party tools that helped enhance your workflow?

Generally, for Le Petit Chef, we use Cinema 4D for pre-visualisations, 3D animations and rendering, Adobe Photoshop for texturing, Adobe After Effects for compositing and Premiere Pro for sound design. There is other software out there for creating sound design, but I’ve learned doing sound design in Premiere Pro since college and always stuck to that. 😉

Who is your creative inspiration and why?

Because of my background in animation film, I mainly draw inspiration from film and animation film. I love Michel Gondry’s work. I’m also a big fan of Megaforce, which is a group of French directors mainly known for their super creative music videos and ads.

People working in projection mapping come from diverse backgrounds such as coding, engineering, theater, graphic design, etc. I think it’s quite amazing how the entertainment company Freckled Sky combines performance and projection mapping. Particle Ink has a new show in Vegas, which I would love to see. Also, Sila Sveta is probably the most established and impressive company in this field.

What’s the toughest thing you’ve had to face in your career and how did you overcome it? What advice do you have for aspiring filmmakers or content creators?

I think the business side of things is definitely the most challenging. When we created our first Le Petit Chef animation in 2015, we projected it onto a table, invited some friends over and filmed their reactions. We didn’t expect much, but the video went viral, and several restaurants and hotels from all over the world approached us asking to use our ‘product’ at their venue. As a creative person, I was initially clueless about contracts, sales and distribution. My recommendation is to seek good legal counsel if your project starts to gain traction. While lawyers can be expensive, they can save you a lot of headache and money in the long run.

Share a photo of where you work. What’s your favorite thing about your workspace and why?

Our studio is relatively spacious! We have everything we need to develop our projects in-house, like a sound booth and a space to test out our projection setups.

Crafting Culinary Magic with Le Petit Chef and Adobe Creative Cloud 4
Image Source: Filip Sterckx, co-founder and creative director at Le Petit Chef and Skullmapping.

 

 

]]>
https://www.provideocoalition.com/crafting-culinary-magic-with-le-petit-chef-and-adobe-creative-cloud/feed/ 0
Unlock Your Creative Potential with Free Video Editing and Motion Design Training https://www.provideocoalition.com/unlock-your-creative-potential-with-free-video-editing-and-motion-design-training/ https://www.provideocoalition.com/unlock-your-creative-potential-with-free-video-editing-and-motion-design-training/#respond Fri, 18 Oct 2024 23:21:46 +0000 https://www.provideocoalition.com/?p=285415 Read More... from Unlock Your Creative Potential with Free Video Editing and Motion Design Training

]]>
Unlock Your Creative Potential with Free Video Editing and Motion Design Training 7

If you’re ready to level up your creative skills and master Adobes industry-standard software like Adobe Premiere Pro and Adobe After Effects, AdobeVideoTraining.com is your go-to resource. Sponsored by Adobe, this new free online training platform is tailored for everyone, from beginners to seasoned pros, offering expertly designed courses led by industry professionals and Adobe-certified instructors.

Learn at Any Level

Whether youre just starting your video editing journey or looking to master complex workflows, youll find a course that suits your needs. Beginner-friendly training videos break down the fundamentals of editing and motion design in easy-to-understand steps. As you progress, advanced lessons introduce more intricate features, such as Premiere Productions and After Effects integration, so that you can take your projects to new levels.

In addition to training to help you master creative techniques, the platform also offers certification prep courses for Premiere Pro and After Effects to help you work toward becoming an Adobe Certified Professional. Earning this industry-recognized credential not only sharpens your skills but also boosts your professional credibility.

Hands-on Learning with Real-World Impact

What sets AdobeVideoTraining.com apart is its focus on practical application. Each course integrates real-world projects, enabling you to immediately apply what youve learned. The platform provides learning materials and downloadable assets, so you can follow along with instructors and achieve the same results in your projects.

Unlock Your Creative Potential with Free Video Editing and Motion Design Training 8Unlock Insider Tips from Industry Experts

Even experienced editors and motion designers can miss out on some of the hidden features Adobe video products offer. AdobeVideoTraining.com goes beyond the basics, providing exclusive tips and tricks from certified instructors who know the software inside and out.

From pro editing techniques that streamline workflows to advanced shortcuts that save time and increase productivity, youll learn how to work more efficiently. The platform also highlights how to seamlessly integrate Adobe apps, allowing you to move fluidly between Premiere Pro, After Effects, and other Adobe tools for a more cohesive post-production workflow.

Flexible Learning That Fits Your Schedule

One of the standout features of AdobeVideoTraining.com is its on-demand access. Life is busy and its not always easy to stick to a set schedule. With this free training platform, you can learn at your pace, whenever its convenient for you. Whether you prefer learning in the morning or late at night, AdobeVideoTraining.com gives you the flexibility to advance your skills at your speed.

Start Your Learning Journey Today

AdobeVideoTraining.com is a game-changer for anyone serious about improving their video editing and motion design skills. With expert instruction from industry professionals and Adobe-certified instructors, hands-on projects, and a wealth of insider knowledge, youll gain the tools and confidence to become an expert in Adobe video software.

Whether you editing content for social media, corporate projects, or film, this free platform gives you the knowledge, skills, and certification to succeed. Ready to unlock your creative potential? Visit AdobeVideoTraining.com and start your journey today!

]]>
https://www.provideocoalition.com/unlock-your-creative-potential-with-free-video-editing-and-motion-design-training/feed/ 0
Cantina Creative on Designing HUDs for Movies and Real Life https://www.provideocoalition.com/cantina-creative-on-designing-huds-for-movies-and-real-life/ https://www.provideocoalition.com/cantina-creative-on-designing-huds-for-movies-and-real-life/#respond Tue, 28 May 2024 14:00:16 +0000 https://www.provideocoalition.com/?p=280045 Read More... from Cantina Creative on Designing HUDs for Movies and Real Life

]]>
When the design team at Cantina Creative started visual effects work on Iron Man 2, they had no idea that their heads-up display (HUD) for Tony Stark’s Iron Man would become iconic. The visual arts studio became one of the industry’s go-to agencies for futuristic user interfaces, computer monitor graphics, and HUDs found in some of the world’s biggest superhero, action, and science fiction films.

As recreational drones, augmented reality apps, and electronic displays in cars gained popularity in the years following the movie’s release, Cantina Creative’s HUDs became the standard that many designers and consumers expected. The studio is increasingly taking its artistic storytelling approach into the real world to help companies create intuitive and engaging displays for users.

I spoke with co-founders Sean Cushing and Stephen Lawes about how they use Adobe After Effects and other Adobe Creative Cloud apps to bring interfaces to life on screen and in life.

Cantina Creative on Designing HUDs for Movies and Real Life 9
Cantina Creative’s Sean Cushing (L), Executive Producer & Co-Founder, and Stephen Lawes (R), Creative Director & Co-Founder. Image sources: Sean Cushing and Stephen Lawes.

Tell us about some of the work that Cantina Creative is known for.

Sean: We’re known for a lot of big action, science fiction, and fantasy movies and television shows: “Avatar,” “The Hunger Games,” “Aquaman,” and the Marvel Cinematic Universe. One project that I really loved working on was the Ryan Reynolds movie, “Free Guy.” The idea is that he’s in a video game, so when he wears special glasses, he can see the video game world on top of the real world. The HUD becomes a really instrumental part of conveying his emotions and experience and is an added element of humor. It goes beyond information and shares new layers of the story.

Stephen: Iron Man’s HUD really changed things for us. You don’t see a lot of truly memorable motion graphics in films, but that HUD became something that everyone could instantly recognize. What amazed us is how it started to influence the real world in terms of design ethos. We started to get job requests from companies who loved what we did in the movies and wanted to bring it into their products.

We’re now engaging with aerospace companies to develop UIs and HUDs for future tech. And we’re just starting to work with Apple Vision Pro goggles to add our design sense to those applications.

Cantina Creative on Designing HUDs for Movies and Real Life 10
The instantly recognizable ‘Iron Man” heads-up display. Image source: Cantina Creative.

What makes your approach to HUDs stand out?

Stephen: For us, every single HUD needs to tell a story. You can see it best in “Iron Man” and how the HUD evolves throughout the movies. We start by analyzing the character, who they are, what they want, and what their world is like. Then we try to convey all of that visually, particularly paying attention to design beats that will connect audiences to that character. “Iron Man” has this little boot-up moment when Tony Stark puts on his suit to let the audiences feel like, this is it, now Iron Man is here.

We update the HUD for every Iron Man appearance in a film. It starts by looking at the suit itself. How did it change from the previous movie? What are its new capabilities? How would the HUD need to evolve for Tony Stark to control it?

Cantina Creative on Designing HUDs for Movies and Real Life 11
Iron Man’s XLVI suit. Image source: Cantina Creative.

Does that story-telling based design philosophy hold true for real-world HUD designs?

Stephen: Absolutely. It’s really a back-and-forth conversation because when we work on movies, our goal is to elevate very grounded scenarios. But then our creativity inspires what real-world clients are looking for. You start to see those direct design connections.

Sean: There are big differences as well. In movies, we’ll use cinematic tricks like depth of field, but in reality, adding too much dimensionality to displays can be really distracting and confusing for the human brain. We focus more on how we can add dynamic elements to 2D designs.

What does a typical HUD design workflow look like?

Stephen: It always starts with sketches. That might mean pencil and paper, Adobe Illustrator, or even drawing over a photograph in Adobe Photoshop. For movies and TV shows, we often get imagery from the production design team, so we can start to expand on those foundations to find our visual language.

Once we get a general design figured out, we’ll start doing animation tests in Adobe After Effects to see how it will work with the aesthetics of the production and the character. What I like about After Effects is that it lends itself to a very fast turnaround with iterative processes. Since it’s layer based, you can really mess around and jump between different ideas without restrictions.

We also use a lot of Adobe Substance 3D Collection for any CG assets. For the television series “The Lord of the Rings: The Rings of Power,” we used Substance for all of the texturing and shading when we created the maps that the characters follow. Best of all, we can bring 3D models directly into After Effects and do all of the shading and rendering in the same application. It skips a stage and makes everything faster.

Cantina Creative on Designing HUDs for Movies and Real Life 12
An “Iron Man” hologram. Image source: Cantina Creative.

What are some of your favorite new Adobe capabilities?

Stephen: Generative Fill in Photoshop is a gamechanger. Often times we’ll need to make a clean plate for a shot, and it can take us hours to erase objects from the foreground to get what we need. Generative Fill turns that into a five-minute task. That’s a real lifesaver, and it gives us back so much time to focus on what we’re doing on screen.

What do you think the future of technology and design look like?

Sean: With recent advancements in AI-powered virtual assistants that can respond to voice commands, images, and video, we may soon be applying our experience in that area to projects. We’ve created virtual assistants for Netflix’s “Atlas” and many HUDs in the Marvel Cinematic Universe so why not real-life?

I also think augmented reality will become much more commonplace in daily life. We’ll start to ask how we lived without it, like a smartphone. I just hope that it’s interesting and enhances our lives and isn’t defined by just one company.

Having said that, I think AR designs are a bit too austere right now. We push animations to entertain and draw visual interest in movies, but it’s often seen as superfluous in real life. There’s a way to use dynamic animations to bring interfaces to life without adding too much complexity, and I hope we’ll start to see that in the future. That’s another place where I think After Effects will really help people push the limits of creativity.

Learn more about Cantina Creative at their website.

 

 

 

 

 

 

]]>
https://www.provideocoalition.com/cantina-creative-on-designing-huds-for-movies-and-real-life/feed/ 0
Preserving Gene Roddenberry’s legacy and history of “Star Trek” https://www.provideocoalition.com/preserving-gene-roddenberrys-legacy-and-history-of-star-trek/ https://www.provideocoalition.com/preserving-gene-roddenberrys-legacy-and-history-of-star-trek/#respond Thu, 21 Mar 2024 17:20:52 +0000 https://www.provideocoalition.com/?p=277839 Read More... from Preserving Gene Roddenberry’s legacy and history of “Star Trek”

]]>
Preserving Gene Roddenberry’s legacy and history of “Star Trek” 13
Image Source: The Roddenberry Archive.

The Roddenberry Archive is a collaboration between OTOY, The Roddenberry Estate, and iconic “Star Trek” artists Denise Okuda, Mike Okuda, and Daren Docterman to preserve Gene Roddeberry’s legacy in all media, but especially the history and cultural contribution of all eras of “Star Trek”. In a multi-pronged approach, the team is documenting “Star Trek” through interviews with cast and crew members, 3D scanning original props, costumes, and sets, making walk-through digital recreations of the sets from the series and movies, and producing documentary featurettes.

VAD supervisor Mark Spatny, senior hard surface artist Donny Versiga, and motion graphics artist Andrew Jarvis, are a few of the members creating this experience. Check out how they are using Adobe Creative Cloud to keep Gene’s legacy alive.

You can check out this project and immerse yourself into the world of “Star Trek” here.

How did you first get into VFX design? What drew you to it?

Spatny: I started working in live theater in the late ‘80s in Los Angeles as a set and lighting designer, and in the early ‘90s began using architectural previz software to design my scenery. I soon found that I enjoyed doing computer graphics as much as the physical theater production, so I transitioned into visual effects where I can combine all those skills. I have since worked on more than 50 TV series and films on-set and in post-production as a VFX supervisor and producer, specializing in both action/stunt/superhero VFX and complex set extensions.

Versiga: I got my start back in 2000 when I was 16 years old, creating my own custom levels in first-person video games like “Doom” and “Dark Forces”. At first it was just rearranging pre-existing assets in a level map, but soon I learned the basics of modeling, texturing, and lighting to achieve more customized results. After a year or so of being active in the video game modding community, I realized that I could merge this newfound hobby with my lifelong love of “Star Trek” by bringing to life my favorite “Trek” sets in a 3D digital world via real-time rendering.

Jarvis: I really took the long way around. I graduated with a degree in philosophy and psychology and started working as a cashier at a coffee shop doing web design jobs on the side. But each job was a little better than the last, and in my free time I was learning Adobe After Effects, 3D animation and ZBrush sculpting. One day a friend of mine met someone who did screen graphics for film and television, he gave me their number, and I never looked back.

What was the inspiration behind the VFX work on this project? What were you trying to achieve?

Spatny: Space-themed TV shows have always been my passion ever since I was a kid — “Star Trek”, “Battlestar Galactica”, “Lost in Space”, “Stargate”, “Doctor Who”, and anime shows like “Space Battleship Yamato” and “Macross”. In 2018, as a member of the Television Academy board, I helped champion giving the “Star Trek” franchise the Academy’s Governors Award Emmy. So, I’m super excited to finally be working in that genre, helping preserve the history of a show I’ve loved for 50 years.

Versiga: My goal on the Roddenberry Archive is to not only simply create CG replicas of the sets and props of “Star Trek”, but also to immerse the end users in a highly detailed and exceptionally immersive virtual tour of some of the most beloved ships and settings of the fictional universe. I want users to feel like they are actually stepping onboard the “Enterprise” via the comfort of their digital devices, in awe of the degree of accuracy.

Preserving Gene Roddenberry’s legacy and history of “Star Trek” 14
Image Source: The Roddenberry Archive

What Adobe tools did you use on this project and why did you originally choose them?

Spatny: Premiere Pro is a key editing platform for our documentaries and concept videos. Since we’re dealing with footage transferred to video across many eras and standards, we depend on Premiere Pro’s ability to easily handle a variety of video formats and standards natively.

Versiga: I am a huge fan of the Adobe Substance suite of Adobe tools, particularly Adobe Painter. I have used the tool for nearly a decade now to add more fidelity and customization to the materials I apply to my models. I originally started using it during the interview process for a job in the video game industry because I recognized that my biggest weakness in my skill set at that point was texturing. I learned via the official tutorials on YouTube and fell in love with the incredibly powerful program. Since then, I’ve been an advocate of Painter in every job I’ve had, even giving tutorials and lessons for those new to the software.

Jarvis: Adobe Illustrator is where I spend my most time, but I also use Adobe Photoshop and After Effects. I’ve been using the Adobe suite as an FUI (fictional user interface) designer for nearly a decade now so it was the natural choice.

How did you begin this project? Can you talk about the collaborative process with the art department teams from the TV shows and films, and the process of creating your work from start to finish?

Spatny: The Archive is a decades-long passion project from Jules Urbach, the CEO of OTOY and the Roddenberry Estate. The CG team currently consists of artists in 6 countries, some of whom previously worked on Trek series and games. They work in a variety of DCC tools, and “Star Trek” art department legends Denise and Michael Okuda and Daren Dochterman review and approve all the assets we recreate for authenticity. We work mostly from publicly available reference sources such as screen grabs from the series, blueprints and technical drawings that have been published, and photos from the internet. We also have teams that work with private collectors to scan original props and costumes, and we can scan actors in our Lightstage capture facility in Los Angeles, CA.

Eventually, the finished models are converted to ORBX files to render on the Render Network, a decentralized high performance GPU computing platform. Real-time environments that users can explore are rendered on Amazon cloud servers and streamed to user’s local devices, such as the Apple Vision Pro, via OTOY’s X.IO streaming service.

Describe your favorite piece or component of the project. How did it come together and how did you achieve it?

Versiga: My favorite piece so far that I’ve created for the Roddenberry Archive would have to be the “Enterprise” — a bridge scene at the very end of “Star Trek IV: The Voyage Home”. Fans were only provided with a glimpse of this new bridge in the film, but we knew the set was a heavy redress of the bridge set seen in the preceding three films. However, the big unknown factors were the details of the hundred or so display graphics that adorned the bridge’s control consoles, of which only a handful were ever seen on screen. Fortunately, Mike Okuda was able to provide us with a copy of the original overview document of the displays that he created during his work on the film, which I used as a guide for creating our digital representations of each display in Adobe Photoshop and Illustrator. It was a huge honor for me to work on that project, as for the first time we were able to provide fans with an accurate representation of the new “Enterprise” bridge set from that film.

Preserving Gene Roddenberry’s legacy and history of “Star Trek” 15
Image Source: The Roddenberry Archive

What were some specific challenges you faced? How did you go about solving them?

Spatny: Because our artists come from different backgrounds and industries, they use a variety of workflows and tools. One of my biggest challenges ahead is helping standardize procedures and simplify communication with the international team. That way, we can more easily scale the team and pivot our work to new platforms, using the best practices I’ve learned from the VFX companies I’ve worked with previously.

If you could share one tip about any or multiple Adobe tools you used, what would it be? (e.g. favorite Photoshop/AE/Premiere Pro “hack”)

Spatny: Every week we have a 3-hour art review session with Daren, Mike, and Denise, and sometimes the notes come in hot and fast. We record the sessions and then use Premiere Pro’s transcription tool, Speech to Text, to help gather and summarize the most important points to relay to our art team. One advantage of this over other transcription tools is that it works on my local machine, not in the cloud, so it has no security issues. That’s a huge plus for companies that depend on proprietary IP.

Versiga: When I first started using Adobe Substance Painter, I was excited at the ease of adding things like dirt and edge wear to my models’ textures, so much so that I overused it in my early days. I’ve since learned to make my application of those effects far more subtle. For instance, if I’m modeling a clean Starfleet prop, I’ll add very subtle roughness variation in the form of light scratches, smudges, and fingerprints to make them appear only slightly used and have a level of realism for up-close inspection. On the other hand, I’ll save the heavier rust, metallic edge wear, dirt, and grime effects for the Klingon props and sets.

Preserving Gene Roddenberry’s legacy and history of “Star Trek” 16
Image Source: The Roddenberry Archive.

Who is your creative inspiration and why?

Spatny: I keep David Mack’s “Kabuki” comic books in my office. He’s a brilliant artist who sometimes uses wildly different art styles on every page of his comic. It’s a reminder that we always need to be innovating in both design and technical approaches in our industry. You can never be satisfied with doing the same thing the same way twice.

Jarvis: For this project, it’s got to be Michael Okuda. His dedication to the franchise and the consistent design style he brought to the screen graphics made “Star Trek” such a cohesive universe. His work really helped spark my interest in digital art as a boy. Decades later it would be my honor to collaborate with him on “Star Trek: Picard”, and now at The Roddenberry Archive.

What’s the toughest thing you’ve had to face in your career and how did you overcome it? What advice do you have for people aspiring to get into the VFX space?

Jarvis: Creating screen graphics for on-set playback can be a very stressful job. In many ways, it’s more difficult than post VFX because you have to be ready for the day of shooting instead of doing your work after the scene is shot. You’ll get conflicting feedback from multiple departments, day-of requests from set, and ridiculous deadlines. My advice to anyone who wants to get into VFX is to just do it. Don’t wait for someone to hire you, just start making things. The barriers to creating your own content have never been lower, and with free software you can create really polished work without a formal education. Put something out that shows off your style and expertise and don’t be afraid to reach out to people who are doing the kind of work you’d like to be doing.

Share a photo of where you work. What’s your favorite thing about your workspace and why?

Spatny: Working from my home office in L.A. full-time now is a huge improvement over filming on a frozen lake in Canada or at night in pouring rain in Wales. My favorite thing about my home office is my signed photo of the “Star Trek: The Next Generation” cast hanging over my desk.

Preserving Gene Roddenberry’s legacy and history of “Star Trek” 17
Image Source: VAD Supervisor Mark Spatny

Versiga: My favorite thing about where I work is that it’s from home. Although I miss the social component of working in an office, there’s a very comforting quality to working from home, and I find that I’m able to maintain a greater degree of concentration while I work compared to an office environment.

Preserving Gene Roddenberry’s legacy and history of “Star Trek” 18
Image Source: Senior Hard Surface Artist Donny Versiga

Jarvis: That would of course have to be my framed poster of Christopher Cushman’s iconic “Enterprise D” cutaway poster that hung on my bedroom wall all through grade school.

Preserving Gene Roddenberry’s legacy and history of “Star Trek” 19
Image Source: Motion Graphics Artist Andrew Jarvis.

Original Source: The Adobe Blog

]]>
https://www.provideocoalition.com/preserving-gene-roddenberrys-legacy-and-history-of-star-trek/feed/ 0
Creating futuristic UIs, HUDs, and languages for Netflix’s “The Mitchells vs. The Machines” with Adobe Creative Cloud https://www.provideocoalition.com/creating-futuristic-uis-huds-and-languages-for-netflixs-the-mitchells-vs-the-machines-with-adobe-creative-cloud/ https://www.provideocoalition.com/creating-futuristic-uis-huds-and-languages-for-netflixs-the-mitchells-vs-the-machines-with-adobe-creative-cloud/#respond Fri, 04 Feb 2022 21:00:49 +0000 https://www.provideocoalition.com/?p=249948 Read More... from Creating futuristic UIs, HUDs, and languages for Netflix’s “The Mitchells vs. The Machines” with Adobe Creative Cloud

]]>
From Iron Man’s futuristic HUD to holographic maps in a galaxy far, far away, Jayse Hansen is the designer behind some of the most recognizable displays and interfaces in films today. Created with Adobe Creative Cloud apps, his iconic interfaces combine realism with futuristic flare. Hansen is best known for his work on live-action blockbusters, but in recent years, has also started to find a niche in animated films.

In the Netflix-exclusive animated film “The Mitchells vs. The Machines” – which received a 97% approval rating on Rotten Tomatoes as well as many honors as the best animated film of 2021 – Hansen led the design for the displays used by the titular machines. The movie is a heartwarming sci-fi comedy that follows the Mitchell family as they learn to work together to save the world when AI robots revolt. Hansen designed the HUDs and interfaces used by the robots, including the film’s unique triangular robot language and custom fonts.

Creating futuristic UIs, HUDs, and languages for Netflix’s “The Mitchells vs. The Machines” with Adobe Creative Cloud 20
Working on “The Mitchells vs. The Machines” in Jayse Hansen’s home studio-theater using Adobe Illustrator.

Hansen spoke with us about his journey from movie fan to a go-to designer for iconic graphics in Hollywood films and beyond.

For “The Mitchells vs. The Machines,” you designed a unique language that only the AI robots can read. Can you walk us through that creative process?

Hansen: The human world of the Mitchell family that is seen early in the film is very painterly, muted, and organic, so we wanted to contrast that with a robot world made up of efficient straight lines, vibrant colors, and ultra-minimalistic design. Led by the amazing team of Lindsey Olivares (Production Designer), Toby Wilson (Art Director), and Mike Rianda (Director), I began with simple shapes like squares and triangles. I tried different iterations where I softened the shapes, formed the triangles into complex hexagons, or just used very simple lines. Early versions were more complex, and some were even inspired by QR codes.

Creating futuristic UIs, HUDs, and languages for Netflix’s “The Mitchells vs. The Machines” with Adobe Creative Cloud 21
An early concept of the robot language for “the Mitchells vs. The Machines”. Designed in Adobe Illustrator.

Because the language was going to be an important aspect of the film, Mike was especially excited about also using it as a coded way of planting hidden Easter eggs throughout the film. I designed about 30 different versions before the final decision. The challenge was making the font look simple, but at the same time highly advanced.

Creating futuristic UIs, HUDs, and languages for Netflix’s “The Mitchells vs. The Machines” with Adobe Creative Cloud 22
A still from “The Mitchells vs. The Machines” showing the robot user interface and language design. Designed in Adobe Illustrator. Animated and composited in Adobe After Effects.

We ultimately circled back to the idea of basing the language on only triangles and adding gradients to give the shapes more definition. It was a big hit with fans. There are numerous Reddit pages where people have translated everything from toaster UIs to street signs after scouring the movie frame by frame.

You did all of the design work with Adobe Creative Cloud. Tell us about your workflow.

Hansen: I like to start by just scribbling something awful looking on paper. Then I’ll go back and forth between Adobe Illustrator and pen and paper. I like this process because each medium puts me in a different headspace, so I get new ideas when I hop back and forth. It can be hard to get inspiration staring at a blank page, so sometimes I’ll even make some shapes in Illustrator, print them out, and then sketch over them.

Creating futuristic UIs, HUDs, and languages for Netflix’s “The Mitchells vs. The Machines” with Adobe Creative Cloud 23
Developing the Robot AI language for “The Mitchells vs. The Machines” in Adobe Illustrator.

When we settled on a design direction, I created the final production-ready graphics in Illustrator and then pulled them into Adobe After Effects for styling. Some artists use Adobe Photoshop for this step, but I prefer After Effects. I can add more advanced types of glows or film grain etc., and it’s ready to animate when clients want to see it in motion. I’d usually do animated examples for key scenes and then hand those source files off to Michael Schroeder and the ultra-talented motion graphics team at Sony Pictures Animation to complete the final shots in the film.

What do you like about working with Adobe After Effects?

Hansen: First, I love the integration between all of the Adobe apps. It makes my life so much easier when I can bring complex Illustrator designs into After Effects as layers. Everything can be converted to shape layers, which adds another level to how I can animate my designs. I also make heavy use of scripts and expressions in After Effects. This allows me to customize my workflow perfectly to the project and how I like to work.

Creating futuristic UIs, HUDs, and languages for Netflix’s “The Mitchells vs. The Machines” with Adobe Creative Cloud 24
A still from “The Mitchells vs. The Machines” showing the robot HUD design. Designed in Adobe Illustrator. Composited and animated for the final film in Adobe After Effects.

You’re an Adobe Creative Cloud veteran. Is there anything you’re looking forward to using more in the future?

Hansen: I’m working a lot more with Adobe XD for UI design, and I think it might replace Illustrator in my workflow. I’m excited about the new Multi-Frame Rendering capability in After Effects. It will make a big difference in terms of how fast I can preview and render comps.

Creating futuristic UIs, HUDs, and languages for Netflix’s “The Mitchells vs. The Machines” with Adobe Creative Cloud 25
The PAL User Interface Style Guide from “The Mitchells vs. The Machines”. Created with Adobe XD and Adobe InDesign.

I’m also really interested to find out what’s next for Frame.io now that it’s owned by Adobe. We used it recently on a feature film about AR contact lenses that I executive produced called “Sight: Extended”, and it was a life saver. But I am probably most excited about Adobe Substance 3D Modeler for VR. I currently concept my holographic UIs in VR using an Oculus Quest 2 and Varjo XR-3, so I’m really looking forward to seeing what Adobe brings to the VR space. Once you’ve designed in VR, it’s really hard to go back to a flat 2D screen.

How did you get started in the film industry?

Hansen: I actually remember the exact moment when I decided I wanted to work in film. I was around eight years old, and I loved making little models of “Star Wars” ships. I’d spend hours gluing and spray painting them to look ‘real’. Then one day, in a “Return of the Jedi Official Collectors Magazine” I saw a picture of the special effects team for “Star Wars” making the Tie-Fighter models used in the film. They were gluing them together and spray painting the wings. That’s when I realized that they were using small models to portray these huge ships. They were basically doing the same thing as me. In my 8-year-old mind, that meant that I too could make my own “Star Wars” movie!

Creating futuristic UIs, HUDs, and languages for Netflix’s “The Mitchells vs. The Machines” with Adobe Creative Cloud 26
A still from “Star Wars: Episode VII – The Force Awakens” showing Hansen’s design of R2-D2’s holographic map to Luke Skywalker. Designed in Adobe Illustrator. Animated and composited for the final shots in the film in Adobe After Effects.

Years later, when the cast from “Return of the Jedi” reunited for “Star Wars Episode VII – The Force Awakens”, I returned to my eight-year-old self, only this time, I was working at JJ Abrams’ Bad Robot with Andrew Kramer recreating the iconic opening “Star Wars” logo in Illustrator. I was kerning each letter for the title crawl in After Effects. I was designing and animating all the various R2-D2 holograms and Tie-Fighter displays.

Creating futuristic UIs, HUDs, and languages for Netflix’s “The Mitchells vs. The Machines” with Adobe Creative Cloud 27
Jayse Hansen at Bad Robot designing the “Star Wars” title crawl and R2-D2 holograms in Adobe Illustrator and Adobe After Effects.

I was in heaven. I realized my childhood dream had finally come full circle when I walked into a grocery store and saw my work in several spreads of the new “Star Wars Episode VII Official Collectors’ Magazine”.

You’ve primarily done work on live-action films. How are animation projects different?

Hansen: I never thought I’d do animated films, but I did some work for Disney for the Marvel and Star Wars franchises. When “Big Hero 6” entered production, I was pulled into that, and I loved it. The biggest difference between live-action and animated films is that with animation I tend to get involved much, much earlier in the process.

Creating futuristic UIs, HUDs, and languages for Netflix’s “The Mitchells vs. The Machines” with Adobe Creative Cloud 28
A still from “Big Hero 6” showing Hiro designing and analyzing hero suits for his robot Baymax and his pals. Designed in Adobe Illustrator.

They’re building worlds from scratch, so there’s an important period where everyone is gathered around experimenting and deciding on the look and design of the animated world. Everyone, from the director down, is an artist or designer, so they really understand the creative process and how to iterate and throw ideas against the wall. That’s what I really like about being involved in animated films: that sense that everyone involved is a true designer at heart.

Have you considered working in real-world UI design?

Hansen: Film is always my first passion. But for the past few years I’ve also been designing spatial holographic real-world UIs for the aerospace and defense industries. These ongoing projects include work on the next-generation fighter jet cockpit and HUD displays, the next generation volumetric UIs for Tanks, and Spatial Triage / Sense-Making UIs for CIA Analysts. I’ve also designed highly transparent AR combat HUDs for multi-domain Special Forces operatives for SOCOM (US Special Operations Command). These are multi-mode HUDs for paratroopers in wingsuits, which then transition to water and ground base modes focusing on human-AI teaming with robot dogs and flying Overwatch drones. Sometimes my real-world UIs end up far more futuristic and detailed than my film UIs. But I love the challenge of using design to help the front-line people that are keeping us all safe every day.

These real-world designs all tie directly back to my fictional work on films. For instance, in “The Hunger Games: Mockingjay,” I designed the hovercraft HUD display and main UI for the scene where the character Peeta is rescued based on a real-world Combat Search and Rescue (CSAR) dashboard I had previously designed for the Joint Task Force Space Command Center.

Creating futuristic UIs, HUDs, and languages for Netflix’s “The Mitchells vs. The Machines” with Adobe Creative Cloud 29
A still from “The Hunger Games: Mockingjay” showing the hovercraft team rescuing Peeta (Josh Hutcherson). Designed in Adobe Illustrator. Animated and composited in Adobe After Effects.

Similarly, for the upcoming “Top Gun: Maverick” movie (coming May 27, 2022), I teamed up with legendary designer-animators Gmunk, Dlew, Nick Lopardo, Toros Kose, and James Heredia. Because of my real-world fighter jet experience, I led the story-telling designs on the advanced fighter jet displays. I ended up working daily hand in hand with Lockheed Martin’s Skunkworks team to ensure the ultimate accuracy for every gauge, dial, and heat setting for critical moments in the film.

Creating futuristic UIs, HUDs, and languages for Netflix’s “The Mitchells vs. The Machines” with Adobe Creative Cloud 30
A still from “Top Gun: Maverick” with Tom Cruise again feeling “the need for speed.” In theaters May 27, 2022.

Check out more of Jayse Hansen’s work at jayse.tv.

]]>
https://www.provideocoalition.com/creating-futuristic-uis-huds-and-languages-for-netflixs-the-mitchells-vs-the-machines-with-adobe-creative-cloud/feed/ 0
Designing futuristic user experiences with Adobe After Effects https://www.provideocoalition.com/user-experiences-adobe-after-effects/ https://www.provideocoalition.com/user-experiences-adobe-after-effects/#respond Wed, 01 Dec 2021 02:45:26 +0000 https://www.provideocoalition.com/?p=247454 Read More... from Designing futuristic user experiences with Adobe After Effects

]]>
Watch any big budget movie today and you’re likely to see a scene where characters gather around a screen to inspect 3D schematics or hack into secure systems with a handheld device. Among many of the biggest movie franchises, Perception is the go-to design firm for these futuristic user interfaces (FUIs). Perception combines extensive research and visual artistry in Adobe After Effects to create visuals that straddle the line between science and fiction. The company’s work is influencing the look of technology on Hollywood screens—and inspiring real-world companies to rethink how creativity and technology can combine to change the future of user experiences.

Designing futuristic user experiences with Adobe After Effects 31
Marvel Studios “Iron Man 2” Tony Stark Phone

Twenty years of visualizing the future

Perception was founded in 2001 by Jeremy Lasky and Danny Gonzalez, both creative veterans from R/GA. During the first half of the company’s history, it became well-known for its trailblazing promos and event work seen on almost every major broadcast network. But Lasky and Gonzalez dreamed of taking their science fiction thinking to the big screen.

“What sets Perception apart is how we take the designs seriously to create a story and build out these fictional universes,” explains John LePore, principal and chief creative at Perception. “We aim to do more than just put blue lines on the screen. Our approach has been to really think about these worlds and figure out how technology might work within them.”

Perception has spent the last decade designing FUIs and titles for some of the most successful Hollywood films. The company’s projects have evolved from quick promos to much deeper engagements where Perception artists spend years building out a highly detailed fictional universe.

As Perception made a name for itself in the film world, the real world took notice. From smart TVs to automobiles, visual interfaces shape how people interact with technology, and many brands became interested in how Perception could apply its creative, thoughtful visuals to move today’s interfaces into the future.

“We realized that the influence goes both ways,” says LePore. “Directors want authentic technology in their films. And then real-world companies began calling us because they want their technology to look as cool as the things people see at the movies. That’s where Perception excels. We work in parallel realms of technical reality.”

Marvelous designs for big and small screens

In the Marvel Cinematic Universe, the name Tony Stark is synonymous with out of this world technology. The way those technologies appear on screen needs to do more than look cool—it needs to tell the audience about the brain of Tony Stark himself. In “Iron Man 2”, many of those visuals were created by Perception, marking just the first in a decade of notable collaborations between Perception and Marvel Studios.

Designing futuristic user experiences with Adobe After Effects 32
Marvel Studios “Iron Man 2” Tony Stark’s Phone Before VFX
Designing futuristic user experiences with Adobe After Effects 33
Marvel Studios “Iron Man 2” Tony Stark’s Phone After VFX

Two of Perception’s most recent technology design and titles projects include “Black Widow” and “WandaVision.” In “Black Widow,” Perception worked closely with Marvel Studios and director Cate Shortland for 10 months to design much of the technology shown in the film, from multi-layer holographic maps to medical screens and handset controls displaying biometric interfaces. All of the technology design was done in After Effects.

“It makes a dramatic difference to our workflow when we have one app where we can create shapes, work with 3D layers, adjust colors, and play with cameras all in one place,” says LePore. “It helps us work so quickly that we can approach projects like napkin sketches: exploring many different ideas until we find something that works for everyone.”

On the small screen, “WandaVision” aired on the Disney+ streaming service to rave reviews. Each episode is inspired by a different decade of American TV family sitcoms, and the opening and ending title credits reflect the changing styles of each decade. Perception designed several of the opening title sequences, but it was the company’s work on the end title sequence that earned it a 2021 Emmy Award nomination for Outstanding Main Title Design.

Designing futuristic user experiences with Adobe After Effects 34
Marvel Studios “WandaVision” Episode 3 Opening Title Sequence

The end titles take inspiration from the television-centric theme of the series. Extreme close-ups of key visuals from the episode transition into individual RGB pixels in the hexagonal shape of Wanda’s “hex” powers. The pixels form 3D figures inspired by the story of “WandaVision”: blooming flowers, a butterfly mobile, and the house where Wanda and Vision live. The 3D figures were designed in Cinema 4D, with After Effects used for final comping, grading, editing, and effects.

“One of the reasons that we started working with Cinema 4D 15 years ago is because we loved how it integrated with After Effects,” says LePore. “After Effects is our bread and butter; we do everything with it.”

“The rise of streaming has really changed the volume of work available for us,” adds Doug Appleton, creative director at Perception. “If you look at something like “WandaVision,” every episode had slightly different end titles and completely original opening titles. I’m excited about some of the new features in After Effects like Multi-Frame Rendering because it will really help us keep up with the growing demand.”

Exploring a futuristic UI for the GMC HUMMER EV

While After Effects is the core of Perception’s visual workflows for film and television, artists also use it as a way to prototype dynamic concepts for real-world clients. When designing the instrument cluster for the HUMMER EV from GMC, Perception started with the same questions used when creating interfaces for Hollywood: What is the inspiration for this technology? What information would people need to know? How can that information be conveyed clearly and easily for viewers?

Designing futuristic user experiences with Adobe After Effects 35
GMC HUMMER EV Instrument Cluster

“We refer to all of the little nuances and personality that we add to interfaces as the ‘soul’,” says Gonzalez. “It’s hard to explain, but you know it when you see it. Our clients really respond to it because they can instantly see that this was a design made by humans, for humans.”

For the HUMMER EV, Perception took inspiration from the moon, based on GMC’s influence on the space program. A custom, angular font reflects the angularity of the vehicle while remaining legible at a glance. Human-centered design is particularly important for automotive instruments, as distracting or confusing interfaces in a moving vehicle can lead to disastrous results.

Artists explored several possible designs to find a balance between style and readability and turned many of the designs into visualizations in After Effects.

“Working with programmers to create a prototype could take months, but we could create animated motion tests in a fraction of the time with After Effects,” says Gonzalez. “It helped streamline the back and forth between our designers and the client so that we could quickly finalize a design. It also served as a great base for GMC programmers to see exactly how the visuals are meant to move.”

From animation to visual effects, motion graphics to user design, Perception has expanded the expectations of what a creative design studio can accomplish.

“We’ve evolved into so many different areas and industries, from entertainment to real life,” says Lasky. “That’s what’s kept our studio fresh after so many years. A lot of the companies that we used to compete with are no longer around because they didn’t adapt. One thing that hasn’t changed for us is our work with After Effects. After Effects helped make our business possible, and we’re excited to see what new doors it will open for us in the future.”

Learn more about Adobe After Effects.

]]>
https://www.provideocoalition.com/user-experiences-adobe-after-effects/feed/ 0
Bringing the wild west to life with Adobe Premiere Pro https://www.provideocoalition.com/bringing-the-wild-west-to-life-with-adobe-premiere-pro/ https://www.provideocoalition.com/bringing-the-wild-west-to-life-with-adobe-premiere-pro/#respond Mon, 01 Nov 2021 16:57:23 +0000 https://www.provideocoalition.com/?p=246215 Read More... from Bringing the wild west to life with Adobe Premiere Pro

]]>
Michael Buday has worked in the film industry for more than 30 years, editing everything from BBC documentaries to hit sitcoms to the MTV Movie Awards. As a freelance editor, Buday has always worked to understand the technology behind editing and has used almost every editing suite over the years. He helped launched high-definition television in the United States, online editing an episode of “Chicago Hope” for CBS that became the first prime time HD broadcast in 1998. And in 2007, Buday co-founded the high-tech startup Fuze to explore ways of advancing video collaboration.

Buday recently finished working with co-editor Victoria McGinnis and MacGillivray Freeman Films—the pioneering filmmakers behind some of the world’s biggest IMAX productions—and Discovery Channel on the four-part documentary “Out Where the West Begins”. The editorial team used Adobe Premiere Pro to combine still images, archival film, original footage, and visual effects to bring to life the American wild west.

We sat down with Buday to talk about his background as an innovative editor and why Adobe Premiere Pro was the best choice for this documentary.

Book on table
Image source: MacGillivray Freeman Films

Why did MacGillivray Freeman decide to use Adobe Premiere Pro for “Out Where the West Begins”?

Buday: I was brought onboard several months after post production began to help meet an aggressive deadline. It was MacGillivray Freeman’s first documentary series for television, and it was started in Premiere Pro. I previously worked on small projects in Premiere Pro but this was my first big one. There were multiple people working on the project so there were a lot of duplicated media references and recovered clips, making it very unwieldy. After several discussions with lead creative editor Victoria and the rest of the editorial team, and because of my software experience, I was elected to fix it.

I asked a friend for help and he put me in touch with the Premiere Pro development team. They not only got us on track, but they also put us on an early version of Productions, which saved us. We rebuilt the structure for the series using Productions and learned a lot along the way. We had as many as eight editors touching the project at the same time. At the end, along with the chief engineer, we decided to convert the whole company to Premiere Pro.

When you started using Premiere Pro in earnest, what did you like most about it?

Buday: Simply put, I don’t think we would have finished on time if it wasn’t for Premiere Pro.

One of the biggest challenges for this documentary series was the volume and variety of media. We had hours of dailies, from reenactments to interviews to landscapes. On top of that, we also had hundreds of hours of archival footage and thousands of archival stills. And the media was coming in all sorts of formats, from HD digital to clips from YouTube. The ability for Premiere Pro to handle all types of media without encoding was so critical for us. We could just drop media into a bin and start pulling it into a timeline.

With Avid, we would have had editors just standing by, waiting for encoding to finish. Yes, Avid has AMA linking, but in practice it’s far less robust and problematic if media is moved from its original location. We didn’t have time for that. We needed to just go. We tried to do as much as we could using Premiere Pro to maximize speed. We were getting changes up until the day we shipped, so we didn’t have the time to be encoding and jumping between solutions.

Bringing the wild west to life with Adobe Premiere Pro 36
Image source: MacGillivray Freeman Films

What else did you like about working with Premiere Pro on this project?

Buday: I appreciated that Premiere Pro could handle it all. We were originally planning to use Resolve for the color grading. But because of the massive time crunch and last-minute changes occurring daily there was no sense of picture lock, so we ended up using the Lumetri Color panel in Premiere Pro and it did a great job and was so much faster. We also did the final sound mixing in Premiere Pro.

We even created most of the visual effects in Premiere Pro. Out of the nearly four-hour run-time for the series, about three hours have visual effects in some form. Only about 30 minutes of it was done in After Effects. The rest was all done by the editors in Premiere Pro. It meant fewer shots that we needed to send to visual effects artists, which meant faster turnaround and faster changes. Our motto was that if you can do it in Premiere Pro, do it.

Viewers might be surprised to see so many visual effects in the series. Did the integration with Adobe After Effects influence the decision to work with Premiere Pro?

Buday: Absolutely. The synergy between Premiere Pro and After Effects was so important. “Out Where the West Begins” has tons of motions graphics. We manipulate still images, layer them, and turn them into animations. Since some of the older archival video elements were so low resolution, we decided to lean into the grainy feel and show the videos framed like they’re being projected onto an old theater screen with flickering light rays. We created effects nests that other editors could use to quickly recreate the theater motif with new footage.

We did what we could in Premiere Pro, but when we needed something more advanced, we went to After Effects. Dynamic Link is great, not just for After Effects but other Creative Cloud apps. Trying to work with Avid’s effect editor and multiple layers of effects is a lot more cumbersome, so the majority of our effects work would have been forced to other applications. It would have involved lots of time spent bouncing between apps and waiting for encoding—time we didn’t have.

Bringing the wild west to life with Adobe Premiere Pro 37
Image source: MacGillivray Freeman Films

Did you use any other Adobe Creative Cloud apps?

Buday: I played around with Adobe Audition a bit. We used it to clean up some of the audio on noisy tracks. The only problem with Adobe Audition is that not enough people are using it. When I actually opened it up, I was surprised at how powerful it is. I’d love to train all other editors on it and use it a lot more in future productions.

You’ve been in the business for a long time. How did you get your start in editing?

Buday: I started as an assistant editor for the BBC. I worked my way up to editor and then went independent. I worked for all of the top post-production houses in London, doing everything from government jobs and corporate work to documentaries and hit TV shows. After a few years of that, I moved back to the U.S. and started working in New York and Los Angeles for all major networks.

When you’re a freelance editor, you don’t have a lot of control over how you edit. You need to know every editing solution, otherwise you might find yourself getting really pigeonholed into one type of work. So I’ve always tried out new things. I worked as a consultant with Sony Broadcast for years to help them design their linear edit controllers and launch high-definition broadcasts of “Chicago Hope” and “The Tonight Show” back in 1998. That was back when maybe three people had HDTV, but it was the start of something new.

I also helped co-found the software company Fuze, working there until 2015. I was originally just looking for a way to do remote reviews and approvals with directors and producers so that I wouldn’t have to make that painful drive into Los Angeles so often. But it expanded and now it’s an enterprise communication and collaboration platform.

What’s the next editing project for you?

Buday: I’m still working with MacGillivray Freeman helping edit a new documentary series about the San Manuel Indian tribe in Southern California and a massive VR project. Since MacGillivray Freeman has moved entirely onto Premiere Pro, Productions is helping multiple editors work together much more smoothly. It’s great for everyone involved.

Read more about Productions in Adobe Premiere Pro.

Visit Michael Buday’s website.

]]>
https://www.provideocoalition.com/bringing-the-wild-west-to-life-with-adobe-premiere-pro/feed/ 0
Rendering 45 percent faster using Multi-Frame Rendering in After Effects on “The Voice” https://www.provideocoalition.com/multi-frame-rendering-in-after-effects-on-the-voice/ https://www.provideocoalition.com/multi-frame-rendering-in-after-effects-on-the-voice/#respond Mon, 11 Oct 2021 13:00:30 +0000 https://www.provideocoalition.com/?p=245400 Read More... from Rendering 45 percent faster using Multi-Frame Rendering in After Effects on “The Voice”

]]>
The Voice
Image source: NBCUniversal

Over his accomplished career, Anderson has worked on some of television’s biggest events, including the Oscars, the Grammy Awards, and the Tony Awards. Like other motion graphics artists, there’s one thing he’s always wanted from Adobe After Effects – faster rendering.

Working on NBC’s top-rated singing competition program, “The Voice”, his wish finally came true. For more than 20 seasons, Anderson has helped design the show logos, insert graphics, and stage performance motion graphics for dozens of aspiring artists. In the Emmy-winning show’s last season he used the beta version of Multi-Frame Rendering in After Effects and was so impressed with the results he reached out to an After Effects engineering manager to share his experience.

We sat down with Anderson to talk about his surprising design background and how he uses Adobe After Effects to deliver jaw-dropping visuals.

Let’s start by talking about Multi-Frame Rendering in After Effects and what it means to the work that you do.

Anderson: I’ve used After Effects for 20 years, so I’ve seen its evolution. It’s always been obvious that the team that works on After Effects listens to users. The most recent feature that demonstrates this is Multi-Frame Rendering. It’s just huge.

I didn’t want to wait until it was out of beta to try it on “The Voice”. I started by taking the same project and rendering it twice, both with single frame rendering and then with multi-frame rendering. I crunched the numbers, and it was 45 percent faster with Multi-Frame Rendering. I couldn’t believe it. That’s a massive difference for our work. From that point on I rendered everything using the beta of After Effects.

Rendering 45 percent faster using Multi-Frame Rendering in After Effects on “The Voice” 38
Image source: NBC Universal

We output massive files for the performance screens that display behind the artists—larger than 4k resolution, 3840×2400 for the template we used, with nine screens in the master output. It takes hours and hours to render. Each artist on our team needs to create about five to seven performance visuals for live broadcasts every week. When we can save hours for each performance render, that equates to more time to be creative, and that’s what we need after 20 seasons. We’re always looking for ways to do something differently than we’ve done it before.

It started as a test, but by the finale week last season I knew I was going to have the render done so much faster, so I decided to spend time working on more creative graphics, detail work, and visual effects. The editor on our team likes to have a lot of content to choose from so he can change things up at any given time. We used the extra time that Multi-Frame Rendering provided to make more creative content.

Are there other new After Effects features that you’re excited about?

Anderson: Speculative Preview is another feature in the beta that’s really cool. The idea of a project rendering in the background while I work – or don’t work – on something else is incredible. It’s a dream workflow that we’ve not had before. I also have the Adobe Creative Cloud app on my phone, and I love that Remote Notifications will let me check on a render when I’m away from my desk. The app can even send a notification to your smartwatch when a render is done.

How did you get started working major award shows and live broadcast events?

Anderson: It’s a funny story. There used to be a company called ICE that made a PCI accelerator board for After Effects, and their plugins made your After Effects renders really fast. I had one for about a year, but I decided to sell it on eBay in 2002. The guy on eBay who bought it wasn’t located too far from me, so I offered to drop it by.

I’d looked up his username and saw that he worked on television shows. So I went to his office near Universal Studios to deliver the ICE board. He handed me a check and then I handed him my demo reel. Two months later he called me and asked if I wanted to work on a project. His name is Allan Wells, and since then we’ve worked on everything from awards shows to the Super Bowl and even stuff for the White House. For the past 10 years we’ve found ourselves working together on “The Voice”. I’m very grateful to him for helping me get my start.

Tell us a bit about “The Voice” and how your work for the show is different.

Anderson: After 20 seasons “The Voice” is more popular than ever. Just like the show’s incredibly talented artists and coaches, we’re always pushing to evolve, to try something different, and be more creative.

I handle both house brand graphics and performance visuals for the live shows when all the singers perform on stage with massive screens. We can completely change the mood of the stage in seconds just by switching to a different graphical backdrop.

Rendering 45 percent faster using Multi-Frame Rendering in After Effects on “The Voice” 39
Image source: NBCUniversal

We’re dealing with dozens of artists and performances every season, so it’s sometimes a challenge to come up with new ideas. One of the creative producers might want to use neon geometric designs for a performance, but since we already used that approach in previous seasons we have to find a new and fresh way of executing that. So, we’re constantly rethinking designs to make sure that each artist’s performance feels fresh. That’s why having extra time to be more creative using features like Multi-Frame Rendering is so valuable.

We do all of our motion graphics with After Effects. It’s amazing that I’ve been in the business for decades and After Effects is still the most cutting-edge program around. When we’re looking for fresh performance screens, we can check out new features in After Effects or mess around with third-party plugins that will let us do something completely different.

“The Voice” band is also great at staying on beat and to tempo, so we can bring a rehearsal track into After Effects and use a third-party plug-in like Trapcode Sound Keys or convert to key frames to apply effects based on the bass, treble, highs, lows, and mids.

You’ve seen After Effects evolve throughout your career. What stands out to you the most?

Anderson: I’ve provided feedback on After Effects features and enhancements over the years. Often when a new version comes out it will include something I was hoping for, even if I never asked for it. So much of it is about the details – speed, automation, resolution. I also love that there’s such a strong system of third-party plugins that help to keep After Effects on the cutting edge.

I’m a big Cinema 4D user and the way that After Effects and Cinema 4D work together keeps getting better and better. That’s been really helpful too.

You’ve been recognized for your design work since you were a teenager. How did this love of art lead to television?

Anderson: My mom is an artist, so she always had us drawing and painting. But I was actually really into industrial design at first. I won a national car design contest at 15, and it got me talking to lead designers from Nissan, Mazda, and General Motors. Then I got my first Macintosh computer and I realized that I could take designs off of the drafting table and onto the computer.

I started working in the digital side of art: graphic design, vector art, and web design. I was introduced to the world of entertainment when I designed the first website for Pittard Sullivan in the late 90s. Pittard Sullivan was one of the most prestigious creative agencies in Hollywood, and they were doing title sequences, show logos, network IDs, and all sorts of work for movies, TV shows, and major networks. Once I saw what motion graphics could do, I was hooked. I used the money from that Pittard Sullivan job to buy the fastest Mac I could afford, loaded it with After Effects, and started to learn everything I could about motion graphics.

That’s great to hear. What do you do with your free time? Anything creative?

Anderson: Over the years I’ve gotten into producing stock assets. Between work projects, I’ll sit down with After Effects and create original animations, motion graphics, 3D videos, all sorts of stuff. I have maybe 2,500 to 3,000 videos on Adobe Stock. Even though I’m working with graphics all the time, it’s fun to do something just for yourself. I can experiment with new features, try out different styles, and just be creative. It’s a lot of fun, and if other people find it inspiring, that’s great, too.

Learn more about Multi-Frame Rendering in After Effects.

Visit Chad Anderson’s website.

Hear more from Chad Anderson about how he got started in his career, working on “The Voice,” and his use of Multi-Frame Rendering in Adobe After Effects as part of an Adobe keynote at Post | Production World 2021. The session also included Victoria Nece, Adobe After Effects product manager, and Francis Crossman, Adobe Premiere Pro product manager, who demonstrated key After Effects and Premiere Pro features for faster workflows.

]]>
https://www.provideocoalition.com/multi-frame-rendering-in-after-effects-on-the-voice/feed/ 0
Designing “The Queen’s Gambit” titles with Adobe After Effects https://www.provideocoalition.com/designing-the-queens-gambit-titles-with-after-effects/ https://www.provideocoalition.com/designing-the-queens-gambit-titles-with-after-effects/#respond Mon, 27 Sep 2021 13:00:04 +0000 https://www.provideocoalition.com/?p=244758 Read More... from Designing “The Queen’s Gambit” titles with Adobe After Effects

]]>
While superheroes and thrilling action are more popular than ever, millions tuned in to watch a different kind of battle in 2020: the battle of brains in the intense world of competitive chess. “The Queen’s Gambit”, starring Anya Taylor-Joy, became a smash hit for Netflix in 2020.

Millions of viewers watched the seven-episode series chronicling a young chess prodigy’s rise to the top of the ranks in the 1950s and 60s. Critics hailed the show for its incredible performances and strong writing, earning the series two Golden Globes for Best Limited Series and Best Actress.

For the Emmy Awards, “The Queen’s Gambit” received several more well-deserved nods, including one for Outstanding Main Title Design. Designed by Saskia Marka, the mesmerizing title sequence uses geometric shapes created by animator David Whyte that dance across the screen in hypnotic patterns. The focus on monotone colors and simple square or circular shapes reminds the viewer of pieces moving across a chess board.

We sat down with Marka to get a little more insight into how she created her Emmy-nominated design.

Designing “The Queen’s Gambit” titles with Adobe After Effects 40
Image source: Netflix

How did you first get into motion design?

It was back in the nineties. I studied communication design and figured print can’t be all there is. As a movie fan I saw all these new title designs from movies like “Seven”, “Gattaca”, and “Enemy of the State”. I saw the impact that a great introduction had on these films. That was what drew me to it. There was not even the term “motion design” at that time in Germany and Adobe After Effects was a very young tool.

What was the inspiration behind your title sequence?

It’s an end title sequence and plays only once after the very last episode. I wanted to continue the feeling of the last scene with an abstract visualization of playing and enjoying chess and turn it into a celebration of overcoming yourself. It’s supposed to build a bridge between logic and emotion.

Designing “The Queen’s Gambit” titles with Adobe After Effects 41
Image source: Netflix

How did you begin this project?

I wanted to give the concept a modern approach. That’s why I created a 25-second mock up with processed gif animations by David Whyte, which I showed to series co-creator Scott Frank and his team. He loved it and from that point I had total freedom of what was going on behind the credits.

The look of the sequence is 1920s with a warm fuzziness, but the credits are very modern in style and typography. My intent was to create opposites on different levels.

What’s your favorite part of the title sequence?

I enjoyed animating these gifs to dance along with the music. My favorite part is when the music gets flowery and the squares pop up like flowers blooming in a meadow.

Designing “The Queen’s Gambit” titles with Adobe After Effects 42
Image source: Netflix

What were some specific challenges you faced in making this sequence?

The animations in the background were so high in contrast that it was hard to read the credits in the beginning and I knew I had to come up with a solution. At some point I put the credits over solid bars and switched the colors with every page of the credits. This gave the sequence an even more graphic and distinctive look.

What Adobe tools did you use on this project and why did you originally choose them?

For the main title logo layouts, I used Adobe Photoshop and Illustrator. The title sequence is set up as a 2D project in Adobe After Effects, including all credits. After Effects gives me the most freedom and flexibility because I like to mix things up to see what happens. I’m also editing with it. For feedback rounds I love Adobe Media Encoder, because it’s so fast.

Designing “The Queen’s Gambit” titles with Adobe After Effects 43
Image source: Netflix

Where do you find creative inspiration?

Pinterest. It’s the most creative tool for me, because you can always dig deeper into all directions, find links to new options that never get boring, or find similar pictures. I’m also inspired by nature, abstract art and design, typography, and photography.

What’s the toughest thing you’ve had to face in your career?

Watching people steal my ideas and being successful with them over and over again. And lawyers telling me that ideas cannot be protected.

You have to just keep going and eventually hard work pays off. I still have to process that my title sequence won the Jury Award for Excellence in Title Design at SXSW!

What advice do you have for people aspiring to get into the motion design space?

You should have fun with what you are doing and be as playful as you can be. Even if you might be working with a tight style guide that doesn’t leave many options – try out new things, use filters for what they are not made for, mess things up. Draw outside the lines. Even if you might not use your results now, you might need it later for something else or you get new inspiration out of it for yourself. Don’t plan everything, let things happen. Don’t try to reach perfection, but always give more than your client expects you to. That’s how it works for me at least.

Designing “The Queen’s Gambit” titles with Adobe After Effects 44
Image source: Saskia Marka

In 2020, many of us are changing how and where we work. What’s the one thing you can’t live without in your workspace?

It’s weird, but my favorite thing is my hot water bottle. I always have it around, and I use it in every season. It helps me concentrate!

See more of Marka’s work on her website www.saskiamarka.com or as saskiamarka on Instagram.

Read about how Adobe Creative Cloud was used to help create the title sequence for “Birds of Prey”.

]]>
https://www.provideocoalition.com/designing-the-queens-gambit-titles-with-after-effects/feed/ 0
Designing the eye-catching titles for “Birds of Prey” https://www.provideocoalition.com/designing-titles-for-birds-of-prey/ https://www.provideocoalition.com/designing-titles-for-birds-of-prey/#respond Wed, 15 Sep 2021 18:04:46 +0000 https://www.provideocoalition.com/?p=243776 Read More... from Designing the eye-catching titles for “Birds of Prey”

]]>
Bright and colorful, fast paced and rambunctious, “Birds of Prey” (and the “Fantabulous Emancipation of One Harley Quinn”) embraced the cartoon origins of its titular character, Harley Quinn. This 2020 film starring Margot Robbie earned praise for its fun style, which started with the opening title sequence. The titles use stylized hand-drawn illustrations that hint at the movie’s climactic finale while preparing audiences to enter the wild and colorful world of “Birds of Prey”.

The sequence was created by the inventive design team at design and production studio Shine. Shine was honored with a 2021 SXSW Special Jury Recognition for Title Design plus the Audience Award for the main title category for “Birds of Prey”. We sat down and talked with Shine’s Creative Director, Michael Riley, to learn a bit more about what goes into these inventive titles.

How did you first get into motion design?

When I was a student at Rhode Island School of Design, I did a six-week internship under Tibor Kalman at his studio M&Co. in New York. Tibor was finding all kinds of ways to break free of norms within graphic design. He used typography and graphic elements to express ideas and emotion in film titles and music videos. His use of motion for typography and graphic design in concert with film was something really exciting and different. His motion work was really groundbreaking, and it opened my eyes to so many more possibilities within graphic design.

Can you tell us about how you got started with “Birds of Prey”?

The project started with Bob Swensen, the executive producer at Shine, and me screening the movie over at Warner Bros. Then the director, Cathy Yan, gave us her creative brief. She wanted a title sequence that served as a visual punctuation for the movie and a graphic expression of Harley Quinn. She wanted us to design something that reflected Harley’s wild punk-rock character. Also, she really didn’t want something slick. Cathy thought maybe we could take some visual and conceptual inspiration from the fun house from the end of the film, where there was a big battle scene.

Designing the eye-catching titles for “Birds of Prey” 45
Image courtesy of Michael Riley at Shine.

What was the main thing you tried to bring to the title sequence?

This project was all about Harley Quinn. The design of the title sequence was meant to be a visual expression of Harley’s character. She’s wild, strong, smart, compassionate, and very human. The title sequence’s visual style and some of the elements were inspired by the production design from the film. The movie was bold, bright, and colorful. For the tone of initial storyboards, I took inspiration from the spirit of Cy Twombly’s work, some of which consists of wild, expressive, bold, almost out-of-control scribbling. I thought that spoke to Harley Quinn’s character.

What was your favorite part about creating the title sequence?

For me it was all about the fun of drawing. I’m addicted to Procreate on my iPad, and the way Procreate seamlessly integrates with Adobe Creative Cloud — specifically Photoshop — made the process really enjoyable, because I was able to focus on the visual ideas and not worry about the tools. It was a very intuitive process. I spent many weeks drawing and illustrating as many visuals as I could that were inspired by the film.

I’ve always used drawing as a tool to come up with ideas, but I don’t always use illustration in the final animation. I think a title sequence should reflect the characters and the story of the film, and in this case, using wild colorful illustration seemed to be a good fit.

Designing the eye-catching titles for “Birds of Prey” 46
Image courtesy of Michael Riley at Shine.

What were some challenges you faced when making the titles?

The main challenge is designing so that we’re speaking to the characters and the story. Every visual we incorporated needed to be an extension of the characters, the conflicts, and the relationships in the film. So that meant the design process was fluid and constantly adjusting so that the title sequence was serving the film as a visual punctuation that tied into the story.

One thing I really love about designing for movies is that film is a collaborative medium. Between the team at Shine that’s animating the project; the director, Cathy Yan; the producers; then the studios executives at DC & Warner Bros… there are a lot of people you get work with. With this kind of collaboration, you need to be flexible. Adobe Creative Cloud is absolutely built for this kind of project.

While the titles use hand-draw illustrations, there’s also real depth. Was there any 3D work involved?

Yes, some shots were animated using 3D cameras in After Effects, and some were created by modeling 3D environments in Cinema 4D. Penelope Nederlander, the lead animator on this project, did some really amazing work on “Birds of Prey”. She brought deep knowledge of both After Effects and C4D and combined the Procreate illustrations into a hybrid 2D-meets-3D aesthetic that the clients really loved. Animators Amanda Gotera and Myke Chapman did amazing work too. The production process was a team effort between all of us, and we had a great time collaborating. I think each of us brought something different to the table — that was a definite plus.

Designing the eye-catching titles for “Birds of Prey” 47
Image courtesy of Michael Riley at Shine.

Walk us through the production process. How did you bring these titles to life?

The production phase was so much fun. Adobe Creative Cloud was key to creating this sequence. I drew all the illustrations in Procreate in my iPad, and then exported to Photoshop to create storyboards. We created a storyboard presentation in the form of a PDF assembled using InDesign. For the client presentation, we projected each storyboard frame separately in the screening room over at Warner Bros. for the director Cathy Yan, and the producers, Margot Robbie, Sue Kroll, and Bryan Unkless. This helped give the storyboard presentation a bigger, more cinematic feel.

Once they approved the concept, we started really rough. I simply cut together Photoshop still frames in Premiere Pro against the music. That established which Procreate illustrations we were going to use. We then advanced to animation with After Effects and Cinema 4D. The ability for Adobe Creative Cloud to interface with all the Procreate illustrations was really key on this project. It took us all the way to delivery, where we exported the final 5K frames out as EXR frame sequences for delivery.

Last but not least, I need to mention fonts! There are so many great fonts and talented font designers that are part of the Adobe type library. The condensed bold option of Trade seemed to be a good fit because its voice kind of yells. I felt that was a choice that was in sync with Harley Quinn’s character.

Designing the eye-catching titles for “Birds of Prey” 48
Image courtesy of Michael Riley at Shine.

What’s your favorite thing about working with Adobe Creative Cloud?

Adobe Creative Cloud can interface so fluidly between apps: Photoshop, Illustrator, InDesign, Premiere Pro, After Effects, Media Encoder, and many others. The way an artist can jump back and forth between each makes the creative process so intuitive and so much fun. There’s no other set of apps that compares. Some artists might have strengths in one over the other. But every app is both very intuitive for the beginner yet extremely deep for experts.

Also, I love that Adobe has always been open to integrating with other software. I talked about how we used the integration between Procreate and Photoshop for this project. And Penelope is amazing in Cinema 4D, and there’s a very fluid relationship between C4D and After Effects. Adobe Creative Cloud does a great job of being useful to creatives wherever they are in their path, in terms of both creative focus and skill level. As an artist, I can’t imagine designing without Adobe Creative Cloud.

Where do you find your creative inspiration?

My answer on this is a little all over the place! I like to be inspired by all kinds of art, design, music, and movies. I love to find new artists and photographers on Instagram or other places. I love movies, old and new. Lately I’ve been rediscovering a lot of art and design that I studied in art history at RISD. I also take a lot of inspiration from music. I can’t wait to get back to LACMA, MoCA, and all the New York museums. I also take a lot of inspiration from the talented people I have the opportunity to collaborate with!

What’s the toughest part about your design work?

Every time, the initial conceptual design phase is the most challenging part. If you’re working in movies or television, it’s always about designing and authoring ideas that fit the story. It’s also the most exciting part because the possibilities are wide open!

What advice do you have for people aspiring to get into the motion design space?

Treat any design opportunity as a gift. Cherish clients that bring you any creative opportunity. Work hard. And stay persistent!

Designing the eye-catching titles for “Birds of Prey” 49
Image by Chris Woods.

We’re always interested to learn how people work. Can you tell us a bit about your workspace?

Shine is located on the sixth floor of a grand old Art Deco building on Wilshire Boulevard in an area called Miracle Mile. I love our studio space because we have amazing views of Los Angeles on all sides. I love to design in the studio at Shine!

See more of Michael Riley’s work on his website, shinestudio.com.

]]>
https://www.provideocoalition.com/designing-titles-for-birds-of-prey/feed/ 0