Virtual Production – ProVideo Coalition https://www.provideocoalition.com A Filmtools Company Sat, 28 Dec 2024 12:00:40 +0000 en-US hourly 1 https://wordpress.org/?v=6.3.5 https://www.provideocoalition.com/wp-content/uploads/cropped-PVC_Logo_2020-32x32.jpg Virtual Production – ProVideo Coalition https://www.provideocoalition.com 32 32 Studio Extreme, an immersive experience at CES 2025 https://www.provideocoalition.com/studio-extreme-an-immersive-experience-at-ces-2025/ https://www.provideocoalition.com/studio-extreme-an-immersive-experience-at-ces-2025/#respond Sat, 28 Dec 2024 11:58:28 +0000 https://www.provideocoalition.com/?p=287412 Read More... from Studio Extreme, an immersive experience at CES 2025

]]>
Studio Extreme, an immersive experience at CES 2025Disguise, Nikon and MRMC bring Studio Extreme immersive activation to CES 2025, to show how easily brands can create everything from social live streams to professional ads in a turnkey virtual production studio.

Studio Extreme will take participants on an immersive journey. As they step into the activation they will be transported to a location just outside Vegas, where they will be tasked to deliver a live news report. But will they be prepared for all the weather conditions that Vegas has to offer?

The immersive experience, which will be on display at CES 2025, is the result of a partnership between Disguise — the company behind the virtual production technology used on commercials for Apple Music and Lenovo, as well as feature film Daddio starring Sean Penn —, Nikon and its subsidiary MRMC. It aims to demonstrate how easily brands can create everything from social live streams to professional ads in a turnkey virtual production studio.

Nikon chose Studio Pro by Disguise to bring this activation to life. Designed for brands looking to tell powerful stories through immersive content, Studio Pro combines the most advanced virtual production technology with best-in-class creative and technical services, including 24/7 support, training and installation, enabling brands to shoot commercials, social media content, internal training material or important keynote or investor presentations — all from one stage on the same day.

Studio Extreme, an immersive experience at CES 2025

“We’ve helped deliver 400 virtual production studios in more than 100 countries, and know just how transformational they can be for brands,” says Alexandra Coulson, VP of Marketing at Disguise. “Traditionally, however, LED technology required significant upfront training and investment, making it reserved for Hollywood studios. We are changing that. Together with Nikon and MRMC we are showcasing the possibilities at CES 2025. Attendees will experience firsthand how they can take their brand’s story to the next level, producing a versatile offering of content with our comprehensive powerhouse solution, Studio Pro.”

The technology behind the Studio Extreme activation
Visitors who step into the CES activation will be able to experience state-of-the-art virtual production technology including:

  • Studio Pro — a turnkey studio solution by Disguise
  • Studio Bot LT — MRMC’s most compact robotic camera arm solution
  • RED Komodo 6K camera
  • Lighting by Kino Flo
  • L-Acoustics spatial audio

Once they complete their weather report, visitors will receive a personalized video downloadable via a QR code.

Visit the virtual production activation Studio Extreme at the Nikon booth 19504 in LVCC, Central Hall, at CES 2025.

]]>
https://www.provideocoalition.com/studio-extreme-an-immersive-experience-at-ces-2025/feed/ 0
Samsung, Pimax and DPVR: three new VR headsets for 2025 https://www.provideocoalition.com/samsung-pimax-and-dpvr-three-new-vr-headsets-for-2025/ https://www.provideocoalition.com/samsung-pimax-and-dpvr-three-new-vr-headsets-for-2025/#respond Tue, 24 Dec 2024 13:52:48 +0000 https://www.provideocoalition.com/?p=287339 Read More... from Samsung, Pimax and DPVR: three new VR headsets for 2025

]]>
Samsung, Pimax and DPVR: three new VR headsets for 2025CES 2025 will be the stage to discover the new VR headsets coming to the market: the 27 million pixels Pimax Dream Air, a new “mystery product” set from DPVR and, probably, the Samsung Moohan Android XR headset.

Virtual Reality is about to take a new step forward with the introduction of new VR/MR headsets that promise to attract more people to experience what these technologies have to offer. One of the new headsets, announced as a “mystery product”, comes from DPVR, a company that has not delivered any exciting consumer solutions, despite its promises, so it will be interesting to see what this new announcement will bring. DPVR will be at CES 2025, showing the company’s products at Booth #15945 in the Central Hall at the Las Vegas Convention Center (LVCC).

Samsung, Pimax and DPVR: three new VR headsets for 2025The second VR headset now announced is the Pimax Dream Air, presented as the world’s smallest full-feature 8K resolution VR headset. With only 200g (strap included), a 102° FOV(H) on 27 million pixels, 3840 x 3552 pixels per eye at 90 Hz refresh rate, on micro-OLED panels, the new headset features head, hand and eye-tracking, integrated spatial audio, a Display Port connection, and a self-adjusting backstrap.

The Dream Air is a PCVR headset, and, according to Pimax, “borrows a lot of components from the previously announced Crystal Super, including the micro-OLED panels and pancake lenses — but packs this into a small form factor headset, to satisfy different use cases. It breaks with previous Pimax headsets, with a new design language, signalling the small form factor era for Pimax.”

Samsung, Pimax and DPVR: three new VR headsets for 2025Pimax Dream Air: is it real?

The Pimax Dream Air is a 6DoF tracking PCVR headset with inside-out tracking by default but Lighthouse compatible if you’ve the Steam base stations. Pimax – the company will be at CES 2025 –  reveals that the headset can become a standalone solution with the addition of Cobb, a standalone puck compute device being developed, powered by Qualcomm’s Snapdragon XR2 chip and a battery.

With a price starting at USD $1,900, the Pimax Dream Air will be available in May 2025, but the company is now taking pre-orders. Knowing Pimax’s previous history of announcing products that it does not deliver in time, and the many problems that users have with Pimax headsets, the Pimax Dream Air may be another example of as one user noted online… “Pimax being Pimax”.

Still, the Pimax Dream Air looks interesting, in terms of specifications and the fact that it promises to be a full-featured PCVR, with a DisplayPort interface for visually uncompressed images from the PC, which continues to be the best way to explore Virtual Reality. Will the Dream Air turn reality?

Samsung, Pimax and DPVR: three new VR headsets for 2025First Android XR headset comes from Samsung

The third VR headset coming in 2025 is much more appealing as it marks the return of Samsung to the VR market, where the company left a good impression with its Windows Mixed Reality headset, Odyssey, released in November 2017, following an announcement in October 2017. The Odyssey was the company’s first VR headset release, designed to work with Microsoft’s Windows Mixed Reality… which is now dead.

Microsoft put an end to its own Windows Mixed Reality, which it promoted in 2016 – when the company announced that multiple OEMs would release virtual reality headsets for the Windows Holographic platform -, and officially “killed” in December 2023, when the company announced deprecation of WMR with complete removal in a future release of Windows 11 version 24H2 expected to arrive in late-2024.

What this means is that owners of one of the 10 VR headsets launched (from companies as Samsung, Lenovo or Acer) since 2016 – the last one of them being the HP Reverb G2 – will own a mere paperweight when they update Windows 11. It’s Microsoft being Microsoft…

With WMR dead, a new “universal” platform is about to be born, and the first VR headset designed for it is Samsung’s Project Moohan. In fact, Samsung Project Moohan is the first Android XR headset to make it to the market. And Google says that more devices are coming for the new operating system built for this next generation of computing, created in partnership with Samsung and Qualcomm. Android XR combines years of investment in AI, AR and VR to bring helpful experiences to headsets and glasses, promising a platform to extend your reality to explore, connect and create in new ways.

Android XR platform supports OpenXR

Google says that the company is “working to create a vibrant ecosystem of developers and device makers for Android XR, building on the foundation that brought Android to billions” and that the preview of the new operating system made available to developers will allow them to start building apps and games for upcoming Android XR devices. Google also revealed that “Qualcomm partners like Lynx, Sony and XREAL, we are opening a path for the development of a wide array of Android XR devices to meet the diverse needs of people and businesses. And we are continuing to collaborate with Magic Leap on XR technology and future products with AR and AI.”

One important aspect of the new Android XR platform is that it will support OpenXR, which is a royalty-free, open standard that provides a common set of APIs for developing XR applications that run across a wide range of AR and VR devices, reducing the time and cost required for developers to adapt solutions to individual XR platforms while also creating a larger market of easily supported applications for device manufacturers that adopt OpenXR.

The Khronos Group, promoters of the OpenXR standard, said that “most major XR platforms have transitioned to using OpenXR to expose current and future device capabilities. Vendors with conformant OpenXR implementations include Acer, ByteDance, Canon, HTC, Magic Leap, Meta, Microsoft, Sony, XREAL, Qualcomm, Valve, Varjo, and Collabora’s Monado open source runtime. OpenXR is also supported by all the major game and rendering engines, including Autodesk VRED, Blender, Godot, NVIDIA’s Omniverse, StereoKit, Unreal Engine, and Unity.”

According to Google, “with headsets, you can effortlessly switch between being fully immersed in a virtual environment and staying present in the real world. You can fill the space around you with apps and content, and with Gemini, our AI assistant, you can even have conversations about what you’re seeing or control your device. Gemini can understand your intent, helping you plan, research topics and guide you through tasks.”

The company says that it is also “reimagining some of your favorite Google apps for headsets. You can watch YouTube and Google TV on a virtual big screen, or relive your cherished memories with Google Photos in 3D. You’ll be able to explore the world in new ways with Google Maps, soaring above cities and landmarks in Immersive View. And with Chrome, multiple virtual screens will let you multitask with ease. You can even use Circle to Search to quickly find information on whatever’s in front of you, with just a simple gesture.”

Android XR will also support glasses for all-day help in the future. Google notes that the company wants “there to be lots of choices of stylish, comfortable glasses you’ll love to wear every day and that work seamlessly with your other Android devices. Glasses with Android XR will put the power of Gemini one tap away, providing helpful information right when you need it — like directions, translations or message summaries without reaching for your phone. It’s all within your line of sight, or directly in your ear.”

Android XR: an open, unified platform for XR

Android XR is designed to be an open, unified platform for XR headsets and glasses. For users, this means more choice of devices and access to apps they already know and love. For developers, it’s a unified platform with opportunities to build experiences for a wide range of devices using familiar Android tools and frameworks.

One question remains, though: will Android XR become the “universal” platform for Virtual Reality experiences? Or will Google tire of it and add one more name to the list of abandoned projects, which includes Chromecast, YouTube VR, Google Cardboard,  Google Daydream. VR180 Creator or Stadia. In fact, Google and Microsoft both have a proven track record of projects abandoned.

Still, with OpenXR supported, Samsung’s Project Moohan is, as the company puts it, “one of our most ambitious endeavors yet”, the first headset designed for Android XR. The name “Moohan”, meaning ‘infinity’ in Korean, connotes Samsung’s belief in delivering unparalleled, immersive experiences within an infinite space. According to Samsung, “equipped with state-of-the-art displays, passthrough capabilities and natural multi-modal input, this headset will be your spatial canvas to explore the world through Google Maps, enjoy a sports match on YouTube or plan trips with the help of Gemini. All these experiences come with lightweight, ergonomically optimized hardware designed to ensure maximum comfort during use.”

Samsung, Pimax and DPVR: three new VR headsets for 2025Virtual Desktop can be ported to Android XR

“XR has quickly shifted from a distant promise to a tangible reality. We believe it has the potential to unlock new and meaningful ways to interact with the world by truly resonating with your everyday lives, transcending physical boundaries,” said Won-Joon Choi, EVP and Head of R&D, Mobile eXperience Business. “We are excited to collaborate with Google to reshape the future of XR, taking our first step towards it with Project Moohan.”

“We are at an inflection point for the XR, where breakthroughs in multimodal AI enable natural and intuitive ways to use technology in your everyday life”, said Sameer Samat, President of Android Ecosystem, Google. “We’re thrilled to partner with Samsung to build a new ecosystem with Android XR, transforming computing for everyone on next-generation devices like headsets, glasses and beyond.”

First revealed in February 2023 when Google, Samsung and Qualcomm announced that they would co-develop an XR device, the Project Moohan headset continues to be a mystery in terms of specifications. All that is known right now is that it uses a Snapdragon XR2+ Gen 2 processor, a more powerful version of the chip in Quest 3 and Quest 3S, suggesting it’s a standalone device, so quite different from Pimax’s Dream Air, a native PCVR headset. Still, as Project Moohan is compatible with OpenXR, it will also be easy to use as a wireless PCVR solution, as it will be able to run Virtual Desktop, the most popular software for wireless experiences on headsets as the Quest, HTC Vive or Pico 4.

Guy Godin, developer of Virtual Desktop, revealed to website UploadVR that bringing his native OpenXR app to Android XR “took only a few hours and the basics just worked out of the box”, adding this: ”Personally I think it’s refreshing to work with a platform that wants to collaborate with developers rather than one who tries to block and copy us. Grateful to have more options for consumers in the near future and I’m very excited to bring the best PC streaming solution to Android XR.”

]]>
https://www.provideocoalition.com/samsung-pimax-and-dpvr-three-new-vr-headsets-for-2025/feed/ 0
Sony MR Cruise: Mixed Reality on wheels https://www.provideocoalition.com/sony-mr-cruise-mixed-reality-on-wheels/ https://www.provideocoalition.com/sony-mr-cruise-mixed-reality-on-wheels/#respond Tue, 03 Dec 2024 12:55:07 +0000 https://www.provideocoalition.com/?p=286750 Read More... from Sony MR Cruise: Mixed Reality on wheels

]]>
Sony MR Cruise: Mixed Reality on wheelsMeet the “Time Trip Taxi”, a special service launched this month in Japan: it’s a taxi specially equipped to provide an MR Cruise experience, allowing passengers to travel back and forth in time during the ride.

Sony Group Corporation announced that as the first phase in the launch of its MR Cruise service, which utilizes the mixed reality (MR) technology cultivated through its Sociable Cart: SC-1, the service will be provided in “Time Trip Taxi,” starting this December.

Sony MR Cruise: Mixed Reality on wheels

Delivered jointly with Daiwa Motor Transportation Co., Ltd., this service is the first practical implementation of MR Cruise. By riding a taxi specially equipped to provide an MR Cruise experience, passengers will be able to enjoy content themed around ukiyo-e Japanese art and Edo period culture in the Asakusa, Yanaka, and Ueno areas of Taito ward. According to Sony, “they can experience the feeling of traveling back and forth in time by simultaneously viewing the outside scenery presented inside the vehicle, together with CG and audio related to that location.”

Sony MR Cruise: Mixed Reality on wheelsA new type of experience

With this project Sony found a new use for Mixed Reality that goes beyond the limits of the VR/MR headset we often associate with the experience. Sony’s MR Cruise is a content service that allows the mixed reality experience that Sony has cultivated until now to be installed not only on the Sociable Cart SC-1 that introduced the concept, but on all types of vehicles such as taxis, buses, and trains. This creates the space for a new type of experience that content creators can explore, as the need for content to be used on these platforms may expand, if the idea is adopted in other countries.

Sony says it will apply its technology cultivated through the development of SC-1 in a variety of places to evolve “every movement into an enjoyable movement.” The “Time Trip Taxi” is a new step in a path started on November 22, when Sony announced that it had commercialized the mixed reality (MR) experience it has been providing through its “Sociable Cart: SC-1” as a new business named “MR Cruise”.

Sony MR Cruise: Mixed Reality on wheels

A seamless fusion of reality and digital content

The SC-1 is equipped with mixed reality technology developed by the Sony Group. By superimposing various CG onto images of the surrounding environment shown on the in-car display, the car window, which was previously used only for viewing scenery, can be transformed into an entertainment space, enabling passengers to enjoy the journey itself even more. To date, it has been deployed at a range of locations, including golf courses, botanical gardens, parks, and shopping malls, and experienced by more than 20,000 people.

Sony also announced the launch of the second “MR Cruise” service – “MR ZEN DRIVE” – that enables passengers to experience mixed reality. The service will enable passengers to enjoy content themed around the history and culture of the Eiheiji Town area by riding a Level 4 autonomous driving vehicle specially equipped to provide an MR Cruise experience. They will be able to simultaneously view the outside scenery presented inside the vehicle, together with CG and audio related to Eiheiji Town, in a seamless fusion of reality and digital content.

]]>
https://www.provideocoalition.com/sony-mr-cruise-mixed-reality-on-wheels/feed/ 0
Understanding Zeiss’ CinCraft Scenario System https://www.provideocoalition.com/understanding-zeiss-cincraft-scenario-system/ https://www.provideocoalition.com/understanding-zeiss-cincraft-scenario-system/#respond Thu, 17 Oct 2024 17:53:18 +0000 https://www.provideocoalition.com/?p=285385 Read More... from Understanding Zeiss’ CinCraft Scenario System

]]>
Pro Video Coalition sent contributor Alec C. Cohen to Cinegear Expo Atlanta in order to document some exciting industry developments.

The Zeiss CinCraft Scenario system is a cutting-edge virtual production tool that revolutionizes how filmmakers approach camera tracking and LED wall integration. Designed to simplify and enhance the production workflow, CinCraft Scenario stands out for its ability to offer precise real-time tracking of cameras across diverse environments, both indoors and outdoors. This essay will explore the components, functionality, and impact of the Zeiss CinCraft Scenario system, while also discussing its role in the future of virtual production.

 

Components of the CinCraft Scenario System

At the heart of the CinCraft Scenario system is the CamBar, a piece of hardware that attaches directly to the camera. The CamBar utilizes infrared (IR) emitters and stereoscopic cameras to track natural features in the environment and synchronization technology to track the camera’s movements with precision. This data is then transmitted to the system’s software, allowing for real-time camera tracking even in complex environments. One of the main innovations of this system is that it simplifies the traditionally cumbersome tracking setup.

In most traditional virtual production environments, camera tracking relies on multiple tracking reference nodes placed around an LED wall. These nodes provide the necessary context for the camera to understand where it is in the space. However, the CinCraft Scenario system eliminates the need for these external nodes, reducing the setup to a single CamBar device. This minimal hardware setup enables a streamlined workflow while maintaining accuracy, making the system accessible for a wider range of productions.

Another key component is the Scenario software, which collects data from the CamBar and integrates it into the production’s 3D environment. This software is both functional and user-friendly, designed to take the guesswork out of managing complex camera and LED wall interactions.

 

Functionality of the CinCraft Scenario System

The core functionality of the CinCraft Scenario system lies in its ability to track a camera’s position and orientation in real time, ensuring the seamless integration of live-action footage with virtual elements. The CamBar chooses a master node for tracking and then starts to feed information into the Link box, which is a data bridge between the CamBar and the camera’s lens data. The information is then sent over Ethernet back to the server, then fed into the Scenario software. This software processes the data for integration with the LED wall and 3D environments. This data-driven approach provides several advantages, particularly when working with virtual production stages or volumes, which are often used in high-end cinematic productions.

One of the most notable features of the Scenario software is its capability to input specific camera and lensing information. This means that filmmakers can enter the exact specifications of their camera and lenses, and the software will use that data to adjust the LED wall’s perspective and tracking range. For example, when a camera zooms in or shifts focus, the software compensates for those changes by altering the projected virtual background accordingly. This ensures that the virtual environment remains in sync with the camera’s movements, providing effects like true parallax and distortion, which are crucial for creating realistic depth and perspective in virtual scenes.

 

The Technological Innovation of CinCraft Scenario

CinCraft Scenario represents a significant shift in real-time camera tracking by making the process both more accessible and more precise. One of the key innovations of this system is its ability to leverage natural markers. Unlike older systems that rely heavily on placing reflective markers throughout a set, CinCraft Scenario uses natural objects in the environment, like the edge of a table or a window frame, to track camera movement. This innovation greatly reduces setup time, enabling camera operators to begin shooting with minimal preparation.

For those working in environments that lack natural markers, such as green screens, CinCraft Scenario still offers flexibility. It allows for the use of virtual markers on LED walls or reflective markers in green screen environments, ensuring that tracking remains highly accurate regardless of the shooting conditions.

In addition to its adaptability, CinCraft Scenario’s hardware is lightweight, portable, and modular. The system is designed for easy transport, making it feasible to use in various settings—from on-location shoots to studio productions. This flexibility extends to its ability to work with older lenses that lack built-in metadata. For example, the system integrates with lens control motors from companies like Preston Cinema Systems and ARRI to calculate lens distortion and other parameters, even when using vintage lenses.

 

Streamlined Workflow and Applications

One of the standout features of CinCraft Scenario is its integration of lens data, particularly from Zeiss’s own eXtended Data lenses. This information is crucial for ensuring that the system can accurately calculate the camera’s field of view, which is essential for realistic compositing. This precision dramatically reduces the time and effort needed for post-production tasks such as match-moving, where tracking data is manually aligned with 3D software.

By incorporating real-time tracking, the system also enables pre-visualization on set, allowing filmmakers to see how computer-generated elements will interact with live-action footage before finalizing the shot. For example, when shooting in a green screen studio, directors can position CGI elements in real time and adjust them as the camera moves. This functionality is especially valuable when working with LED volumes, as the background can shift dynamically to match the camera’s perspective.

The system’s ability to shorten prep time by eliminating the need for lens calibration further enhances its appeal. Previously, filmmakers had to calibrate lenses by shooting test charts in advance—a time-consuming process. CinCraft Scenario automates much of this, streamlining both pre-production and on-set workflows.

 

Future Implications for Virtual Production

CinCraft Scenario’s impact on the future of filmmaking is profound. As virtual production continues to expand, the need for efficient, real-time camera tracking will only grow. The system’s democratization of VFX—by making high-quality tracking affordable and accessible—means that even smaller productions can now integrate complex visual effects without the traditionally high cost and expertise requirements.

Moreover, CinCraft Scenario’s modular design allows it to be used in a wide range of production environments. Whether shooting outdoors or in a green screen studio, the system can adapt to the unique needs of the project. This flexibility will be particularly valuable as more productions look to combine practical and virtual elements to create richer, more immersive worlds.

Ultimately, CinCraft Scenario is poised to become a standard tool in virtual production, not only for large studios but also for rental houses and owner-operators. Its seamless integration with cameras and lenses, along with its ability to reduce post-production time, makes it an essential addition to any filmmaker’s toolkit. As the technology continues to evolve, CinCraft Scenario will likely become a cornerstone of how visual effects are integrated into modern filmmaking, enabling faster, more dynamic, and more cost-effective productions.

]]>
https://www.provideocoalition.com/understanding-zeiss-cincraft-scenario-system/feed/ 0
NYU Tisch opens the Martin Scorsese Virtual Production Center https://www.provideocoalition.com/nyu-tisch-opens-the-martin-scorsese-virtual-production-center/ https://www.provideocoalition.com/nyu-tisch-opens-the-martin-scorsese-virtual-production-center/#respond Sat, 05 Oct 2024 10:05:11 +0000 https://www.provideocoalition.com/?p=284875 Read More... from NYU Tisch opens the Martin Scorsese Virtual Production Center

]]>
NYU Tisch opens the Martin Scorsese Virtual Production CenterNYU students now have a leading-edge production facility in which to learn, as the fully functional Virtual Production stage will serve as a training platform and a cutting-edge commercial Virtual Production studio.

ProVideo Coalition wrote on March 2023, that the state-of-the-art Martin Scorsese Virtual Production Center would open in 2024, at the same place chosen for the 2023 edition of Cine Gear Expo, the Industrial City. The space is now open, offering hands-on training in the latest production techniques and further establishing NYU Tisch as a global leader in cinematic arts training by pioneering new technology and expanding training and collaborative opportunities in the performing arts and design.

As ProVideo Coalition noted then, the center was made possible by a major gift announced in 2021 from the Hobson/Lucas Family Foundation, led by Mellody Hobson, co-CEO of Ariel Investments, and filmmaker George Lucas. In addition, the donation funds the Martin Scorsese Institute of Global Cinematic Arts, which includes the Virtual Production Center, the Martin Scorsese Department of Cinema Studies, and support for student scholarships. The gift is the largest in the history of the Tisch School of the Arts.

“We are thrilled to be able to honor our dear friend Martin Scorsese. Through this gift in his name, the Scorsese Institute of Global Cinematic Arts deservedly highlights his legacy as a quintessential American filmmaker and will inspire generations of diverse, talented students. Through time-honored scholarship and hands-on instruction on the state-of-the-art digital technology at the Institute, artistic vision will come to life where storytelling meets innovation,” said Mellody Hobson and George Lucas.

The Martin Scorsese Virtual Production Center will help to put world-leading Virtual Production on the map on the East Coast, as the fully functional Virtual Production stage will serve as a training platform for post-graduate NYU students and a cutting-edge commercial Virtual Production studio for the film and advertising industry.

NYU Tisch opens the Martin Scorsese Virtual Production CenterHere is some more information about the project:

Rosanne C. Limoncelli, Senior Director of filmmaking Technologies at NYU Tisch, the internal driving force for the project, explains, “The reason I really wanted to do this program is that I kept hearing from designers, directors, and cinematographers that there are not enough people with experience in Virtual Production that we can hire. We aim to help bridge that gap and introduce new talent into the industry.”

The school offers a new 36-credit Master of Professional Studies degree in Virtual Production with a cutting-edge curriculum for filmmakers to learn how to use Virtual Production as a toolset in their storytelling process. The 45,586 square-foot facility lives on the top floor of Building 8 at Industry City, a 35-acre innovation campus on the Brooklyn waterfront. The Center features two double-height, column-free stages, two television studios, industry-standard broadcast and control rooms, dressing/make-up rooms, a lounge and bistro, scene workshops, offices, postproduction labs, finishing suites, and training spaces.

As the lead systems integrator, AbelCine led the entire facility’s production and broadcast systems build and worked with partner Lux Machina on the virtual production integration. AbelCine also outfitted the facility with ARRI ALEXA 35 cameras, ZEISS Supreme Prime lenses, and an integrated lighting grid featuring ARRI SkyPanels for the virtual production stage.

“At AbelCine, we pride ourselves on supporting the next generation of creatives, which is why we’re so excited to have NYU Tisch located within the vibrant Industry City campus,” says Pete Abel, CEO & Co-Founder of AbelCine.

“Not only will NYU students now have a leading-edge production facility in which to learn, but they will do so alongside the more than 90 media companies on the IC campus. This will spark immeasurable collaborative and creative opportunities for the students. We’re grateful to be involved in this project from the outset and to witness firsthand the kernel of an idea blossoming into the amazing and unique educational facility that has been unveiled.”

The Virtual Production Stage is a 180° LED volume measuring 26′ deep x 41′ wide x 17′ high on a 3,500-square-foot soundstage with innovative mobile workstations. As part of the high-end technical specification, ROE Visual Black Pearl 2V2 panels were selected utilizing the Megapixel Helios LED processing platform. Lux Machina’s custom-developed ARCA media servers power the stage. They are optimized for Virtual Production and can switch between multiple content rendering platforms, including pre-installed Pixera licenses. The NYU Center has also been outfitted with Vicon’s Shōgun entertainment market software and 40 Vicon cameras: the combination of the two provides a state-of-the-art motion capture system for virtual production.

Lux Machina has long been considered one of the leading innovators and proponents of Virtual Production. Co-Founder and Executive Vice President of Operations and Finance at Lux Machina, Zach Alexander comments on the recent expansion of business into the educational environment, “In terms of the ‘why now?’ and ‘why universities?’, I would expand on Rosanne’s statement and note that while many of these aspects of Virtual Production have been around for quite some time, they’re developing extremely quickly and we’re at the point where we have outpaced the available resources on a very practical level. We need to educate the next round of professionals in the industry-standard technologies and workflows that are used daily on film, television, and live productions worldwide.”

Britainne Pedersen, Senior Producer, elaborates on the approach, “The importance of education in Virtual Production is that we allow students to access a body of knowledge and a skill set that Lux Machina has been developing over the last 12 years. Students don’t have to start at zero. They can join us where we are now, we can level up the industry, and build the next version of this technology together.”

NYU Alumnus Sang-Jin Bae counts Rosanne C Limoncelli as a mentor and was thrilled to be approached and hired as the Director of the Martin Scorsese Virtual Production Center. Sang has taught for 26 years and worked commercially on long-form TV series and episodic animation. Speaking on the relationships formed with this project, Sang states, “During the research process, we discovered industry leaders AbelCine and Lux Machina. AbelCine was one of the first companies to bring production to Industry City, and Lux Machina has been a leading innovator in Virtual Production for over a decade. We are excited to capitalize on the experience that both groups bring to the table, and we are excited to bring this technology to the next generation of filmmakers… even though it’s called Virtual Production, it is eventually going to just be called Production.”

Rosanne C. Limoncelli comments on her vision for the facility’s future: “We’re looking to partner and collaborate to put world-leading Virtual Production on the map on the East Coast, establishing a strong community where our beautiful visual artist graduates and the wider commercial industry can mutually benefit.”

The facility officially opened with an initial class of 24 students from all over the world. To learn more about the program go to www.tisch.nyu.edu.

]]>
https://www.provideocoalition.com/nyu-tisch-opens-the-martin-scorsese-virtual-production-center/feed/ 0
Play for Dream MR: the world’s first Android-based spatial computer https://www.provideocoalition.com/play-for-dream-mr-the-worlds-first-android-based-spatial-computer/ https://www.provideocoalition.com/play-for-dream-mr-the-worlds-first-android-based-spatial-computer/#respond Wed, 26 Jun 2024 10:50:57 +0000 https://www.provideocoalition.com/?p=281535 Read More... from Play for Dream MR: the world’s first Android-based spatial computer

]]>
Play for Dream MR: the world's first Android-based spatial computerAs the term Mixed Reality is used loosely to support the marketing purposes of different companies, a new Chinese AR and VR headset is announced, with specs like Apple Vision Pro… but with battery.

It’s “a blatant Apple Vision Pro clone but with a Quest Pro style rear battery” according to website UploadVR, but this new mixed reality headset comes from a company with experience in the market, and two other headsets already produced, the YVR 1, introduced in 2021, and the YVR 2, first all-in-one VR headset with pancake lenses from the company.

Play for Dream introduced its technology at the 2024 Mobile World Congress and now the company reveals the Play for Dream MR, claiming it’s the world’s first Android-based spatial computer, designed to “seamlessly blend digital and reality worlds to deliver powerful spatial entertainment experiences that revolutionize how people reach the world.”

Play for Dream MR: the world's first Android-based spatial computer

Play for Dream MR was announced in Singapore, as part of a strategic move that marks Play for Dream’s first international expansion, establishing its Asia-Pacific marketing headquarters in the Lion City. The global launch ignited in Singapore, drawing in key media from APAC markets, investors and other stakeholders.

Play for Dream MR: the world's first Android-based spatial computerEight exclusive DTS-customized spatial sound effects

As the world’s first Android-based spatial computer, Play for Dream MR is a mixed reality (MR) headset, that sets, the company says, “a new standard in spatial entertainment” as “it elevates your experience across various scenarios including IMAX- Level movies, eight exclusive DTS-customized spatial sound effects, immersive MR gaming, stunning 3D spatial video recording and photography, and more to explore. Featuring industry-leading specifications, a powerful self-developed algorithm architecture, it creates a seamless fusion of digital and real worlds, delivering an unparalleled spatial entertainment.”

Play for Dream MR: the world's first Android-based spatial computer
The YVR 2 VR headset was already used for IMAX- style movie streaming

Play for Dream has experience with movie streaming as it is one of the key points of its YVR 2 VR headset, and the company is now teaming up with industry leaders to revolutionize Virtual Reality… pardon, “spatial entertainment”. As the world’s first IMAX and DTS strategic partner in the spatial computing and MR field, one of the pioneering devices equipped with Snapdragon XR2+ Gen 2, Play For Dream MR delivers, the company says, “a whole new experience” thanks to collaborations with giants like IMAX, DTS, Qualcomm, Unity, BOE, that “ensure an unparalleled premium user experience, combining advanced technology with seamless integration.”

Play for Dream MR: the world's first Android-based spatial computerYour own 1000-inch IMAX-level giant screen theater

Play for Dream MR sounds like a dream for those who want to watch IMAX- style movies at home. Promising to set a new standard for spatial cinema, with 1000-inch IMAX-level giant screen theater, it also features DTS Simulated 7.1 Channel Spatial Sound Effect, recreating cinema level spatial reverb and surround sound, while offering 8 adaptive personalized sound settings for an immersive experience in movies, gaming, or work.

The specifications for the new headset are exciting: boasting 8K Micro-OLED screens and an ultra-light resin Pancake optical solution, it delivers an astonishing 27 million pixels at 3882 PPI, with 45 PPD at the center. Powered by the Qualcomm Snapdragon XR2+ Gen 2 chip, Play for Dream MR ensures exceptional performance with minimal power use. Its cutting-edge camera and sensor array, including 11 cameras, 7 types of sensors, and 22 infrared LEDs, provide precise and efficient tracking with a color VST latency as low as 14ms.

Play for Dream MR: the world's first Android-based spatial computerIntegrated light-weight curved battery

The headset has its own integrated light-weight curved battery. The company says that compared to flat batteries, this solution offers higher space utilization, a thinner battery compartment, and more comfortable wear. The built-in high-capacity battery and external power bank provide dual solutions, supporting 3 to 3.5 hours of worry-free play. Paired with a fast-charging 45W GaN adapter, enjoy lightweight, easy to carry, and quickly recharges anytime. The built-in battery also contributes to balance the weight distribution (3:2, according to the company) to minimize pressure on the face and make the headset more comfortable to wear.

Play for Dream MR: the world's first Android-based spatial computerThe Play for Dream MR will run its own OS but will be compatible with Steam and will allow for PC wireless streaming through Wi-Fi. Compatible with a vast amount of Android apps, it will also support various Bluetooth peripherals. With controllers available, and eye tracking as part of the specifications, this is a headset which offers new options in terms of Virtual Production, besides being compatible with mainstream office applications and multi-screen streaming,  enabling multi-screen work for a blended virtual and real immersive experience. And entertainment too… allowing users to watch movies and seamlessly switch between MR and VR games, play 2D games on a large screen, and stream over 6000 Steam AAA games with a single click, immersing yourself in a truly captivating experience.

Price for the new VR headset is not announced yet, and it will not be cheap, but the company suggests it will be under $2000… what might explain why Apple is working on a new, more affordable version of its Apple Vision Pro.

]]>
https://www.provideocoalition.com/play-for-dream-mr-the-worlds-first-android-based-spatial-computer/feed/ 0
VR and mocopi change how audiences experience live performances https://www.provideocoalition.com/vr-and-mocopi-change-how-audiences-experience-live-performances/ https://www.provideocoalition.com/vr-and-mocopi-change-how-audiences-experience-live-performances/#respond Mon, 24 Jun 2024 11:01:16 +0000 https://www.provideocoalition.com/?p=281471 Read More... from VR and mocopi change how audiences experience live performances

]]>
Virtual Reality and mocopi change how audiences experience theaterThe local theater community in La Jolla, San Diego is using new technologies to merge the physical and virtual worlds by allowing real-time interaction between audiences and cast members.

Combining theater performances with technology is the way forward, according to Trisha Williams, Director of UC San Diego’s MAVERiC Studio and CEO of Origami Air, who worked with the local theater community in La Jolla, San Diego in the latest version of “Without Walls” (WOW), the La Jolla Playhouse’s signature performance program.

Without Walls (WOW) that takes art outside traditional theatre walls and into unique spaces, including the annual WOW Festival and WOW standalone productions. From a car to a bar, from a beach to a basement, WOW invites audiences of all ages to interact with artists and art in unexpected ways, bringing people together and reimagining what storytelling can be.

WOW has become one of the United States most acclaimed immersive performance programs. Since its inception in 2011, La Jolla Playhouse has commissioned and presented a diverse series of immersive, site-inspired and virtual productions under the umbrella of WOW. These works take place throughout San Diego, including nine stand-alone productions, six WOW Festivals and 14 Digital WOW pieces.

The theater used the new technologies provided by Sony’s 27-inch 3D Spatial Reality Displays (SRD) and mocopi wireless motion capture system to merge the physical and virtual worlds by allowing real-time interaction between audiences and cast members and elevate the art of immersive, interactive storytelling.

Virtual Reality and mocopi change how audiences experience theaterActors wearing the mocopi devices

The partnership with MAVERiC Studio, located in UC San Diego’s Design and Innovation Building, and Origami Air Co., a San Diego-based XR development company, allowed the theater to develop and build a virtual environment, featuring highly expressive motion capture-generated characters, which was published to VRChat. The results showcased the theater experience at its best, when audiences are transported to a different place and allowed to immerse themselves in a performance.

“SRD and mocopi became part of the performance for Without Walls,” said Trisha Williams. “Audiences could see a full 360-degree spatial view of actors wearing the mocopi devices to create this huge 15-foot character with large gestures and then react in real time through the SRD. It felt so real and mystical. Everyone enjoyed seeing a motion capture performance come to life in a completely different way.”

Here is some more information, shared by Sony, about the project and its importance to the future of theater performances. The whole project also reveals a real-world usage of mocopi, confirming the potential for this accessible motion capture system solution:

An Easy and Seamless Creative Workflow

The SRD and mocopi technologies quickly became the foundation for everything the creative teams were trying to accomplish, from giving student-performers easy, mobile access to full-body avatars to the design process for creating new characters and environments.

“We started using the SRD to show our artists and action performers who were inhabiting our avatars what they were capable of and how they could move around in a space without having to step into that space,” said Joe Unger, Chief Ecosystem Officer at Origami Air and the content manager at UC San Diego’s MAVERiC Studio.

Unger added he enjoyed working with the glasses-less 3D displays because, “I can see what I’m doing in real-time without having to put on glasses or a headset. You can see 3D objects, move them around, scale them and work on them side-by-side with my artists and with the other devices we’re using to create content. I can tell what an object is going to look like before it ever enters a scene.”

Using the combined Sony technologies, Williams created and designed the virtual scenes and worlds using mocopi and SRD Maya Plugin (a 3D computer graphics software) for real-time creation, before moving to the actual performance. Designing in Maya and seeing modifications, as they happened, on the SRD and mocopi helped the creative team decide and finalize the best approach for this project.

Williams also found the glasses-less SRD design to be a significant improvement. “Having to put a headset on to see what a character looks like in virtual reality takes you out of the workflow,” Williams said. “It’s so nice to be able to look at the SRD and see that full 3D, 360 view of a character and then go right back into the Maya workflow.”

The decision to use mocopi came down to mobility and convenience.

“It was the easiest solution for everyone to quickly jump into full body avatars and we could also send that pocket-sized device home with our actors so they could continue to perform,” Unger said. “It was up to the actors to continue to rehearse the performance on their own and mocopi enabled that. I can hand this to an artist, they can set it up at home and begin animating in real-time, right away.”

Virtual Reality and mocopi change how audiences experience theaterA New Language for Theater

The MAVERiC Studio and Origami Air teams worked with Hortense Gerardo, Director of UC San Diego’s Anthropology, Performance and Technology Program to create a unique form language that was optimized for mocopi and used during performances to converse with audiences Gerardo also wore the mocopi devices during the performances, getting a first-hand experience with this new motion-capture system. mocopi enabled the actors to fully express themselves with the full-body avatar within the VRChat environment, which is essential to a theatrical performance. mocopi helped to give character’s life as if they were doing a live performance.

“Once I familiarized myself with how to synchronize the system and how to move my whole body to accommodate the way that it functions, it was easy to use and allowed me to embody the character fully,” Gerardo said. “People genuinely reacted whenever the creature responded to the audiences. They sensed fear and excitement, and they could tell friendliness versus if the creature was reluctant to engage. It drove a wide range of emotions.”

Gerardo added the addition of mixed reality technologies to theater performances mirrors the shift happening across other media and entertainment platforms as audiences now expect more interactive and immersive viewing experiences.

“This will inspire artists and technologists to create in VR spaces as we enter a new era where people are increasingly going to be engaged in these new technologies,” she said. “It was also just a lot of fun!”

Virtual Reality and mocopi change how audiences experience theaterA “Center of Technological Learning”

Unger described MAVERiC Studio as a center of technological learning, a place where students can get hands-on time with the newest tools that are reshaping the professional workplaces they will soon enter and SRD and mocopi are now part of the technology the studio uses to showcase and teach XR students on how to create projects for virtual worlds.

“Before we started experimenting with mocopi, our emotions were centered around the front of our body,” said Aria Grossfia, Student, UC San Diego. “That didn’t show up well in VR, so with mocopi we were able to shift the movements to be larger and more ‘outside’ and around us. It made sense that these motions would be larger and a little more intimidating to the audience and it was exciting to watch the actors use mocopi to bring this new language to life.”

Jack Meir, Student, UC San Diego, also found that mocopi helped him focus on effectively communicating with the audience and translating his movements freely while controlling the character in VR, without having to think about animations or rigging of the model.

“The most useful feature of mocopi for me is its ability to stream to different outlets and devices,” he said. “I was able to use a character in VR smoothly, controlling the legs and the arms, and then easily switch over to live mocap streaming on different devices or any other type of real-time simulation in addition to virtual reality. It’s a versatile device.”

Williams is confident they are not only positively impacting the theater-going experience today but also setting the stage for meeting heightened audience expectations in coming years.

“Watching students put these technologies to use in ways we never ever imagined was really exciting,” Williams said. “This is the future of live performance, especially combining theater and technology the way we are.”

]]>
https://www.provideocoalition.com/vr-and-mocopi-change-how-audiences-experience-live-performances/feed/ 0
Closed Loop Colour for Virtual Production https://www.provideocoalition.com/closed-loop-colour-for-virtual-production/ https://www.provideocoalition.com/closed-loop-colour-for-virtual-production/#respond Mon, 17 Jun 2024 15:17:23 +0000 https://www.provideocoalition.com/?p=259638 Read More... from Closed Loop Colour for Virtual Production

]]>
Reverse side of an LED video wall showing interconnections and power cables for each foot-square panel.
At some point, most of these video wall displays are operating in something like DCI P3.

Almost everyone reading this will have had the experience of pointing a camera at a real-world scene and then observing the resulting image on a monitor, and will know that it’s never really been expected that the two should match with any accuracy. Given the eyelid-twitching preoccupation we often have with calibration and accuracy, that might come off as carelessness, though it’s also an inevitable concomitant of displays that don’t have infinite dynamic range or colour gamut. In digital cinematography in general, that’s created a none-too-wonderful situation in which every camera has a couple of different ways of representing colour and brightness. It’d be great if that didn’t happen in virtual production, too.

It’s not entirely a new idea. Manufacturers don’t like the idea that all cameras should look the same, but normalisation of cameras has actually been implemented in software like Cinematch, based on precision analysis of a lot of camera and their behaviour. Matching cameras to each other and to the real world is clearly possible, but what we might call a universal opto-electronic transfer function has never been normal.

Perhaps it should be normal

One specific world in which that has begun to matter is in virtual production, where we might shoot material which will be displayed as part of a scene and rephotographed on other cameras. It’s a less flexible, but potentially cheaper approach than realtime 3D rendering. Camera-fresh material will often need quite significant compositing work before being ready for the display, which has a cost implication, but either way, colorimetry and brightness must be right.

At the time of writing, “right” is a moving target that’s mostly approached with nothing more sophisticated than tweaking until it looks right. That can usually only be done on the day and it’s really slow for people shooting film. That’s generally true both for real time computer-rendered backgrounds and for a scenes which have been photographed in reality, although there’s a lot more flexibility in the computer world. The surprise is that many video walls used for virtual production are operated roughly as Rec. 709 or DCI P3 displays, even though the emitters would probably allow for wider colour gamut. Brightness can be different from the written standards, but the colour is often operated quite conservatively.

Either way, there remains a need for an experienced hand to adjust until things fall into place. If this seems to you like a slightly unsatisfactory state of affairs, you’re not alone, and virtual production people are increasingly expressing the desire to regularise things, creating a closed loop for colour control in virtual production. This is difficult given the huge quantity and variety of equipment used in virtual production, which is a world of very many moving parts, from cameras, rendering and playback servers, through graphics cards, video wall processors, the tiles themselves, another camera and its associated monitoring.

Racks of control equipment for an LED video wall in virtual production, with many controls and connections.
Whether this is ever going to get any simpler remains to be seen.

Virtual production colour calibration

We could speculate about solutions, but we would be speculating. It is, for instance, quite valid to point a monitor calibration probe at an LED video wall in exactly the same way we’d point it at a projection screen, making it theoretically possible to close that loop. Doing that might require quite a lot of forward combinations of lookup tables, and doing that might require forward combinations of lookup tables in ways that would require some external tools. The problem is, while that might work, any such solution would very likely only remain valid for one very specific configuration of equipment, so it’s not that clear how much it’d help.

So the practicality of this is uncertain. To some extent, this is a standardisation problem, where “flexibility” and “standardisation” are often concepts in competition with one another. As so often, the vicious circle is that high-end, rarefied and expensive technologies such as virtual production stages lack standardisation because they need the flexibility to solve problems caused by the lack of standardisation. In theory, that’s a very easy problem to solve by simply dictating to everyone how they’re required to do things. In reality, that sort of enthusiastic cat-herding is likely to run into significant resistance from people who want to indulge their creative desires, invent new things, and show off their capabilities.

Technicians operating an LED video wall.
Two technicians at NAB 2022 making it look… well, not all that easy, actually.

It’s a complex problem, but it would be nice to think that we could avoid the parlous situation that’s been allowed to emerge in monitoring. Every camera and display manufacturer has a at least a couple of different ideas about how brightness encoding should work, as if the phrase “base two logarithm” was somehow ambiguous. Virtual production colour control is a more complicated problem, but it’s also a smaller field, and it’d be nice to imagine a future that’s a little less reliant on leaning on the controls until it looks right. Or perhaps that’s inevitable.

]]>
https://www.provideocoalition.com/closed-loop-colour-for-virtual-production/feed/ 0
Rob Lowe’s first encounter with Virtual Production https://www.provideocoalition.com/rob-lowes-first-encounter-with-virtual-production/ https://www.provideocoalition.com/rob-lowes-first-encounter-with-virtual-production/#respond Thu, 14 Mar 2024 10:11:16 +0000 https://www.provideocoalition.com/?p=277606 Read More... from Rob Lowe’s first encounter with Virtual Production

]]>
Rob Lowe's first encounter with Virtual ProductionBlackmagic URSA Mini Pro 12K digital film cameras were used to create digital twins of the real world, to transport Rob Lowe to historically significant Mystic Seaport… but without leaving Los Angeles.

It all started with an inquiry that set in motion a search for a solution. Global Objects, a production studio utilizing the latest 3D capture technology to create digital twins of the real world, “received an inquiry from a FOX executive wondering if virtual production could address some logistical challenges for a project,” explained Jess Loren, CEO, Global Objects. “The task was to transport Rob Lowe to the charming and historically significant Mystic Seaport but without leaving Los Angeles.”

Eager to demonstrate the pinnacle of its virtual production capabilities, the Global Objects team collaborated with Stephen David Entertainment, traveling to Mystic Seaport in Connecticut for detailed image capture and Lidar scans, which included 12K plate footage completed with the URSA Mini Pro 12K.

Rob Lowe's first encounter with Virtual ProductionMaking the virtual environment more believable

“The URSA Mini Pro 12K was pivotal because its ultra high resolution allowed us to shoot plates in 12K with the flexibility to reframe and adjust content directly on set without losing any detail,” said Loren. “For a project like ours, where precise and high quality imagery was crucial, this capability was invaluable.”

This on set flexibility also translated to post production. “The 12K resolution also meant that we could shoot the plates wider than normal, so we could punch in on specific areas of the footage and still maintain crisp, high quality detail, providing us with more options in post production,” added Loren. “This flexibility was essential for fine tuning the final visual output to match our exact creative vision.”

“The high resolution and flexibility in reframing ensured that the final digital environments were not only visually stunning but also rich in detail. This enhanced the overall immersive experience, making the virtual environment more believable and engaging for the audience,” she continued.

Rob Lowe's first encounter with Virtual ProductionCreating a 3D level in Unreal Engine

In addition to the digital replication of Mystic Seaport, Global Objects was tasked with managing the entire workflow during the shoot in Los Angeles. “Overall, our responsibilities were twofold: to capture and ensure that every detail of Mystic Seaport’s unique historical setting was accurately replicated in a digital twin format; and to manage the entire production and virtual production workflow during the shoot with Rob Lowe, which included implementing the scanned content by creating a 3D level in Unreal Engine for filming on the LED wall,” said Loren.

During the live production phase in Los Angeles, Global Objects employed an ATEM SDI Pro ISO live production switcher, which played a key role in managing the live cut of the multicam shoot, ensuring a smooth and efficient workflow. A Blackmagic Pocket Cinema Camera 4K digital film camera was also used to capture behind the scenes footage, and DaVinci Resolve Studio was used to edit and color grade the project’s promo reel.

Rob Lowe's first encounter with Virtual ProductionThe impact and potential of Virtual Production

“On set, the ATEM switchers are indispensable for recording and monitoring the footage being shot. They also facilitate the quick provision of proxies to the editorial team, streamlining the production process significantly,” noted Loren.

“Any time you get to work with a star like Rob Lowe is fun and always a highlight. What made it particularly memorable and enjoyable was the fact that it was Rob Lowe’s first encounter with virtual production,” she concluded. “Introducing someone of his stature and experience in the film industry to the innovative world of virtual production was not just a highlight for us, but a moment of pride as well. It underscored the impact and potential of virtual production in revolutionizing traditional filmmaking, and to be at the forefront of this transformation was a remarkable experience.”

The docuseries “Liberty or Death: Boston Tea Party” celebrates the 250th anniversary of one of the most legendary moments in the American Revolution: The Boston Tea Party. Rob Lowe hosts this  retelling of the Boston Tea Party in a must-see 4-part docuseries. This was the spark that lit the torch of liberty… In 1773, the people of Boston were faced with a choice: Live under British rule without representation or stand up against tyranny. In an act of defiance, the Sons of Liberty stood up to the crown.

]]>
https://www.provideocoalition.com/rob-lowes-first-encounter-with-virtual-production/feed/ 0
Pixotope Pocket: Virtual Production tool for iPhone available now https://www.provideocoalition.com/pixotope-pocket-virtual-production-tool-for-iphone-available-now/ https://www.provideocoalition.com/pixotope-pocket-virtual-production-tool-for-iphone-available-now/#respond Fri, 08 Mar 2024 15:21:24 +0000 https://www.provideocoalition.com/?p=277444 Read More... from Pixotope Pocket: Virtual Production tool for iPhone available now

]]>
Pixotope Pocket: Virtual Production tool for iPhone available nowUsers only need a smartphone running iOS, a PC with Windows system, and Pixotope Graphics license to create powerful and immersive content anywhere without the need for a fully equipped studio.

Introduced on May 2023 as the next evolution of the Pixotope Education Program, the Pixotope Pocket app gives aspiring VP professionals what they need most: access to AR and virtual studio (VS) tools and workflows.  The app, which was initially introduced as an exclusive offering for Pixotope Education Program partners, democratizes the creation of immersive content by providing easy, unfettered access to augmented reality (AR) and virtual studio (VS) tools and workflows. Now Pixotope, the leading software platform for end-to-end real-time virtual production solutions, announces that the app is commercially available.

“Pixotope Pocket is maturing beyond its initial educational focus, transforming into a powerful tool for all content creators,” states David Dowling, Pixotope’s Chief Revenue Officer. “This accessible solution empowers creatives to explore and test virtual environments with ease, ultimately enhancing pre-production efficiency and streamlining the overall virtual production process. By leveraging Pixotope Pocket, creators are no longer confined to studios; they have the freedom to work from anywhere with minimal equipment.”

Pixotope Pocket: Virtual Production tool for iPhone available nowNo need for a broadcast camera

Unlike traditional AR and VS workflows, with Pixotope Pocket, there is no need for a broadcast camera and tracking software and hardware. The creative can use their mobile phone camera to shoot footage while Pixotope Pocket takes care of the camera tracking. It does this by combining device motion tracking, camera scene capture, and advanced scene processing. Video and tracking data are transmitted via SRT stream through a local network to the local machine that has Pixotope Graphics installed.

David concludes: “We aim to democratize this powerful technology by placing it in the hands of creatives. We envision camera operators unlocking the potential of AR visualization and social media content creators pushing the boundaries of storytelling. We’re excited to see how these professionals will leverage the technology to shape the future of virtual production.”

Pixotope Pocket is currently available for iOS with a planned expansion to Android platforms, opening up a world of possibilities to creatives across various operating systems. Pixotope will be at NAB 2024, in April, to demonstrate the Pixotope Pocket solution as well as the company’s other products.

]]>
https://www.provideocoalition.com/pixotope-pocket-virtual-production-tool-for-iphone-available-now/feed/ 0