A few months before the coronavirus pandemic forced production to shut down, the sound team at Sony Pictures Entertainment was introduced to the 360 Virtual Mixing Environment headphone tech from the company’s Japan R&D team. At the time, no one could realize the full scope of the benefits provided by the technology, which enables sound professionals to mimic the characteristics of any studio or environment, even if they’re working from a home garage.
Tommy McCarthy, executive VP of post-production services for SPE, had planned to roll the system out to a few members of the team to test and demo, but once COVID hit, he says, “the tool became a game changer.”
With soundstages and theaters closed, the acoustics for a 360-seat theater like the Cary Grant Screening Room on Sony’s lot could be recorded using the Virtual Mixing Environment, and McCarthy could listen to the sound “for an exact replication.”
Supervising sound editor, sound designer and re-recording mixer Steven Ticknor is among those for whom 360 VME has made a difference. For years, Ticknor (“Bad Boys for Life,” “Spider-Man: Far From Home”) has had the luxury of working on the Sony lot, surrounded by speaker systems with Atmos sound in his office. He was introduced to the VME headphones six months before the lockdown.
When the work-from-home pivot happened, Ticknor started using them; the technology performed as advertised, allowing him and others to do the same quality of work at home that typically required a Hollywood soundstage. “Those headphones got us to a place where we could keep working, where we could be efficient,” he says. “It gave us a lifeline to keep having a living.”
Another feature of the Virtual Mixing environment has enabled teams around the country and the world to use the tool simultaneously on projects including “Ghostbusters: Afterlife” and the highly anticipated “Venom” sequel. “If you had a TV show mixing in Burbank, you could have a showrunner anywhere do playback using a livestream,” says McCarthy. “They would listen to it at the same time as our mixers are hearing it, as if we were in the room.”
Supervising sound editor Kami Asgar discovered an additional attribute of the technology. He was working in a 14-by-20-foot garage on the sound design for “Venom: Let There Be Carnage.” With his kids home, surrounded by a washing machine and AC units, Asgar had been using regular Sony headphones with subwoofer speakers. “I was blowing up things — the crescendo of the movie with sound effects,” he says. Every 10 minutes someone would interrupt him, asking him to turn down the sound. McCarthy introduced him to the VME headphones. “It was the biggest game changer of my life,” says Asgar. “I was able to work, playing scenes loud, without disturbing anyone.”
At one point, Asgar had to mix the last reel with director Andy Serkis in London. A tad skeptical of the new technology, he went onto the Sony lot the next day to ensure the sound matched. “It was stunning,” he says. “It sounded exactly how it sounded in my headphones. Before this, headphones would sound different because they don’t match the listening environment. This equalizes it, and everyone is hearing the sound the same way.”
Read More About:
Source: Read Full Article