Let’s Get Experimental: Behind the Adobe MAX Sneaks

By Adobe Conversations Team

November 7, 2016

Does this sound familiar? You send your client a completed video project and they ask you to make a last minute change to the voiceover…but the voiceover artist is already on a plane to Hawaii. Well, thanks to the technology behind “Photoshopping Voiceovers,” this soon may no longer be an issue.

“Photoshopping Voiceovers,” or what we affectionately refer to as #VoCo, was one of 11 experimental technologies demoed at Adobe MAX 2016. The MAX Sneaks session invites our engineers out of the lab and onto the stage to show off what they’ve been working on. Co-hosted by television personality and comedian Jordan Peele and Adobe’s Community Engagement Manager Kim Chambers, this year’s sneaks had us alternating between laughing and gasping in awe.

Fortunately, you can enjoy the show even if you weren’t able to join us in San Diego. Take a sneak peek at our 2016 Sneaks. We’ve highlighted three of our favorites from the session below.

While these technologies are not yet part of Creative Cloud, many Sneaks from previous years have later been incorporated into our products. As always, we’d love your feedback. In fact, we’ve given each of the demos its own #hashtag for this very purpose. You’re welcome.


When recording voiceovers, dialogue, and narration, wouldn’t you love the option to edit or insert a few words without the hassle of recreating the recording environment or bringing the voiceover artist in for another session? #VoCo allows you to change words in a voiceover simply by typing new words. Have to hear it to believe it? Check out a live demo using a recording of co-host Jordan Peele’s voice.


You’ve heard it said that art imitates life. But what about technology imitating art? #Stylit allows you to transfer your personal artistic style from paper to screen, making your digital art appear like it was colored by hand. How does it work? Watch the video to see it in action.  Spoiler alert: It involves oil pastels, spheres and 3-D models.



Because you can’t edit 360° video while immersed in 360° video, the current process for editing VR footage is tedious and clumsy, requiring ad-hoc workflows and desktop plugins. At least that’s what we’ve been told by very smart people. #CloverVR aims to solve this problem by providing editors dedicated tools to view and edit directly in VR. Check out the video to go inside the headset.

Inspired/amazed/awed by what you saw here? Don’t forget to tell us on social, and check back here next week for the remaining eight demos.

Terms of Use | Copyright © 2002 - 2016 CONSTITUENTWORKS SM  CORPORATION. All rights reserved. | Privacy Statement