THE FUTURE FOR ADOBE –
INTERVIEW WITH BILL ROBERTS

Digital technology has permanently changed the way we make films. No longer are editors shut up in dark rooms, developing and slicing large reels of celluloid. These days you can grab your laptop, or tablet, sit at your favourite cafe and edit scenes whilst sipping lattes.

In fact, at no other time in the history of cinema, has film been so inexpensive to make, and its tools so readily available for all.

It’s easy to see how Adobe has been at the forefront of these technological shifts. Largely due to the availability of it’s Creative Cloud subscription service, putting into the hands of indie filmmakers a relativly inexpensive stable of programs, covering all aspects of the post production workflow.

Editor, colourist and guest writer Julien Chichignoud caught up with Bill Roberts, Adobe’s Sr. Director of Product Managment for Video, during the recent SMTPE conference in Sydney.

This is the first of our two-part series on the future of Adobe and what that means for indie filmmakers.

Bill Roberts, Adobe Sr. Director of Product Management for Video.

Bill Roberts

As the Sr. Director of Product Management for Video, could you explain what your role entails at Adobe?

My job has many aspects, but primarily it’s to ensure we’re developing the right products for the market.  This involves extensive time by my team working with customers and our engineers – often trying to bridge the two worlds.

Customers are a great resource for feedback on how to iterate or improve existing products and engineers often have brilliant, revolutionary ideas. My role requires me to ensure these ideas are focused and applied to the products in the right way, leading the team to develop truly meaningful products and features that directly address customer concerns.

A great example of this is how our team developed Time Tuner – a feature found in Adobe Premiere Pro CC 2015. The original concept was called ‘Faster Forward’ which analyzed the complexity of a shot, create metadata based on this complexity, and then use the metadata to try to deliver the ‘visual change’ at the selected speed where the user would fast forward. This means images might go 16X for a lock off shot with no action but only .5X for a high action fight scene.

The team thought the feature was valuable but was surprised to learn that users hated the experience. However, we realized the underlying metadata on visual complexity was very valuable and revisited the idea when a major US broadcaster asked us about improved techniques for re-timing completed programming to sell in overseas markets. Our product team determined what the result would be if we used the ‘Faster Forward’ metadata to select where to be aggressive on compressing time in areas of low action while leaving the high action areas alone. We concluded if we made the metadata ‘compositionally aware’ (understanding the cuts in a timeline so that re-timing is spread across the entire timeline, not just compressed at the start), we could have an incredible feature for our customers.

Time Tuner was released in Premiere Pro CC as part of a major Adobe Creative Cloud update in June this year. By working with a broader group of customers through our pre-release program, we’re often able to deliver fantastic new features. Watching us take an idea and transform it into a real-world innovation is something amazing!

Adobe Premiere Pro CC

Adobe Premiere Pro CC

While After Effects and Photoshop have been the flagship products driving customers to Adobe for a while, more recently eyes have turned to Premiere Pro as the focus of both the users’ and the development teams’ attention.

Do you see it as the central product all the other video tools revolve around?

Premiere Pro CC is the hub for all Adobe video workflows. With Creative Cloud, users have access to the full collection of Adobe tools and services. We found that video professionals are the most voracious users of Creative Cloud, using more tools and services across a range of disciplines to complete their work. For example, it’s not uncommon to have a logo from Illustrator CC and image from Photoshop CC combined in an After Effects CC comp and dynamically linked to a Premiere Pro timeline. Ultimately, telling stories with video is about presenting the viewer with imagery and sound over time, which explains why Premiere Pro CC is so well-suited as a hub for many creatives.

Also, if you look across the different creative disciplines that Adobe serves, each creative type will have a different ‘hub product’ from which they explore other Creative Cloud tools. We see an increasing amount of graphic designers exploring motion graphics with After Effects CC, as well as photographers embracing the video tools due to the changing capabilities of their cameras.

After Effects CC libraries.

After Effects CC libraries.

On that point, Adobe’s product development has very much been “horses for courses”, with each product serving a different task.

With Premiere Pro progressively gaining features from other applications of the ecosystem, is there a shift in the Adobe philosophy?

The exchange of technology has always been a part of Adobe’s DNA. The shift in focus is about addressing the increasing demands of film/TV editors. In the past, they were exclusively responsible for editorial. Today, they’re asked to not only edit but create complex motion graphics and sweeten audio and color grade. Historically, the only content that would get this level of finish would be high-budget shows but expectations are now higher all around. Even a daily news magazine show wants each segment to look and sound stellar, upping the demand for more tools that are easily accessible to the editor.

Our philosophy is not to input the full complexity of a ‘craft’ tool in the NLE user interface. We are looking at a full workflow, the connection between creative disciplines and then designing the optimized workflow from that perspective.

A great example of this is our “Live Text Templates.”  We found many users were creating titles for shows that required the deep toolset of After Effects CC to design the style and motion. Even though Adobe’s Dynamic Link technology made it easy for users to bounce between Premiere Pro CC and After Effects CC, it was still two interfaces for editors to master. As a response, we created Live Text Templates so that even junior editors could deliver amazing content quickly and easily. We’re able to arm creatives at all levels with useful feature sets by focusing on the intention of what they’re trying to achieve and finding ways to empower users to share their skills.

Speedgrade

Speedgrade

In the latest update, Premiere Pro has gained a new toolset of colour grading features that seem to overlap, and sometimes exceed the tools of Speedgrade.

Adding into the mix the success of DaVinci Resolve, is there a future for Speedgrade or will it join Encore in the group of obsolete applications?

Let’s start with comparing the depth of features in Premiere Pro CC versus SpeedGrade CC. While Adobe has some of the top color scientists in the world, they are finite in number. Thus, given the scope of work in Premiere Pro CC in the latest release, we just didn’t have the cycles to do a huge feature release for SpeedGrade CC. It is worth noting both Premiere Pro CC and SpeedGrade CC share Direct Link technology, which means everything users do in a multi-track timeline in Premiere Pro CC is available as a full Premiere Pro CC sequence in SpeedGrade CC for further editing.

Color is a hugely exciting area with the advent of HDR (High Dynamic Range) being part of the emerging UHD (Ultra High Definition) specs, which is why we’re making color and light more central in the editing process – something our customers are telling us. Editors need color and light to drive the narrative by creating a specific mood. The Adobe team believes giving optimized controls in the editing experience and deep controls in a dedicated interface is the natural approach. This workflow is amplified with cameras that have HDR capabilities – you want full control and work natively to effect changes as ‘RAW’ as possible.

Two different looks.

Two looks.

Premiere Pro has a very active online community of users where Adobe representatives seem to always be lurking. How valuable is this direct exchange with your customers?

This customer focus is part of the collaborative culture we value at Adobe. The forums are a public extension of this ethos. Customer connections are not about a one-off event, such as a meeting at NAB. It’s about spending the time to visit them in the shops and understand their work. I don’t think there is a feature we release that isn’t driven or substantially shaped by customers. Quite often our features come out of engagements where a customer is switching to an Adobe workflow. Many vendors don’t address these kinds of issues, but we implemented a process to assess all feature requests and determine if it aligns with the desired function of our products. If the feature is accepted, the customer is embedded with one of our Sprint Teams (Agile Software Development process) and they help us define and test the feature, which rolls into the next release. The benefits are immeasurable. We understand the workflow in detail and can test the feature before we release. This helps Adobe build deep relationships with our customer. At any point we have 10-15 features being developed in this manner.

Additionally, we gather feedback all the time through the Feature Request/Bug Submission process. We search for themes that emerge from these frameworks and address as many as possible in each release. To get a sense of what we have done over the years, check out this page.

adobe

Adobe’s video products have been placed in the difficult position of having to serve every type of user, from the indie guerrilla filmmaker to the enterprise multi-user company.

Does this make the product harder to develop, and is there a risk of over-complicating the product, putting it out of reach of the new user?

I’d argue there has never been a time where the tools of the independent filmmaker align [more] with the tools of Hollywood! The basics of any pipeline are almost identical: cloud-based pre-production, file-based capture (often with the same cameras being used), native or proxy pipeline, VFX, audio and grading simultaneously happening, output to files for digital distribution. One cool benefit of Creative Cloud is that any individual can use the same tools as a Hollywood feature team for all aspects of production from script writing to VFX.

In terms of application design, if you compare Premiere Pro CS5 to Premiere Pro CC 2015, you’ll see a tremendous simplification in user interface design. You’ll notice the user experience has been optimized to remove time and keystrokes. The main reason we’re able to do this is that today’s users no longer start with film or tape-to-tape editing nor want tools that emulate old workflows. They demand speed and efficiency of experience, which is where our focus remains.

adobe-anywhere

What is the vision behind Adobe Anywhere, and will the technology ever reach smaller users?

One of the biggest areas of lost creative potential is trying to collaborate around a project file. How many of us have seen a file named like this:

Program.final.really_final.I_really_mean_it_this_is_final.this-better-be-the-last-change.IF_THIS_IS_NOT_THE_LAST_CHANGE_I_QUIT.proj

With pervasive network connectivity, collaboration can occur across a facility and around the world. File-based versioning maintains multiple versions of the project even if the media is synced. Users need to work on the same project at the same time, have their version history preserved and have any conflicts due to simultaneous changes intelligently managed. This is our vision: Keep the tools the same. Don’t change the creative process. Build a system that is smart and allows you to work better with others and never lose your work.

We have had tremendous success since this product rolled out to the biggest media creators in the world – companies with a global footprint and dedicated teams to manage workflows. At NAB this year, we previewed a ‘Collaboration Only’ version. Up until now, Anywhere has been a compound system that offered collaboration features like versioning, and also a system for streaming remotely stored video, optimized in real-time for your specific network bandwidth. While this is extremely powerful and opens amazing workflow opportunities, it also requires more hardware than a small post facility would want to invest in. The collaboration-only preview shows how we can provide the collaboration component and rely on the existing house network to access local shared media. We are going to keep pushing to make collaboration something that every Creative Cloud subscriber uses daily.

Adobe has been a pioneer in the tackling of technical challenges these past few years. What are the new challenges ahead?

It can be summed up in an acronym: UHD (Ultra High Definition). We see three interesting challenges in UHD related to our work:

  • Larger raster images
  • Higher cadence frame-rate
  • Larger gamut

Aside from changes in resolution, the home experience has not altered much since the advent of color broadcasting. We are on the cusp of delivering more color and light to the home at a higher frame rate, which is going to be incredible to watch. This does come with substantial challenges, the biggest of which will be the co-existence of SDR and HDR devices in the home. Today, most home devices have the luminance capability of around 100 nits (brightness units). Going forward this will get broader. Today it’s been shown around 1500 nits, but some see this reaching 6000 nits. With such a spectrum of possibility, content distributors won’t want to put out different streams for different devices. For the first time the content stream will handshake with the viewing device and a deferred ‘grade’ will be applied by the device itself to optimize the viewing experience. Luckily, users will be able to target a specific profile for the SDR devices and a target nit profile that can be extrapolated to service the variable spectrum of capabilities of the new devices.

Mastering for this reality will initially be the domain of high value content, but as the technology gets more established in the home, there will be a need for more content. Therefore, mastering needs to be simple. Users need to be able to understand what is happening at each phase of the creative process (see below).

q9

One could argue that one of the reasons 3D hasn’t really picked up is because we haven’t fully mastered its powers and avoided its technical dangers, and that they have added little artistic value.

What do you think is key to these new technologies avoiding the same pitfalls?

Technologies that are successful should work for all content, all the time and don’t incur a massive increase in cost for the producer. This was the problem with Stereoscopic television. Firstly, the benefit was interesting but for ‘day-to-day’ content (documentary, news magazine, sport, etc.) it did not deliver enough benefit in experience. Secondly, the active shutter systems forced users to wear glasses (which were expensive and prone to loss or breakage) and lenticular never worked well for a group situation. Lastly, the costs of production were much higher. Combined, it meant that stereoscopic was only well suited to event content. It was for this reason that Adobe took a partner approach for these workflows. We made sure that our architecture correctly supported stereoscopic workflows and allowed our partners, such as Vision 3, to provide professional tools for those with with these types of needs.

UHD however is a different proposition. When large raster is combined with high frame rate, everything looks better. If you add extended dynamic range to the mix, it’s truly and amazing experience for all gathered around the screen. The whole production and distribution chain is organically migrating to enabling technologies:

  • Cameras are extending gamut and frame rate
  • Software can deal with the increased processing required for HDR and HFR content
  • Home TVs are reaching price-points where it would appear that UHD sets will become the norm

This is an area where Adobe is investing heavily. Lars Borg, our Principal Color Scientist is the chairman of the drafting group Dynamic Metadata for Color Transforms of HDR and WCG images at SMPTE. Additionally, he is working with Dolby to ensure that the Dolby Vision workflow can deliver amazing techniques utilizing an XMP sidecar to carry the deferred grading information.

Outside of this change in home experience, the thing that gets me most excited is the increasing use of mobile technologies for content creation. As the computing capabilities of mobile devices increases, we end up with not only a powerful computer in our pocket, but also a powerful camera with network connectivity. This allows creativity to be truly mobile. At Adobe, we pushed and explored this space on many fronts. Recently, we developed Adobe Premiere Clip for video. This app isn’t trying to put Premiere Pro into a 4-by-6 inch screen. Instead, it’s an experience tailored to the screen size of the device and an interaction model using your finger as a pointer. You can do amazing work right in the palm of your hand, but if you want to take your work further, it syncs to Creative Cloud. This means when you sit down with Premiere Pro CC on the desktop, you can pick up the same project and continue with a richer set of tools for more editing power and control.

The mobile experience extends out to other aspects of production. As discussed earlier, color and light as a creative function is key for our users. We’ve taken advantage of the incredible camera in the iPhone to create Adobe Hue CC, which allows you to capture the mood of a moment and create a “Look” that is instantly available to you via Creative Cloud Libraries. Imagine watching a glorious sunset at the beach with your family and being able to capture the essence of that moment by creating a complex 3D color transform with the push of a button. It’s pretty special.

Adobe Hue

Adobe Hue

 

1 Comment

  • Reply February 13, 2016

    WhoCares

    Blah, blah, blah… creative cloud sucks.

Leave a Reply