This article is more than 1 year old

Bow to your Sensei! Adobe adds machine learning and design tool to Creative Cloud

Everything's getting cloudier

Adobe MAX Adobe has announced a series of updates to its Creative Cloud offering at its MAX event under way in San Diego.

What Adobe calls Creative Cloud has always in fact been a hybrid product, with large desktop applications like Photoshop, Illustrator, InDesign and Premier Pro forming the main part of its value. Now the company is adding more cloud-first features, focusing on three areas – collaboration, storage, and machine learning.

At Microsoft's Ignite event in September, Adobe announced that Microsoft Azure would be its preferred cloud, but refused to elaborate on the details, or whether it would move applications away from Amazon's web services.

On Wednesday Adobe announced Sensei, which it calls "a framework and set of intelligent services built into the Adobe Cloud Platform". Despite the announced Microsoft partnership, Adobe says that Sensei does not run on Azure. However, with or without Microsoft, Adobe is promising “cloud-capable mobility” and “the ability to work interchangeably across desktop and mobile,” according to Creative Cloud VP and GM Mala Sharma.

In March Adobe announced Experience Designer (XD), which is a design and prototyping tool. An updated beta is on the way in the first half of 2017 which will feature realtime collaboration, so that users can work simultaneously on shared documents and see each another's changes. The product is also getting support for Layers and Symbols, an obvious omission from the first beta. Adobe is also set to release a Windows 10 version, apparently a UWP (Universal Windows Platform) app, so no use to users of earlier versions of Windows.

On the storage side, Sharma spoke about going beyond file synchronisation between devices to cloud-first storage, so "you can work wherever you are in high resolution with complete backup".

The most intriguing area, though, is machine learning. While Creative Cloud has long had features driven by machine learning, such as content-aware fill in Photoshop, Sensei will enable new features, such as the previously announced visual search in the Adobe Stock image library.

Sharma also spoke about analysing usage data to drive new features, though without specifics. "Because we can see what assets are in the cloud, and because the collaboration is cloud-enabled, and we understand the connections that our customers have, and we understand what mobile devices they have, we can enable machine learning at a new level," she said. It looks as if designers will have to take care if for any reason they do not want Adobe to know everything about what they are working on.

Project Felix is a new 3D design application within Adobe's Creative Cloud, and will be available in beta before the end of 2016. "We are enabling graphic designers to create photo-realistic images and scenes by combining 2D and 3D assets," said Sharma. The idea is that you construct scenes in Project Felix, including materials, lighting and camera angles, and then render the image ready for export to Photoshop for finishing.

Other announcements at MAX include Android versions of all Adobe's mobile apps, including Photoshop Sketch, Comp and Fix. Adobe Stock is being updated to enable a contributor portal, so that anyone can upload images for licensing. Creative Cloud 2017 will also include updates to all the main applications, with features such as a new 3D rendering engine in After Effects, better VR (Virtual Reality) support in Premiere Pro, improved puppet animation in Character Animator, and a revamped user interface and coding engine in Dreamweaver.®

More about

TIP US OFF

Send us news


Other stories you might like