Skip to main content

Photoshop’s AI neural filters can tweak age and expression with a few clicks

Photoshop’s AI neural filters can tweak age and expression with a few clicks

/

Adobe wants to make a big splash with its new machine learning tools

Share this story

One of the neural filter options lets you adjust someone’s age with a simple slider.
One of the neural filter options lets you adjust someone’s age with a simple slider.
Image: Adobe

Artificial intelligence is changing the world of image editing and manipulation, and Adobe doesn’t want to be left behind. Today, the company is releasing an update to Photoshop version 22.0 that comes with a host of AI-powered features, some new, some already shared with the public. These include a sky replacement tool, improved AI edge selection, and — the star of the show — a suite of image-editing tools that Adobe calls “neural filters.”

These filters include a number of simple overlays and effects but also tools that allow for deeper edits, particularly to portraits. With neural filters, Photoshop can adjust a subject’s age and facial expression, amplifying or reducing feelings like “joy,” “surprise,” or “anger” with simple sliders. You can remove someone’s glasses or smooth out their spots. One of the weirder filters even lets you transfer makeup from one person to another. And it’s all done in just a few clicks, with the output easily tweaked or reversed entirely.

“We can now say that Photoshop is the world’s most advanced AI application”

“This is where I feel we can now say that Photoshop is the world’s most advanced AI application,” Maria Yap, Adobe’s vice president of digital imaging told The Verge. “We’re creating things in images that weren’t there before.”

To achieve these effects, Adobe is harnessing the power of generative adversarial networks — or GANs — a type of machine learning technique that’s proved particularly adept at generating visual imagery. Some of the processing is done locally and some in the cloud, depending on the computational demands of each individual tool, but each filter takes just seconds to apply. (The demo we saw was done on an old Mac Book Pro and was perfectly fast enough.)

Many of these filters are familiar to those who follow AI image editing. They’re the sort of tools that have been turning up in papers and demos for years. But it’s always significant when techniques like these go from bleeding-edge experiments, shared on Twitter among those in the know, to headline features in consumer juggernauts like Photoshop.

As always with these sorts of features, the proof will be in the editing, and the actual utility of neural filters will depend on how Photoshop’s many users react to them. But in a virtual demo The Verge saw, the new tools delivered fast and good quality results (though we didn’t see the facial expression adjustment tool). These AI-powered edits weren’t flawless, and most professional retouchers would want to step in and make some adjustments of their own afterwards, but they seemed like they would speed up many editing tasks.

Neural filters can be used to colorize old photos — a popular application of machine learning.
Neural filters can be used to colorize old photos — a popular application of machine learning.
Image: Adobe

Trying to beat AI bias

AI tools like this work by learning from past examples. So, to create the neural filter that’s used to smooth away skin blemishes, for example, Adobe collected thousands of before and after shots of edits made by professional photographers, feeding this data into their algorithms. The GANs operate like a paired student and teacher, with one part trying to copy these examples while the other tries to distinguish between this output and the training data. Eventually, when even the GAN is getting confused trying to tell the difference between the two, the training process is complete.

“Basically, we’re training the GAN to make the same corrections a professional retoucher would do,” Alexandru Costin, Adobe’s vice president of engineering for Creative Cloud, told The Verge.

It sounds straightforward, but there are lots of ways this training can go wrong. A big one is biased data. The algorithms only know the world you show them, so if you only show them images of, say, white faces, they won’t be able to make edits for anyone whose complexion doesn’t fit within this narrow range. This sort of bias is why facial recognition systems often perform worse on women and people of color. These faces just aren’t in the training data.

Costin says Adobe is acutely aware of this problem. If it trained its algorithms on too many white faces, he says, its neural filters might end up pushing AI-edited portraits toward whiter complexions (a problem we’ve seen in the past with other ML applications).

“One of the biggest challenges we have is preserving the skin tone.”

“One of the biggest challenges we have is preserving the skin tone,” says Costin. “This is a very sensitive area.” To help root out this bias, Adobe has set up review teams and an AI ethics committee that test the algorithms every time a major update is made. “We do a very thorough review of every ML feature, to look at this criteria and try and raise the bar.”

Users will be able to send “inappropriate” results to Adobe to improve the filters.
Users will be able to send “inappropriate” results to Adobe to improve the filters.

But one key advantage Adobe has over other teams building AI image-editing tools is its catalog of stock photography — a huge array of images that span different ages, races, genders. This, says Costin, made it easy for Adobe’s researchers to balance their datasets to try to minimize bias. “We complemented our training data with Adobe stock photos,” says Costin, “and that allowed us to have a good as possible, distributed training set.”

Of course, all this is no guarantee that biased results won’t appear somewhere, especially when the neural filters get out of beta testing and into the hands of the general public. For that reason, each time a filter is applied, Photoshop will ask users whether they’re happy with the results, and, if they’re not, give them the option of reporting “inappropriate” content. If users choose, they can also send their before and after images anonymously to Adobe for further study. In that way, the company hopes to not only remove bias, but also expand its training data even further, pushing its neural filters to greater levels of fidelity.

Selecting a new light source is another application of neural filters.
Selecting a new light source is another application of neural filters.
Image: Adobe

Machine learning at speed

This sort of speedy update based on real-world usage is common in the fast-moving world of AI research. Often, when a new machine learning technique is published (usually on a site named arXiv, an open-access collection of scientific papers that haven’t yet been published in a journal), other researchers will read it, adopt it, and adapt it within days, sharing results and tips with one another on social media.

Some AI-focused competitors to Photoshop distinguish themselves by embracing this sort of culture. A program like Runway ML, for example, not only allows users to train machine learning filters using their own data (something that Photoshop does not), but it operates a user-generated “marketplace” that makes it easy for people to share and experiment with the latest tools. If a designer or illustrator sees something cool floating around on Twitter, they want to start playing with it immediately rather than wait for it to trickle into Photoshop.

Adobe wants to bring the fast pace of AI research to Photoshop

As a widely used product with customers who value stability, Adobe can’t truly compete with this sort of speed, but with neural filters, the company is dipping a toe into these fast-moving waters. While two of the filters are presented as finished features, six are labeled as “beta” tools, and eight more are only listed as names, with users having to request access. You can see a full list of the different filters and their respective tiers below:

Featured Neural Filters: Skin Smoothing, Style Transfer
Beta Neural Filters: Smart Portrait, Makeup Transfer, Depth-Aware Haze, Colorize, Super Zoom, JPEG Artifacts Removal
Future Neural Filters: Photo Restoration, Dust and Scratches, Noise Reduction, Face Cleanup, Photo to Sketch, Sketch to Portrait, Pencil Artwork, Face to Caricature

Yap says this sort of approach is new to Photoshop but will hopefully let Adobe temper users’ expectations about AI tools, giving them the license to update the tools more quickly. “We’ve built this framework that allows us to bring models [to users] faster, from research to Photoshop,” says Yap. “Traditionally when we do features, like sky replacement, they’re really deeply integrated into the product and so take a longer time to mature.” With neural filters, that update cycle will ideally be much faster.

“It’s this pace that we’re trying to bring into Photoshop,” says Costin. “And it will come at the cost of the feature not being perfect when we launch, but we’re counting on our community of users to tell us how good it is [...] and then we will take in that data and refine it and improve it.”

In other words: the flywheel of AI progress, wherein more users create more data that creates better tools, is coming to Photoshop. Tweaking someone’s age is just the start.