This is the start of the company’s initiative to integrate generative AI into workflows across Creative Cloud.
Adobe has integrated Firefly, its family of generative AI models, into Photoshop, starting a major initiative to bring AI to existing creative workflows across Creative Cloud, Document Cloud, Experience Cloud, and Adobe Express.
Firefly now also supports Generative Fill, which allows users to extend images as well as add and remove objects using simple text prompts.
“By integrating Firefly directly into workflows as a creative co-pilot, Adobe is accelerating ideation, exploration and production for all of our customers,” said Ashley Still, senior vice president, Digital Media at Adobe. “Generative Fill combines the speed and ease of generative AI with the power and precision of Photoshop, empowering customers to bring their visions to life at the speed of their imaginations.”
Generative Fill automatically matches the perspective, lighting, and style of images, reducing time costs. According to Adobe, it "expands creative expression and productivity and enhances creative confidence of creators with the use of natural language and concepts to generate digital content in seconds."
It allows users to edit their images non-destructively, creating generative layers and supports Content Credentials, ensuring people know if a piece of content was created by a human.
Moreover, Adobe updated Photoshop itself, adding new Adjustment Presets, Contextual Task Bar, Remove Tool, and Enhanced Gradients.
Photoshop’s Generative Fill feature is available in the desktop beta app today and will be generally available in the second half of 2023. It is also available today in the Firefly beta app.