logo80lv
Articlesclick_arrow
Research
Talentsclick_arrow
Events
Workshops
Aboutclick_arrow
profile_loginLogIn

Generating 3D Materials With DALL-E & Substance 3D

Senior Material Artist Stan Brown showed how the AI-powered toolkit can be used with Substance 3D tools.

Last week, we shared a story on materials created by Senior Material Artist Stan Brown with Midjourney, new AI-powered software that can turn any text description into an image. The artist generated a set of images using various text inputs and then turned them into materials via Substance 3D Sampler.

This time, the artist decided to use DALL-E 2 and photos he has taken with a phone while traveling. "You can usually spot a Material Artist as they are the one taking close-up photos of the ground or walls in the street," wrote the artist. "I've gathered a lot of photos over the last few years, and with my recent experience with Substance Sampler, and DALL-E2 it was a great time to put some to use."
Brown said he was very impressed with the results from Sampler and amazed with "DALL-E 2's ability to fill in the gaps". The artist shared a post including the source photos, offset and output. Gigapixel AI was used in this experiment to upscale the DALL-E results, currently locked at 1k outputs.
"Most of the results have minimal edits in Designer, though for the Wood Wall I created a mask for the logs and flood-filled some rounded gradients to blend with the normal and height to get the proper look, added Brown. "These mostly took around 45 minutes to an hour each."

You can find the original post here or check out the Midjourney experiment there. Don't forget to join our new Reddit pageour new Telegram channel, follow us on Instagram and Twitter, where we are sharing breakdowns, the latest news, awesome artworks, and more.

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more