Rosebud AI will let you place your or any face onto any body for diverse stock images

[How strange to know that the people in images we see in the media may not have ever existed; now Motherboard reports that a company will make its collection of AI-generated stock models “customizable and personalizable. Users will be able to algorithmically place any face onto any body in their collection” including their own uploaded pictures. The story addresses some of the ethical concerns; it also includes a 0:41 minute video. See Rosebud AI’s website for more details. –Matthew]

This Company Promises to Place Any Face Onto Any Body, Using an Algorithm

Rosebud AI wants to add diversity to stock images—and is doing it by swapping anyone’s face onto stock models’ bodies.

By Samantha Cole
November 21 2019

In September, we saw the launch of Generated Photos, a collection of 100,000 images of AI-generated faces for use in stock images. Another company, Rosebud AI, is now taking that concept a step further, with faces that aren’t just part of a static, stock database, but customizable and personalizable. Users will be able algorithmically place any face onto any body in their collection.

Maybe you’re thinking, another AI face generator? Yes, another AI face generator. But this time, you’ll be able to upload any face into a system that places it onto another person’s stock-image body.

Rosebud AI, a San Francisco-based synthetic media company, launched Generative.Photos this week with a Product Hunt page and demo site. The demo only uses its pre-loaded models for now, but includes placeholders for uploading your own photos and a signup for a user waitlist.

Generative.photos is a first step in our synthetic stock photo and API offering, which will eventually allow users to edit and fully synthesize visual content with an intuitive interface,” Lisha Li, the founder of Rosebud AI, wrote on Product Hunt. “We focused on bringing forth a way to diversify stock photo content since it was a need we heard voiced by stock photo users. All the faces in our 25k photo collection are not of real people.”

If this diversity line is sounding familiar, that’s because it’s also what Generated Photos claimed it was setting out to fix. Li also says it wants to give “consumers the power to choose an advertising model that they can relate to,” with more diverse models. She wrote that what makes Generative.Photos different from other attempts is the context: It’s giving a fictitious, generated face a stock body and background, and adjusting it to whatever skin color or gender an advertiser or marketer wants.

Li told Motherboard that Rosebud AI’s tool are still in closed beta. But releasing something into the world before establishing public terms of use—or considering any kind of guidelines or prevention measures for the tool’s potential for malicious use—is unfortunately not uncommon. We see it again and again with AI programs hustled out into the wild before any ethical guidelines are established, especially, like deepfakes, Deepnude, Generated Photos.

In addition, Generated Photos and Rosebud AI are allowing people to create their own realities, letting companies demonstrate artificial diversity where there actually isn’t any. Rather than real diversity, we get algorithmically generated, customizable stock images.

“It’s pretty harmful and a major oversight to launch any kind of project where users can add content to a repository and not check and verify if that content is ‘harmful’ or not,” machine learning designer Caroline Sinders, a fellow with Mozilla Foundation and the Harvard Kennedy School who studies biases in AI systems, told Motherboard. “It’s even more of an oversight and downright neglectful not to have policies that define ‘harm’ in terms of contention and actions. In 2019, this is a major issue for a company to not have these things.”


Update: Following publication, Li told Motherboard that Rosebud AI’s self-serve tool is not open yet, as it is still in closed beta, and will require users to sign a term of service that reflects a code of ethics before using the beta version of the tools.

This entry was posted in Presence in the News. Bookmark the permalink. Trackbacks are closed, but you can post a comment.

Post a Comment

Your email is never published nor shared. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

*
*

  • Find Researchers

    Use the links below to find researchers listed alphabetically by the first letter of their last name.

    A | B | C | D | E | F| G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z