A deepfake camera can undress people almost in real time – to send a message about AI

[The camera in an art project called Nuca uses generative artificial intelligence to create photos in which people who are dressed normally appear to be naked. Rather than being prurient, the goal is to make us think about the potential of AI to “destroy reality.” This story about the project is from Fast Company, where the original includes four more images and a 2:38 minute video (which is also available on Vimeo). –Matthew]

A new camera can undress people almost in real time—to send a message about AI

Nuca, a new deepfake camera, is an art project that shows how artificial intelligence can undermine reality and privacy, taking photos of subjects and completely ’nudifying’ them in just 10 seconds.

By Jesus Diaz
April 9, 2024

German artist Mathias Vef and designer Benedikt Groß thought it would be a great idea to create Nuca, a consumer compact deepfake camera capable of generating nonconsensual naked photos of its subjects. They are right—not because of pervy reasons but because it serves a higher purpose: to highlight once again that we are so very screwed. Generative artificial intelligence is destroying reality. And now it can do it almost in real time, right in your very own hands, right in front of your very own eyes, at the touch of a button.

In an email interview, Vef tells me that he and Groß studied speculative design at the Royal College of Art in London, where they explored the future of technology and its implications. “When we met last year in summer for a reunion, we had quite an extensive chat about generative AI, which we both worked with for a while. And then the topic of deepfakes came up,” he says.

That’s when they thought of the worst possible consequences of the technology and came up with the idea to create this “nudifying camera.” Although in university they usually worked with props rather than actual technology in different projects, something as advanced as Nuca was now possible. “As we learned to think very critically about [this] technology, it seemed very tempting to realize it,” Vef says.

Publicly available tools

Nuca uses artificial intelligence to process an image captured in the cloud, reconstructing a nude from the real-world subject. Its design follows the format of a standard compact camera that vaguely reminds me of an old Nikon reflex. Its 15-ounce body is 3D-designed and printed. Equipped with 37 millimeter lenses, the deepfake camera has a mobile phone for guts that captures the image and acts as the viewfinder.

The software displays the input image as well as other data, like the subject’s pose, in real time. When you press the shutter button, the deepfake camera sends the photo to the cloud for processing. There, the image gets into a workflow that studies the pose, body landmarks, and face, analyzing 45 identifiers such as gender, age, ethnicity, expression, and body shape.

The information is sent to a Stable Diffusion engine that runs one of the many public diffusion models specialized in creating nudes that are available in the popular Civitai. The model interprets this information and generates the base nude. The resulting image gets processed using a “deepfake” tool that seamlessly adds the face of the real-world subject onto that naked body.

“The processing time is quite short, 10 seconds,” Vef tells me. “One of the challenges was to reduce the time for the generation of the images. We used mostly tools that are publicly available, but to combine them in a very efficient way and put it in a working device is something that hasn’t been done before.”


For now, Nuca is a working prototype created as an artistic experiment to demonstrate how we are progressively getting closer to an apocalyptic fate of unreality. You will be able to experience it in real(?) life only at a nüüd.berlin gallery exhibition this summer. But I wouldn’t be surprised if, at some point in the very near future, something of similar features will actually be easily available to consumers, either as a simple app for your phone or as a dedicated gadget—perhaps a real deepfake camera, some sort of Mission Impossible X-ray glasses, or hacked cyber-eye contact lenses.

Nuca was made to provoke us, but the project has already been far surpassed by real web applications that are used for all things bad. Undressing people to humiliate or bully them, manipulate the masses with fake everything, scam hundreds of thousands with deepfake video chat or voice, or just literally bring the end of reality as we know it.

So far, many of these tools are out of reach for all but the most technologically gifted users, but it’s likely that one-touch reality fabrication way better than Nuca will be available sooner than we think. In a way, even big companies like Apple are already doing this, subtly training us for what’s coming: Products like Vision Pro or AirPods Pro actively process the analog reality that we perceive naturally into new synthetic realities. It’s only a matter of time before we fall into a black hole that we are not going to be able to escape.

Will we learn any lessons?

Which takes us right back to AI itself.

Still, Vef and Groß believe that their artistic provocation may push us into thinking about the perils. “We both think the debate is only about to begin because the possibilities seem endless. We are only at the beginning of this journey,” Vef points out. “It is very important to us to be both critical and explorative, as we need to know what’s coming to be able to discuss possibilities and their implications. Our camera is a way to do that.”

I’m betting all the Bitcoins I don’t have into the idea that no lessons will be learned from this experiment and we will be able to buy something like Nuca on AliExpress—or worse—soon.


Leave a Reply

Your email address will not be published. Required fields are marked *

ISPR Presence News

Search ISPR Presence News: