Camera Strips People Naked to Spark Debate on AI and Privacy 

Camera Strips People Naked to Spark Debate on AI and Privacy 

German artist Mathias Vef and designer Benedikt Groß created a new ‘deepfake camera’ called Nuca, which uses AI to completely undress photos of clothed people within seconds. 

Nuca raises questions about privacy, but its creators argue they are doing it for the greater good. They say the camera is an art project that aims to provoke debate on how generative AI can be used to erode reality and, of course, privacy itself.

Also read: How to Spot AI Deepfakes on Social Media 

How does it work?

Nuca doesn’t need consent to undress you. Once the camera captures your image wearing clothes, Vef says, it will take just 10 seconds to process the photo and generate a naked version of yourself.

“One of the challenges was to reduce the time for the generation of the images,” Vef said, as reported by Fast Company.

“We used mostly tools that are publicly available, but to combine them in a very efficient way and put it in a working device is something that hasn’t been done before.”

Camera Strips People Naked to Spark Debate on AI and Privacy 
Nuca camera. Image credits: Nuca

In its current form, the Nuca is a prototype compact camera made from 3D designed and printed parts. The device weighs 430g and is equipped with a 37mm wide angle lens and an ergonomic grip.

The ‘deepfake camera’ features a smartphone that takes the image and works as the viewfinder. It displays the input picture and other data, like the person’s pose, in real time, the project’s website says.

When you press the trigger button, the camera captures the image of the subject and sends it to a Stable Diffusion engine where it is processed for things such as face landmarks and body shape.

The AI model analyzes the photo on 45 identifiers like gender, age, ethnicity, hair, and even glasses. Using this data, the AI then adds the face of the real person to an AI-generated, fully naked body.

Nuca camera generates reaction

At the moment, Nuca remains only a working prototype, meaning it is not yet available for public use. However, the artistic experiment by Mathias Vef and Benedikt Groß is already generating debate.

“Nuca has already sparked diverse reactions, ranging from fears of AI’s bias towards body cult and beauty mania to enthusiasm for its celebration of natural human beauty and form,” the website reads.

“This project prompts a crucial discussion on AI’s potential, emphasizing consent, algorithmic fairness, and the societal impacts of AI-generated imagery.”

Citing some users, the project says the world must confront that it is facing “a new type of pornography in which humans are only a memory that’s copied and remixed to instantly generate whatever sexual image a user can describe with words”.

The German creators are hoping that their cheeky sales pitch will get people to start thinking and talking about the dangers of using artificial intelligence to reproduce deepfake nudes.

“We both think the debate is only about to begin because the possibilities seem endless. We are only at the beginning of this journey,” Vef told Fast Company.

“It is very important to us to be both critical and explorative, as we need to know what’s coming to be able to discuss possibilities and their implications. Our camera is a way to do that.”

Vef and Groß will reportedly showcase the Nuca prototype camera at an exhibition in the German capital, Berlin, later this year.

Generative AI has been used by bad actors to generate non-consexual pornographic material targeting celebrities like Taylor Swift and Emma Watson, and even under-age school children. AI has also been used to create deepfakes to commit fraud.

Image credits: Shutterstock, CC images, Midjourney, Unsplash.