What is the problem?
DeepNude apps are dubious "entertainment"
The development of artificial intelligence technologies (Artificial Intelligence, AI) led to the emergence of AI applications that can use people's personal data without their knowledge for various purposes, particularly when creating offensive or pornographic images. Such images can then be posted by app users as supposedly real and used to harass or blackmail the person whose photo is used.
In particular, a whole series of DeepNude applications — programs that use a neural network to edit photos of women, removing clothes and drawing parts of the naked body instead.
Recently, it became known about such a Ukrainian development: the online service Makenude, in which the AI itself "draws" parts of the body covered by clothes, creating "unreal, but realistic naked bodies of people in photos."
What is the solution?
Impossible to prevent but possible to punish
The development caused the indignation of many social network users and, at the same time, questions about what can be done to prevent such offensive and false content.
For clarification, Rubryka turned to Nataliia Semchuk, an attorney, candidate of legal sciences, and a graduate of Warsaw University's postdoc program on the law of new technologies.
Currently, the European Union, which Ukraine aspires to join, is developing legislation that will regulate the ethical issues of using artificial intelligence technologies. Still, even now, Ukrainian legislation contains norms that will allow bringing to justice those who will create DeepNude from your photo.
How does it work?
"Whoever hinders us will help us"
Nataliia Semchuk divides this issue into several legal components.
- Legal status of photography in general.
According to the lawyer, this has little changed in jurisprudence since the 19th century. These norms have certain differences in different countries but are similar throughout Europe.
It is about the right to photograph a person on the street without their consent (so-called reportage photography) and the right to distribute photographs that have already been taken and published in this way. The person in the photo has the right to contact the person who distributed their photo and demand that the distribution of their image be stopped. If it is, for example, a poster, it will be dismantled at the expense of this person who is against using their photo. This is roughly the scheme in general, explains Semchuk.
- Norms related to the protection of personal data, the so-called GDPR (General Data Protection Regulation)
GDPR (General Data Protection Regulation) is an EU regulation that establishes requirements for protecting the personal data of natural persons. In Ukraine, it is now partially used, according to the expert.
These norms provide for two types of photoprotection:
- just as an object
- protection against unauthorized processing using software for purposes such as profiling.
In addition, due to the increased role of fakes, the EU passed the Digital Services Act (DSA), which obliges social networks to remove questionable content and contains other restrictions on false information. The EU code on combating disinformation has also been updated, and a draft of a separate regulation on artificial intelligence (AI Act) is being prepared.
The GDPR protects the photo precisely as an object, by which it is possible to identify the person in the image not only by a person but also thanks to software and searching for it in a particular database, says Semchuk. "There were several unfortunate cases when a girl suspected of committing a crime was detained with the help of such software, but it turned out that there was a software failure and it was not her."
Initially, these programs were not regulated in any way, but as they spread, all states began to limit such programs and introduce legal regulation.
In Ukraine, too, such regulation is gradually being introduced. Now, there are very few restrictions on such programs, continues the expert. "In principle, anyone can buy such a program, for example, for personal use. But this is purchased software, and the person registers in the database, so all these requests are visible to a certain extent to the author of the software. The information there is usually taken from open sources. That is, such a system as purchased software is legal in Ukraine.
Semchuk cited a positive example from the US when a photo from a dating site helped identify a criminal.
A girl and a young man went to a cafe for a date and spent the night together, and then the man disappeared with the girl's purse. But a surveillance camera recorded him, and the girl had a photo of a man from a dating site.
The victim went to a lawyer with legal software identifying the person in the photo. They downloaded it and got the guy's name, surname, and legal address. That way, they could sue him for embezzlement, win the case, and get the money back. That is, it helped to find the robber, about whom there was not enough data.
- All-European norms regarding the regulation of artificial intelligence technologies in the development process.
There are certain international acts on combating disinformation and the creation of deep fakes.
You can be sued for spreading deepfakes
Deepfake (Deep Learning + Fakes = Deepfakes) is a method of human image synthesis based on artificial intelligence used to combine and superimpose some images and videos on other photos or videos.
"As it was once said in a movie, whoever hinders us will help us," says Semchuk. There is software that allows you to make deepfakes, but there is also software that allows you to identify deepfakes.
Accordingly, the lawyer continues, for the distribution of deepfakes, a person can sue the person who posted it and who is responsible for it and receive compensation for insulting honor, dignity, and business reputation.
At the same time, Semchuk reminds the first point: regardless of whether it is a deepfake, an offensive photo, or not, the person whose image is used has the right to demand that this photo cease to be publicly displayed: "Any woman who sees a photo like this, in which it is claimed that it is her and not just some random brunette, for example, can say that on this basis, according to the law, she demands that it be stopped, and it must be stopped.
- The issue of the use of artificial intelligence has not yet been settled, but in many countries, in particular Ukraine, the distribution of pornography and the distribution of false information is regulated.
If an attacker creates DeepNude and distributes it publicly, this image will have certain signs of illegality, not because it was created by artificial intelligence, but because it is false or offensive, Semchuk explains.
What to do now if this happened?
Complain, sue, and contact the police
Unfortunately, Ukraine can limit the distribution of such content only on its territory.
For this, it is worth:
- contact the administrators of the network or site where such content is posted, with a demand for its removal,
- and in case of disagreement, its display in Ukraine can be restricted through the court.
In addition, if the images are pornographic, you can contact the police.
At the international level, everything is more complex, the expert states. But there are options: "Usually, it is advisable to apply to European and/or American cybersecurity and personal data protection authorities with a corresponding statement by means of electronic communication (however, success depends on many factors in each case)."
Newsletter
Digest of the most interesting news: just about the main thing