FGV professor believes legislation on artificial intelligence-created sexual material is getting attention, but Congress’ slow response to technology is an issue.

“Law is important, but the problem is far from being resolved”

An important first step would be to pass a law making it illegal to manufacture and sell nudes generated by artificial intelligence. As a law professor at São Paulo’s Fundação Getulio Vargas (FGV) and coordinator of the Center for Teaching and Research in Innovation, Alexandre Pacheco da Silva holds this view. “It [bill] responds to a specific, serious and important episode, but the problem is far from being resolved” , he out.

The measure introduced by PT-DF deputy Erika Kokay, which was reviewed by the Chamber of Deputies on December 7, has to be discussed by the Senate. Anyone found guilty of publishing “montages that aim to include people in nude scenes, including the use of Artificial Intelligence in video, audio and photography” might face a fine and a jail sentence of one to four years. Also subject to punishment are revelations of sexual intimacy that are not allowed.

The legislation also applies to minors. The ECA will make the use of artificial intelligence a crime, with penalties including fines and prison terms of two to six years. It was suggested because of recent instances of photo manipulation by teenagers.

The dissemination of nude photos of Santo Agostinho schoolgirls altered using artificial intelligence started an investigation by the Civil Police of Rio de Janeiro in November. Forty students from Marista São Luís school in Recife (PE) complained last month over altered photos they saw online. Inquiries have been initiated by the Pernambuco Civil Police.

To Alexandre Pacheco, “several parts of our body and our representation” are ignored by the legislation when it comes to AI matters

Our representatives cause the most difficulty because they respond to news stories. This phenomenon is far bigger than “Deepnude,” so why not recognize that? ‘Deepfake’ is a more suitable concept as it enables me to create synthetic representations and bodies. A legislation that takes this into account will likely gain more traction, according to his research.

We need to understand the setting and impact of technology, says Alexandre Silva Pacheco. We can’t tell whether the standards will work until we learn about emerging tech features and trends.

Programs that mimic a naked body from a hidden image are a recent example. This is done by the platform based on features and silhouette. Several more incidences have taken place at Rio de Janeiro’s Barra da Tijuca schools. Can you tell me what this law does? Making “Deepnudes” or any other kind of artificially generated image is now illegal.

Nonetheless, AI is capable of dubbing images by simulating a person’s nakedness, voice, timbre, cadence, and even lip movement. Because of our inability to control these exposures, this has become a serious societal problem. The legislation overlooks a more significant problem by fixating on the naked body. It deals with a serious, important event, yet the problem is still there.

What can be done to address the disparity between the expansion of technology and the strengthening of democratic institutions?

A full and rhythmic mobilization of federal, state, and municipal officials to discuss such a complex issue is underway. Colleges and academics consider this issue on a daily basis; I believe politicians should make an effort to communicate with them. Authorities in fields such as computer science, psychology, law, etc. This approach is crucial for elucidating the disconnect between parliament and technical progress.

What gets the most attention and causes the most problems is what our representatives respond to. Why does this phenomenon not extend beyond “Deepnude”? Better still, I can use “Deepfake” to create synthetic versions of human body parts and representations. Compared to focusing just on the body, a measure that takes this into account will have more traction in the future. In order to initiate discussions about society, we must respond less hastily to upsetting incidents.
Is this law the first of its kind or does it add to others?

Additional digital crime possibilities have been made possible by recent modifications to the Penal Code. Because it still uses analog responses, the Penal Code was ill-prepared for “Deepnude”.

It seems that digital voice copying is ideologically incorrect. With technology evolving at a rapid pace, how can we simultaneously combat the “Deepfakes”? The present tendency toward duplicating and abusing identification traits: what to do? These inquiries will remain.

Will Brazil be able to build a legal framework with the support of international laws?

We used to believe that technology originated in rich countries and took a long time to make its way to poorer ones, so this is a fascinating topic. Any topic, including cars, movies, etc. A lot of nations are distributing applications at the same time.

We need someone to motivate us to tackle the problem, find a solution that has worked before, and then weigh the pros and cons of it. Problems similar to Rio de Janeiro’s have also plagued schools in India and the United States. This gives us an opportunity to think and situate Brazil in the global conversation. But it’s unnerving not to have this reference.

How exactly were these “deepnudes” created? How long does it take?

Picture editors used to be able to put a montage. The slow, low-quality, and artisanal process made it easy to recognize the fake as it was done by a small number of people. The use of artificial intelligence enhances these technologies by digitizing everything and making it more precise. With these methods, you may achieve an impact that is almost impossible to detect for as little as $30 per license. They’re easy to use. Just submit a photo, take off your clothing, and pretend to be naked. Share the image with any application. It’s straightforward and will probably become even easier.

The second problem is that we will soon think about how to portray the human body in three dimensions. We need to think about the limits of AI representation and how we may regulate laws that safeguard our bodies.

How do platforms fit into this procedure?

Brazilian law largely exempts platforms that manage content from outside sources. It may be anything from a social media post to an app for picture editing available in app stores like Apple’s App Store or Google Play.

To what extent do these companies specialize? Apps’ compatibility with various operating systems. It won’t be an issue if it works. Online retailer censorship is something I disagree with. Attempts to suppress naked photographs have been made by WhatsApp, Facebook, and Instagram.

In my opinion, the Bills are trying to discourage companies from selling the software on their online marketplaces. Because the internet is so much larger than these physical locations, anyone can find this application.

Having an attorney on hand who is well-versed in the digital realm helps in categorizing the actions that took place and providing a practical explanation for the losses. Another benefit is that it lessens the likelihood of the matter being dismissed as insignificant, which happens fairly often. To begin the inquiry and investigation, the case will be sent to the most suitable location nearest to the victim’s house if a specialist police station is not available.

Leave a Reply

Your email address will not be published. Required fields are marked *