Websites that use AI to create nude images of underage girls in Australia have been shut down
Websites that use AI to create nude images of underage girls in Australia have been shut down
Australia has shut down three nudify websites that use artificial intelligence to create sexually explicit images of children.
According to a global news agency, the Australian Internet Safety Commissioner said that platforms that use artificial intelligence to create nude images of anyone are proving to be disastrous for Australian schools.
eSafety Commissioner Julie Inman-Grant added that before the closure of these three websites, a formal warning was issued that a fine of up to 49.5 million Australian dollars could be imposed for violations.
According to her, despite this, the relevant websites did not take adequate security measures to prevent child exploitation on their platforms, but rather some features were being marketed in a way that promoted the misuse of images of underage girls.
The websites were being visited by around 100,000 Australian users per month and were also linked to several high-profile cases of fake sexual images of schoolchildren.
Significant steps are being taken in Australia to protect children online, and just a few days ago, children under 16 were banned from social media while a crackdown was launched on deepfake apps.
International surveys show that the problem of non-consensual AI deepfake images among teenagers is growing rapidly.
According to the US organization Thorne, 10% of young people aged 13 to 20 know someone who has been faked naked, while 6% have been a victim themselves.
According to a global news agency, the Australian Internet Safety Commissioner said that platforms that use artificial intelligence to create nude images of anyone are proving to be disastrous for Australian schools.
eSafety Commissioner Julie Inman-Grant added that before the closure of these three websites, a formal warning was issued that a fine of up to 49.5 million Australian dollars could be imposed for violations.
According to her, despite this, the relevant websites did not take adequate security measures to prevent child exploitation on their platforms, but rather some features were being marketed in a way that promoted the misuse of images of underage girls.
The websites were being visited by around 100,000 Australian users per month and were also linked to several high-profile cases of fake sexual images of schoolchildren.
Significant steps are being taken in Australia to protect children online, and just a few days ago, children under 16 were banned from social media while a crackdown was launched on deepfake apps.
International surveys show that the problem of non-consensual AI deepfake images among teenagers is growing rapidly.
According to the US organization Thorne, 10% of young people aged 13 to 20 know someone who has been faked naked, while 6% have been a victim themselves.
Express your opinion:
Recent Comments
No comments yet. Be the first to comment!