Mikie
I'll agree with you on "pitiful" -- this is one of the most embarrassing crashouts I've seen on this site.
You couldn't just acknowledge that it is disinformation and move on with your life. — Mijin
Punshhh
AmadeusD
Punshhh
AmadeusD
Punshhh
This is an interesting angle, maybe it doesn’t matter if the image is of the actual body of the person in the photo. As long as it is believable, or the public can be persuaded that it is. Also there is that visceral reaction people have to indecent, or explicit material. This can increase the impact and where it is used maliciously to blackmail, or abuse a vulnerable person, it is a serious issue.There as no public v private, or any real privacy concerns. So one could claim to be embarrased by an Ai image getting into the public, but I highly doubt this would be the same "embarrassment" meant by that claimed when the image is a real, private image.
AmadeusD
As long as it is believable, or the public can be persuaded that it is. — Punshhh
This can increase the impact and where it is used maliciously to blackmail, or abuse a vulnerable person, it is a serious issue. — Punshhh
here have been reports in the U.K. of a rapid increase in the amount of pedophilia related material. Where the line between real images and AI generated images is becoming blurred. I heard reports that the photo’s of Renee Good were micro bikinied and spread in social media within hours of her murder last week.
Then there are people in the public eye being depicted with bruising, smeared in blood, or with tattoos. Where defamation may be involved. — Punshhh
Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.