• Mikie
    7.2k
    I'll agree with you on "pitiful" -- this is one of the most embarrassing crashouts I've seen on this site.
    You couldn't just acknowledge that it is disinformation and move on with your life.
    Mijin

    I missed that entire exchange a while back, but agreed. Embarrassing. I was thinking about doing a thread about this, and it’s a good example: “if anything contradicts me, it’s biased.”

    So my citations are complete bullshit? YOU are to blame somehow: you don’t read carefully enough, you misunderstood, you’re biased, you’re a bad “interlocutor.” Basically, you’re doing something wrong, not me.

    It’s not that common, and not as blatantly obvious as in this example, but it’s common enough to warrant a little reflection.
  • Punshhh
    3.5k
    Please don’t apologise, I welcome the interest.

    I thought the issue about Grok should be included here because it has been treated as a free speech issue in the press. It was the lead story in the U.K. yesterday, after Iran.

    I’m not sure what to think about that, but I do think it’s a publication issue. Because People are using Grok to alter images with skinny bikinis on X, which are immediately posted on a public forum. Meaning embarrassing pictures of people which are non consensual are being published. This seems to fall foul of the U.K. online safety act.

    Interestingly the two main political party’s are in agreement that it should be stopped. While only Elon Musk and Nigel Farage are claiming censureship, or a free speech issue.
  • AmadeusD
    4k
    Oh, right - cool, thanks. Yeah, I get that and I agree. i think that, though, can be considered a privacy issue. Publication being the trigger, so I totally agree.

    I think if you, in the privacy of your own technological world, create AI images of someone public you have a crush on for sexual gratification, as long as that never leaves your technological bubble, I don't see the harm. But making anything of this kind public immediate violates several things that we don't even need to look at digital communications legislation for, i think.
  • Punshhh
    3.5k
    Yes, agreed, it seems a clear cut to me.
    But this shines a light on what Musk and Farage are doing. They are claiming it is censorship, a free speech issue. So they are claiming that free speech includes the right to publish embarrassing, or defaming images of anyone on public forums, without their consent.
    This looks like an overreach of the free speech narrative, exposing it as a false populist narrative, or culture war.
  • AmadeusD
    4k
    Yeah, overall I agree with your sentiment - on defamation, that's a bit nuanced - I think publishing embarrassing pictures of someone is entirely legal, where they do not enjoy an expectation of privacy right? Paparazzi can publish nude photos of public figures, if taken in public. But for AI, there's nothing 'real' to adjudicate in that normal way the legal system would. There as no public v private, or any real privacy concerns. So one could claim to be embarrased by an Ai image getting into the public, but I highly doubt this would be the same "embarrassment" meant by that claimed when the image is a real, private image.

    I think Musk, at least, is clumsily trying to point out the amorphous nature (and this is somewhat corroborated by the history of common law on the matter) of 'defamation' and the various ways that can be claimed. He's just both autistic and has a huge ego so it's difficult to parse anything that specific publicly.
  • Punshhh
    3.5k
    We can draw a parallel with deepfakes, which have been going around for a few years now. The legal ramifications have been worked out and tested in the courts. What is different now is app’s like Grok make it possible to produce this kind of material in seconds as an alteration to a photo, which can be accessed by anyone on the platform.
    There as no public v private, or any real privacy concerns. So one could claim to be embarrased by an Ai image getting into the public, but I highly doubt this would be the same "embarrassment" meant by that claimed when the image is a real, private image.
    This is an interesting angle, maybe it doesn’t matter if the image is of the actual body of the person in the photo. As long as it is believable, or the public can be persuaded that it is. Also there is that visceral reaction people have to indecent, or explicit material. This can increase the impact and where it is used maliciously to blackmail, or abuse a vulnerable person, it is a serious issue.

    There have been reports in the U.K. of a rapid increase in the amount of pedophilia related material. Where the line between real images and AI generated images is becoming blurred. I heard reports that the photo’s of Renee Good were micro bikinied and spread in social media within hours of her murder last week.
    Then there are people in the public eye being depicted with bruising, smeared in blood, or with tattoos. Where defamation may be involved.

    It looks like Musk backed down yesterday and is taking down the facility. As there were indications of government action against it in most European countries.
1234Next
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment

Welcome to The Philosophy Forum!

Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.