• HiddenLayer5@lemmy.ml
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    1
    ·
    edit-2
    1 year ago

    The last quote danced around it but if the implication is that they were seeking out and collecting CSAM which is a sex crime to access, possess and distribute, why the fuck are the boards of both companies not in prison and on the sex offender list?!

    I mean, I know why, but

    • SacrificedBeans@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 year ago

      I’m sure there’s some loophole there, maybe between countries’ laws. And if there isn’t, Hey! We’ll make one!

    • Meowoem@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      1 year ago

      They could be working with the governments of relevant countries to develop filters and detection systems.

    • Clbull@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      Isn’t CSAM classed as images and videos which depict child sexual abuse? Last time I checked written descriptions alone did not count, unless they were being forced to look at AI generated image prompts of such acts?

      • Strawberry@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        1
        ·
        1 year ago

        That month, Sama began pilot work for a separate project for OpenAI: collecting sexual and violent images—some of them illegal under U.S. law—to deliver to OpenAI. The work of labeling images appears to be unrelated to ChatGPT.

        This is the quote in question. They’re talking about images

    • aidan@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      IIRC there are a few legitimate and legal reasons to seek CSAM, such as journalism, and definitely developing methods to prevent it’s spread.

    • smooth_tea@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      3
      ·
      1 year ago

      I really find this a bit alarmist and exaggerated. Consider the motive and the alternative. You really think companies like that have any other options than to deal with those things?

      • HiddenLayer5@lemmy.ml
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        edit-2
        1 year ago

        If absolutely nothing else and even assuming for the sake of the argument that work of this nature is completely justified, they still have to answer for the fact that they severely underpaid foreign workers in clickfarms to do this and traumatize themselves on their behalf presumably so no one in the West had to.

        Personally, my opinion is very strongly that if you can’t develop a technology without committing such serious ethical breaches, for example seeking out and accumulating CSAM, then it’s either too early to develop that technology or it’s not worth developing at all. One may counter this with something like “well it’s basically inevitable that unscrupulous people will harm others to develop technology” but I would also argue that while that is true, the inevitability of something doesn’t make the act itself any less unethical.

        As a bit of context: The reason why even accessing and possessing CSAM is illegal almost everywhere in the world is because the generally accepted philosophy around this kind of material is that every time someone views it for any reason, it victimizes that child all over again, which is also very consistent with the opinions of actual CSAM survivors so I don’t feel it’s something that the rest of us can really question. I obviously cannot speak on their behalf in any way, but my guess would be the vast majority of CSAM victims do not want photos and videos the most terrifying and traumatic moments of their lives being used in this way, especially not by a for-profit company so they can develop a product with the goal of making themselves richer.

      • Floshie@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        Consider the impact on human psychology. Not everyone has the guts to read and even look through these. And even though they appear to have, it still scars them inside.

        Maybe There is no alternative for now, but don’t do that to people with such low paycheck. Consider even the background of these people who may work on these tasks to not even live, but to survive. I would have preffered to wait 10 years than to indulge these horrifying tasks to those persons.

        I’m sure there are lots of people who are in jail for creating/sharing or even making a profit off of these content. They could do that work ? But then again, even though it bothers me less than people who has no choice to live their lives, that is still an Idea I find ethically very questionable.

      • barsoap@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Very much yes police authorities have CSAM databases. If what you want to do with it really is above board and sensible they’ll let you access that stuff.

        I don’t doubt anything that OpenAI could do with that stuff can be above board, but sensible is another question: Any model that can detect something can be used to train a model which can generate it. As such those models are under lock and key just like their training sets, (social) media platforms which have a use for these things and the resources run them, under the watchful eye of the authorities. Think faceboogle. OpenAI could, in principle, try to get into the business of selling companies at that scale models they can, and have, trained themselves, I don’t really see that making sense from the business POV, either.