• FaceDeer@kbin.social
    link
    fedilink
    arrow-up
    49
    ·
    edit-2
    1 year ago

    And it didn’t exactly offer it, as in “why don’t you try this?” The AI was set up so you could give it some ingredients and it would make up recipes that used those ingredients.

    New Zealand political commentator Liam Hehir wrote on Twitter that he asked the Pak‘nSave bot to create a recipe that only included water, ammonia and bleach

    When you mix ammonia and bleach you get chloramine, the AI was basically told to make a recipe that would produce that.

    • Chozo@kbin.social
      link
      fedilink
      arrow-up
      15
      ·
      1 year ago

      Just like most stories these days about AI doing some weird shit, it’s almost always because it was explicitly instructed to do weird shit.

    • Very_Bad_Janet@kbin.social
      link
      fedilink
      arrow-up
      5
      ·
      edit-2
      1 year ago

      True but when most people put a few ingredients into a search engine, they usually get a recipe that uses most (but not necessarily all) of the ingredients. If you used “ammonia bleach recipe” as search terms in a search engine, you would not get any results for drinkable recipes, likely just articles and blog posts with warnings on not to mix them. The people using the AI recipe bot probably started out using it like a search engine but must have noticed that it will use all ingredients no matter how disharmonious, then started to test how bad the bot really was, pushing it to absurd levels.

      The real story is that it creates recipes using all suggested ingredients (a serious bug) and they are all crap - it’s useless.