Several attacks involving OpenAI’s chatbot—including Tumbler Ridge and FSU—raise urgent questions about the technology.

“From the outside, it looks like OpenAI had the opportunity to prevent this horrific loss of life, to prevent there from being dead children,” said BC Premier David Eby after the Journal reported on the shooter’s ChatGPT use. “I’m angry about that. I’m trying hard not to rush to judgment.” Canadian authorities demanded accountability and vowed to create new national requirements for tech companies to report threats brewing on their platforms.

OpenAI told Canadian government leaders in late February that under the company’s newly revised protocols, the shooter’s account from June 2025, if discovered today, would be flagged to law enforcement. “Mental health and behavioural experts now help us assess difficult cases, and we have made our referral criteria more flexible to account for the fact that a user may not discuss the target, means, and timing of planned violence in a ChatGPT conversation but that there may be potential risk of imminent violence,” VP of Global Policy Ann O’Leary stated in an open letter

  • BlameThePeacock@lemmy.ca
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    4 days ago

    So what you’re saying is that we shouldn’t blame the actual tool that people are using to kill people, but we should blame the tool people are using to plan for the use of that tool?

    That is some fucked up logic right there.

    • ArmchairAce1944@lemmy.ca
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      edit-2
      4 days ago

      Caelen Conrad explained it better than I could here

      But I will make it a tl;dr (or tl;dw, whatever suits you) for you. Having a gun, or access to a gun, alone is insufficient to make a mass shooting or mass murder at all possible, especially when you have someone who could have been stopped or talked out of it in any number ways at any time, such as the Tumblr ridge shooter, or the Nazi Incel that Conrad talks about. ChatGPT not only did none of that, it greatly removed friction from the psychological process of working up to do it (killing people is not easy. I don’t mean that just in a physical sense of shooting or beating someone to death, it takes a LOT to get one human being to kill another human being) and the actual planning process of the shooting.

      Do you know why we don’t see a lot of successful bombings? Or why bombings are fairly rare? The reason for this is the skill and skill set to actually do something like that successfully is difficult to acquire. You could spend a lot of time learning chemistry and electronics and craftsmanship and sourcing the required materials to do it, and that is not easy in and of itself. The internet has been awash with bomb making guides for decades, but even buying the materials needed (and in some places even ASKING about where to find the stuff) to build a bomb often raises alarm bells in many systems that would have the police take you in for questioning. However if an LLM could somehow tell you ‘here’s how to buy what you need to make a big bomb without raising suspicion’ and provides the links, and then gives clear step by step instructions on what to do for your specific purpose, it has removed 90% of all the work.

      In short, without the support of chatGPT, the shooting would likely have never happened, or if it did, it would have resulted in far fewer deaths than it did. I mean this for both tumbler ridge and for the Nazi incel shooting.

      Edit: Also the fact that chatGPT did not have the ability to report these shooters meant that disclosing their plans to it, even if it didn’t help them as much, would have still facilitated in getting their plans out of their brains and disclosed somewhere. Many shooters are thwarted because they talk about wanting to do what they want to do to another human being, who DID report their asses to the authorities.

      • BlameThePeacock@lemmy.ca
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        4 days ago

        If these people didn’t have guns, no amount of talking to ChatGPT would have mattered in terms of them going through with a mass shooting.

        It’s painfully clear that at least for the Tumbler Ridge shooter there should not have been any guns in that home. Multiple humans KNEW there was an issue, and that didn’t stop it at all.

        I know it’s hard, but use your fucking brain. You’re so caught up with trying to justify your own ownership of firearms that you refuse to even consider that they’re the largest component of the problem.

        We don’t have a lot of mass knife attacks, we don’t have a lot of bombings, we don’t have a lot of poisonings, because as you pointed out they have a much higher skill/knowledge requirement. The thing you intentionally ignored is that the skill/knowledge requirement for pulling a trigger is really low.

        I’m not saying guns shouldn’t exist. However, I personally think that handguns and even semi-automatic rifles/shotguns should not be allowed to be owned by civilians. If you want a gun for personal use, you can use a bolt action rifle or break-action shotgun. Those cover 99% of legitimate use cases for firearms. All firearms should also be legally required to be bright pink.

        Job based firearms can be licensed separately, because there are legitimate use cases for handguns and semi-automatics in very specific roles or tasks.

        • ArmchairAce1944@lemmy.ca
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          4 days ago

          We don’t have a lot of mass knife attacks, we don’t have a lot of bombings, we don’t have a lot of poisonings, because as you pointed out they have a much higher skill/knowledge requirement. The thing you intentionally ignored is that the skill/knowledge requirement for pulling a trigger is really low.

          And LLMs would absolute reduce that friction to next to nothing, which would make it happen more often and more successfully.

          The Nazi incel shooter would NEVER have succeeded in any of his goals without the chat. It doesn’t matter that he had a gun (which was his dad’s gun, not his own), the chats revealed that he knew so little about firearms and the times when people gathered most densely that he if tried to do what he did without chatGPT he would likely failed to kill anyone. Injuries maybe, but not deaths. The guy was so ignorant on how to carry out an attack and without the support of chatGPT he might have gotten over his desire to kill and even suicide with minimal or no intervention.

          BTW, for explosives that I mentioned? There is a precedent on how reducing friction can facilitate a massive rise in bombings is the availability of bombing making instruction manuals. As I said, the internet is awash with them, but once upon a time they weren’t available… until BBSes in the late 80s and early 90s started having them And once people were able to get them there was a massive spike in pipe bomb incidents in the US at the time.

          Timothy McVeigh used such books in making the bomb he used to blow up the FBI building in Oklahoma city. They are still available.

          A bomb making guide reduces friction in making successful bombings, but if it were combined with an LLM who ‘just wanted to help with a fun experiment’ it would reduce whatever friction left to nothing.

          • BlameThePeacock@lemmy.ca
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            4 days ago

            No, it really fucking won’t.

            LLMs don’t do shit to reduce the friction of a knife attack, that’s not something an LLM can help with significantly. Youtube videos are a better source for “how to” knife attacks.

            LLMs don’t help someone make bombs, go try and ask it, it will refuse on all the major platforms. If someone is going to learn enough about how to jailbreak modern LLMs, they can find the same information that you already mentioned. There’s no friction reduction here.

            And you STILL don’t get the fact that without that gun, the Nazi incel shooter would have shot exactly 0 people.

            • ArmchairAce1944@lemmy.ca
              link
              fedilink
              arrow-up
              1
              arrow-down
              1
              ·
              4 days ago

              Watch the fucking video and read the damn linked article. The LLM did more to make the shooting happen than anything else. In both the Canadian Tumbler Ridge shooting and the one in Florida. You have no idea what you are talking about.

              • BlameThePeacock@lemmy.ca
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                1
                ·
                3 days ago

                You can not shoot people with a gun, if you do not have a gun.

                Logic does not get any simpler than that. Yet you refuse to accept it.