A shocking story was promoted on the “front page” or main feed of Elon Musk’s X on Thursday:

“Iran Strikes Tel Aviv with Heavy Missiles,” read the headline.

This would certainly be a worrying world news development. Earlier that week, Israel had conducted an airstrike on Iran’s embassy in Syria, killing two generals as well as other officers. Retaliation from Iran seemed like a plausible occurrence.

But, there was one major problem: Iran did not attack Israel. The headline was fake.

Even more concerning, the fake headline was apparently generated by X’s own official AI chatbot, Grok, and then promoted by X’s trending news product, Explore, on the very first day of an updated version of the feature.

  • Deceptichum@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    30
    ·
    7 months ago

    It’s pretty, trending is based on . . . What’s trending by users.

    Or as the article explains for those who can’t comprehend what trending means.

    Based on our observations, it appears that the topic started trending because of a sudden uptick of blue checkmark accounts (users who pay a monthly subscription to X for Premium features including the verification badge) spamming the same copy-and-paste misinformation about Iran attacking Israel. The curated posts provided by X were full of these verified accounts spreading this fake news alongside an unverified video depicting explosions.

      • Deceptichum@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        12
        ·
        7 months ago

        It does say it’s likely hyperbole, so they probably just tazed and arrested the earthquake.

        Also I’m impressed by the 50,000 to 1,000,000 range for officers deployed. It leaves little room for error.

        • PopShark@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          7 months ago

          I wonder if the wide margin is the AI trying to formulate logic and numbers in the story but it realizes it doesn’t know how many officers would be needed to shoot the earthquake since it would logically depends on the magnitude of the earthquake which the AI doesn’t know so it figures well alright tectonic plates are rather resistant to firearms discharge and other potential law enforcement tactics so it starts high at 50,000 but decides 1,000,000 is a reasonable cap as there just can’t be more than that many officers present in the state or country