Elon Musk has until the end of Wednesday to respond to demands from Brussels to remove graphic images and disinformation linked to the violence in Israel from his social network X — or face the full force of Europe’s new social media rules.

Thierry Breton, the European Union commissioner who oversees the bloc’s Digital Services Act (DSA) rules, wrote to the owner of X, formerly Twitter, to warn Musk of his obligations under the bloc’s content rules.

If Musk fails to comply, the EU’s rules state X could face fines of up to 6 percent of its revenue for potential wrongdoing. Under the regulations, social media companies are obliged to remove all forms of hate speech, incitement to violence and other gruesome images or propaganda that promote terrorist organizations.

Since Hamas launched its violent attacks on Israel on October 7, X has been flooded with images, videos and hashtags depicting — in graphic detail — how hundreds of Israelis have been murdered or kidnapped. Under X’s own policies, such material should also be removed immediately.

  • atetulo@lemm.ee
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    29
    ·
    1 year ago

    Umm… why? Why are they censoring the truth?

    This is how people don’t take war seriously. All they do is hear about it, but don’t see the gruesome reality.

    • SpaceBishop@lemmy.zip
      link
      fedilink
      English
      arrow-up
      16
      arrow-down
      2
      ·
      1 year ago

      Why are they censoring the truth?

      Oh, that must be really embarrassing, but…

      graphic images and disinformation

      Maybe work on your reading comprehension to make sure you don’t embarrass yourself like that again.

      • atetulo@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        11
        ·
        1 year ago

        What are you talking about?

        I’m specifically referring to the videos and images.

        gruesome reality.

        other gruesome images

        Maybe work on your reading comprehension to make sure you don’t embarrass yourself like that again.

        Oh the ironing.

        • BreakDecks@lemmy.ml
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          1 year ago

          Child sex abuse is a reality we have to confront head-on, but we don’t share images of it for awareness.

          Likewise, you shouldn’t be sharing images of the slaughtered bodies of civilians to draw awareness to terrorism.

          • atetulo@lemm.ee
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            2
            ·
            1 year ago

            Woah, child sex abuse isn’t the same as war though. People already take it plenty seriously and nobody is glorifying it (out in the open.)

            Your analogy isn’t a 1:1 representation of the topic at hand. All it does is pivot from the actual topic to something that’s easier for you to argue against.

            • BreakDecks@lemmy.ml
              link
              fedilink
              English
              arrow-up
              2
              ·
              1 year ago

              I never said it was the same, it’s called a comparison. We ban images of sex abuse because of the harm sharing those images does to victims. Hamas has gone through a lot of effort to film and disseminate what they did in Israel online, and they are doing so with the intent of doing harm to the victims’ families. While there may not be laws in the USA prohibiting the sharing of this content, I would still argue that it is morally reprehensible given that you are participating in something intended to do harm.

              • atetulo@lemm.ee
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 year ago

                What it’s ‘intended’ to do doesn’t really matter. If you notice, people aren’t supporting Hamas. They see these videos and they’re rallying behind them in support for Israel.

                Wow. It’s almost like, exactly how I said, showing people instead of telling them causes them to take war seriously.

    • pflanzenregal@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      edit-2
      1 year ago

      Traumatizing ≠ making people take war seriously.

      Believe it or not, journalism and educating people is much more than uploading graphically disturbing images to some website and leave it as is.

      • atetulo@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        2
        ·
        1 year ago

        Traumatizing ≠ making people take war seriously.

        That’s actually not true, and most people who watch these videos aren’t ‘traumatized’, so it’s not really an argument.

        Believe it or not, journalism and educating people is much more than uploading graphically disturbing images to some website and leave it as is.

        Who said it isn’t? They should include the footage with their articles. This way people can see instead of just being told.

        If they don’t want to look, then there should be explicit content warnings.

        • BreakDecks@lemmy.ml
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          There have been plenty of studies about gore and death content that suggest they cause trauma similar to PTSD. Some people are affected more than others. On X you’re pretty likely to be presented with some extremely violent images right now if you go looking for information about what is happening, so you can’t really avoid it other than to avoid X entirely. Plenty of these images and videos aren’t even related to this conflict, and are just misinformation / ragebait.

          • atetulo@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            1 year ago

            Can you show me what studies you’re talking about?

            I have a feeling you’re referring specifically to studies that focus on people who are paid to moderate this content. If you share what studies you’re talking about we can know for sure.

            • BreakDecks@lemmy.ml
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              1
              ·
              1 year ago

              You really don’t need to look further than the clinical data on PTSD. A sufficient amount of any form of trauma can cause mental health issues including but not limited to PTSD. Watching an execution video has a large potential to cause a severe trauma response, especially if the victims are people you know or love, or are members of your community.

              Plenty of real world examples of content moderation teams at social media companies suffering from their exposure to extreme content.

              Traumatizing people is one of the core goals of terrorism, because it does damage.

              • atetulo@lemm.ee
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                2
                ·
                1 year ago

                Thanks for not linking to a single study like I asked.

                Sorry, but I won’t take you seriously until you do. You mentioned ‘studies.’ Show us them.

                I’ve seen studies that disprove your studies.