The rollout of the Digital Services Act in the EU has made it hard for tech giants like X to function without transparency.
And keeping in line with this theme, we’re now seeing the EU seek more data from popular social media app X regarding a recent decision to carry out cuts to the company’s content moderation team.
The news comes after the firm opted to get rid of 20% of its workforce in this division and now the matter is being investigated under the tougher online content laws that function in this part of the world.
The European Commission was recently tasked by the EU for more data on this front as per the Digital Services Act. The latter has been causing a stir as it’s a groundbreaking law that forces online companies to take on stricter means to police those embarking on illegal and harmful endeavors linked to their platforms online.
Right now, they are having serious concerns with Elon Musk’s company which cited a certain figure for content moderators in October of 2023 and that figure has gone down this year dramatically, citing major concerns.
The company also ended up reducing linguistics within the region where 11 languages were brought down to just 7. Meanwhile, other points worth mentioning include the impact that generative AI has on the world of politics and voting in elections, how the app was disseminating illegal content, and how it was guarding against fundamental rights.
X needs to provide more data to media outlets right now and return requests for comments on the subject.
But from what we can confirm right now, it needs to give the EU all requested material and reports by May 17 that speak about its content moderation and generative AI more transparently. The deadline is May 27 so we should be seeing things pick up the pace really soon.
This is another example of the formal steps taken by the EU against those breaching the DSA.
We had the commission begin formal proceedings against the company late last year after it spoke about some serious concerns linked to getting rid of illegal material linked to the controversial Israel-Hamas war.
The Commission went on to add how such investigations were designed to focus more on the likes of X being compliant with duties designed to ward off the dissemination of illegal material across the EU. Similarly, how effective it was on this front to stop misinformation spread and how it planned to raise transparency were other points worth discussing.
The investigation against X seems to be an ongoing ordeal where officials from the EU have gathered plenty of evidence in regards to the DSA and are now trying to see what steps the tech giant is taking to stop disinformation spread as Generative AI continues to serve major security threats.
The Digital Services Act came into play toward the end of 2022 but it’s been causing a major stir in the EU for many online tech platforms like X which are mitigating risks of false information spread and giving rise to rigorous means to get rid of hate and also trying to balance out matters like freedom of expression along the way.
And if a company has been found guilty of the above, they would be penalized with fines that can go up to 6% of their annual revenue generated globally.
Image: DIW-Aigen
Read next: Apple Faces Fast-Spreading Backlash Over 'Tone-Deaf' Crush Ad For Record-Thin iPad Pro
And keeping in line with this theme, we’re now seeing the EU seek more data from popular social media app X regarding a recent decision to carry out cuts to the company’s content moderation team.
The news comes after the firm opted to get rid of 20% of its workforce in this division and now the matter is being investigated under the tougher online content laws that function in this part of the world.
The European Commission was recently tasked by the EU for more data on this front as per the Digital Services Act. The latter has been causing a stir as it’s a groundbreaking law that forces online companies to take on stricter means to police those embarking on illegal and harmful endeavors linked to their platforms online.
Right now, they are having serious concerns with Elon Musk’s company which cited a certain figure for content moderators in October of 2023 and that figure has gone down this year dramatically, citing major concerns.
The company also ended up reducing linguistics within the region where 11 languages were brought down to just 7. Meanwhile, other points worth mentioning include the impact that generative AI has on the world of politics and voting in elections, how the app was disseminating illegal content, and how it was guarding against fundamental rights.
X needs to provide more data to media outlets right now and return requests for comments on the subject.
But from what we can confirm right now, it needs to give the EU all requested material and reports by May 17 that speak about its content moderation and generative AI more transparently. The deadline is May 27 so we should be seeing things pick up the pace really soon.
This is another example of the formal steps taken by the EU against those breaching the DSA.
We had the commission begin formal proceedings against the company late last year after it spoke about some serious concerns linked to getting rid of illegal material linked to the controversial Israel-Hamas war.
The Commission went on to add how such investigations were designed to focus more on the likes of X being compliant with duties designed to ward off the dissemination of illegal material across the EU. Similarly, how effective it was on this front to stop misinformation spread and how it planned to raise transparency were other points worth discussing.
The investigation against X seems to be an ongoing ordeal where officials from the EU have gathered plenty of evidence in regards to the DSA and are now trying to see what steps the tech giant is taking to stop disinformation spread as Generative AI continues to serve major security threats.
The Digital Services Act came into play toward the end of 2022 but it’s been causing a major stir in the EU for many online tech platforms like X which are mitigating risks of false information spread and giving rise to rigorous means to get rid of hate and also trying to balance out matters like freedom of expression along the way.
And if a company has been found guilty of the above, they would be penalized with fines that can go up to 6% of their annual revenue generated globally.
Image: DIW-Aigen
Read next: Apple Faces Fast-Spreading Backlash Over 'Tone-Deaf' Crush Ad For Record-Thin iPad Pro