Plenty of tech giants have come under fire in the past for not giving researchers access to their data for the sake of risk studies. And keeping that in view, the EU is asking for more information on this front by listing 17 leading platforms that are requested to report on the matter soon.
The news comes as the DSA has come into play and that means giving researchers access to data belonging to these companies is a fundamental right that it feels cannot be denied under the transparency clauses of the Act.
All studies are linked to highlighting and determining what sorts of systemic risks take place in the EU.
Moreover, a long list of big online apps was issued a notice by the EU and they included the likes of Ali Express, Apple’s App Store, Amazon, Google Play Store, Pinterest, LinkedIn, YouTube, TikTok, Google Maps, Bing, Google Search, and a host of others.
They are being requested to give more data on how they’re giving access without any kind of delay involved. The goal right now is to make sure accountability and scrutiny continue to take place as we speak. It’s important to research more on this front, the regulatory body added, especially considering how there are plenty of events set to take center stage like the upcoming elections. So for effective monitoring, this is needed as it would keep unlawful actions at bay and ensure only good stems from these apps.
Those who were not included in this list include the likes of Wikipedia, X Videos, X Corp, and even Pornhub. But that does not mean Musk’s X is not under hot water. It’s already being scrutinized as we speak as a part of a formal investigation on the matter that began in December of last year.
Since top porn sites were only recently labeled as VLOPs, their role in research is less and therefore they will be less scrutinized for access by researchers. So far, 23 VLOPs have been identified in the Commission right now.
As far as which company is actually getting things right, it’s Wikipedia that has been giving researchers all the data they need with ease for the sake of risk studies. This might have to do with the fact that it’s not-for-profit while the rest are. So their edits are subject to change and open for all as a default setting.
The others are just less likely to be questioned by external sources about how they make money and that could impact the way people view them in society. Under this new DSA, topics that many studies would be diving deep into include child safety, disinformation, mental health, and violence related to gender.
Regulators feel that this is the right first step forward to put pressure on companies. It’s time to get better answers than what has been on display so far. A deadline for the first week of February was provided to fulfill the request for data by the EU. After getting those results, the Commission would go on to take the next leading steps, it added.
As a part of the reminder so far, we may soon see offending apps face bigger penalties of nearly 6% annual turnover. It could also take stringent action against those making the offenses again and again.
Past reports prove how RFIs were rolled out by the EU to focus on a leading number of areas including the likes of disinformation risks linked to it like the ongoing conflict taking place in the Hamas and Israel war in the Gaza Strip and the upcoming elections too.
Photo: Digital Information World - AIgen
Read next: AI-Produced Content Is Being Marketed Across Google News And The Company Is Aware Of It
The news comes as the DSA has come into play and that means giving researchers access to data belonging to these companies is a fundamental right that it feels cannot be denied under the transparency clauses of the Act.
All studies are linked to highlighting and determining what sorts of systemic risks take place in the EU.
Moreover, a long list of big online apps was issued a notice by the EU and they included the likes of Ali Express, Apple’s App Store, Amazon, Google Play Store, Pinterest, LinkedIn, YouTube, TikTok, Google Maps, Bing, Google Search, and a host of others.
They are being requested to give more data on how they’re giving access without any kind of delay involved. The goal right now is to make sure accountability and scrutiny continue to take place as we speak. It’s important to research more on this front, the regulatory body added, especially considering how there are plenty of events set to take center stage like the upcoming elections. So for effective monitoring, this is needed as it would keep unlawful actions at bay and ensure only good stems from these apps.
Those who were not included in this list include the likes of Wikipedia, X Videos, X Corp, and even Pornhub. But that does not mean Musk’s X is not under hot water. It’s already being scrutinized as we speak as a part of a formal investigation on the matter that began in December of last year.
Since top porn sites were only recently labeled as VLOPs, their role in research is less and therefore they will be less scrutinized for access by researchers. So far, 23 VLOPs have been identified in the Commission right now.
As far as which company is actually getting things right, it’s Wikipedia that has been giving researchers all the data they need with ease for the sake of risk studies. This might have to do with the fact that it’s not-for-profit while the rest are. So their edits are subject to change and open for all as a default setting.
The others are just less likely to be questioned by external sources about how they make money and that could impact the way people view them in society. Under this new DSA, topics that many studies would be diving deep into include child safety, disinformation, mental health, and violence related to gender.
Regulators feel that this is the right first step forward to put pressure on companies. It’s time to get better answers than what has been on display so far. A deadline for the first week of February was provided to fulfill the request for data by the EU. After getting those results, the Commission would go on to take the next leading steps, it added.
As a part of the reminder so far, we may soon see offending apps face bigger penalties of nearly 6% annual turnover. It could also take stringent action against those making the offenses again and again.
Past reports prove how RFIs were rolled out by the EU to focus on a leading number of areas including the likes of disinformation risks linked to it like the ongoing conflict taking place in the Hamas and Israel war in the Gaza Strip and the upcoming elections too.
Photo: Digital Information World - AIgen
Read next: AI-Produced Content Is Being Marketed Across Google News And The Company Is Aware Of It