The EU has proposed a new law that forces tech giants to be given permission to scan users’ private chats for material that might be explicit such as those linked to child abuse.
But while it might seem like the next best thing to pass into EU law, experts are now warning against the endeavor as they feel it would give rise to a major privacy and security threat in the form of false positives spreading.
The concern has been mounting pressure on the minds of security advocates who don’t agree with the call to go ahead with this decision as the Commission has rolled out the idea two years back.
So many experts today speak about how the European Parliament is swimming in deep waters with this call and it could have some serious consequences which is why experts are now sounding alarm bells.
The report has further delineated how specific technology would be made use of to detect known CSAM and other kinds of tech would be used to see whether or not any form of grooming is involved. So this kind of nonspecific tech could lead to doing more harm than good as the world of tech continues to evolve today.
Meanwhile, critics continue to argue about how such proposals are saying hello to doing the impossible and it might not quite get the aim that it intend which at the end of it all is ensuring kids remain protected at all times.
It would simply wreak havoc online and impact users’ privacy by forcing apps to enable surveillance blankets of everyone which is super threatening.
Today, there is no evidence of any kind of technology that can get bigger and better demands without the fear of doing harm along the way but the fact the European Union is moving full throttle on this front says so much.
The last open letter spoke about changes to the regulation pertaining to CSAM scanning that was rolled out by the EU and where signatories argued about no use of fundamental laws being considered in the plan.
Researchers, activists, critics, and more were a part of the list who proposed the urgent need for reevaluation before it’s too late. Many continue to raise red flags on how the move is concerning and could give rise to more threats and attacks in the future.
More discussions on this front had to do with how the letter for the change failed to address matters pertaining to false positives that define the real person of interest so it’s going to be a challenge to figure that out, leading to more security threats.
Remember, the fact that we’re dealing with billions of people means we’re dealing with millions of false positives. So you can expect false alarms to be going off left and right thanks to errors made in detection until you repeat over and over again. Imagine millions of messages rolled out each day means you’re dealing with a lot.
Moving on to discussions surrounding encryption safeguarding, many feel any kind of detection linked to E2E encryption is failing its useful purpose which is protection via encryption standards.
The goal of encryption is to make sure only the intended person sees the data during the communication. For a while now, police chiefs have denied making calls for backdooring of encryption but they are yet to give rise to ideas on how they wish for tech to get lawful access to criminals chats. So as you can see, it’s one big dilemma.
Image: DIW-Aigen
Read next: Google’s Unfair Monopoly, Court Documents Reveal It Paid Apple $20 Billion To Become Safari’s Default Search Engine
But while it might seem like the next best thing to pass into EU law, experts are now warning against the endeavor as they feel it would give rise to a major privacy and security threat in the form of false positives spreading.
The concern has been mounting pressure on the minds of security advocates who don’t agree with the call to go ahead with this decision as the Commission has rolled out the idea two years back.
So many experts today speak about how the European Parliament is swimming in deep waters with this call and it could have some serious consequences which is why experts are now sounding alarm bells.
The report has further delineated how specific technology would be made use of to detect known CSAM and other kinds of tech would be used to see whether or not any form of grooming is involved. So this kind of nonspecific tech could lead to doing more harm than good as the world of tech continues to evolve today.
Meanwhile, critics continue to argue about how such proposals are saying hello to doing the impossible and it might not quite get the aim that it intend which at the end of it all is ensuring kids remain protected at all times.
It would simply wreak havoc online and impact users’ privacy by forcing apps to enable surveillance blankets of everyone which is super threatening.
Today, there is no evidence of any kind of technology that can get bigger and better demands without the fear of doing harm along the way but the fact the European Union is moving full throttle on this front says so much.
The last open letter spoke about changes to the regulation pertaining to CSAM scanning that was rolled out by the EU and where signatories argued about no use of fundamental laws being considered in the plan.
Researchers, activists, critics, and more were a part of the list who proposed the urgent need for reevaluation before it’s too late. Many continue to raise red flags on how the move is concerning and could give rise to more threats and attacks in the future.
More discussions on this front had to do with how the letter for the change failed to address matters pertaining to false positives that define the real person of interest so it’s going to be a challenge to figure that out, leading to more security threats.
Remember, the fact that we’re dealing with billions of people means we’re dealing with millions of false positives. So you can expect false alarms to be going off left and right thanks to errors made in detection until you repeat over and over again. Imagine millions of messages rolled out each day means you’re dealing with a lot.
Moving on to discussions surrounding encryption safeguarding, many feel any kind of detection linked to E2E encryption is failing its useful purpose which is protection via encryption standards.
The goal of encryption is to make sure only the intended person sees the data during the communication. For a while now, police chiefs have denied making calls for backdooring of encryption but they are yet to give rise to ideas on how they wish for tech to get lawful access to criminals chats. So as you can see, it’s one big dilemma.
Image: DIW-Aigen
Read next: Google’s Unfair Monopoly, Court Documents Reveal It Paid Apple $20 Billion To Become Safari’s Default Search Engine