Reports from the FBI reveal that scammers are utilizing deep-fake technology in order to impersonate other individuals for job interviews.
Honestly, just from the sound of it this comes off as the funniest Black Mirror episode never made. But honestly, let me get something off of my chest: despite all the numerous security ramifications and privacy infringements that the technology can lead to, I really think deep-fakes can be fun sometimes. One of my favorite formats of video nowadays is watching users who’ve either edited new characters into older movies, or have edited older characters to resemble their current actors more. A major example can be found in the myriad of videos which have any of the live-action Spider-Man actors being deep-faked to look like each other. Seriously, look up Tobey Maguire in Far From Home, or Tom Holland in Spider-Man 2, you’ll encounter loads of results. It’s a fun flexing of technology that, while occasionally coming off as creepy (Luke Skywalker’s weird flat face in The Mandalorian Season 2 finale), can still be entertaining to observe.
However, since this is the internet after all and we really can’t have nice things, deep-fakes have been weaponized on any number of occasions to either add famous celebrities to rather unflattering and/or inappropriate videos, or to scam people across the globe. While the majority of such efforts are now flagrantly illegal, it’s sort of difficult trying to press charges against some snotty thirty-year-old loser using a VPN in their mother’s basement on the other side of town. However, the problem with deep-fakes is that they’re not easy to do, for starters, and even amazing attempts can lead to flawed results. The technology is much more akin to a very complex Snapchat filter than it is to the rubber masks from Mission Impossible.
The Federal Bureau of Investigation sees deep-fakes as threats to employers, and I can see why. Lord knows the USA cares more about employers, managers, and corporations than the other millions living there. However, if you’re from relatively younger generations, and even from older ones for that matter, you will more often than not be tipped off to a deep-fake. Let’s talk about the uncanny valley.
The uncanny valley is a concept stating that the more photorealistic our attempts at making human faces via AI get, the more we will note something off about them. It’s the Grand Moff Tarkin effect from Rogue One, it’s the weird baby from those weird Twilight movies, it’s that terrible Bruce Lee Johnnie Walker ad. Since we tend to attach and orient ourselves to humans more than animals or inanimate objects, we notice even the tiniest missing details quickly. Be it missing pores, or features that aren’t just right, there’s a lot that can tip us off.
The FBI has been rather vocal about this as well, assuring many that deep-fakes can be easily spotted with just a little careful consideration.
Via @onamastudio/freepik
Read next: Here’s How the Cybersecurity Landscape Might Look By 2025 According Gartner’s New Report
Honestly, just from the sound of it this comes off as the funniest Black Mirror episode never made. But honestly, let me get something off of my chest: despite all the numerous security ramifications and privacy infringements that the technology can lead to, I really think deep-fakes can be fun sometimes. One of my favorite formats of video nowadays is watching users who’ve either edited new characters into older movies, or have edited older characters to resemble their current actors more. A major example can be found in the myriad of videos which have any of the live-action Spider-Man actors being deep-faked to look like each other. Seriously, look up Tobey Maguire in Far From Home, or Tom Holland in Spider-Man 2, you’ll encounter loads of results. It’s a fun flexing of technology that, while occasionally coming off as creepy (Luke Skywalker’s weird flat face in The Mandalorian Season 2 finale), can still be entertaining to observe.
However, since this is the internet after all and we really can’t have nice things, deep-fakes have been weaponized on any number of occasions to either add famous celebrities to rather unflattering and/or inappropriate videos, or to scam people across the globe. While the majority of such efforts are now flagrantly illegal, it’s sort of difficult trying to press charges against some snotty thirty-year-old loser using a VPN in their mother’s basement on the other side of town. However, the problem with deep-fakes is that they’re not easy to do, for starters, and even amazing attempts can lead to flawed results. The technology is much more akin to a very complex Snapchat filter than it is to the rubber masks from Mission Impossible.
The Federal Bureau of Investigation sees deep-fakes as threats to employers, and I can see why. Lord knows the USA cares more about employers, managers, and corporations than the other millions living there. However, if you’re from relatively younger generations, and even from older ones for that matter, you will more often than not be tipped off to a deep-fake. Let’s talk about the uncanny valley.
The uncanny valley is a concept stating that the more photorealistic our attempts at making human faces via AI get, the more we will note something off about them. It’s the Grand Moff Tarkin effect from Rogue One, it’s the weird baby from those weird Twilight movies, it’s that terrible Bruce Lee Johnnie Walker ad. Since we tend to attach and orient ourselves to humans more than animals or inanimate objects, we notice even the tiniest missing details quickly. Be it missing pores, or features that aren’t just right, there’s a lot that can tip us off.
The FBI has been rather vocal about this as well, assuring many that deep-fakes can be easily spotted with just a little careful consideration.
Via @onamastudio/freepik
Read next: Here’s How the Cybersecurity Landscape Might Look By 2025 According Gartner’s New Report