Artificial intelligence received much attention in the past few years leading to the creation of many deepfake videos. Recently, a research by Deeptrace has concluded that the number of deepfake videos has doubled in the past seven months.
The research showed that most of the deepfake videos were pornographic, which is far away from the purpose of creating deepfake videos. Many of the videos consisted of celebrities replacing the face of the porn stars while engaged in sexual activity. It was considered a global problem as American, South-Korean and British actors and actresses were found in these videos.
Deepfake technology was first developed to be used in political campaigns and it somewhat found to be successful in this area, however, its use for pornography is an area of concern. Revenge cyber-bullying and porn is definitely something to look up to.
The existence of the Deeptrace shows how much governments and corporations are influenced by the deepfake phenomenon. Deeptrace has found four websites that include deepfake pornographic material in them with over 134 million views.
Apart from this, software that was used to synthetically remove clothes from women is still out there being sold among many. Such software aid into the creation of deepfake videos that is another factor that technologists need to look at. Another aspect highlighted by Deeptrace is the use of deepfake videos by online businesses to sell their products.
Apart from all this, a report recently released by Witness Media Lab suggests that the creation of deepfake videos is not a piece of cake and requires certain skills. A team of specialists is required to perform these actions and simulate actual faces, however, automation has made the process a lot easier. Now, a person even without specialist knowledge can simulate these videos in a less sophisticated manner.
Other videos posted by Ctrl Shift Face are aimed to entertain people with the deepfake videos rather than deceive them. These videos still have the potential of affecting election campaigns in the future. It is expected that most of the victims of deepfake videos will be women and other individuals rather than corporations and governments. Not everyone can afford to hire a specialist to deal with this hack.
Read next: New Report Explains Even AI Can’t Protect Us from Deepfakes Now
The research showed that most of the deepfake videos were pornographic, which is far away from the purpose of creating deepfake videos. Many of the videos consisted of celebrities replacing the face of the porn stars while engaged in sexual activity. It was considered a global problem as American, South-Korean and British actors and actresses were found in these videos.
Deepfake technology was first developed to be used in political campaigns and it somewhat found to be successful in this area, however, its use for pornography is an area of concern. Revenge cyber-bullying and porn is definitely something to look up to.
The existence of the Deeptrace shows how much governments and corporations are influenced by the deepfake phenomenon. Deeptrace has found four websites that include deepfake pornographic material in them with over 134 million views.
Apart from this, software that was used to synthetically remove clothes from women is still out there being sold among many. Such software aid into the creation of deepfake videos that is another factor that technologists need to look at. Another aspect highlighted by Deeptrace is the use of deepfake videos by online businesses to sell their products.
Apart from all this, a report recently released by Witness Media Lab suggests that the creation of deepfake videos is not a piece of cake and requires certain skills. A team of specialists is required to perform these actions and simulate actual faces, however, automation has made the process a lot easier. Now, a person even without specialist knowledge can simulate these videos in a less sophisticated manner.
Other videos posted by Ctrl Shift Face are aimed to entertain people with the deepfake videos rather than deceive them. These videos still have the potential of affecting election campaigns in the future. It is expected that most of the victims of deepfake videos will be women and other individuals rather than corporations and governments. Not everyone can afford to hire a specialist to deal with this hack.
Read next: New Report Explains Even AI Can’t Protect Us from Deepfakes Now