For those hearing it for the first time, DeepNude is the name of a rather infamous app that took the capabilities of the emerging AI technologies to a new heights, allowing its users to create unerring nude images of women from any picture with just a click.
While the official website of DeepNude has already been taken down, as it attracted criticism from different sectors, this post is meant to highlight what went wrong and how the app created around 3 months ago by an unknown developer by the nick “technology enthusiast” could have presented a better returns.
Ironically, the DeepNude App targeted women, which according to the creator, was because images of undressed women are pretty easy to find anywhere on the internet. But this is certainly not the first instance of technology haven been used to ruin people’s lives on the web platforms.
Adobe's Photoshop has even wrought greater harms, with uncountable fake images been photoshopped to fuel the increasing fake news menace. But why didn't Photoshop go into oblivion, and why is DeepNude facing such a huge backlash, when such utilities like Photoshop are still been heavily patronized and used in creating fake porn videos of celebrities and even for revenge porn.
DeepNude is a prove of the advancement in machine learning algorithm which is happening at a rate faster than we can ever imagine. And with the latest AI capabilities, there are huge possibilities that are both exciting and horrifying at the same time.
The caveat is that even though the original app has been taken down, there are myriads of exact copies already sworn into circulation on the web, and as more people are becoming curious by searching to find the download links of the supposed app, bad guys can also leverage on it to spread malware to unsuspecting web users.
Albeit, some developers claimed to have tweaked the original DeepNude app, but the threat posed by the deepfake technology isn’t just limited to online platforms, the impending reality is that if regulations and policies are not implemented to prevent the harm that technologies like DeepFakes and DeepNude are capable of doing, nothing else can stop it.
No comments