الأربعاء، 1 يونيو 2022

NEWS TECHNOLOGIE

(Photo: Shahadat Rahman/Unsplash)
Without fanfare, Google has made the decision to ban users from creating deepfakes on its Colaboratory computing service.

Colaboratory (more commonly referred to as Colab) was launched in 2017 as a way to let users write and execute Python code via their web browser.  The mostly free-to-use resource provides users with a number of tools to power data analysis, machine learning, and other complex projects, which are supposed to be in the name of scientific research. Unfortunately, this hasn’t always been the case.

Deepfakes, for the uninitiated, are hyper-realistic fabricated videos of things that didn’t actually happen. While they’re sometimes made for well-intentioned laughs, they’re often created to spread political misinformation or making it appear as though someone starred in pornography they actually had nothing to do with (i.e. “involuntary pornography”). Leading up to the 2020 presidential election, people generated deepfakes of candidates “saying” jarring things; others have used tools like Colab’s to create “revenge porn” involving former romantic partners. 

Digital tricks like these are made possible by machine learning algorithms and intricate editing tools—the first of which users can train on Colab’s CPUs, GPUs, and TPUs. The process typically involves a facial recognition algorithm, which enables the user to make it appear as though the targeted person is making certain facial expressions or saying specific things. The possibilities are virtually endless, which can be dangerous, especially when it comes to fake news and porn someone never consented to being in. 

@deeptomcruise Not gonna lie… I love nerds 👨‍🎓@harvard ♬ origineel geluid – Tom

It’s unknown whether this TIkTok user uses Colab to make their Tom Cruise deepfakes, but their videos are pretty convincing, especially to the untrained eye.
With great power comes great responsibility and all that, and some Colab users might have abused that power…or Google thought to nip a potential problem in the bud due to ethical concerns. (I’d like to believe it’s the latter, but we’re talking about a massive tech company here.) Given that deepfake training uses up a lot of computational resources, Google also may have brought down the ban hammer for financial reasons.

Regardless of its rationale, Google banned deepfake training in mid-May. The company also banned a number of other questionable practices it understandably doesn’t want to be associated with, like password cracking, cryptocurrency mining, and torrenting. Anyone attempting to engage in these practices now receives an error code that reads: “You may be executing code that is disallowed, and this may restrict your ability to use Colab in the future. Please note the prohibited actions specified in our FAQ.”

As Bleeping Computer points out, deepfake enthusiasts often refer newbies to Colab to try their hand at creating highly-detailed false media. This means Google’s ban might have far-reaching implications for the deepfake “community,” whether or not individual developers have bad intentions.

Now Read:



from ExtremeTechExtremeTech https://ift.tt/0CqWjxE

ليست هناك تعليقات:

إرسال تعليق