立即打开
人工智能模仿语音以假乱真,容易被诈骗分子利用

人工智能模仿语音以假乱真,容易被诈骗分子利用

Steve Mollman 2023-03-07
你确定与你通话的是你的亲人吗?要警惕人工智能语音克隆诈骗。

图片来源:GETTY

你近期可能会接到亲友的来电,对方迫切需要帮助,希望你尽快为其汇款。你或许相信对方就是你的亲友,因为你很熟悉对方的声音。

人工智能改变了这一点。新的生成式人工智能工具可以根据简单的提示文本输出各种结果,包括以某位作者的特定风格书写的论文,达到获奖水平的图片,甚至通过某个人的语音片段就能够生成以假乱真的讲话内容。

今年1月,微软(Microsoft)的研究人员演示了一款文本转语音人工智能工具。这款工具只需要一个人三秒钟的音频样本,就可以逼真地模仿其声音。他们并未对外公开其代码;相反,他们警告这款名为VALL-E的工具“可能存在被滥用的风险……例如语音识别诈骗,或者模仿某人讲话等。”

但类似的技术已经泛滥,并被诈骗分子利用。只要他们能够在网上找到你的30秒语音,就很有可能克隆语音,然后生成任何内容。

加州大学伯克利分校(University of California at Berkeley)的数字取证专业教授哈尼·法里德对《华盛顿邮报》(Washington Post)表示:“两年前,甚至一年前,克隆一个人的声音需要大量语音。现在……如果你有Facebook主页……如果你拍摄过一段TikTok视频并且有30秒语音,人们就可以克隆你的语音。”

“钱不翼而飞”

《华盛顿邮报》在上周末报道了这种技术的风险。该媒体介绍了加拿大一家人如何被诈骗分子利用人工智能语音克隆工具欺骗,并损失数千美元。这对年迈的父母被一名“律师”告知,他们的儿子因为在一场车祸中害死了一位美国外交官,已经被羁押,需要他们支付律师费。

然后,这名所谓的律师将电话交给他们的儿子,对方告诉他们他深爱并感激他们,但需要他们付钱。儿子本杰明·珀金告诉《华盛顿邮报》,克隆语音听起来“非常逼真,足以令我的父母相信他们正在与我通话。”

他的父母通过一个比特币(Bitcoin)网站支付了超过15,000美元,但这笔钱却落入了诈骗分子的口袋,而不是像他们想象的那样转给了儿子。

珀金对《华盛顿邮报》表示:“钱不翼而飞。没有保险,无法追回,钱就这样消失了。”

一家提供生成式人工智能工具的公司ElevenLabs在1月30日发推文称:“滥用克隆语音的案例越来越多。”次日,该公司宣布,其人工智能工具VoiceLab免费版的用户将无法再使用语音克隆功能。

《财富》杂志向该公司发送了置评请求,但并未得到回复。

该公司写道:“几乎所有恶意内容都由免费的匿名账户生成。额外的身份验证必不可少。因此,VoiceLab将仅供付费用户使用。”(订阅费为每月5美元起。)

该公司承认,身份验证无法阻止所有坏人,但能够减少匿名用户,并“迫使人们三思而行”。(财富中文网)

译者:刘进龙

审校:汪皓

你近期可能会接到亲友的来电,对方迫切需要帮助,希望你尽快为其汇款。你或许相信对方就是你的亲友,因为你很熟悉对方的声音。

人工智能改变了这一点。新的生成式人工智能工具可以根据简单的提示文本输出各种结果,包括以某位作者的特定风格书写的论文,达到获奖水平的图片,甚至通过某个人的语音片段就能够生成以假乱真的讲话内容。

今年1月,微软(Microsoft)的研究人员演示了一款文本转语音人工智能工具。这款工具只需要一个人三秒钟的音频样本,就可以逼真地模仿其声音。他们并未对外公开其代码;相反,他们警告这款名为VALL-E的工具“可能存在被滥用的风险……例如语音识别诈骗,或者模仿某人讲话等。”

但类似的技术已经泛滥,并被诈骗分子利用。只要他们能够在网上找到你的30秒语音,就很有可能克隆语音,然后生成任何内容。

加州大学伯克利分校(University of California at Berkeley)的数字取证专业教授哈尼·法里德对《华盛顿邮报》(Washington Post)表示:“两年前,甚至一年前,克隆一个人的声音需要大量语音。现在……如果你有Facebook主页……如果你拍摄过一段TikTok视频并且有30秒语音,人们就可以克隆你的语音。”

“钱不翼而飞”

《华盛顿邮报》在上周末报道了这种技术的风险。该媒体介绍了加拿大一家人如何被诈骗分子利用人工智能语音克隆工具欺骗,并损失数千美元。这对年迈的父母被一名“律师”告知,他们的儿子因为在一场车祸中害死了一位美国外交官,已经被羁押,需要他们支付律师费。

然后,这名所谓的律师将电话交给他们的儿子,对方告诉他们他深爱并感激他们,但需要他们付钱。儿子本杰明·珀金告诉《华盛顿邮报》,克隆语音听起来“非常逼真,足以令我的父母相信他们正在与我通话。”

他的父母通过一个比特币(Bitcoin)网站支付了超过15,000美元,但这笔钱却落入了诈骗分子的口袋,而不是像他们想象的那样转给了儿子。

珀金对《华盛顿邮报》表示:“钱不翼而飞。没有保险,无法追回,钱就这样消失了。”

一家提供生成式人工智能工具的公司ElevenLabs在1月30日发推文称:“滥用克隆语音的案例越来越多。”次日,该公司宣布,其人工智能工具VoiceLab免费版的用户将无法再使用语音克隆功能。

《财富》杂志向该公司发送了置评请求,但并未得到回复。

该公司写道:“几乎所有恶意内容都由免费的匿名账户生成。额外的身份验证必不可少。因此,VoiceLab将仅供付费用户使用。”(订阅费为每月5美元起。)

该公司承认,身份验证无法阻止所有坏人,但能够减少匿名用户,并“迫使人们三思而行”。(财富中文网)

译者:刘进龙

审校:汪皓

You may very well get a call in the near future from a relative in dire need of help, asking you to send them money quickly. And you might be convinced it’s them because, well, you know their voice.

Artificial intelligence changes that. New generative A.I. tools can create all manner of output from simple text prompts, including essays written in a particular author’s style, images worthy of art prizes, and—with just a snippet of someone’s voice to work with—speech that sounds convincingly like a particular person.

In January, Microsoft researchers demonstrated a text-to-speech A.I. tool that, when given just a three-second audio sample, can closely simulate a person’s voice. They did not share the code for others to play around with; instead, they warned that the tool, called VALL-E, “may carry potential risks in misuse…such as spoofing voice identification or impersonating a specific speaker.”

But similar technology is already out in the wild—and scammers are taking advantage of it. If they can find 30 seconds of your voice somewhere online, there’s a good chance they can clone it—and make it say anything.

“Two years ago, even a year ago, you needed a lot of audio to clone a person’s voice. Now…if you have a Facebook page…or if you’ve recorded a TikTok and your voice is in there for 30 seconds, people can clone your voice,” Hany Farid, a digital forensics professor at the University of California at Berkeley, told the Washington Post.

“The money’s gone”

The Post reported last weekend on the peril, describing how one Canadian family fell victim to scammers using A.I. voice cloning—and lost thousand of dollars. Elderly parents were told by a “lawyer” that their son had killed an American diplomat in a car accident, was in jail, and needed money for legal fees.

The supposed attorney then purportedly handed the phone over to the son, who told the parents he loved and appreciated them and needed the money. The cloned voice sounded “close enough for my parents to truly believe they did speak with me,” the son, Benjamin Perkin, told the Post.

The parents sent more than $15,000 through a Bitcoin terminal to—well, to scammers, not to their son, as they thought.

“The money’s gone,” Perkin told the paper. “There’s no insurance. There’s no getting it back. It’s gone.”

One company that offers a generative A.I. voice tool, ElevenLabs, tweeted on Jan. 30 that it was seeing “an increasing number of voice cloning misuse cases.” The next day, it announced the voice cloning capability would no longer be available to users of the free version of its tool, VoiceLab.

Fortune reached out to the company for comment but did not receive an immediate reply.

“Almost all of the malicious content was generated by free, anonymous accounts,” it wrote. “Additional identity verification is necessary. For this reason, VoiceLab will only be available on paid tiers.” (Subscriptions start at $5 per month.)

Card verification won’t stop every bad actor, it acknowledged, but it would make users less anonymous and “force them to think twice.”

热读文章
热门视频
扫描二维码下载财富APP