立即打开
能辨识喜怒哀乐的计算机来了

能辨识喜怒哀乐的计算机来了

Stacey Higginbotham 2016年01月08日
我们会利用新技术做好事么,还是会用来卖东西?

如今计算机已代替人类从事着如翻译、个人助理、酒店服务等各种工作。事实上,计算机在某些方面的发展已经智能到让人有点不安。2016年这种感觉恐怕会只增不减。因为这一年计算机将可以通过辨识人们脸部骨骼肌的微小动作,就能判断出你是高兴、悲伤还是愤怒。

卡内基梅隆大学(Carnegie Mellon)计算机科学院主任安德鲁•摩尔说,通过更先进的算法和高清摄像机,这项实验室里的技术将进入商店、网站甚至医院。以往人机互动过程中,有一些微妙的细节容易被忽视,但在计算机能读懂人的表情后就可以增进互动。

举个例子,表情辨识技术可以帮助客户与网站自动客服互动,计算机能够辨别客户是否真的对服务满意,还是礼貌性的敷衍。

摩尔说,人们在面对面交谈时会自然而然地露出表情,这些微小动作透露出人们对谈话的兴趣或疑惑,推动对话进行下去。然而最新技术出现之前计算机一直无法读懂人类面部表情,只能通过自然语言辨识技术处理书面或声音信息。

摩尔说,实现更深层次的理解将是互动技术的巨大进步。他预测,这一技术在健康护理领域大有可为,如果能够读懂人们的表情,就能帮助人类甚至护理机器人更好地识别疼痛或沮丧,哪怕病人并没有表现出来。

在日本,机器人已经用来与病患互动。在美国,政府也正测试远距离医疗,让农村地区的病人可以更方便地求医。

虽然如此,想到在某些情况下利用人工智能还是有点让人担心的,因为有可能涉及滥用及侵犯隐私等问题。

比如,商店可能会利用表情辨识来判断顾客或雇员是否有紧张情绪,将他们标记为潜在的小偷,也有可能用来辨别哪些顾客真的渴望购买古奇钱包或者阿玛尼外套,然后派出售货员向他们大力推销。

也有人担心人工智能会被当做高科技版本的测谎仪,通过读懂微表情来判断是否说谎,应用在边境、执法部门甚至是求职面试。

我在手机上下载了一个叫IntraFace的表情测试软件来测试准确性。视频广告说,这款软件已用于增强现实,比如可以在人脸上显示化妆后的效果,这样人们在购买化妆品前测试颜色是否满意。公司还宣传说,该技术可以通过观察人们是否在专心看路从而防止开车时走神。还有一种应用场景是,在教室中判断学生有没有不认真听讲偷看手机。

不管如何,我都有种被监视的感觉,但正如摩尔所说,该来的自然会来。而且我发现,我对着手机做鬼脸时,IntraFace能准确辨别出五种表情,没有做得太夸张的都能识别。

像其他任何技术一样,表情识别也是双刃剑,明年,计算机就可读懂我们的感受,到时再看会怎样吧。(财富中文网)

译者:Donna

校对:夏林

Computers are performing jobs once reserved for humans like translators, personal assistants, andhotel bellmen. In fact, they are getting so good at some things, that it’s starting to feel a little creepy.But next year, expect that ick factor to multiply. Computers will be able to figure out whether you’re happy, sad, or angry by merely watching the tiny involuntary muscle movements in your face.

Andrew Moore, dean of Carnegie Mellon’s computer science school, says a combination of better algorithms and high-definition cameras mean this technology will make its way out of the labs and into stores, web sites and even hospitals. The idea is to help humans interact with computers, which are otherwise oblivious to certain subtleties that come with being able to read a person’s expressions.

For example, the emotion reading technology could be put to use to help a web site customer interact with an automated customer service agent. The computer would understand if a customer was actually getting the help they needed or just typing that they did.

People talking face to face inevitably make minor movements that signal their interest or confusion that help the conversation along, says Moore. But until recently, computers have lacked the ability to read facial emotions and have instead focused only on written or spoken words, using natural language recognition technology.

Moore says adding an extra layer of understanding will improve interactions tremendously. That same dynamic will play out in healthcare settings, he predicts, where having the ability to recognize emotions could help a human or even a robot caregiver recognize pain or depression in patients who may not even disclose it.

In Japan, robots are already being used to interact with patients. In the U.S., policymakers are experimenting with telemedicine to bring rural areas better access to doctors.

However, it’s also a little unsettling to imagine using artificial intelligence in certain environments. The technology creates the possibility for misuse or privacy violations.

In a retail store for example, emotion reading computers could be used to identify whether shoppers or employees are nervous and flag them as potential shoplifters. They also might also pick out customers who really covets a particular Gucci purse or Armani coat and send a salesperson to give them a hard sell.

There are also concerns about using artificial intelligence as some kind of high-tech version of a lie detector test that would read micro expressions to tell if someone is telling the truth. This might be used at borders or in law enforcement settings. Or even when people interview for jobs.

I downloaded emotion testing software calledIntraFace on my phone to test its accuracy. A marketing video says it is being used in augmented reality and to add makeup to people’s faces in retail settings, so people can try a color out before they buy. The company advertises that its technology could be used to stop distracted driving by detecting whether people are looking at the road. Another scenario is in a classroom where a student sneaks glances at his phone instead of at the teacher.

It’s all a bit too surveillance state for me, but as Moore says, it is coming. I found the IntraFace software worked fine when it came to recognizing five different emotions, when I made faces at my phone’s camera. Even those that weren’t terribly exaggerated.

Like any technology tool it can be used for good or ill. Next year, we’ll see how computers do as they start understanding how we feel.

  • 热读文章
  • 热门视频
活动
扫码打开财富Plus App