立即打开
撤掉编辑人员无法解决Facebook的倾向性问题

撤掉编辑人员无法解决Facebook的倾向性问题

Mathew Ingram 2016年09月01日
计算机程序也包含着人的倾向,因为算法是由人类制定的。

Facebook最近表示,它已经调整了自家网站上一个关键板块的运作方式。该社交网站称,在这个名为“热门话题”(Trending Topics)的板块中,内容的挑选和编辑基本都不再由人来负责,相反,现在几乎整个板块的制作和编排都由计算机算法完成。

今年初,曾负责该板块的几名编辑说有人鼓励他们把某些保守新闻网站的内容排除在外,阻止它们成为热点。尽管Facebook矢口否认,但此事引起了轩然大波,进而成为该公司做出此番调整的部分原因。

Facebook在自己的博客中宣布了此项新措施。它还表示,并未发现热门话题挑选过程中存在“系统性倾向”的证据,但依然希望这次调整能让该板块“就各种各样的话题为人们提供广泛的意见和看法”。

Facebook想必是希望用算法管理该板块能让此类指责变得更容易应付,这是因为人们认为程序代码比人更客观也/或更理性,所以不容易存在倾向。

然而,正如科技行业分析师反复指出的那样,负责Facebook新闻采集和热门话题的算法并非无所不知,也不是什么冷酷客观的机器。它由人设计和编写,而且在大多数情况下都包含了这些编程人员的倾向。

实际情况是,Facebook其实并未将所有人都排除在热门话题板块的运作之外。该公司在博客中称,仍将由编辑人员来剔除一些并未指向真正新闻事件的话题。它指出:“比如说,全世界的人每天中午吃饭时都有午餐话题,但它并不是热门话题。”

本周一也出现了一个类似事例,热门话题板块出现了关于福克斯新闻网主持人梅金·凯莉的假新闻,随后被众多新闻记者以及其他一些用户指了出来。

但真正的问题在于,仅仅从使用编辑人员转向使用算法并不会改变Facebook的新闻采集和热门话题算法是否存在倾向的事实。无论由人还是计算机软件来选择哪些新闻足够有趣或者有价值,其决定都会自动地把其一些东西排除在外。

算法和编辑人员或许都会排除午餐话题这样的内容,但也可能会排除其他东西,而大多数用户对此一无所知。对一个看来想成为新闻内容汇聚地的社交网站来说,这将带来潜在风险。

2014年,密苏里州弗格森镇发生黑人遭枪击事件后,Facebook的热门话题板块并没有向大多数用户披露相关内容,相反,它展示了一些无伤大雅的“冰桶挑战赛”帖子。这是因为大多数用户都没有针对弗格森镇事件发帖吗?Facebook一定会说,是的,但实际情况如何我们不得而知。

同时,这些问题还会循环往复。就算热门话题板块真实地反映出人们分享和交流最多的内容,但如果Facebook的新闻采集算法隐藏甚至排除了某些类型的帖子,就像这个网站经常做的那样,这些内容就永远也不会流行起来。

我的结论是,Facebook的编程人员其实也是编辑,他们决定着我们在什么时候看到哪些东西,这会产生实实在在的影响,不光是对新闻行业,对整个社会也是如此。(财富中文网)

译者:Charlie

审校:詹妮

Facebook recently announced that it has changed the way it handles a key section of its website. Instead of being curated and edited mostly by human beings, the social network said its “Trending Topics” feature will now be almost entirely produced and structured by computer algorithms.

This change was driven in part by a controversy that flared up earlier this year in which human editors who worked on the feature said that they were encouraged to exclude certain conservative news sites and prevent them from trending, although Facebook denied this.

In its blog post about the new approach, the social network says it found no evidence of “systemic bias” in the way trending topics were selected, but nevertheless hopes that the change will make the feature “a way for people to access a breadth of ideas and commentary about a variety of topics.”

Presumably, Facebook is hoping that handing the feature over to an algorithm will make it easier to defend against these kinds of accusations because computer code is seen as being more objective and/or rational than human beings, and thus not susceptible to bias.

The code that operates Facebook’s news feed and trending algorithms, however, isn’t some kind of omniscient or ruthlessly objective engine, as technology analysts continually point out. It’s designed and programmed by human beings, and in most cases incorporates the biases of those human programmers.

As it turns out, Facebook isn’t actually taking all of the human beings out of the Trending Topics process. The company noted in its post that human editors will still be used to weed out certain topics that don’t refer to actual news events. “For example, the topic #lunch is talked about during lunchtime every day around the world, but will not be a trending topic,” it said.

Another example presented itself on Monday when a fake news story about Fox News host Megyn Kelly appeared in the Trending Topics section, and was called out by a number of journalists and other users.

The real point, however, is that simply moving from using human editors to using algorithms isn’t going to change the reality of whether Facebook’s news feed and trending topics algorithms are biased. If either human beings or computer software are choosing which items qualify as interesting or newsworthy, then that decision automatically excludes certain other things.

Maybe the algorithm and the human editors will exclude topics like #lunch, but they may also exclude other things, and most users will never know. That creates a potential risk for a social network that seems to want to become a hub for journalistic content.

In the aftermath of the shooting of a black man in Ferguson, Mo. in 2014, the Trending Topics feature showed nothing about the event to most users, but instead showed innocuous posts about the “Ice Bucket” challenge. Was that because most users weren’t sharing posts about Ferguson? Facebook would undoubtedly say yes, but the simple fact is that we don’t know.

These problems can become recursive as well. Even if Trending Topics does faithfully represent what people are actually sharing or interacting with the most, if Facebook’s news feed algorithm hides or even excludes certain types of posts—which it routinely does—then they will never trend.

The bottom line is that Facebook’s programmers, who in a very real sense are also editors, are choosing what we see and when—and that has very real implicationsnot just for journalism but for society as a whole.

  • 热读文章
  • 热门视频
活动
扫码打开财富Plus App