Skip to content

【卫报】消灭“网络”暴政,世界属于“人类”

或许你没有发现,现在网上许多文章都是由程序自动编写出来的。它们将原有的多篇文章分解,然后根据热搜上的主题,拼凑出一篇点击量预测会很高的文章。而更夸张的是,不单单是文字,现在就连视频也能够通过这种方式合成出来。再这样下去,程序恐怕要统治世界了,我们作为人类还能够做些什么呢?


消灭“网络”暴政,世界属于“人类”

作者:Jonathan Freedland

译者:倪凌晖

校对:刘   璠

策划:刘   璠


From Peppa Pig to Trump, the web is shaping us. It’s time we fought back 

从小猪佩奇到特朗普,互联网正在塑造着我们的生活。是时候绝地反击了。


本文选自 The Guardian | 取经号原创翻译

关注 取经号,回复关键词“外刊”

获取《经济学人》等原版外刊获得方法


Forget the canary in the coal mine: these days, the warning comes from a cartoon pig in a dentist’s chair. And it’s no exaggeration to say it’s pointing to a threat facing all humanity. 

还在等着矿井里的金丝雀告诉你危险将至吗?如今,只要看下那只躺在齿科医院椅子上的小猪,你就知道自己的处境有多危险了。毫不夸张地说,全人类都会因此受到波及。

canary in the coal mine 金丝雀对瓦斯气味特别敏感,于是过去人们下矿的时候会带上一只金丝雀作为“瓦斯检测指标”。只要鸟儿不停地叫,就表明空气是安全的,如果鸟儿死了,就必须马上撤离。


The pig in question is Peppa, beloved by children everywhere. What could be safer than settling a child in front of a few Peppa Pig videos, served up in succession by YouTube, knowing they’ll be innocently amused while the adults chat among themselves? Except they might not be so safe.

那只小猪就是受到全球儿童喜爱的小猪佩奇(Peppa)。当家长们想要聚在一起聊聊天的时候,他们就会打开YouTube,给孩子们连续播放小猪佩奇的卡通片。在大人们看来,卡通片的内容一定是纯真无害的,给孩子们看看小猪佩奇应该是最为安全的娱乐活动了吧。可事实也许并非如此。


In a recent, revelatory blog post, writer James Bridle described how what might seem to the naked eye – or distracted parent – to be harmless cartoons available on YouTube’s kids’ channel are often, in fact, unofficial knock-offs, edited and titled in order to rise up the algorithmic rankings and attract lucrative page views. A human hand need not even be involved. A simple bot can simply take words or images it knows the algorithm will favour, chop them up and generate a video that meets those criteria. The result will be served up automatically, played to a child the moment the previous video has finished. 

最近,詹姆斯·布赖德尔(James Bridle)推送了一篇博客帖子,内容发人深省。帖子写道,YouTube幼儿频道上播放的卡通片,在粗心的父母那里乍一看来也许是安全的,但实际上却常常是盗版的仿冒品。这些视频的内容和标题都是为了提升算法排名和吸引大量点击率来盈利而刻意制作成那样的。根本用不着人类动手,一个简单的自动程序就能轻易找到那些可以让算法打出高分的文稿和图像,把它们分解后组装成一段符合高分标准的视频。这些视频将会在网上自动推送给用户。在上一个卡通结束后,孩子们面前可能就会马上出现这种“合成视频”。


Bridle took a look, and what he saw was deeply disturbing. One video turns a legitimate Peppa story – visiting the dentist – into a scene of torture. Another involves smiling clowns slaughtering a series of familiar cartoon characters against a soundtrack of cheery nursery tunes. For children, it’s the terrain of nightmares. And yet few parents would have any idea it was there. They might have selected a wholly innocent cartoon a few minutes earlier, only to find their child traumatised by the one that came next.

布赖德尔看了下小猪佩奇的卡通,就被里面的内容震惊了。某一集的内容是小猪佩奇去看牙医。原本很平常的剧情中却出现了虐待场景。另一集中,一群面带笑容的小丑们在屠杀一系列相似的卡通人物,而背景音乐则是一首欢快的童谣。对于孩子们来说,这就是他们的梦魇,但父母们却毫不知情。或许,他们还特意花上了几分钟的时间挑选出了那些无害的片子,结果却发现,紧接着那些片子之后播放的却是会给孩子们带去童年阴影的画面。


It’s hard to tell, Bridle admits, how and where this material originated. Some of it will have been created by trolls who find the idea of a decapitated Smurf funny. But automation is central, if not in the initial creation of these videos then in their distribution. They’re being viewed in their tens of millions because a bit of computer script puts them in front of kids. 

詹姆斯·布赖德尔承认,很难说这些影像是从哪里来的,又是怎么被制作出来的。一些片子可能是那些“故意发布恶性讯息的人”制造的。对于这些人来说,连被斩首的蓝精灵都会让他们感到搞笑。但是,问题的核心还是在于那些自动程序。即使这些自动程序不是视频最初的制造者,也至少在视频传播方面起到了核心的推手作用。合成的视频通过这些自动程序推送给孩子们,于是便有了成千上万的浏览量。

Troll 指通过故意发表一些煽动性的、离题的、无关的或者故意挑衅性质的言论,或者采取具有之前所提性质的行为,来促使对方做出自己期望的情绪反应(通常是激怒)。


This may seem like a technical problem. But it’s not. It’s a political problem, one that touches on the central question of politics: who governs our lives? 

看上这去是个技术问题,实则不然。这其实是个政治问题:它提出了一个政治上的核心问题——谁支配着我们的生活?


At the dawn of political theory that question applied to kings or emperors, as subjects sought to set limits on the powers of the ruler. In recent decades it’s become obvious that governments are not the only, or even the most, powerful players needing to be tamed: global corporations hold huge sway over us too. But now we must face the fact that machines are deeply shaping our lives, and they are currently answerable to no one. We are slaves to the algorithm. 

在政治理论的发蒙阶段,这个问题所指向的对象是国王或君主,而主题则是如何限制他们的权力。最近几十年中,我们明显发现,政府并不是唯一一个需要制约权力的对象,甚至也不是权力最大的那个。那些全球性的跨国公司在很大程度上也在左右着我们的生活。但如今我们必须面对的事实却是:机器正在惊人地塑造着我们的生活,并且目前为止还不需要对任何人负责。我们都是算法的奴隶。


Perhaps this political point is illuminated best by politics. There’s much focus, rightly, on Russian meddling in the 2016 US presidential election, with Facebook’s admission, for instance, that Moscow-funded messages were seen by 150 million Americans. But such enormous reach was only possible because of the way Facebook works, an algorithm designed to “maximise engagement”, showing people nuggets of news that they are likely to pass on – even when that “news” is bogus and fact-free. 

或许,政治活动最能阐明这一观点。当前,有许多人准确地将注意力放在了俄罗斯干预2016年美国总统大选一事上。脸书(Facebook)方面承认,有1.5亿美国人浏览了由俄罗斯政府出资撰写的信息。但是,这些信息之所以能够形成如此庞大的浏览量,原因还是在于脸书运作的方式。脸书这一程序的设计初衷就是为了使“软件使用流量最大化”。这就意味着它会推送那些容易被用户转发的新闻,即使有些新闻是伪造的、毫无事实依据的。


That was the system those infamous Macedonian teenagers realised they could exploit for cash – spreading the lie that Pope Francis had endorsed Donald Trump – and which meant that in the last three months of the US campaign, the biggest fake election stories generated more engagement than the biggest, and true, stories produced by the likes of the New York Times and the Washington Post. It’s the same system the Trump campaign itself used to such great effect, with its micro-targeted ads aimed at specific demographic groups, which were then shared and shared again. 

恶名远扬的马其顿青年们意识到可以利用脸书的这个运作方式来赚钱。于是他们开始在脸书上传播假新闻——教皇方济各(Pope Francis)支持特朗普(Donald Trump)。因此,在大选的最后三个月中,那些由《纽约时报》和《华盛顿邮报》发布的最重要的真实新闻,实则吸引的点击量还没有这次竞选中的那些最大的假新闻来得多。也正是因为脸书,特朗普的竞选团队才能够制造出如此巨大的宣传效应。他们在脸书上将竞选广告精准地投放到特定人群,再由这些人群将广告一遍又一遍地“分享”出去。


What’s more, the very operating systems of the social media giants – Google, Facebook, YouTube – rest on algorithms that function in ways mysterious even to those who own them. My Guardian colleague Alex Hern explains that the companies know that these strings of code achieve the outcome with which they’re tasked – but they don’t always know exactly how they do it. “They’re a bit of a black box.” There’s an echo here of the financial crisis, when it emerged that the CEOs of the big banks and investment houses were selling complex derivatives that they themselves did not understand. But the stakes here are even higher. 

另外,连那些社交媒体巨头——谷歌、脸书和YouTube的拥有者们都不知道他们的软件赖以存在的程序是如何运作的。我的一位《卫报》的同事艾利克斯·赫恩(Alex Hern)解释说,这些公司只知道那些程序能够达成他们的目标,但却并不知道具体的原理。他说:“它们就像是一个个黑箱。”黑箱一词源于2008年经济危机。那时,对于大型银行和投资公司发行的复杂的金融衍生产品,就连他们自己的CEO们都解释不清楚其中的运作原理。而如今,这样的风险却明显更高了。


Tim Berners-Lee, the father of the worldwide web, this week warned: “The system is failing. The way ad revenue works with clickbait is not fulfilling the goal of helping humanity promote truth and democracy.” At the heart of the matter, he said, is “very finely trained” artificial intelligence. These are not the robots of science fiction past, lumbering towards us, their arms stiff and eyes cold. They are unseen, ghosts in the machine. But they are exerting enormous influence on our lives. They understand our foibles, poke at our fears, keep us hooked to our screens – and are now involved in raising our children and picking our leaders in ways we would never have chosen. 

这周,万维网之父蒂姆·伯纳斯·李(Tim Berners-Lee)警告说:“互联网系统正在崩溃。人们制造耸动标题,不择手段地提升点击率,从中赚取广告费。互联网违背了其诞生的初衷,并没有帮助人类更好地推广真相和民主。”他说,问题的核心在于人们已经把人工智能“开发地非常精良了。它们不再是过去科幻片中缓慢向我们走来的那些手臂僵硬、眼神冰冷的机器人。而是隐身于机器中的无形幽灵,对我们的生活产生着极大的影响。它们理解我们的弱点,撩拨着我们的恐惧,把我们锁定在屏幕前。而现在它们甚至开始照顾起我们的孩子来,还使用了我们从不会使用的方式替我们选出了领导人。

clickbait n. a pejorative term for web content whose main goal is to get users to click on a link to go to a certain webpage. 标题党(又称钓鱼式标题),指网络中故意用较为夸张、耸动的文章标题以吸引网友点击观看文章或帖子的人,特别是与用实际上与内容完全无关或联系不大的文字当标题者。

foible /ˈfɔɪbl/ n. a silly habit or a strange or weak aspect of a person’s character, that is considered harmless by other people (性格上无伤大雅的)怪癖,弱点,小缺点 


What can we do? Governments don’t have to be impotent. They could insist on regulating the tech giants the way they regulate the utilities: the information supply is scarcely less important than the water supply and right now they’re fouling it. In the name of child protection, politicians could demand that YouTube deal with the nightmare videos it’s playing automatically to the youngest and most vulnerable. Or that Instagram identify digitally altered pictures so that teenagers anxious at the sight of apparently perfect-looking peers realise those images are not real. 

那我们能做些什么呢?对于这一问题,政府并非束手无。政府可以坚持像管理公共事务一样管理科技公司。信息供应的重要程度不亚于供水,更何况现在的情况是那些科技公司正在“污染”信息。为了保护儿童,政治家们可以命令YouTube着手处理那些恶意视频,停止自动播放,呵护孩子们脆弱的心智。或者,命令Instagram识别出数码修容后的图片,让那些因为看到同龄人完美长相而感到焦虑的青少年们意识到这都不是真的。

impotent /ˈɪmpətənt/ adj. having no power to change things or to influence a situation 无能为力的;不起作用的

foul /faʊl/ verb [VN] to make sth dirty, usually with waste material (通常用废物)弄脏,污染


As individuals, we can assert ourselves. Don’t let Twitter or Facebook “curate” your news feed; go to settings and select “show most recent”, rather than what the algorithm regards as the “top” items. 

作为个人,我们可以坚持自己的主张,不要被动地让推特(Twitter)和脸书“引导”你的阅读内容。你可以点击设置,选择“显示最新内容”,而不是让算法为你标出“优先级”。

assert oneself [VN] 坚持自己的主张;表现坚定;If you assert yourself, you speak and act in a forceful way, so that people take notice of you.


Above all, we can use our power as consumers to exert pressure on the tech behemoths. If only for the sake of their corporate self-image, they don’t want to be the tobacco companies of the 21st century. Users can demand that, say, peddlers of fake news are not presented as the equals of proven media organisations. We can demand they take the steps that might well make their services a tad less addictive – but which will make them safer and healthier. 

总的来说,我们可以利用自己作为消费者的权利来向科技巨头们施压。只要这些公司还能对自己的企业形象有所顾及,他们就不会希望自己成为21世纪的“烟草公司”。例如,用户可以要求:那些传播假新闻的账号不能和那些经过验证的媒体机构以相同的方式呈现在软件上。我们还可以要求它们逐步作出改变,降低服务令人上瘾的程度,使它们变得更加安全和健康。

 a tad adv. 一点儿


We’re used to raging at politicians or bureaucrats, in the way that our forebears railed against princes and kings. But today the masters of so much of our universe are invisible strings of ones and zeroes and the corporations that own them. They’re shaping our lives much more than Brussels ever did. We need to take back control. 

我们习惯于像我们的先辈怒斥王子和国王一样,抱怨政客和官僚。但是,如今掌控我们绝大部分生活的是那些无形的代码和拥有代码的科技公司。它们正在塑造我们的生活,产生的影响之大是北约和欧盟从未能够做到的。我们需要重新夺回控制权。

rag  (NAmE, informal) to complain to sb about their behaviour, work, etc. 向某人抱怨(或埋怨、发牢骚)

rail  /reɪl/ verb ~ (at/ against sth/ sb) (formal) to complain about sth/ sb in a very angry way 怒斥;责骂;抱怨


#读译交流#

后台回复 读译会,参与取经号Q群交流

#外刊资源#

后台回复 外刊,获取《经济学人》等原版外刊获得方法

#关注取经号#

扫描 二维码,关注跑得快的取经号(id: JTWest)

<原文链接:https://www.theguardian.com/commentisfree/2017/nov/17/peppa-pig-donald-trump-internet-social-media-algorithms>

Be First to Comment

发表评论

电子邮件地址不会被公开。 必填项已用*标注