Why ‘deepfake’ videos are becoming more difficult to detect
2019-06-12 00:00:00


JUDY WOODRUFF: There is growing alarm overthe use of altered videos online, especially(1)
JUDY WOODRUFF:人们越来越担心在线改变视频的使用,特别是

those known as deepfakes, which are highlyrealistic looking and inaccurate.(2)
那些被称为深度伪造的东西,它们具有高度逼真的外观和不准确性。

There are concerns about their growing sophisticationand the risks they pose to national security.(3)
人们担心他们日益成熟,以及他们对国家安全构成的风险。

It's the focus of a hearing tomorrow in theHouse Intelligence Committee.(4)
这是众议院情报委员会明天听证会的焦点。

Miles O'Brien has a look at how those videos,once the source of some fun, are being manipulated(5)
迈尔斯·奥布莱恩(Miles O'Brien)看看这些视频,曾经是一些有趣的视频,是如何被操纵的

and how artificial intelligence scientistsare trying to respond.(6)
以及人工智能科学家如何应对。

It's part of our weekly segment on the LeadingEdge of science.(7)
这是我们科学前沿的每周部分的一部分。

HAO LI, University of Southern California:All right, let's see you being me.(8)
南加州大学郝莉:好的,让我们看看你是我。

(LAUGHTER)
MILES O'BRIEN: Scary as deepfake videos maybe, there are times when they can be fun,(9)
MILES O'BRIEN:恐怖的视频可能是有趣的,有时他们可以很有趣,

a place where a 3-D model of my face getselectronically plastered onto computer scientist(10)
我脸上的三维模型被电子贴在计算机科学家身上的地方

Hao Li's head, making him the puppet masterand me the dummy.(11)
郝莉的头,让他成为傀儡主人和我的假人。

Really a scary looking individual overall.(12)
真的是一个可怕的个人整体。

What do you think?(13)
你怎么看?

I do need to change my hair, don't I, yes?(14)
我确实需要换头发,不是吗,是吗?

Li is an associate professor at the Universityof Southern California, and co-founder of(15)
Li是南加州大学的副教授,也是南加州大学的联合创始人

Pinscreen, an app that allows consumers tomake instant custom 3-D avatars for virtual(16)
Pinscreen,一款允许消费者为虚拟化制作即时自定义3D头像的应用程序

reality gaming and shopping.(17)
现实游戏和购物。

HAO LI: So now I created your avatar, right?(18)
郝力:所以现在我创建了你的头像吧?

So, we have your...(19)
所以,我们有你的......

MILES O'BRIEN: A nice, trim Miles O'Brien.(20)
MILES O'BRIEN:一个漂亮的,修剪Miles O'Brien。

But the real-time puppet master trick is howhe refines the technology.(21)
但实时傀儡主要技巧是他如何改进技术。

And here I am as our president.(22)
在这里,我是我们的总统。

Yes, Shinzo Abe, prime minister of Japan.(23)
是的,日本首相安倍晋三。

Leader of China.(24)
中国领导人。

Trudeau.(25)
特鲁多。

It's not a bad look for me.(26)
这对我来说并不坏看。

Me as Justin Bieber.(27)
我是Justin Bieber。

What do you think?(28)
你怎么看?

I think I'm going to do this on the "NewsHour"all the time now.(29)
我想我现在一直都会在“NewsHour”上这样做。

This will be good for my career, don't youthink?(30)
这对我的职业生涯有好处,你不觉得吗?

(LAUGHTER)
MILES O'BRIEN: Li says he never saw it asanything more than entertainment.(31)
MILES O'BRIEN:Li说他从未将它视为娱乐。

HAO LI: Of course, it can be used for somethingreally bad, but the main purpose was never(32)
郝力:当然,它可以用于非常糟糕的事情,但主要目的永远不会

for that.(33)
为了那个原因。

It was used -- to use for entertainment, afun tool that could give us more things to(34)
它被用于娱乐,这是一个有趣的工具,可以给我们更多的东西

do for fashion, lifestyle, et cetera.(35)
做时尚,生活方式等等。

MILES O'BRIEN: Deepfake videos cleverly combinewhat's real with what is synthesized by a(36)
MILES O'BRIEN:Deepfake视频巧妙地将真实与由a合成的内容结合起来

computer to make people appear to say thingsthey never did or never would(37)
计算机使人们似乎说出他们从未做过或永远不会做的事情

HAO LI: I like vodka.(38)
郝力:我喜欢伏特加。

MILES O'BRIEN: The ever increasing speed ofcomputers, along with the advancement of the(39)
MILES O'BRIEN:计算机的速度不断提高,以及计算机的进步

artificial intelligence technique called machinelearning, is making these composites harder(40)
人工智能技术称为机器学习,正在使这些复合材料更难

and harder to detect with the naked eye.(41)
并且更难用肉眼检测。

HAO LI: We all assume that there will be apoint where there's no way to tell the difference.(42)
郝力:我们都认为有一点无法区分。

I mean, for visual effects, I think you canget pretty close already.(43)
我的意思是,对于视觉效果,我认为你已经非常接近。

It's just the question of how much effortyou put into it.(44)
这只是你付出了多少努力的问题。

But in terms of content that it can be createdby anyone, I think it's getting very close(45)
但就任何人都可以创建的内容而言,我认为它已经非常接近了

to the point.(46)
到了这一点。

MILES O'BRIEN: One technique is the face swap,which put Steve Buscemi's face on Jennifer(47)
MILES O'BRIEN:一种技巧就是面部交换,这让Steve Buscemi面对Jennifer

Lawrence's body, Nicolas Cage onto a seriesof marquee stars in iconic roles, or Jimmy(48)
劳伦斯的身体,尼古拉斯·凯奇(Nicolas Cage)饰演一系列标志性角色的明星,或吉米

Kimmel's mug on mine.(49)
Kimmel的杯子放在我的杯子上。

I have had to relearn very simple things.(50)
我不得不重新学习非常简单的东西。

But there is a deep, dark side as well.(51)
但是也有一个深刻的,黑暗的一面。

Indeed, the technology has been used to pastethe faces of celebrities onto the bodies of(52)
事实上,这项技术已经被用来将名人的面孔贴在身上

porn stars.(53)
色情明星。

Computer scientist Hany Farid is a professorat Dartmouth College:(54)
计算机科学家Hany Farid是达特茅斯学院的教授:

HANY FARID, Dartmouth College: I am worriedabout the weaponization and I'm worried about(55)
HANY FARID,达特茅斯学院:我担心武器化,我很担心

how it's impacting us as a society.(56)
它如何影响我们作为一个社会。

So, we are working as hard as possible todetect these things.(57)
因此,我们正在尽可能努力地检测这些事情。

JORDAN PEELE, Filmmaker: Killmonger was right.(58)
JORDAN PEELE,电影制作人:Killmonger是对的。

MILES O'BRIEN: This video crystallized muchof the deep concern, what seems to be President(59)
MILES O'BRIEN:这段视频深刻引起了人们的关注,似乎是总统

Barack Obama making a speech...(60)
巴拉克奥巴马发表演讲......

JORDAN PEELE: You see, I would never say thesethings.(61)
JORDAN PEELE:你看,我永远不会说这些东西。

MILES O'BRIEN: ... is actually comedian andfilmmaker Jordan Peele doing his excellent(62)
MILES O'BRIEN:......实际上是喜剧演员和电影制作人乔丹皮尔做得非常出色

Obama impersonation synched with softwarecreated with artificial intelligence, or A.I.(63)
奥巴马冒充与人工智能或人工智能创建的软件同步

HANY FARID: The A.I. system synthesized themouth of President Obama to be consistent(64)
HANY FARID:人工智能系统合成了奥巴马总统的口碑

with the audio stream, and it made it looklike President Obama was saying things that(65)
随着音频流,它看起来像奥巴马总统说的那样

he never said.(66)
他从未说过。

That's called a lip synch deepfake.(67)
这被称为嘴唇同步深度假。

MILES O'BRIEN: Just this week, the techniquewas used to put some pretty outrageous and(68)
MILES O'BRIEN:就在这个星期,这项技术被用来做一些非常令人发指的事情

comical words into the mouth of Facebook founderMark Zuckerberg.(69)
Facebook创始人马克扎克伯格进入口中的滑稽话语。

MAN: Specter showed me that whoever controlsthe data controls the future.(70)
MAN:幽灵告诉我,控制数据的人控制着未来。

MILES O'BRIEN: It's a potent technology thatis ripening at a time of deep polarization(71)
MILES O'BRIEN:这是一种在深度极化时成熟的强大技术

and suspicion fueled by social media.(72)
社交媒体推动和怀疑。

REP.(73)
REP。

NANCY PELOSI (D-CA): So it's really sad.(74)
NANCY PELOSI(D-CA):所以真的很难过。

And here's the thing.(75)
这就是事情。

MILES O'BRIEN: Just last month, somethingmuch less sophisticated than a deepfake, a(76)
MILES O'BRIEN:就在上个月,还有一些不那么复杂的东西,a

doctored video of House Speaker Nancy Pelosimaking her seen drunk went viral.(77)
众议院议长南希佩洛西的篡改视频使她看到醉酒病毒。

REP.(78)
REP。

NANCY PELOSI: We want to get this presidentthe opportunity to do something historic.(79)
NANCY PELOSI:我们希望让这位总统有机会做一些具有历史意义的事情。

MILES O'BRIEN: Deepfakes ratchet up the risks.(80)
MILES O'BRIEN:Deepfakes加大了风险。

HANY FARID: The nightmare situation is thatthere's a video of President Trump saying,(81)
HANY FARID:噩梦般的情况是,特朗普总统的视频说,

"I have launched nuclear weapons against NorthKorea."(82)
“我发动了针对朝鲜的核武器。”

And somebody hacks his Twitter account, andthat goes viral, and, in 30 seconds, we have(83)
而且有人破解了他的推特账号,这种情况变得有病毒,并且在30秒内,我们有了

global nuclear meltdown.(84)
全球核危机。

Do I think it's likely?(85)
我认为这可能吗?

No.(86)
没有。

But it's not a zero probability, and thatshould scare the bejesus out of you, right?(87)
但这不是零概率,这应该吓跑你的怪物,对吗?

Because the fact that that is not impossibleis really worrisome.(88)
因为这不是不可能的事实真的令人担忧。

MILES O'BRIEN: Farid is most worried aboutdeepfakes rearing their ugly head during the(89)
米尔斯·奥布莱恩:法里德最担心的是,在他们期间,他们的丑陋头部会让他们感到头疼

2020 election.(90)
2020年大选。

So he and his team are carefully learningthe candidates' patterns of speech and how(91)
所以他和他的团队正在认真学习候选人的演讲模式和方式

they correlate with gestures as a way to spotdeepfakes.(92)
它们与手势相关联,作为发现深度伪造的一种方式。

HANY FARID: We do that, of course, by analyzinghundreds of hours of hours of video of individuals.(93)
HANY FARID:当然,我们通过分析数百小时的个人视频来实现这一目标。

We're focused on building models for all ofthe major party candidates, so that enough(94)
我们专注于为所有主要候选人建立模型,这样就足够了

we can upload a video to our system.(95)
我们可以将视频上传到我们的系统。

We can analyze it by comparing it to previousinterviews, and then asking, what is the probability(96)
我们可以通过将它与之前的访谈进行比较来分析它,然后询问概率是多少

that this is consistent with everything wehave seen before?(97)
这与我们以前见过的一切一致吗?

MILES O'BRIEN: Computer scientists have pushedthis technology using generative adversarial(98)
MILES O'BRIEN:计算机科学家利用生成对抗推动了这项技术的发展

networks, or GANs.(99)
网络或GAN。

A GAN pits two artificial intelligence algorithmsagainst each other.(100)
GAN将两个人工智能算法相互对立。

One strives to create realistic fake images,while the other grades the effort.(101)
一个人努力创造逼真的假图像,而另一个则努力。

HANY FARID: So, the synthesis engine says,I'm going to create a fake image, I give it(102)
HANY FARID:所以,合成引擎说,我要创建一个假图像,我给它

to this A.I. system that says, this looksfake to me.(103)
对于这个人工智能系统说,这对我来说看起来很假。

So it goes back and you change it.(104)
所以它回过头来你改变它。

And you do that a few billion times in rapidsuccession, and the computers are teaching(105)
你快速连续几十亿次,计算机正在教学

each other how to make better fakes.(106)
彼此如何制作更好的假货。

And that's what has democratized access.(107)
这就是民主化访问的内容。

MILES O'BRIEN: And that's why the Pentagonis interested in deepfakes.(108)
MILES O'BRIEN:这就是为什么五角大楼对深度伪造感兴趣的原因。

Its research enterprise, the Defense AdvancedResearch Projects Agency, or DARPA, is exploring(109)
其研究企业,国防高级研究计划局(DARPA)正在探索

ways to defend against the threat of deepfakes.(110)
如何防御深陷的威胁。

Computer scientist Matt Turek runs DARPA'smedia forensics, or MediFor, project.(111)
计算机科学家Matt Turek负责DARPA的媒体取证或MediFor项目。

MATT TUREK, Defense Advanced Research ProjectsAgency: So, there's an opportunity here for(112)
MATT TUREK,国防高级研究计划局:所以,这里有机会

us to essentially lose all trust in imagesand video.(113)
我们基本上失去了对图像和视频的所有信任。

MILES O'BRIEN: Turek showed me some of the70 counter-deepfake techniques DARPA is helping(114)
MILES O'BRIEN:Turek向我展示了DARPA正在帮助的70种反深度技术

nurture.(115)
培育。

WOMAN: Necessary for one people to dissolvethe political bands which have connected them(116)
女人:有必要让一个人解散与他们有联系的政治乐队

with another.(117)
和另外一个。

MILES O'BRIEN: This software is designed tocharacterize lip movement and compare it to(118)
MILES O'BRIEN:该软件旨在表征唇部运动并将其与之比较

the audio.(119)
音频。

MATT TUREK: And so, when see these red bars,that means actually that sounds of the speaker(120)
MATT TUREK:所以,当看到这些红色条形时,这实际上意味着扬声器的声音

are not actually consistent with the movementof the lips.(121)
实际上并不符合嘴唇的运动。

MILES O'BRIEN: Take a look at this video,supposedly two people sitting together.(122)
MILES O'BRIEN:看看这个视频,据说两个人坐在一起。

But software that determines the lightingangle on faces concludes it is a composite.(123)
但是确定面部照明角度的软件得出的结论是复合材料。

MATT TUREK: So, it estimates a 3-D model forthe face.(124)
MATT TUREK:因此,它估计了面部的三维模型。

Along with that 3-D model, it estimates thereflectance properties of the face, and also(125)
与该三维模型一起,它估计了面部的反射特性

the lighting angles.(126)
照明角度。

And so here we're primarily using the lightningangles to see whether those are consistent(127)
所以我们主要使用闪电角度来确定它们是否一致

or not.(128)
或不。

MILES O'BRIEN: In this example, video apparentlygathered by a security camera shows only one(129)
MILES O'BRIEN:在这个例子中,安全摄像头显然收集的视频只显示一个

car.(130)
汽车。

This artificial intelligence algorithm isdesigned to predict how things should move.(131)
这种人工智能算法旨在预测事物应该如何移动。

MATT TUREK: What that is triggering off ofis discontinuities in the motion.(132)
MATT TUREK:触发的是运动中的不连续性。

And so that gives us a signal to look at animage or a video and say, well, perhaps frames(133)
因此,这给了我们一个观察图像或视频的信号,并说,好吧,也许是帧

were removed here.(134)
被删除了。

MILES O'BRIEN: And it flags the video as altered.(135)
MILES O'BRIEN:它将视频标记为已更改。

Another vehicle was edited out.(136)
另一辆车被删除了。

MATT TUREK: There's a cat-and-mouse game.(137)
MATT TUREK:有一个猫捉老鼠的游戏。

The more aspects that you can use to debunkan image or video, the more burden that you(138)
您可以用来揭穿图像或视频的方面越多,您的负担就越多

put on the manipulator.(139)
穿上机械手。

MILES O'BRIEN: But none of these ideas willwork without the cooperation of the big social(140)
MILES O'BRIEN:但如果没有大社会的合作,这些想法都不会奏效

media platforms YouTube and Facebook, whichwould need to deploy the software and delete(141)
媒体平台YouTube和Facebook,需要部署软件并删除

the fakes, something Facebook refused to dowhen the Pelosi video emerged.(142)
这些假货是Facebook在Pelosi视频出现时拒绝做的事情。

HANY FARID: And the platforms have been, forthe most part, very cavalier about how they(143)
HANY FARID:在大多数情况下,这些平台对于他们的方式非常傲慢

deal with this type of illegal content, harmfulcontent, misinformation, fake news, election(144)
处理此类非法内容,有害内容,错误信息,虚假新闻,选举

tampering, non-consensual pornography, andthe list goes on and on, because it gets eyes(145)
篡改,未经同意的色情内容,以及名单不断,因为它得到了眼睛

on the platform, and that's good for business.(146)
在平台上,这对业务有利。

MILES O'BRIEN: A fake video amplified in anecho chamber can go an awfully long way before(147)
MILES O'BRIEN:在回声室中放大的假视频可能会非常漫长

the facts even enter the picture.(148)
事实甚至进入了画面。

For the "PBS NewsHour," I'm Miles O'Brienin Los Angeles.(149)
对于“PBS NewsHour”,我是洛杉矶的Miles O'Brien。


All News Articles fetched from PBS RSS Feeds and copyrighted by pbs.org