美国国家公共电台 NPR In An Era Of Fake News, Advancing Face-Swap Apps Blur More Lines
时间:2019-01-16 作者:英语课 分类:2018年NPR美国国家公共电台2月
SCOTT SIMON, HOST:
Most people familiar with face-swapping know it as a harmless, fun feature on social media apps. An algorithm captures a person's face and places it on somebody else's head. The result is rarely seamless, and often it's pretty funny.
But face-swapping has recently been used to superimpose the faces of celebrities 1 into pornographic films. This isn't just alarming for actors and actresses who appear to perform in movies they never made. Because the technology is more advanced and accessible, not-so-famous faces are worried where they might show up online. Is face-swapping a dark sign for online identities?
Samantha Cole is an editor at Motherboard and has been covering this. Thanks very much for being with us.
SAMANTHA COLE: Sure. Thanks for having me.
SIMON: You've seen one of these, right?
COLE: Yes. I've seen probably dozens, if not a hundred, of them by now.
SIMON: Well, you tracked down and interviewed a Reddit user who goes by the name of Deepfakes who, I guess, has created three adult films with celebrity 2 faces, yes?
COLE: He's created probably a lot more than that, to be honest. He was the person who first posted one of these on Reddit, and his name has become the name for this form of video - these fake porn videos.
SIMON: How does it work?
COLE: So basically, it's generated using a machine-learning algorithm. So someone takes a dataset of a lot of people's - or one person's face and a lot of pictures of that person's face, and then a video that they want to put it on. And they run a machine-learning algorithm, train it on these two images. And after a few hours, it gives you the result, which is these very realistic fake porn videos.
SIMON: So hypothetically, could you take somebody's photos or videos off their social media feeds and put them into adult films?
COLE: So yes. Hypothetically, it's definitely possible if you have enough images of someone. It's not something that we've seen happen yet. But as quickly as this technology is moving, it's definitely possible.
SIMON: Is it legal? Or does anyone care?
COLE: (Laughter) I think both sides care quite a bit - the people making them and the people who are the targets of them. The legality is honestly in a very gray area. It's all very hazy 3 right now. We're not really sure what to make of it. Celebrities could sue for misappropriation of their images, like when you use a celebrity's face for an ad without their permission - things like that. But the average person has little recourse, honestly. Revenge porn laws don't include the right kind of language to cover this situation because it's a mashup of two things.
SIMON: Yeah. Revenge porn is when someone takes a...
COLE: Right.
SIMON: ...Intimate film of someone, and they don't have their permission.
COLE: Exactly. Yeah. So this is not quite that. And that's creating a lot of problems legally and a lot of questions of how we're going to handle this.
SIMON: I have to tell you my biggest worry as a citizen is not porn but that somebody might put somebody's face - let's say - at a crime scene or in some other - you know, at a rally that you never attended or something like that.
COLE: That's definitely possible, and that's something that we're thinking about. It's splashy right now because it is porn. And celebrities and porn performers are two groups of people that have lots of images of themselves publicly out there, so they're easy targets for this. But so are politicians, you know, anyone who's on TV or on the Internet, showing their face quite a bit.
SIMON: And what about regular citizens who just have a lot of pictures and videos on social media sites? Could they be victimized, too?
COLE: I mean, it's theoretically definitely possible. You would need hundreds of pictures of someone. It's worth taking a look at your privacy settings and thinking about how you use the Internet and whether or not you're sharing your face in all these private forums 4.
But then again, that puts a lot of pressure on users to - for them to kind of self-regulate over platforms. And those are the ones that really need to be accountable for taking care of the people who are using these platforms and kind of regulating how people are using them and hoping that they're not for harm.
SIMON: I mean, if the solution is just don't put pictures or videos on social media platforms, that also kind of destroys the utility of social media platforms, doesn't it?
COLE: Sure. And that's definitely not - that's not what I'm saying. I'm not saying don't put pictures of yourself out there. That's an extreme solution to this. The better solution would be to have more stringent 5 laws around revenge porn, ownership of our own images, more responsive platforms who act quickly and serve their users better.
Yeah. It's - right now, it's just easier to say think twice about your privacy settings because that's all we can do. That's all we have control of right now.
SIMON: Samantha Cole at Motherboard, thanks so much for being with us.
COLE: Thank you for having me.
- He only invited A-list celebrities to his parties. 他只邀请头等名流参加他的聚会。
- a TV chat show full of B-list celebrities 由众多二流人物参加的电视访谈节目
- Tom found himself something of a celebrity. 汤姆意识到自己已小有名气了。
- He haunted famous men, hoping to get celebrity for himself. 他常和名人在一起, 希望借此使自己获得名气。
- We couldn't see far because it was so hazy.雾气蒙蒙妨碍了我们的视线。
- I have a hazy memory of those early years.对那些早先的岁月我有着朦胧的记忆。
- A few of the forums were being closely monitored by the administrators. 有些论坛被管理员严密监控。
- It can cast a dark cloud over these forums. 它将是的论坛上空布满乌云。