"Dear people of Belgium. This is great. As you know, I have had the balls to pull the united STATES out of the Parisavtalen. It should also ye do.", says Trump in a video that appeared on social media in may of last year.
the Video shows a picture of Donald Trump, but the person in the video, which is produced by the belgian political party Socialistische Partij Anders (Sp.a), is not the u.s. president, writes The Economist.
- A stolid spectator can let himself be deceived, but on the Trump clip you can see that the lip movements and the audio don't belong together, " says Torgeir Waterhouse, director Internet and new media in ICT, Norway, Dagbladet.AI: The chinese news agency Xinhua has hired two new reporters to the staff: two robots, which should read out the news to the chinese people. Video: Xinhua News Agency. Editing: Ørjan Ryland View more
Have you not noticed that the video is fake at first glance, one gets however an investigation from Trump-imitasjonen at the end of the clip:
"We all know that climate change is imaginary. Just like this video".Easy to falsify video footage
That the images can easily be manipulated is known for most. Less well known is that the videos are now on the way to becoming just as vulnerable to manipulation.
so-Called "Deepfake"-videos can be created with images that, for example, be retrieved from social media.
- It is about making fake films by the use of available images, computing power and technologies that can manipulate and easy to forge videos. It is the video as we believe that it is true, " says Waterhouse.the Fear of political attacks in Norway Will Sissell Kyrkjebø be your friend? In the worst case, you can be scammed out of money
While the Trump-the video is a little realistic, however, shows a stoner warning from Barack Obama how realistic the manipulation of video footage can be.
In the video, which was created by Buzzfeed in the last year, the call including former U.s. president Donald Trump for a "bastard". In reality, it was the movie director Jordan Peel that mimicked the the former president.
Waterhouse does not rule out that this type of videomanipulasjon can be used to politically targeted attacks in Norway.
- I fear that individuals or groups may attack parties and democracy by using it against certain politicians or parties. I don't think the serious parties will do it, but it does not rule out that some can, " he says.
According to Waterhouse, it is not difficult to weaken a political opponent, if you really want to.CHALLENGE: Internettfenomenet "Momo" takes off on social media. Here you will see why. NOTE: We warn that the video contains images that may be frightening for children and young people. Video: screenshots from Instagram and YouTube / Emilie Rydning / Dagbladet Read the full story here: https://www.dagbladet.no/nyheter/advarer-foreldre-mot-nytt-internettfenomen---ikke... Show more
You can imagine a power struggle within a party. It is not something difficult to defame a political adversary by cutting the audio from a private conversation or video that is filmed in the hidden. You can just look at the debate about abortloven here in Norway recently. Who said what and when was incredibly important. With movie and sound clips in it hidden in the context, it could have been easy to manipulate the debate, " he says.Us election
Tor Henning Ueland, senior in Very mean Norway is in no way spared of the phenomenon.
- Everything that can be used as a weapon, will be used as it, including the so-called Deepfake videos. Even though I am not aware that such videos have been used yet, it is not a question if, but when such videos will be used, " he says.
Ueland points out that the false news, conspiracy theories and påvirkningskampanjer is in the wind like never before. He does not rule out that we may get to see this up against and during the american election campaign in 2020.
- It can come from any of the candidates 'supporters, or from external parties who want to influence the outcome in one direction or the other," he says.95 people arrested in major crackdown on credit card fraud Risk for extortion
According to the Washington Post also represents these videos now a new way to harass women.
Images of women's faces retrieved from the social media and manipulated on the bodies of porn actors.
Waterhouse is not aware of cases where Deepfake-videos have been used to malign people in Norway, but:
- It is important to look at what can happen about 10-15 years. This represents a significant risk for abuse, for example, by manipulating people into porn movies in order to harass or to extort money. It is a bit like imaging. From having to use a lot of computing power for image manipulation, one can now do this with the smartphone, " says Waterhouse.Norwegian milliardærarving abused on Facebook: - We are aware of it