Partner im RedaktionsNetzwerk Deutschland
Écoutez {param} dans l'application
Écoutez Seriously... dans l'application
(26.581)(171.489)
Sauvegarde des favoris
Réveil
Minuteur
Sauvegarde des favoris
Réveil
Minuteur

Seriously...

Podcast Seriously...
Podcast Seriously...

Seriously...

ajouter

Épisodes disponibles

0 sur 300
  • 5. The Future Will Be Synthesised
    What do we want the synthetic future to look like? It’s seeping into our everyday lives, but are we ready? We need a conversation about the legal, policy and ethical implications for society. Deepfakes’ murky origins are in a form of sexual image abuse that is being used against hundreds of thousands of people, most of them women. Presenter and synthetic media expert Henry Ajder speaks to journalist Sam Cole, who first reported on deepfakes in 2018. She uncovered a Reddit forum sharing pornographic videos with the faces of famous Hollywood actresses transposed on to the bodies of porn performers. Since then the technology has become much more accessible and ordinary women have become the target. Henry interviews a woman who was targeted with deepfake image abuse, and considers what we can do to protect citizens from synthetic media’s malicious uses. Interviewees: Sam Cole, Vice; Noelle Martin, campaigner; Jesselyn Cook, NBC
    5/20/2022
    14:59
  • 4. The Future will be Synthesised
    If anything can be a deepfake, perhaps nothing can be trusted - and politicians can take advantage of the so called "Liars' dividend" by dismissing real media as fake. In satire, deepfakes have already had a controversial impact, targeting politicians, business leaders, and celebrities. Meanwhile, convincing deepfake audio and video have the potential to create a new wave of fraud where faces, voices and bodies can be stolen. These malicious uses of deepfake technology started out targeting celebrities and people in the public eye, but have become a mainstream challenge for cyber security professionals and ordinary individuals whose images have been used without their consent. Deepfakes can be used to defame or discredit people - but on the flip side, the cry of ‘deepfake’ could undermine trust in the use of video evidence in the justice system. What can we do to protect citizens from synthetic media’s malicious uses? And might there be some positive applications for deepfakes in politics? Interviewees: Sam Gregory, Witness; Nina Schick, author; Victor Riparbelli, Synthesia
    5/20/2022
    15:21
  • 3. The Future will be Synthesised
    If anything can be a deepfake, perhaps nothing can be trusted - and politicians can take advantage of the so called "Liars' dividend" by dismissing real media as fake. In satire, deepfakes have already had a controversial impact, targeting politicians, business leaders, and celebrities. Meanwhile, convincing deepfake audio and video have the potential to create a new wave of fraud where faces, voices and bodies can be stolen. These malicious uses of deepfake technology started out targeting celebrities and people in the public eye, but have become a mainstream challenge for cyber security professionals and ordinary individuals whose images have been used without their consent. Deepfakes can be used to defame or discredit people - but on the flip side, the cry of ‘deepfake’ could undermine trust in the use of video evidence in the justice system. What can we do to protect citizens from synthetic media’s malicious uses? And might there be some positive applications for deepfakes in politics? Interviewees: Sam Gregory, Witness; Nina Schick, author; Victor Riparbelli, Synthesia
    5/20/2022
    15:57
  • 2. The Future Will Be Synthesised
    Ever since the 2018 mid-term elections in the US, people have been sounding the alarm that a deepfake could be used to disrupt or compromise a democratic process. These fears have not yet come to pass, but recently deepfakes of Zelensky and Putin were deployed as the Ukrainian conflict escalated. How much disruption did these deepfakes cause? How convincing were they? And are they an omen of things to come? Could deepfakes enhance disinformation campaigns that already cause significant harm? Presenter and synthetic media expert Henry Ajder unpicks the most recent deepfake video and speaks to a journalist who reported on an unusual news report which used a deepfake news presenter to attempt to spread disinformation in Mali. Interviewees: Kateryna Fedotenko, Ukraine 24; Sam Gregory, Witness; Catherine Bennett, Le Monde/ France 24
    5/20/2022
    14:51
  • 1. The Future Will Be Synthysised
    What do we want the synthetic future to look like? It’s seeping into our everyday lives, but are we ready? We need a conversation about the legal, policy and ethical implications for society. Deepfakes’ murky origins are in a form of sexual image abuse that is being used against hundreds of thousands of people, most of them women. Presenter and synthetic media expert Henry Ajder speaks to journalist Sam Cole, who first reported on deepfakes in 2018. She uncovered a Reddit forum sharing pornographic videos with the faces of famous Hollywood actresses transposed on to the bodies of porn performers. Since then the technology has become much more accessible and ordinary women have become the target. Henry interviews a woman who was targeted with deepfake image abuse, and considers what we can do to protect citizens from synthetic media’s malicious uses. Interviewees: Sam Cole, Vice; Noelle Martin, campaigner; Jesselyn Cook, NBC
    5/20/2022
    15:11

À propos de Seriously...

Site web de la radio

Écoutez Seriously..., Skyrock ou d'autres radios du monde entier - avec l'app de radio.fr

Seriously...

Seriously...

Téléchargez gratuitement et écoutez facilement la radio et les podcasts.

Google Play StoreApp Store

Seriously...: Radios du groupe