Digitally altered and synthetic media are becoming more of a problem.  Openly available tools, including AI Deep Learning, enable the easy modification of pictures and videos for distribution on the Internet.  Most are benign; clearing up acne, improving image lighting, creating a funny meme, or perhaps narrowing a waistline for aesthetic reasons.  More disturbing is the generation of videos of known personalities, making them appear to make caustic statements or take part in inappropriate activities.  These fakes have appeared in political posts, social satire, news media, and pornographic material.  Motivations are sometimes for humor, vanity, vindictiveness, or to sway public viewpoints.   

The most malicious reasons are just around the corner.  Cybercriminals, who innately understand the value of impersonation and counterfeiting identities, are drooling at potentially using this technology to open entirely new lucrative branches of scams, phishing, and identity theft.  Every day the technology to create synthetic digital representations gets more believable and accessible, the closer it will end up in the hands of criminals. 

The societal problems are only beginning as the tools to create fakes are far outpacing the capabilities to detect them.  Several organizations are working toward the goal of confidently identifying digital modification in pictures, audio, and video. 

Microsoft has recently announced one such tool for analyzing videos, purposely being released in advance of the U.S. elections, to help media sites and social watchdogs detect misleading political deepfakes.  Microsoft Research is aware their technology will be undermined soon, but having some tools to help identify truth as the election cycle begins, is better than nothing. 

The war on deepfakes is just starting.  Technology innovation is working on both sides, to create realistic synthetic content and to detect such creations before they are accepted as truth.  Society will be caught in the cross-fire as we all must consider if what we see and hear is actually real.



Interested in more? Follow me on LinkedInMedium, and Twitter (@Matt_Rosenquist) to hear insights, rants, and what is going on in cybersecurity.


Image Source:

Views: 29

Join the Discussion ...

You need to be a member of CISO Platform to join the discussion!

Join CISO Platform



CISO as an enabler

Started by Maheshkumar Vagadiya Jul 30. 0 Replies

Share the instances where you were able to convince the Executive management /board that CISO function is enabler rather then a hindrance.Thanks youMaheshContinue

Has Anyone Evaluated Digital Signature (like Docusign)?

Started by CISO Platform. Last reply by SACHIN BP SHETTY Apr 24. 1 Reply

(question posted on behalf of a CISO member)Has anyone evaluated digital signature (like Docusign), any specific risk/ security areas to be looked into while finalising a vendor? Any and all inputs will be very much appreciated.Continue

What are your strategies for using Zoom in your organization after recent vulnerabilities in news about Zoom platform?

Started by CISO Platform. Last reply by ANAND SHRIMALI May 20. 4 Replies

(question posted on behalf of a CISO member)What are your strategies for using Zoom in your organization after recent vulnerabilities in news about Zoom platform?Related Question: …Continue

[Please Suggest] Corona Virus: Security advisory for work from home

Started by CISO Platform. Last reply by Bhushan Deo Mar 20. 12 Replies

(question posted on behalf of a CISO member)Due to CORONA virus most of the organizations are allowing their employees to work form home.Has any one issued security advisory for work from home ?Continue

Tags: #COVID19

Follow us

Contact Us


Mobile: +91 99002 62585

InfoSec Media Private Limited,First Floor,# 48,Dr DV Gundappa Road, Basavanagudi,Bangalore,Karnataka - 560004

© 2020   Created by CISO Platform.   Powered by

Badges  |  Report an Issue  |  Privacy Policy  |  Terms of Service