Way back in 1934, writer Upton Sinclair swept the Democratic primary for California’s gubernatorial election. One of his opponents needed something big to beat him. What came out of that was one of the dirtiest campaigns in history — and the first on-screen attack ad.

Hollywood directors created short “news” segments that aired before movies featuring residents and so-called experts. Yeah, all the content was fake.

Almost a century later, deepfakes are fooling voters into believing things that never happened or at least went down differently. The latest example: Ron DeSantis’ attack ad against Donald Trump.

Let’s make sure we’re all on the same page. What are “deepfakes?”

Special computer programs can make videos or pictures look like they show someone saying or doing something, but it’s not real. It can make someone’s face look like it’s on another person’s body or make them say things they never really said.

These computer programs can make it seem like someone famous, like a celebrity or a politician, is doing or saying things they didn’t actually do or say. These are called deepfakes.

First, a word between us

This next presidential campaign will have a lot of tech you need to know about. Don’t even think about dropping me a line saying, “Oh Kim, you’re only reporting this cause you hate Trump” or “You’re such a Trump lover.”

I’m telling you about this ad because the campaigning for anyone running is about to get worse. Put your politics aside so I can tell you what’s happening with AI and deepfakes. Got it? This is not politics. It’s being tech aware and intelligent.

See also  RHOC’s Emily Shares Real Reason Tamra Has Issue With Jen

Here’s what you need to know

This week, the “DeSantis War Room” account on Twitter shared a video with lots of examples of Trump’s support of Anthony Fauci — you know, former White House chief medical advisor. 

Weaved in with the authentic clips of Trump and Fauci are six images of the two together. Now, half of those images are fake and AI-generated. It never happened. But the video sure looks like it did. 

AFP Fact Check was the first to spot the fakes. Let’s take a look at the tells:

  • The fakes don’t bring up any results in reverse image searches. You can bet any real pics of Trump will appear across many sites.
  • The images are strangely glossy and blurry. (Take a look at their hair and skin.)
  • The poses and the way they are together look weird and unnatural. 
  • The White House briefing room and signs in the background are off, too. Look at the nonsense text on one of the signs.

How can you spot deepfakes?

OK, it’s easy to see those fakes now that we’ve reviewed the details. But what about next time you run into something strange online?

  • Try the reverse image search trick. From your computer, right-click the pic in your browser, then choose “Search image with Google.” A big red flag is if you can’t find it being posted by any other reputable source.
  • AI has a notoriously hard time getting hands right. Count the fingers, seriously.
  • Look closely at skin and hair, too. Anything too smooth, shiny or blurry should make you think twice.
  • Do a gut check. Does it feel real? There’s often an unexplainable but odd quality to AI images.
See also  Here’s every song on the ‘Extraction 2’ soundtrack

This is important stuff, no matter your politics. Don’t be that person sharing deepfake images and news. Share this story to help the people in your life spot fakes, too.



Source link