sport

One Tech Tip: How to spot AI

Font size+Author:Cosmic Coverage news portalSource:politics2024-05-08 09:13:10I want to comment(0)

LONDON (AP) — AI fakery is quickly becoming one of the biggest problems confronting us online. Decep

LONDON (AP) — AI fakery is quickly becoming one of the biggest problems confronting us online. Deceptive pictures, videos and audio are proliferating as a result of the rise and misuse of generative artificial intelligence tools.

With AI deepfakes cropping up almost every day, depicting everyone from Taylor Swift to Donald Trump, it’s getting harder to tell what’s real from what’s not. Video and image generators like DALL-E, Midjourney and OpenAI’s Sora make it easy for people without any technical skills to create deepfakes — just type a request and the system spits it out.

These fake images might seem harmless. But they can be used to carry out scams and identity theft or propaganda and election manipulation.

Here is how to avoid being duped by deepfakes:

HOW TO SPOT A DEEPFAKE

In the early days of deepfakes, the technology was far from perfect and often left telltale signs of manipulation. Fact-checkers have pointed out images with obvious errors, like hands with six fingers or eyeglasses that have differently shaped lenses.

Related articles
  • Plane passenger reveals his hilarious take on what your seat selection means about your journey

    Plane passenger reveals his hilarious take on what your seat selection means about your journey

    2024-05-08 08:45

  • PGA Championship invites 7 LIV players to get top 100 in the world

    PGA Championship invites 7 LIV players to get top 100 in the world

    2024-05-08 08:41

  • With help from AI, Randy Travis got his voice back. Here's how his first song post

    With help from AI, Randy Travis got his voice back. Here's how his first song post

    2024-05-08 07:43

  • St. Louis Blues remove interim tag and name Drew Bannister full

    St. Louis Blues remove interim tag and name Drew Bannister full

    2024-05-08 07:07

Netizen comments