If AI and deep fakes can listen to a video or audio of a person and then are able to successfully reproduce such person, what does this entail for trials?

It used to be that recording audio or video would give strong information which often would weigh more than witnesses, but soon enough perfect forgery could enter the courtroom just as it’s doing in social media (where you’re not sworn to tell the truth, though the consequences are real)

I know fake information is a problem everywhere, but I started wondering what will happen when it creeps in testimonies.

How will we defend ourselves, while still using real videos or audios as proof? Or are we just doomed?

  • conciselyverbose@sh.itjust.works
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    2 months ago

    Again, you have to completely ignore that the core premise is evil intended to give big players even stronger monopoly control. It’s anti-free in every sense, and as an added bonus, would very certainly make possession of specific hardware sufficient to be executed in some countries, because everything it has ever captured would be tracked to it.

    But if you do that, there is already a system that does exactly what you’re asking. You don’t need to invent anything. It’s certificate authorities.

    I’m not actually trying to be an asshole, though I’m sure I’m coming off as one. But the only thing blockchain actually does is validate transactions. It’s a shared ledger.

    • webghost0101@sopuli.xyz
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      2 months ago

      Sure i’ll have a look at decentralized certificate authorities options.

      Very possibles to adapt my idea to whatever technology provides those function honestly.

      The only actual connection i have with blockchain is that reading about it when it was new directly inspired in me a possible way to combat fake news.