Police investigation remains open. The photo of one of the minors included a fly; that is the logo of Clothoff, the application that is presumably being used to create the images, which promotes its services with the slogan: “Undress anybody with our free service!”

    • taladar@feddit.de
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      5
      ·
      1 year ago

      Not even that. It only allows you to verify that the source is identical to (the potentially wrong information) that was claimed at the time of recording by the person adding that information to the block chain. Blockchain, as usual, adds nothing here.

      • devils_advocate
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        1 year ago

        It proves that the video could not have been created at a later time.

      • fiah@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        8
        ·
        1 year ago

        Blockchain, as usual, adds nothing here.

        it can add trust. If there’s a trusted central authority where these hashes can be stored then there’s no need for a blockchain. However, if there isn’t, then a blockchain could be used instead, as long as it’s big and established enough that everybody can agree that the data stored on it cannot be manipulated

        • nudny ekscentryk@szmer.info
          link
          fedilink
          English
          arrow-up
          12
          arrow-down
          3
          ·
          1 year ago

          but false, nonconsensual nudes are not collectible items that need to have their authenticity proven. they are there to destroy peoples’ lives. even if 99% of people seeing your nude believe you it’s not authnetic, it still affects you heavily

          • fiah@discuss.tchncs.de
            link
            fedilink
            English
            arrow-up
            6
            arrow-down
            1
            ·
            1 year ago

            nonconsensual nudes are not collectible items that need to have their authenticity proven

            of course not, but that’s not what this comment thread is about. It’s about this:

            Ironically, in a sense we will revert back to the era before photography existed. To verify if something is real, we might have to rely on witness testimony.

            that’s where it can be very useful to store a fingerprint of a file in a trusted database, regardless of where that database gets its trust from

                • nudny ekscentryk@szmer.info
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  arrow-down
                  4
                  ·
                  1 year ago

                  it very much is:

                  OP: In Spain, dozens of girls are reporting AI-generated nude photos of them being circulated at school: ‘My heart skipped a beat’

                  parent reply: Thats why we need Blockchain Technology

                  • fiah@discuss.tchncs.de
                    link
                    fedilink
                    English
                    arrow-up
                    5
                    arrow-down
                    2
                    ·
                    edit-2
                    1 year ago

                    a discussion can have multiple, separate threads with branching topics, that’s what this threaded comment system is specifically made to facilitate

    • nudny ekscentryk@szmer.info
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      yeah but the problem is mere existance of tools allowing pornographic forgery, not verifying whether the material is real or not