A U.K. woman was photographed standing in a mirror where her reflections didn’t match, but not because of a glitch in the Matrix. Instead, it’s a simple iPhone computational photography mistake.

  • Ook the Librarian@lemmy.world
    link
    fedilink
    English
    arrow-up
    24
    ·
    1 year ago

    This was important in the Kyle Rittenhouse case. The zoom resolution was interpolated by software. It wasn’t AI per se, but the fact that a jury couldn’t be relied upon to understand a black box algorithm and its possible artifacts, the zoomed video was disallowed.

    (this in no way implies that I agree with the court.)

    • Rob T Firefly@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      The zoom resolution was interpolated by software. It wasn’t AI per se

      Except it was. All the “AI” junk being hyped and peddled all over the place as a completely new and modern innovation is really just the same old interpolation by software, albeit software which is fueled by bigger databases and with more computing power thrown at it.

      It’s all just flashier autocorrect.

      • Ook the Librarian@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        As far as I know, nothing about AI entered into arguments. No precedents regarding AI could have been set here. Therefore, this case wasn’t about AI per se.

        I did bring it up as relevant because, as you say, AI is just an over-hyped black box. But that’s my opinion, with no case law to cite (ianal). So to say that a court would or should feel that AI and fancy photoediting is the same thing is misleading. I know that wasn’t your point, but it was part of mine.

    • wagoner@infosec.pub
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      I watched that whole court exchange live, and it helped the defendant’s case that the judge was computer illiterate.