• paraphrand@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      edit-2
      1 day ago

      I’ve watched most of DS9 upscaled. And it was an improvement for sure. I never noticed anything strange like these examples.

      But I also don’t think the upscaling/cleanup was this aggressive.

      • ragebutt@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 day ago

        AI upscale generally gives a perceived improved fidelity at a loss of some finer detail and grain. It is a trade off and it is almost never a objective improvement

      • MudMan@fedia.io
        link
        fedilink
        arrow-up
        3
        ·
        1 day ago

        Yeah, I don’t know what they did, how or from what source material.

        You can get less artifacty AI upscaling in real time on a mid-size PC these days.

    • crossover@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      1 day ago

      I went through a phase of testing out Topaz AI upscale tools on videos. Ultimately I didn’t like the results, as impressive as they are you always end up with some hallucinations ruining details.

      The exception is cartoons. They upscale really well.

  • Matriks404@lemmy.world
    link
    fedilink
    arrow-up
    11
    ·
    edit-2
    1 day ago

    I don’t see the point of watching upscaled version of an old TV show. It ruins the atmosphere. It’s like playing the NES version of Tetris on an emulator and using HD textures for some reason.

    Also if you’d want to upscale it anyway, why not provide source material and allow customers to use any upscaler they want?

    • Prunebutt@slrpnk.net
      link
      fedilink
      arrow-up
      2
      ·
      1 day ago

      Also if you’d want to upscale it anyway, why not provide source material and allow customers to use any upscaler they want?

      Because Upscaling is incredibly resource hungry. You can’t do it on a 250€ “smart” TV with the calculation equivalent of a raspberry pi 2.

      And then, sunk cost fallacy goes brrrrr.

  • Alexstarfire@lemmy.world
    link
    fedilink
    arrow-up
    11
    ·
    1 day ago

    I see the appeal for things like this. Taking pretty low quality, 480i at best, and making it presentable on modern TVs.

    That said, AI isn’t anywhere close to doing this well. Better to have original than some poorly upscaled crap.

  • PineRune@lemmy.world
    link
    fedilink
    arrow-up
    9
    ·
    1 day ago

    It would probably look better with an emulated CRT filter including scanlines; giving the appearance of the type of TVs they originally were made for.

    • MudMan@fedia.io
      link
      fedilink
      arrow-up
      8
      ·
      1 day ago

      This is an interesting option that I’m surprised hasn’t made the jump from games to other media.

      Admittedly they are different challenges. Games were half-res but progressive scan, video was interlaced. But hey, in the gaming world we’re at the point of adding high refresh support to emulate the CRT scan flicker. I can see a world where you create this high res 120Hz picture to simulate a shadow mask and interlaced output on modern TVs. Probably alongside a raw pixel option and an upscaled option, no reason to do just one other than storage space.