• s_s@lemm.ee
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    29
    ·
    edit-2
    2 months ago

    They are not bad at this. You are bad at understanding it.

    Don’t get mad when you could instead learn something.

    Yes it gets complex. It’s a 25-year old protocol that does almost everything. Of course it will be.

    But the names are not hard if you bother to learn them.

    • NotAnOnionAtAll@feddit.org
      link
      fedilink
      English
      arrow-up
      15
      ·
      2 months ago

      They are not bad at this. You are bad at understanding it.

      I work with this stuff, and I do understand it. Some of my colleagues are actively participating in USB-IF workgroups, although not the ones responsible for naming end user facing things. They come to me for advice when those other workgroups changed some names retroactively again and we need to make sure we are still backwards compatible with things that rely on those names and that we are not confusing our customers more than necessary.

      That is why I am very confident in claiming those naming schemes are bad.

      “don’t even bother learning it” is my advice for normal end users, and I do stand by it.

      But the names are not hard if you bother to learn them.

      Never said it is hard.

      It is more complex than it needs to be.

      It is internally inconsistent.

      Names get changed retroactively with new spec releases.

      None of that is hard to learn, just not worth the effort.

      • InvertedParallax@lemm.ee
        link
        fedilink
        English
        arrow-up
        4
        ·
        2 months ago

        They’re bad because manufacturers want to pass their usb 2.0 gear as “usb 3.0 compliant”, which it technically is, and their usb 3.0 gear as “usb 3.2” because 3.2 Gen 1x1 is also 5gbps.

        Also the whole alternate mode is awesome, but cheap hub chips don’t bother trying to support it and the only people who do are the laptop ports so they can save $.40 on a separate hdmi port.

        And don’t get me started on all the USB-c chargers that only put out 1.5a because it’s just a normal 7805 on the back end.

        • s_s@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          2 months ago

          They’re bad because manufacturers want to pass their usb 2.0 gear as “usb 3.0 compliant”, which it technically is, and their usb 3.0 gear as “usb 3.2” because 3.2 Gen 1x1 is also 5gbps.

          The USB X.X is just the version of the standard and doesn’t mean anything for the capabilities of a physical device.

          When a new standard comes out it superceeds the old one. Devices are always designed and certified according to the current standard.

          Soooo…What are you talking about?

          • InvertedParallax@lemm.ee
            link
            fedilink
            English
            arrow-up
            3
            ·
            2 months ago

            I’m talking about using the standard traditionally to denote the performance of the connection.

            You don’t go around talking about your “Usb 3.0 device” that runs at 480mbps unless you’re trying to be a massive dickhole.

            That’s what I’m talking about.

            • s_s@lemm.ee
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              2
              ·
              edit-2
              2 months ago

              480mbps

              A device or port that does 480mbps transfer speeds is a “Hi-Speed” device/port. That’s the real name and always has been.

              It doesn’t matter what version of the USB spec it was certified under. If it was designed between 2000 and 2008 it was certified under USB 2.0 or 2.1

              If that device was certified between 2008 and 2013 then it was certified under USB 3.0. That absolutely doesn’t make it a “SuperSpeed” device/port, but that’s more than clear when we use the real names.

              • InvertedParallax@lemm.ee
                link
                fedilink
                English
                arrow-up
                3
                ·
                2 months ago

                Nobody uses that, they use the spec number because that’s what they’ve been taught, and they identify with it more than the incredibly stupid ‘full/high/super/duper/ultramegahyperspeed’ convention which the idiots at the siig decided to break again in 3.2.

                Everybody literally on the planet agrees the system is moronic, you’re literally the only person who dissents, congratulations on that.

                • s_s@lemm.ee
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  2
                  ·
                  edit-2
                  2 months ago

                  Nobody uses that…Everybody literally on the planet agrees the system is moronic

                  Then just be as mad as you want–that’s the whole point of the news cycle anyways! Why bother learning? Congrats, chaos wins!

                  • InvertedParallax@lemm.ee
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    ·
                    2 months ago

                    I’ve integrated the IP on silicon (copy pasta with axi mostly), it’s not me who has the problem, it’s normal people who don’t live this shit and just want the plug to work best, which isn’t what happens at all.

                    The naming is a joke to everyone but keep being proud of it.

      • s_s@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        4
        ·
        2 months ago

        They come to me for advice when those other workgroups changed some names retroactively again

        Can you give a specific example of this?

        I’d love to believe all your ethos arguments if you could give me some logos.

    • rhandyrhoads@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      ·
      2 months ago

      There is some stuff to be learned, but especially with USB-C I’d say the vast majority are not labeled. There’s even some devices charged with USB C that can’t be charged with a PD charger and need an A to C cable. Phones are a great example where you have to look up the specs to know data transfer capabilities. Additionally they renamed the USB 3.0 standard which has been established for over a decade to USB 3.1 Gen 1 which is completely unnecessary and just serves to confuse. The standard was largely understandable with USB 3.0 generally being blue or at least a color other than black and on decently modern devices USB 2.0 would be black. With USB-C indication has just about gone out the window and what used to be a very simple to understand standard has now become nearly impossible to understand without having researched every device and cable you interact with.

      • s_s@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        2
        ·
        edit-2
        2 months ago

        There’s even some devices charged with USB C that can’t be charged with a PD charger and need an A to C cable

        Phones with qualcomm chips briefly had their own proprietary fast charging standards that were not a USB standard. You are unlikely to be using those devices in 2024. But is it USB-IF’s fault manufacturers tried to create proprietary standards to collect royalties?

        Additionally they renamed the USB 3.0 standard which has been established for over a decade to USB 3.1 Gen 1 which is completely unnecessary and just serves to confuse

        No they didn’t?

        The 5Gbps transfer rate introduced in 2008 is called “Superspeed” and it always has been.

        USB X.X is not a port or a transfer speed. It’s the standard (ie a technical whitepaper). The standard is updated as time marches on and new features are added.

        The standard was largely understandable with USB 3.0 generally being blue or at least a color other than black and on decently modern devices USB 2.0 would be black.

        This was never a requirement, but it was nice to know which Type-A ports had 8 pins vs 4-pins.

        With USB-C indication has just about gone out the window and what used to be a very simple to understand standard has now become nearly impossible to understand without having researched every device and cable you interact with.

        For the most part you just plug it in and it works. If you need something specific like an external GPU connection, you can’t use your phone charging cable, sure. Is that really that big of a deal?

        • NotAnOnionAtAll@feddit.org
          link
          fedilink
          English
          arrow-up
          4
          ·
          2 months ago

          But is it USB-IF’s fault manufacturers tried […]

          Yes, it absolutely is USB-IF’s fault that they are not even trying to enforce some semblance of consistency and sanity among adopters. They do have the power to say “no soup certification for you” to manufacturers not following the rules, but they don’t use it anywhere near aggressively enough. And that includes not making rules that are strict enough in the first place.