Why aren’t motherboards mostly USB-C by now?::I’m beginning to think that the Windows PC that I built in 2015 is ready for retirement (though if Joe Biden can be president at 78, maybe this PC can last until 2029?). In looking at new des…

  • BetaDoggo_@lemmy.world
    link
    fedilink
    English
    arrow-up
    107
    arrow-down
    1
    ·
    1 year ago

    There’s no reason to replace USB A on most desktops since it would break 20+ years of backwards compatibility without any real benefit. Maybe 1 or 2 would be useful.

    • DacoTaco@lemmy.world
      link
      fedilink
      English
      arrow-up
      31
      arrow-down
      3
      ·
      edit-2
      1 year ago

      Thats the thing, with a small adaptor that has no logic/silicon, usb-a device is fully compatible with a usb-c port. And things like framework solved this issue ages ago to make hardware expose either, or both, usb-c and usb-a…

      If anything, i think the usb-c price might be why its nowhere to be seen. However, with the eu laws that might change in the next 8y, but i doubt it as usb-c to usb-a are a thing

      • JohnEdwa@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        6
        ·
        edit-2
        1 year ago

        If manufacturers start making printers, mouses, keyboards, headsets and all other peripherals with usb-c cables and provide c to a adapters in the boxes, then motherboard manufacturers should start adding more ports to support them without those adapters.

        But the Apple way of changing all ports to USB-C because “you can just use dongles!” is dumb. Motherboards have plenty of space for both, usb-c is like the smallest connector that exists right after the 3.5mm audio plug.

        • Captain Aggravated@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          6
          ·
          1 year ago

          Having lived through the initial rollout of USB, I remember a period of time when a PC would come with a few USB ports, printers had parallel and USB ports on them, mice came with USB to P/S2 adapters in the box etc. so there was a transitional period. Everyone seemed to be onboard with the idea that USB was the future. Within a decade, P/S2, RS-232 and parallel ports disappeared from PCs.

          That same drive to move the fuck on and complete the transition doesn’t seem to be there this time. Mobile device manufacturers have adopted USB-C as entirely as they can because of their weird obsession with making devices uselessly thin. Peripheral manufacturers really haven’t; displays are still HDMI or DP, Logitech outright refuses to make a USB-C Unifying Transceiver…“dongle life.” And desktop PCs have relatively few USB-C ports meaning if you do manage to collect up USB-C peripherals for your mobile life, they’re a pain on desktop.

          • CmdrShepard@lemmy.one
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            I think the difference this time is that P/S2 to USB was about mobing from a proprietary connector to a universal standard while USB-A to USB-C is moving from one universal standard to another

    • AA5B@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      3
      ·
      1 year ago

      If y’all still have desktops, there’s just no excuse. There’s room to include any port that may be convenient, and having some extra would let you modernize as you need to replace accessories.

      At least with laptops, there may be a space argument for limited ports

      • CmdrShepard@lemmy.one
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        1
        ·
        1 year ago

        Still any more than 2 seems like a waste since PCs also have dedicated video, audio, power, and data ports. USB-C makes sense on laptops and phones because you can lump all those things into one or two ports. This isn’t necessary on a PC and just adds extra cost with little benefit.

        My board from 2018 has a rear USB-C and header for front USB-C. I’ve only used only one of them a handful of times in all these years to transfer large files to/from a phone and this is coming from someone with a lot of different devices that use USB to interface with the PC.

        • AA5B@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          2
          ·
          1 year ago

          You’re assuming you only need USB-C for things where it is uniquely suited, whereas im assuming we want to transition everything to the new standard, so we have one port, one connector, one wire

          • CmdrShepard@lemmy.one
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            Its a worthy goal but I don’t see it happening for decades as everything still uses USB-A and again there isn’t a whole lot of downside to the current configuration as long as you have at least one or two USB-C for the tines when it’s uniquely suited.

  • orclev@lemmy.world
    link
    fedilink
    English
    arrow-up
    89
    arrow-down
    2
    ·
    1 year ago

    So, much as I hate to admit it, the real reason for this is bandwidth. Lets look at the best case scenario without dipping our toes into server grade hardware. AMD CPUs tend to have more I/O bandwidth allocated than Intel, so we’ll take the top of the line desktop AMD CPU as of right now, the Ryzen 9 7950X (technically the X3D version is the actual top of the line, but that makes certain tradeoffs and for our purposes in this discussion both chips are identical).

    On paper, the 7950X has 24 PCIe 5.0 lanes, and 4 on board USB 3.2 ports on its built in USB controller. So already we could have a maximum of 4 type-C ports if we had no type-A ports, however in practice most manufacturers opt to split the difference and go with 1 or 2 type-C ports and the remaining 2 or 3 ports as type-A. You can have more USB ports of course, but you need to then include a USB controller on your motherboards chipset, and that in turn needs to be wired into the PCIe bus which means taking up PCIe lanes, so lets take a look at the situation over there.

    We start with 24 PCIe lanes, but immediately we’re going to be sacrificing 16 of those for the GPU, so really we have 8 PCIe lanes. Further, most systems now use NVMe M.2 drives, and NVMe uses up to 4 PCIe lanes at its highest supported speed. So we’re now down to 4 PCIe lanes, and this is without any extra PCIe cards or a second NVMe drive.

    So, now you need to plug a USB controller into your PCIe bus. USB 3.2 spec defines the highest supported bandwidth as 10 Gbps. PCIe 5.0 defines the maximum bandwidth of a single PCIe lane as a bit over 31 Gbps. So the good news is, you can successfully drive up to 3 USB 3.2 ports off a single PCIe 5.0 lane. In practice though USB controllers are always designed with even numbers of ports, typically 2 or 4. In the case of 4, one lane isn’t going to cut it, you’ll need at least 2 PCIe lanes.

    I think you can see at this point why manufacturers aren’t in a huge rush to slap a ton of USB type-c connectors on their motherboards. With a modern desktop there’s already a ton of devices competing for limited CPU I/O bandwidth. Even without an extra USB controller added in it’s already entirely feasible to come dangerously close to completely saturating all available bandwidth.

    • AA5B@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      1
      ·
      1 year ago

      They don’t all have to be high speed. For example, we already see a distinction in USB-A based on things like power and data speed. I don’t see why anyone would be surprised at a similar arrangement for USB-C. Let me have my low speed keyboard and mouse ports, my low power watch charging port

      • orclev@lemmy.world
        link
        fedilink
        English
        arrow-up
        20
        arrow-down
        1
        ·
        1 year ago

        While that is true, it does cause some headaches for end users. There’s a (barely followed) code for differentiating the speeds of type-A connectors, but I’m not aware of any such system for type-C. Generally people expect a type-C connection to be full USB 3.2 or USB-4 speeds (not to mention the absolute state of the USB spec with them changing the nomenclature constantly). If you started putting USB 2.0 ports with type-C connectors you’d quickly find people complaining about that I’m sure.

        Really, in the long term I’m sure in another CPU generation or two we’ll have enough bandwidth to spare that manufacturers can start putting extra USB 3.2 or USB 4 controllers on the motherboards at which point they’ll be able to replace most of the type-A ports with type-C without losing speed. In practice though I expect we’ll see history repeating itself with “low” speed type-C ports and high speed type-C ports that support whatever the latest and greatest USB spec is, and no doubt some kind of distinguishing mark to differentiate them. We already see something like that with lightning, although that’s just a little too proprietary to really cut it, we’ll need to get something that’s part of the USB spec itself.

        • anyhow2503@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          Almost none of the alternate modes or advanced features are required for USB-C devices. Most smartphones don’t support high data rates over their single USB-C port. There are are probably more USB-C ports using the USB 2.0 specs, for example peripheral devices like mice or keyboards. Beyond stuff like DisplayPort alternate mode, there still isn’t a big demand for more than one or two USB-C ports with high data rates or the full feature set.

      • abhibeckert@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 year ago

        The latest USB standard has a minimum of 20 Gigabit. Of course, they could only support USB 2, but there would be complaints.

    • NaibofTabr@infosec.pub
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 year ago

      I think power delivery is a concern too. If a motherboard had 4 USB-C ports on it, you know someone would try to plug in 4 USB-C monitors at 100W (20V/5A) each, so 400W going across your IO bus. At that point if your motherboard doesn’t just burn out, and you have a big enough power supply to provide it, you’re still going to have a serious heat problem.

      • Pantsofmagic@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        Yeah I recently started using a motherboard that has a 6-pin GPU style header for powering the USB-C ports. It limits power delivery capacity if you don’t plug the connector in, but if you do it supports 100W ports.

        • NaibofTabr@infosec.pub
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          100W on each port or 100W total output for all 6 ports? I seriously doubt your power supply will deliver 600W on one peripheral cable.

          • Pantsofmagic@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            No it limits the total amount, but it is reasonable that they added a dedicated power input. I’m guessing we’ll see even more of that on ATX12VO motherboards or similar. Seems like power standards are changing a lot and manufacturers are waiting for things to settle down.

      • lud@lemm.ee
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        1 year ago

        Don’t support 100W power delivery on all ports then.

        • NaibofTabr@infosec.pub
          link
          fedilink
          English
          arrow-up
          4
          ·
          1 year ago

          I think it’s easy to say this, but harder to actually do in practice. There’s a color code system for USB-A, but a lot of manufacturers didn’t follow it reliably, and most users don’t know what the differences are anyway (I’d certainly have to look up what Yellow and Red are specifically for). You’d have the same problem with trying to mark USB-C ports, and without some easily identifiable marking most users will just expect that a USB-C port is a USB-C port.

    • lud@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Not al usb c ports have to gen 2.2, just a few 3.0 ports would be neat.

    • MHLoppy@fedia.io
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      1 year ago

      Isn’t this glossing over that (when allocating 16 PCIe lanes to a GPU as per your example), most of the remaining I/O connectivity comes from the chipset, not directly from the CPU itself?

      There’ll still be bandwidth limitations, of course, as you’ll only be able to max out the bandwidth of the link (which in this case is 4x PCIe 4.0 lanes), but this implies that it’s not only okay but normal to implement designs that don’t support maximum theoretical bandwidth being used by all available ports and so we don’t need to allocate PCIe lanes <-> USB ports as stringently as your example calculations require.

      Note to other readers (I assume OP already knows): PCIe lane bandwidth doubles/halves when going up/down one generation respectively. So 4x PCIe 4.0 lanes are equivalent in maximum bandwidth to 2x PCIe 5.0 lanes, or 8x PCIe 3.0 lanes.

      edit: clarified what I meant about the 16 “GPU-assigned” lanes.

      • apt_install_coffee
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 year ago

        Typically no, the top two PCIE x16 slots are normally directly to the CPU, though when both are plugged in they will drop down to both being x8 connectivity.

        Any PCIE x4 or X1 are off the chipset, as well as some IO, and any third or fourth x16 slots.

        So yes, motherboards typically do implement more IO connectivity than can be used simultaneously, though they will try to avoid disabling USB ports or dropping their speed since regular customers will not understand why.

        • MHLoppy@fedia.io
          link
          fedilink
          arrow-up
          3
          ·
          edit-2
          1 year ago

          Typically no, the top two PCIE x16 slots are normally directly to the CPU, though when both are plugged in they will drop down to both being x8 connectivity.

          Any PCIE x4 or X1 are off the chipset, as well as some IO, and any third or fourth x16 slots.

          I think the relevant part of my original comment might’ve been misunderstood – I’ll edit to clarify that but I’m already aware that the 16 “GPU-assigned” lanes are coming directly from the CPU (including when doing 2x8, if the board is designed in this way – the GPU-assigned lanes aren’t what I’m getting at here).

          So yes, motherboards typically do implement more IO connectivity than can be used simultaneously, though they will try to avoid disabling USB ports or dropping their speed since regular customers will not understand why.

          This doesn’t really address what I was getting at though. The OP’s point was basically “the reason there isn’t more USB is because there’s not enough bandwidth - here are the numbers”. The missing bandwidth they’re mentioning is correct, but the reality is that we already design boards with more ports than bandwidth - hence why it doesn’t seem like a great answer despite being a helpful addition to the discussion.

    • MonkderZweite@feddit.ch
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      1 year ago

      Nah, they usually advertise one USB-C port with full speed and that is the only one who gets it, even if it has 2, 3 or even 4x.

      Btw, the DeskMini is the only full-spec PC i know of, which doesn’t use additional chipsets for I/O. There may be a few more boards like this, dunno, but additional I/O chipsets are incredibly common.

  • SkyNTP@lemmy.ca
    link
    fedilink
    English
    arrow-up
    28
    ·
    edit-2
    1 year ago

    Am I throwing away all my mice, keyboards, DAC, digital pens, and other peripherals just so I can have a connector with more bandwidth than I’ll ever need? Nah.

    Am I buying them or adapters all over again just so I can be compatible with a new universal standard that I don’t need? Double nah.

    KVM switches, or breakout hubs that these devices plug into, then a single USB c device goes to the computer is the most logical avenue for a migration. But this will take a long time. Most people don’t even have that kind of luxury.

    • AA5B@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      On the other side of that, I’m already stuck throwing away all my Lightning cables and chargers, and ideally want to change only once. Why is it so hard to jump right to C for everything?

      • Revv@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        11
        ·
        1 year ago

        What on earth would possess folks to replace their often expensive existing peripherals for no benefit? To totally get rid of USB-A a person will either be out a bunch of money or be stuck with having to keep track of adapters for all their devices they can currently just plug in. An industry move to do so would necessitate the creation of a huge amount of e-waste and would net everyone else precisely nothing.

        USB-C is great for mobile devices as it’s small, relatively robust, easier to connect, and does pretty much everything from power deliver to video to connecting any device imaginable. Desktops (and even laptops really) don’t need to place such a premium on port size. Laptops and other mobile devices standardizing on USB-C for power is great. We can charge all our devices from the same charger. Fantastic!

        Making 20+ years of working equipment harder to use and forcing billions of people to buy stuff they don’t need (and that many can’t afford) would be wild.

        Expect to continue seeing USB-A for a long, long time. No need to replace anything with a USB-C version until it breaks (and maybe not even then).

        • cole@lemdro.id
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 year ago

          sure, but some people are already in a position where they already have only USB-C (me). I have adapters for the USB-A ports and it sucks. Just let me choose to get rid of the old shit that I don’t need please

        • AA5B@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          If you had multiple USB-C ports, you’d have the option of switching whoever you needed to replace something. No one ever said you had to throw everything out and start over

    • patatahooligan@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      But we’re not at the point of debating whether users should replace all of their devices. If motherboards with a single USB-C are so common, we’re actually at a place where we’re expecting users to buy all their new peripherals to be USB-A as well.

  • kadu@lemmy.world
    link
    fedilink
    English
    arrow-up
    24
    ·
    1 year ago

    I agree most motherboards should at least come with 2 or 4 USB-C ports.

    That being said, people upgrading all their peripherals happens significantly less often than the PC upgrade itself, and 90% of my current setup relies on USB type A, so if a motherboard (specially mATX) needs to decide what ports to fit into limited space, I’d prioritize USB A for sure.

    • jonne@infosec.pub
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      Yeah, I’ve got mostly USB-A peripherals, and the USB-C ones are using a USB-A to C cable anyway. What I’d actually want is graphics cards with the same Thunderbolt type ports that laptops have, so the USB stuff can be pushed through the same cable to your monitor.

  • echo64@lemmy.world
    link
    fedilink
    English
    arrow-up
    24
    arrow-down
    1
    ·
    1 year ago

    Usbc connectors are expensive and more difficult to drive. Usb-a connectors are cheap and easy to drive

    • Lantern@lemmy.world
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      1
      ·
      1 year ago

      Not to mention the numerous amount of accessories that use USB-A. My keyboard, mouse, and flash drives all use USB-A.

      In my cable collection, odds are that if a cable has USB-C on one end, then either USB-A or C is on the other end. That means every other connector still requires USB-A or a dongle.

      USB-A‘s longevity (~20 years) basically ensures that until it’s much cheaper to use USB-C, it won’t replace USB-A.

  • CmdrShepard@lemmy.one
    link
    fedilink
    English
    arrow-up
    23
    arrow-down
    3
    ·
    1 year ago

    How the heck is USB-A a legacy port and what would I do with 11 USB-C ports on a PC when everything I plug into it besides my phone (depending on the cable) has a USB-A connector? Like how would I even use something as simple as a flash drive or Bluetooth/wifi/radio transmitter?

    USB-C makes a ton more sense for mobile devices, docks, and charging, but not so much when you’re plugging them into a suitcased size brick that doesn’t move. I could see useful applications for something powered that needs a lot of bandwidth, but PCs also come with dedicated ports for all those peripherals too.

    • desconectado@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      A cheap USB hub solves everything you are describing. You can just leave it dangling from behind.

      Saying that, I’m not in favour of only usb-c motherboard though.

      • dustyData@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Cheap USB hubs get fried if you try to use it with something that requires power feed. Or usually just feed the whole voltage to one of the connectors, have fun finding out which one. Fully competent dock hubs are not cheap.

    • barsoap@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      edit-2
      1 year ago

      I just recently bought a cheap kitchen scale that has a USB-C connector, used for charging the LIR2450 inside.

  • cmnybo@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    17
    arrow-down
    1
    ·
    1 year ago

    Because there is no reason to have more than 1 or 2 since almost everything uses a type A connector.

  • NaibofTabr@infosec.pub
    link
    fedilink
    English
    arrow-up
    16
    ·
    edit-2
    1 year ago

    Most desktop peripherals are still USB-A. For low-power, low-data things like keyboards and mice, what would be the point of USB-C? It would increase the cost of the product but provide no real benefit to the user.

    Also, if you had a new desktop motherboard with say 6 USB-C ports, would you expect all of them to be capable of delivering 20V at 5A so they can be used to drive USB-C monitors &etc? Because that’s a lot of power to be running across your motherboard, even if you have a power supply that can handle it. You’ll need a separate cooler just for the USB-C bus controller, and pray that nothing ever goes wrong with power delivery because it will probably fry the whole board.

  • Kalash@feddit.ch
    link
    fedilink
    English
    arrow-up
    14
    ·
    1 year ago

    Would make much sense. You still want USB-A ports for most peripherals as using an usb-c port to connect a single mouse would be pretty much wasting a port.

    However adding a Thunderbolt4 port or two along side the usual USB-A ports would be nice.

    • TwanHE@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 year ago

      Ofcourse you’d still want some type a ports, but I have 6 type a ports and a single type c on my rear io but i would definitely give up 1 or 2 a ports for 2 more type c.

      • AA5B@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Why not half and half? Then I can afford to both keep my current accessories, and to buy USB-C when I need to replace them

        • TwanHE@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          I’d only need 2 extra since I got dual a and dual c on my case front io aswell.

          And since most devices i plug in are still type i can’t sacrifice too many.

  • Asifall@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    ·
    1 year ago

    I don’t understand why I would want a bunch of usb c ports? On a phone where there obviously isn’t space for a full sized port sure, but I find that fiddling with the one usb c port on the back of my desktop is a pain in the ass and the port really struggles to keep a good connection when attached to a stiff or heavy cable.

    • Captain Aggravated@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      2
      ·
      1 year ago

      Back in the late 90’s why did we want USB ports when serial and parallel and P/S2 worked so well? There were decades worth of hardware that were compatible with the old standards.

    • MalditoBarbudo@programming.dev
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Two extra monitor setups. My slim book has 3 usb a, 1 USB c and 1 HDMI. With the USB c and the HDMI I have two monitors in my office setup and is super nice. I understand that for common devices is preferable USB a but for video, it saves a lot of physical space compared to VGA or HDMI in a small laptop

    • LwL@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Yeah, not once have i had any need for a usb c port on my pc? Not having to deal with orientation is nice, but I’ve also had the experience that usb-c is worse at keeping a connection, and I have so many cables with USB-A at one end anyway.

  • squirrelwithnut@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    1
    ·
    1 year ago

    Standard USB type A ports are cheaper, and more importantly, STURDIER then USB C ports. This is extremely important for peripherals that do not need to be disconnected and reconnected often.

    USB C is great for convenience for certain things, but it’s a weaker port in terms of physical connection strength.

  • wolre@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    ·
    1 year ago

    I’d honestly love to see everything USB-C-ified. Would be great to finally just have one standard to concern yourself with.

    • WaxedWookie@lemmy.world
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      3
      ·
      1 year ago

      Nobody tell them about the massively fragmented set of standards using the USB-C connection.

      • wolre@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        1 year ago

        I know, but at least we’d only have one physical connector at that point. While there are indeed a lot of standards for USB C, many of them are not all that relevant in day-to-day use when you’re mostly just looking to connect some basic USB peripherals like a mouse, a thumb drive or charge your phone.

        • gornius@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          ·
          1 year ago

          I disagree.

          More technical people would understand, but your average Joe would try to plug in their external monitor and RMA PC because it’s not working, same with slow charging phone speed etc.

          I’m honestly all in for keeping USB-A for basic I/O devices. Although inventing an USB-A female connector that works both sides and is backwards compatible would be neat.

          • Scribbd@feddit.nl
            link
            fedilink
            English
            arrow-up
            3
            ·
            edit-2
            1 year ago

            Don’t think this didn’t happen for people that wished to copy something from or to an external drive, and RMAd as they found it to be too slow. They plugged it into a black usb 2.0 port instead of a red one because they thought it was dangerous. Ow wait, no. That motherboard manufacturer used green usb ports for USB3.2. What do you mean you didn’t try it because you didn’t know what they were for? Your hard drive cable has blue plugs, didn’t you at least try the blue ports? No? Because there was a lighting bolt printed nearby… I understand you don’t want to lose the data. Do you have a backup? … You should. Ok, well you can test it with the mouse or keyboard. Yes, the top two usb ports do have the icons for those, but that doesn’t mean… Oh, you already put the pc in the mail. See you in 2 weeks then.

            Also, switchable usb-a is already a thing, but is very flimsy due to the necessity of moving parts.

            • gornius@lemmy.world
              link
              fedilink
              English
              arrow-up
              6
              ·
              1 year ago

              The difference between different generations of USB-A are speeds. If user notices differences in speeds, they are way more likely to know the difference between USB versions.

              The differences between USB-C and USB-A are capabilities. USB-C is already confusing for many people. My boss (IT Project Manager) thought he could use USB-C to connect his monitor, while he couldn’t because his laptop doesn’t support DisplayPort over USB-C.

              There is already a huge mess with USB-C capabilities. Some of them are just glorified USB-A ports, some of them have DisplayPort over USB-C, some of them are Thunderbolt (with different versions or course), some of them are QC (with different versions - once again).

              I can just imagine the confusion from users, who expect all of the USB-C ports in the motherboard to work the same way, but then only one or two ports from 8 total have DisplayPort capabilities.

              “If it doesn’t fit it means it’s not supposed to go here” is a great way to tell the user what capabilities the port has.

              • Scribbd@feddit.nl
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 year ago

                Yeah that is true. But I was more or less portraying that customers gonna custom-er. And PCs will be RMA-ed for stupid reasons no matter what. And usb-a also had customers confused, sure c is worse. But don’t make it out to be that a was so magnificent. SuperSpeed, QC, trying to plug the male printer side into the ethernet port, different grades of cables for different speeds, expecting a bump in speed because they bought a ‘golden cable’ while their pc and peripheral were on usb 2.1, all these things are also in usb a forms.

                Because I had all those conversation. The man was aware, yes. But wasn’t aware enough and too afraid to lose his precious data. (But wasn’t willing to pay for extra drives or remote storage. But that is a different story.)

  • Dkarma@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    1
    ·
    1 year ago

    This is about cost. The standard USB ports are far cheaper and they probably already have a billion of them on hand. Plus all the board layouts already use standard USB for their layouts. Also you’re not really getting any advantage from the USB c size wise or performance wise.

    Further more now you’d have to make USB c to whatever form cables and make customers buy these new cables.

    If u had to choose between 2 computers and 1 made u buy completely new cables for every peripheral which would u buy?

    • shalafi@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      billion of them on hand

      Been thinking that for a long time! For example, I can imagine Chinese warehouses jammed with micro USB connectors. Want to build a low-cost widget? Meh, save some pennies per unit and put a micro on there.

  • stolid_agnostic
    link
    fedilink
    English
    arrow-up
    7
    ·
    1 year ago

    Weirder is that you can’t really find usb c mice and keyboards, though I really don’t know why.

    • SatyrSack@lemmy.one
      link
      fedilink
      English
      arrow-up
      9
      ·
      1 year ago

      Because motherboards are mostly USB A, because peripherals are mostly USB A, because motherboards are mostly USB A, bec…

    • dual_sport_dork 🐧🗡️@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      1 year ago

      There are a few, but certainly not many. And they mostly seemed to be aimed at plugging into Mac laptops. But at the moment, manufacturers can count on every computer made in the last 20 years or whatever having at least one USB-A port, and most computers still having zero USB-C ports. The options are to make it USB-C and pay extra to include an adapter, or just say the hell with it and make it USB-A. Or relegate yourself to basically selling to Mac users only.

      Most manufacturers, naturally, will pick the second option.