• Bytemeister@lemmy.world
      link
      fedilink
      Ελληνικά
      arrow-up
      22
      arrow-down
      1
      ·
      8 months ago

      Or when your’re trying to feed that fucker back through the passthrough on a desk.

    • spirinolas@lemmy.world
      link
      fedilink
      arrow-up
      10
      ·
      8 months ago

      I do tech support in a school filled with old computers all connected with VGA. One day I’ll hang myself with one of those.

  • Siegfried@lemmy.world
    link
    fedilink
    arrow-up
    70
    ·
    8 months ago

    The actual retro problem was when those tighty boys would start unscrewing the port instead of themselves

  • jeffw@lemmy.world
    link
    fedilink
    arrow-up
    63
    arrow-down
    1
    ·
    8 months ago

    Pretty sure the little slit was so that you could use a flathead screwdriver. Had to do that a couple times

    • qprimed
      link
      fedilink
      English
      arrow-up
      13
      ·
      edit-2
      8 months ago

      those slots were near useless.

      edit to say: one trick was to use the blank expansion slot plates to gently break the vice like grip the screw had in the hex stand-off. the metal used on the cheap “digit remover” cases was sometimes soft enough to loosen the thumb screws via the driver slot without the thumb screw breaking.

      still nearly useless though.

  • mozingo@lemmy.world
    link
    fedilink
    English
    arrow-up
    42
    arrow-down
    3
    ·
    8 months ago

    This happens because the connector is at an angle. Since it’s at an angle, the screw presses against the side and jams itself in place. All you have to do is tilt the connector the other direction and the tight screw loosens right up. Easy peasy.

    • SuperApples@lemmy.world
      link
      fedilink
      arrow-up
      21
      ·
      8 months ago

      I tighten them and it saved my monitor! Robbers broke in to our house, stole a bunch of stuff. The computer monitor was still there, connected to the computer, dangling from the table.

      How do I know they tried to steal it? Because they tried to cut through the cable with PAPER SCISSORS, because they didn’t know how to unscrew the cables.

      I feel sorry for the dumb robbers. I hope they didn’t pawn it and are still enjoying playing Wii Fitness without the balance board, which they neglected to take with the console.

    • al177@lemmy.sdf.org
      link
      fedilink
      arrow-up
      4
      ·
      8 months ago

      Other than niche Keysight gear that’s has three layers of nameplates because it’s '90s vintage NOS, LXI and USB-TMC have replaced GPIB.

      • Fosheze@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        8 months ago

        You would think that but where I work we are still manufacturing NEW equipment with GPIB. Industry moves at a glacial pace and plenty of compainies will still pay to have GPIB as an option.

  • dejected_warp_core@lemmy.world
    link
    fedilink
    arrow-up
    15
    ·
    8 months ago

    All I can say is that we are fortunate that the overlap between “VGA ports everywhere” and “battery operated impact drivers” is almost zero on the timeline. Imagine trying to unscrew a VGA plug by hand that was tightened down to ugga-dugga-foot-pounds of torque. Of course that assumes that didn’t shear the screws first.

    • MystikIncarnate@lemmy.ca
      link
      fedilink
      English
      arrow-up
      14
      ·
      8 months ago

      I like DVI. I prefer it most of the time.

      I like the screw in connector because I don’t have to worry about it falling out of the PC or monitor, and it is more robust, less likely to be pulled/bent/broken.

      Unfortunately, even monitor vendors don’t seem to agree that DVI was/is good, and I’ve seen a lot of displays shipping without it recently. GPU makers have entirely gone to displayport/HDMI. It’s the end of an era, as far as I’m concerned.

      I’ve switched almost entirely to DP, since I can’t get DVI anything anymore. I don’t hate DP. I like it more than the friction fit HDMI which is prone to pulling itself out of the port for no good reason just as your opponent is about to come around the corner and all you can do is stare at yourself in the black mirror that your monitor has become and listen in horror as fartmaster69420 frags you again, bragging about it and telling you that you suck, and how he does unspeakable things to your mother over VC in his prepubescent voice.

      Anyways. I miss DVI.

        • MystikIncarnate@lemmy.ca
          link
          fedilink
          English
          arrow-up
          3
          ·
          8 months ago

          You obviously don’t use HDMI the same way I’ve seen it used by some people.

          I do IT support for a living and I’ve had a non-zero number of tickets where I literally have to go over and plug in someone’s display because they managed to disconnect it.

        • borari@lemmy.dbzer0.com
          link
          fedilink
          arrow-up
          2
          ·
          8 months ago

          I’m trying to figure out what this person is doing that would lead to an HDMI cable, or any cable really, getting pulled out of the port on the monitor or the computer while gaming. The only situations i can think of would be more of a hinderance to playing the game than the monitor blanking out, like the laptop or desktop falling off a desk or something.

          • MystikIncarnate@lemmy.ca
            link
            fedilink
            English
            arrow-up
            1
            ·
            8 months ago

            I was mostly being fatecious for effect/comedy.

            But working IT support, I’ve had users complain that their computer doesn’t work, then travel to their location and find the HDMI connection fell out.

            I’ve wasted countless hours troubleshooting a plug. It’s a big reason I like the latch on DP and I prefer DVI when possible. No user error with things just getting unplugged.

            I use DP for my computer, HDMI for all my TVs, and it works fine. I don’t make it a habit to mess with the cables, for laptops I tend to try to use docks so I’m only plugging in one cable while I’m stationary, and my displays are always connected to the dock.

            The example rant I provided had no basis in reality. Just something I came up with because I thought it would be funny. The only point that had any actual real world relevance is the fact that HDMI can become unplugged if not properly seated, or if it’s pulled at all, or if the friction fit is generally loose from wear&tear. That’s all. I’m just trying to be funny beyond that.

            Either way, I’m not going to tell you how to live your life; so if you prefer HDMI, that’s fine. You use what you want to use. I’m not about to tell you that your choices are invalid because I don’t prefer it. Your decision doesn’t affect me, so you can do as you wish. I won’t try to change your mind.

            Have a good day.

            • borari@lemmy.dbzer0.com
              link
              fedilink
              arrow-up
              1
              ·
              edit-2
              8 months ago

              Oh dude, you totally misinterpreted my intent. I’m a DisplayPort only household. I’ve got DP cables going from PCs and docks into KVMs, and from KVMs to way too many monitors, and all of them are DisplayPort with active adapters when necessary. I refuse to buy any cable that isn’t DisplayPort at this point. I guess except for my TV but that shit is in the wall and if rats start tugging on that shit or the TV falls off the wall we got bigger problems.

              I was just genuinely confused about the apparent frequency of these cable mishaps, like monitor video cables are as frequently ripped out as n64/psx/ps2 controllers lol.

              • MystikIncarnate@lemmy.ca
                link
                fedilink
                English
                arrow-up
                1
                ·
                8 months ago

                All good. Working in IT support has its set of challenges. I’m not sure what users are doing with their equipment, but they keep getting in dumb situations where a complaint of “my computer doesn’t work” has about a 50% chance of the problem being an HDMI cable that’s either damaged or unplugged. Every once in a while it’s a powered off PC, and the user just thinks that the power button on the display “turns off [their] PC”. Those are fun.

                For people who make a living working in some computer program, some people are so willfully ignorant of how a computer functions… Usually they simply state that they’re “not very techy” and think that’s an acceptable excuse for why they haven’t learned the basics of operating a PC in the past two decades.

                My point is, I have no idea how it keeps happening, all I can say is that since DP became the default standard for workstations, those calls have all but completely stopped happening. Calls like that on VGA/DVI were rare, usually because the install tech was too lazy to actually screw in the connector, then it was the HDMI hellscape, now it’s displayport bliss. Hard to be a lazy installer when you only need to push in the connector to have it properly latched into the system.

                It still happens, usually when someone breaks the DP connector, but like I said, that’s pretty rare.

                Oh, in case you thought I worked with complete idiots, most of the people I support are professional white collar workers. Office drones in lawyers offices, accounting offices… Even dental practices. These are people with certificates and diplomas representing 4+ years of education per person, and yeah, they still can’t figure out that the button on the screen doesn’t power off the computer.

          • thereisalamp@reddthat.com
            link
            fedilink
            arrow-up
            1
            ·
            8 months ago

            I do have this problem with the monitor I hook up to my laptop for gaming occasionally. It’s looser because it gets plugged and unplugged more commonly and can occasionally slip out of I move my laptop to my lap so I can lean back when my back starts to ache.

            But this is not a common situation I think

      • Red Army Dog Cooper
        link
        fedilink
        arrow-up
        5
        ·
        8 months ago

        VGA has outlived DVI… I can buy a new monitor with VGA and get a new VGA cable at almost any store … DVI is hard to find anything but a DVI to VGA adapter

          • Red Army Dog Cooper
            link
            fedilink
            arrow-up
            1
            ·
            8 months ago

            Would you hate me if I said I think the correct screw in port won… mutters in hating DVI for no good reason

            • MystikIncarnate@lemmy.ca
              link
              fedilink
              English
              arrow-up
              2
              ·
              8 months ago

              You’re entitled to that opinion. I don’t hate you for it. I would be lying to say I understood.

              DVI could operate in three modes, either DVI-A, which was basically just VGA adapted to the DVI connector, DVI-D, which is the primary digital mode, then there was dual link which doubled the bandwidth for the DVI digital mode, allowing higher resolutions and higher refresh rates.

              By comparison HDMI can only do a single digital link.

              DVI is great IMO.

              • Red Army Dog Cooper
                link
                fedilink
                arrow-up
                1
                ·
                8 months ago

                I am not going to lie, while I appreciate the 3 modes that is the part that I think I ended up hating, not that it could do that but that so many times you would get either a cable or a port that would only accept -a or more often -d made it incredably hard.

                I can also appreciate that on paper DVI is amazing and should still be arround, (also Displayport should be more popular than HDMI … HDMI should be the port in the grave) it does not mean I do not have this irrational hatred for DVI that makes no sense at all…

                • MystikIncarnate@lemmy.ca
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  8 months ago

                  I’m not judging. I just wanted to detail a couple of my favorite things about it.

                  I’m not foolish enough to think I’m going to change your mind about it. Your criticisms are valid, and you are free to like or dislike anything you wish.

                  Have a good day.

      • thawed_caveman@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        8 months ago

        I’ve recently plugged and unplugged a lot of monitors, and the way DP keeps itself attached it with those little claws, and you have to push a button to release it. But when there’s 4 monitors plugged into the same GPU, you can’t access those buttons. The struggle was real.

        In comparison the DVI connector just needed a screwdriver

  • Pantherina@feddit.de
    link
    fedilink
    arrow-up
    17
    arrow-down
    2
    ·
    8 months ago

    At least they had screws? I dont trust HDMI or even worse USB-C. Still using VGA monitors with adapters, never broke a single plug.

    • ultranaut@lemmy.world
      link
      fedilink
      arrow-up
      29
      ·
      8 months ago

      I sort of miss the screws too but it’s so much better when a cable accidentally gets yanked and it just comes right out instead of transmitting the force into whatever it’s attached to.

    • Votes@lemm.ee
      link
      fedilink
      arrow-up
      16
      ·
      8 months ago

      Good news, USB-C has two formats with screws: 1 on either side like VGA or 1 on top. Though I’ve never seen them in real life.

    • mihnt@lemy.lol
      link
      fedilink
      arrow-up
      12
      ·
      edit-2
      8 months ago

      Why are you using VGA when DVI-D exists? Or Displayport for that matter.

          • Pantherina@feddit.de
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            8 months ago

            Why should I? Full HD and working well, no reason to do so, new displays are 100€+ which is freaking expensive for that improvement

            • mihnt@lemy.lol
              link
              fedilink
              arrow-up
              2
              ·
              8 months ago

              Because there’s plenty of used monitors to be had out there that have DVI on them in some capacity for very reasonable prices.

              For instance I just purchased 4 x 24inch Samsung monitors for $15 USD each.

      • renzev@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        15
        ·
        edit-2
        8 months ago

        All those new video standards are pointless. VGA supports 1080p at 60Hz just fine, anything more than that is unnecessary. Plus, VGA is easier to implement that HDMI or Displayport, keeping prices down. Not to mention the connector is more durable (well, maybe DVI is comparable in terms of durability)

        • Fuck spez@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          10
          ·
          edit-2
          8 months ago

          VGA is analog. You ever look at an analog-connected display next to an identical one that’s connected with HDMI/DP/DVI? Also, a majority of modern systems are running at around 2-4 * 1080p, and that’s hardly unnecessary for someone who spends 8+ hours in front of one or more monitors.

          • renzev@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            3
            ·
            8 months ago

            I look at my laptop’s internal display side-by-side with an external VGA monitor at my desk nearly every day. Not exactly a one-to-one comparison, but I wouldn’t say one is noticeably worse than the other. I also used to be under the impression that lack of error correction degrades the image quality, but in reality it just doesn’t seem to be perceptible, at least over short cables with no strong sources of interference.

        • mihnt@lemy.lol
          link
          fedilink
          arrow-up
          4
          ·
          edit-2
          8 months ago

          I think you are speaking on some very different use cases than most people.

          • renzev@lemmy.world
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            7
            ·
            edit-2
            8 months ago

            Really, what “normal people” use cases are there for a resolution higher than 1080p? It’s perfectly fine for writing code, editing documents, watching movies, etc. If you are able to discern the pixels, it just means you’re sitting too close to your monitor and hurting your eyes. Any higher than 1080p and, at best you don’t notice any real difference, at worst you have to use hacks like UI Scaling or non-native resolution to get UI elements to display at a reasonable size.

            • everett
              link
              fedilink
              arrow-up
              11
              ·
              edit-2
              8 months ago

              Sharper text for reading more comfortably, and viewing photos at nearly full resolution. You don’t have to discern individual pixels to benefit from either of these. And stuff you wouldn’t think of, like small thumbnails and icons can actually show some detail.

            • mihnt@lemy.lol
              link
              fedilink
              arrow-up
              7
              ·
              8 months ago

              You had 30Hz when I read your comment. Which is why I said what I said. Still, there’s a lot of benefit for having a higher refresh rate. As far as user comfort goes.

            • borari@lemmy.dbzer0.com
              link
              fedilink
              arrow-up
              1
              ·
              8 months ago

              I think a 1440p monitor is a good compromise between additional desktop real estate on an equivalently sized monitor and dealing with the UI being so small you have to scale back the vast majority of that usable space.

              People are getting fucking outrageous with their monitor sizes now. There’s monitors that are 38”, 42”+, and some people are using monstrous 55” TVs as monitors on their fucking desks. While I personally think putting something that big on your desk is asinine, the pixel density of even a 27” 1080p monitor is pushing the boundary of acceptable, regardless of how close to the monitor you are.

              Also just want to point out that the whole “sitting too close to three screen will hurt your eyes” thing is bullshit. For people with significant far-sightedness it can cause discomfort in the moment, mostly due to difficulty focusing and the resulting blurriness. For people with “normal” vision or people with near-sightedness it won’t cause any discomfort. In any case, no long term or permanent damage will occur. Source from an edu here

            • Pantherina@feddit.de
              link
              fedilink
              arrow-up
              1
              ·
              8 months ago

              Its unneeded perfectionism that you get used to. And its expensive and makes big tech rich. Know where to stop.

        • borari@lemmy.dbzer0.com
          link
          fedilink
          arrow-up
          1
          ·
          8 months ago

          I have a 2560x1080p monitor, and while I want to upgrade to a 1440p since the monitors control joystick nub recently broke off I can’t really justify it. I have a 4080s and just run all my games with DLDSR so they in engine render at 1440p or 4k, then I let nvidia ai magic downsample and output the 1080p image to my monitor. Shit looks crispy, no aliasing to speak of so I can turn off the often abysmal in game AA, I have no real complaints. A higher resolution monitor would look marginally better I’m sure, but it’s not worth the cost of a new one to me yet. When I can get a good 21:9 HDR oled without forced oled care cycles or another screen technology that has as good blacks and spot brightness I’ll make the jump.

          From what people have told me, 144hz is definitely noticeable in games. I can see it feeling better in an online fps, but i recently had a friend tell me that Cyberpunk with maxed out settings and with ray tracing enabled was “unplayable” on a 4080s, and “barely playable” on a 4090, just because the frame rate wasn’t solidly 144 fps. I’m more inclined to agree with your take on this and chalk his opinion up to trying to justify his monitor purchase to himself.

          All that said, afaik you can’t do VRR over VGA/DVI-D. If you play games on your PC, Freesync or G-Sync compatibility is absolutely necessary in my own opinion.

    • motor_spirit@lemmy.world
      link
      fedilink
      arrow-up
      7
      arrow-down
      2
      ·
      8 months ago

      do you live ON train tracks? how often is shit just falling out around you? usually a pretty cozy fit on most things imo 🤔

      do you like the display port push tab? I feel like many of those are a PITA for real

      • deadbeef79000@lemmy.nz
        link
        fedilink
        arrow-up
        3
        ·
        8 months ago

        Hate it. Though there is one that’s worse.

        The mini-DP retention clip. There seems to be either wide and narrow variations or simply on-/off-spec variants.

        Those clips just jam right in the back plate of the video card.

    • dejected_warp_core@lemmy.world
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      edit-2
      8 months ago

      I’m still waiting for the other shoe to drop on USB-C/Thunderbolt. Don’t get me wrong - I think it’s a massive improvement for standardization and peripheral capability everywhere. But I have a hard-used Thinkpad that’s on and off the charging cable all day, constantly getting tugged in every possible direction. I’m afraid the physical port itself is going to give up long before the rest of the machine does. I’m probably going to need Louis Rossmann level skills to re-solder it when the time comes.

      Edit: I’m also wondering if the sudden fragility of peripheral connections (e.g. headphones, classic iPod, USB mini/micro) and the emergence of the RoHS standard (lead-free solder) is not a coincidence.

      • Pantherina@feddit.de
        link
        fedilink
        arrow-up
        2
        ·
        8 months ago

        On my Thinkpad the ports where both soldered to the mobo, unlike some random other USB daughterboard. Really annoying, on my T430 the port is a separate piece and can be easily replaces with a cable.

        But no, USB-c is pretty tough for me, when done right. But its still too small for no reason in Laptops.