And when pulling it out from the mess of cables
Or when your’re trying to feed that fucker back through the passthrough on a desk.
I do tech support in a school filled with old computers all connected with VGA. One day I’ll hang myself with one of those.
hey would you mind using an impact to screw one in… just to mess with someone
Left side: Unscrews from the standoff. Right side: Unscrews the standoff from the IO plate.
Every fucking time. Why do I keep buying monitors with a VGA 🤣😭
But seriously, why would you buy a VGA monitor in 2024?
(Edit: typo)
Fighting games and the absolutely lowest possible video latency with the tech available. VGA puts you literally a frame or two ahead of the opponent. For players at their peak, this is a pretty big advantage.
Dont think the best vga monitors will beat current oleds in response times.
Using a gpu that still has vga will be a problem as well.
Apologies, I should have clarified. VGA plus CRT. I only ever used VGA with LED panels for a very short time, so I tend to conflate all the old tech as a package deal.
They’re just so cheap at goodwill :(
I got 4 1680x1050 monitors for 5€ each like 5 years ago and they’re still working perfectly fine
Cheap multi monitor setup gang here we go
I wonder if I can get an adapter to mount an old massive CRT to a monitor arm, and if any monitor arm has pistons that can support the thing in the first place.
Undermount and passive hydraulic arm will do the job, i think, as fellow crt enjoyer i approve of your idea, i have 32 inch crt tv stored, waiting for ps3 and Xbox 360 to be bought
Get some HDMI to VGA adapters, the kind that screw into the VGA port and then have an HDMI port. I have a bunch of old VGA monitors I use with Raspberry Pis and as test displays when working on PCs and never have to deal with the annoyances of VGA since they’re basically HDMI displays now.
CRTs are highly desirable for retro gaming.
Every time
Every. Fucking. Time.
Video card manufacturers, why u no threadlocker?
I used to use locktite on my stand-offs.
Now there’s an idea.
You have to tighten the loose one to loosen the tight one. My fingers hurt just looking at it
you a director yet? that’s gandalf level wisdom
Amen to this…or just say fuck it and break out the screwdriver
You mean that thing I set down right there but has somehow transitioned into a different dimension?
You can only find a screwdriver by first releasing your intent to use it.
You also got ADHD?
Absolutely!!
Usually VGA connector has slots for a screwdriver.
Best part is when this sucker unscrews from the port and comes off with the cable:
Ugh this stresses me out just thinking about it
The actual retro problem was when those tighty boys would start unscrewing the port instead of themselves
Pretty sure the little slit was so that you could use a flathead screwdriver. Had to do that a couple times
… aaaaaand there goes the infinitesimally tiny nut inside the case
clatter click … oh no…
I learned not to do that with the system plugged in. Only lost one expansion slot somehow.
Then one side of the driver notch shears off
those slots were near useless.
edit to say: one trick was to use the blank expansion slot plates to gently break the vice like grip the screw had in the hex stand-off. the metal used on the cheap “digit remover” cases was sometimes soft enough to loosen the thumb screws via the driver slot without the thumb screw breaking.
still nearly useless though.
We referred to those blank expansion slot metal pieces as “keys.” They were useful lockpicks.
TIL!!
I mean, I could’ve been doing it wrong lol
This happens because the connector is at an angle. Since it’s at an angle, the screw presses against the side and jams itself in place. All you have to do is tilt the connector the other direction and the tight screw loosens right up. Easy peasy.
This would have been really good for me to know about 20 years ago.
Holy Diver!
Retro problem? I used a DVI connector on my monitor until December last year.
Yep, you are retro
DVI-D is basically HDMI with a large connector, so nothing wrong with it
I still do. If it works, it works. Until my video card self-immolates, I’ll keep using it. Damn these modern infants and their cable endings! shakes fist at sky
>.>
¯\_(ツ)_/¯ it’s just thicc HDMI
Retro problem? All of our monitors at work use VGA… not to mention pretty much all servers
Am I the only one that never tightened them?
I tighten them and it saved my monitor! Robbers broke in to our house, stole a bunch of stuff. The computer monitor was still there, connected to the computer, dangling from the table.
How do I know they tried to steal it? Because they tried to cut through the cable with PAPER SCISSORS, because they didn’t know how to unscrew the cables.
I feel sorry for the dumb robbers. I hope they didn’t pawn it and are still enjoying playing Wii Fitness without the balance board, which they neglected to take with the console.
Oh wow, I didn’t see that coming. Screw-terminal cable connections are now the Manual Transmission of computer parts.
Probably not, there are plenty of people in the world
right? just put it in the vga port like any other cable?
GPIB users and instrumentation automating folks know the problem is very modern.
Other than niche Keysight gear that’s has three layers of nameplates because it’s '90s vintage NOS, LXI and USB-TMC have replaced GPIB.
You would think that but where I work we are still manufacturing NEW equipment with GPIB. Industry moves at a glacial pace and plenty of compainies will still pay to have GPIB as an option.
also, both stripped somehow?
All I can say is that we are fortunate that the overlap between “VGA ports everywhere” and “battery operated impact drivers” is almost zero on the timeline. Imagine trying to unscrew a VGA plug by hand that was tightened down to ugga-dugga-foot-pounds of torque. Of course that assumes that didn’t shear the screws first.
You know you have given me a wonderful idea, I have a few friends that are in VGA heavy places
“Tightened down to ugga-dugga-foot-pounds of torque” sent me into an absolute gigglefit.
I still have a DVI monitor connected to my main pc, so it’s not that much of a retro problem for me
I like DVI. I prefer it most of the time.
I like the screw in connector because I don’t have to worry about it falling out of the PC or monitor, and it is more robust, less likely to be pulled/bent/broken.
Unfortunately, even monitor vendors don’t seem to agree that DVI was/is good, and I’ve seen a lot of displays shipping without it recently. GPU makers have entirely gone to displayport/HDMI. It’s the end of an era, as far as I’m concerned.
I’ve switched almost entirely to DP, since I can’t get DVI anything anymore. I don’t hate DP. I like it more than the friction fit HDMI which is prone to pulling itself out of the port for no good reason just as your opponent is about to come around the corner and all you can do is stare at yourself in the black mirror that your monitor has become and listen in horror as fartmaster69420 frags you again, bragging about it and telling you that you suck, and how he does unspeakable things to your mother over VC in his prepubescent voice.
Anyways. I miss DVI.
I’ve never had an HDMI cable fall out of anything.
You obviously don’t use HDMI the same way I’ve seen it used by some people.
I do IT support for a living and I’ve had a non-zero number of tickets where I literally have to go over and plug in someone’s display because they managed to disconnect it.
I’m trying to figure out what this person is doing that would lead to an HDMI cable, or any cable really, getting pulled out of the port on the monitor or the computer while gaming. The only situations i can think of would be more of a hinderance to playing the game than the monitor blanking out, like the laptop or desktop falling off a desk or something.
I was mostly being fatecious for effect/comedy.
But working IT support, I’ve had users complain that their computer doesn’t work, then travel to their location and find the HDMI connection fell out.
I’ve wasted countless hours troubleshooting a plug. It’s a big reason I like the latch on DP and I prefer DVI when possible. No user error with things just getting unplugged.
I use DP for my computer, HDMI for all my TVs, and it works fine. I don’t make it a habit to mess with the cables, for laptops I tend to try to use docks so I’m only plugging in one cable while I’m stationary, and my displays are always connected to the dock.
The example rant I provided had no basis in reality. Just something I came up with because I thought it would be funny. The only point that had any actual real world relevance is the fact that HDMI can become unplugged if not properly seated, or if it’s pulled at all, or if the friction fit is generally loose from wear&tear. That’s all. I’m just trying to be funny beyond that.
Either way, I’m not going to tell you how to live your life; so if you prefer HDMI, that’s fine. You use what you want to use. I’m not about to tell you that your choices are invalid because I don’t prefer it. Your decision doesn’t affect me, so you can do as you wish. I won’t try to change your mind.
Have a good day.
Oh dude, you totally misinterpreted my intent. I’m a DisplayPort only household. I’ve got DP cables going from PCs and docks into KVMs, and from KVMs to way too many monitors, and all of them are DisplayPort with active adapters when necessary. I refuse to buy any cable that isn’t DisplayPort at this point. I guess except for my TV but that shit is in the wall and if rats start tugging on that shit or the TV falls off the wall we got bigger problems.
I was just genuinely confused about the apparent frequency of these cable mishaps, like monitor video cables are as frequently ripped out as n64/psx/ps2 controllers lol.
All good. Working in IT support has its set of challenges. I’m not sure what users are doing with their equipment, but they keep getting in dumb situations where a complaint of “my computer doesn’t work” has about a 50% chance of the problem being an HDMI cable that’s either damaged or unplugged. Every once in a while it’s a powered off PC, and the user just thinks that the power button on the display “turns off [their] PC”. Those are fun.
For people who make a living working in some computer program, some people are so willfully ignorant of how a computer functions… Usually they simply state that they’re “not very techy” and think that’s an acceptable excuse for why they haven’t learned the basics of operating a PC in the past two decades.
My point is, I have no idea how it keeps happening, all I can say is that since DP became the default standard for workstations, those calls have all but completely stopped happening. Calls like that on VGA/DVI were rare, usually because the install tech was too lazy to actually screw in the connector, then it was the HDMI hellscape, now it’s displayport bliss. Hard to be a lazy installer when you only need to push in the connector to have it properly latched into the system.
It still happens, usually when someone breaks the DP connector, but like I said, that’s pretty rare.
Oh, in case you thought I worked with complete idiots, most of the people I support are professional white collar workers. Office drones in lawyers offices, accounting offices… Even dental practices. These are people with certificates and diplomas representing 4+ years of education per person, and yeah, they still can’t figure out that the button on the screen doesn’t power off the computer.
I do have this problem with the monitor I hook up to my laptop for gaming occasionally. It’s looser because it gets plugged and unplugged more commonly and can occasionally slip out of I move my laptop to my lap so I can lean back when my back starts to ache.
But this is not a common situation I think
VGA has outlived DVI… I can buy a new monitor with VGA and get a new VGA cable at almost any store … DVI is hard to find anything but a DVI to VGA adapter
Sad but true.
Would you hate me if I said I think the correct screw in port won… mutters in hating DVI for no good reason
You’re entitled to that opinion. I don’t hate you for it. I would be lying to say I understood.
DVI could operate in three modes, either DVI-A, which was basically just VGA adapted to the DVI connector, DVI-D, which is the primary digital mode, then there was dual link which doubled the bandwidth for the DVI digital mode, allowing higher resolutions and higher refresh rates.
By comparison HDMI can only do a single digital link.
DVI is great IMO.
I am not going to lie, while I appreciate the 3 modes that is the part that I think I ended up hating, not that it could do that but that so many times you would get either a cable or a port that would only accept -a or more often -d made it incredably hard.
I can also appreciate that on paper DVI is amazing and should still be arround, (also Displayport should be more popular than HDMI … HDMI should be the port in the grave) it does not mean I do not have this irrational hatred for DVI that makes no sense at all…
I’m not judging. I just wanted to detail a couple of my favorite things about it.
I’m not foolish enough to think I’m going to change your mind about it. Your criticisms are valid, and you are free to like or dislike anything you wish.
Have a good day.
I’ve recently plugged and unplugged a lot of monitors, and the way DP keeps itself attached it with those little claws, and you have to push a button to release it. But when there’s 4 monitors plugged into the same GPU, you can’t access those buttons. The struggle was real.
In comparison the DVI connector just needed a screwdriver
Someone gets me.
DP is great until you actually want to unplug it.
I’d rather the little latch than nothing at all.
‘just’ lol. Nerver mind that you have to reach that fucker.
At least they had screws? I dont trust HDMI or even worse USB-C. Still using VGA monitors with adapters, never broke a single plug.
I sort of miss the screws too but it’s so much better when a cable accidentally gets yanked and it just comes right out instead of transmitting the force into whatever it’s attached to.
Tell that to the USB ports on my laptop.
Good news, USB-C has two formats with screws: 1 on either side like VGA or 1 on top. Though I’ve never seen them in real life.
Why are you using VGA when DVI-D exists? Or Displayport for that matter.
Because VGA used to be a standard and all monitors I had lying around are VGA only
Kudos for not just trashing them.
Why should I? Full HD and working well, no reason to do so, new displays are 100€+ which is freaking expensive for that improvement
Because there’s plenty of used monitors to be had out there that have DVI on them in some capacity for very reasonable prices.
For instance I just purchased 4 x 24inch Samsung monitors for $15 USD each.
All those new video standards are pointless. VGA supports 1080p at 60Hz just fine, anything more than that is unnecessary. Plus, VGA is easier to implement that HDMI or Displayport, keeping prices down. Not to mention the connector is more durable (well, maybe DVI is comparable in terms of durability)
VGA is analog. You ever look at an analog-connected display next to an identical one that’s connected with HDMI/DP/DVI? Also, a majority of modern systems are running at around 2-4 * 1080p, and that’s hardly unnecessary for someone who spends 8+ hours in front of one or more monitors.
I look at my laptop’s internal display side-by-side with an external VGA monitor at my desk nearly every day. Not exactly a one-to-one comparison, but I wouldn’t say one is noticeably worse than the other. I also used to be under the impression that lack of error correction degrades the image quality, but in reality it just doesn’t seem to be perceptible, at least over short cables with no strong sources of interference.
I think you are speaking on some very different use cases than most people.Really, what “normal people” use cases are there for a resolution higher than 1080p? It’s perfectly fine for writing code, editing documents, watching movies, etc. If you are able to discern the pixels, it just means you’re sitting too close to your monitor and hurting your eyes. Any higher than 1080p and, at best you don’t notice any real difference, at worst you have to use hacks like UI Scaling or non-native resolution to get UI elements to display at a reasonable size.
Sharper text for reading more comfortably, and viewing photos at nearly full resolution. You don’t have to discern individual pixels to benefit from either of these. And stuff you wouldn’t think of, like small thumbnails and icons can actually show some detail.
You had 30Hz when I read your comment. Which is why I said what I said. Still, there’s a lot of benefit for having a higher refresh rate. As far as user comfort goes.
Okay, fair point, sorry for ninja-editing that.
I think a 1440p monitor is a good compromise between additional desktop real estate on an equivalently sized monitor and dealing with the UI being so small you have to scale back the vast majority of that usable space.
People are getting fucking outrageous with their monitor sizes now. There’s monitors that are 38”, 42”+, and some people are using monstrous 55” TVs as monitors on their fucking desks. While I personally think putting something that big on your desk is asinine, the pixel density of even a 27” 1080p monitor is pushing the boundary of acceptable, regardless of how close to the monitor you are.
Also just want to point out that the whole “sitting too close to three screen will hurt your eyes” thing is bullshit. For people with significant far-sightedness it can cause discomfort in the moment, mostly due to difficulty focusing and the resulting blurriness. For people with “normal” vision or people with near-sightedness it won’t cause any discomfort. In any case, no long term or permanent damage will occur. Source from an edu here
Its unneeded perfectionism that you get used to. And its expensive and makes big tech rich. Know where to stop.
I have a 2560x1080p monitor, and while I want to upgrade to a 1440p since the monitors control joystick nub recently broke off I can’t really justify it. I have a 4080s and just run all my games with DLDSR so they in engine render at 1440p or 4k, then I let nvidia ai magic downsample and output the 1080p image to my monitor. Shit looks crispy, no aliasing to speak of so I can turn off the often abysmal in game AA, I have no real complaints. A higher resolution monitor would look marginally better I’m sure, but it’s not worth the cost of a new one to me yet. When I can get a good 21:9 HDR oled without forced oled care cycles or another screen technology that has as good blacks and spot brightness I’ll make the jump.
From what people have told me, 144hz is definitely noticeable in games. I can see it feeling better in an online fps, but i recently had a friend tell me that Cyberpunk with maxed out settings and with ray tracing enabled was “unplayable” on a 4080s, and “barely playable” on a 4090, just because the frame rate wasn’t solidly 144 fps. I’m more inclined to agree with your take on this and chalk his opinion up to trying to justify his monitor purchase to himself.
All that said, afaik you can’t do VRR over VGA/DVI-D. If you play games on your PC, Freesync or G-Sync compatibility is absolutely necessary in my own opinion.
do you live ON train tracks? how often is shit just falling out around you? usually a pretty cozy fit on most things imo 🤔
do you like the display port push tab? I feel like many of those are a PITA for real
Hate it. Though there is one that’s worse.
The mini-DP retention clip. There seems to be either wide and narrow variations or simply on-/off-spec variants.
Those clips just jam right in the back plate of the video card.
My display port cable has a clip that you have to press to remove.
I’m still waiting for the other shoe to drop on USB-C/Thunderbolt. Don’t get me wrong - I think it’s a massive improvement for standardization and peripheral capability everywhere. But I have a hard-used Thinkpad that’s on and off the charging cable all day, constantly getting tugged in every possible direction. I’m afraid the physical port itself is going to give up long before the rest of the machine does. I’m probably going to need Louis Rossmann level skills to re-solder it when the time comes.
Edit: I’m also wondering if the sudden fragility of peripheral connections (e.g. headphones, classic iPod, USB mini/micro) and the emergence of the RoHS standard (lead-free solder) is not a coincidence.
On my Thinkpad the ports where both soldered to the mobo, unlike some random other USB daughterboard. Really annoying, on my T430 the port is a separate piece and can be easily replaces with a cable.
But no, USB-c is pretty tough for me, when done right. But its still too small for no reason in Laptops.