Left side: Unscrews from the standoff. Right side: Unscrews the standoff from the IO plate.
Every fucking time. Why do I keep buying monitors with a VGA 🤣😭
But seriously, why would you buy a VGA monitor in 2024?
(Edit: typo)
Fighting games and the absolutely lowest possible video latency with the tech available. VGA puts you literally a frame or two ahead of the opponent. For players at their peak, this is a pretty big advantage.
Dont think the best vga monitors will beat current oleds in response times.
Using a gpu that still has vga will be a problem as well.
Apologies, I should have clarified. VGA plus CRT. I only ever used VGA with LED panels for a very short time, so I tend to conflate all the old tech as a package deal.
They’re just so cheap at goodwill :(
I got 4 1680x1050 monitors for 5€ each like 5 years ago and they’re still working perfectly fine
Cheap multi monitor setup gang here we go
I wonder if I can get an adapter to mount an old massive CRT to a monitor arm, and if any monitor arm has pistons that can support the thing in the first place.
Undermount and passive hydraulic arm will do the job, i think, as fellow crt enjoyer i approve of your idea, i have 32 inch crt tv stored, waiting for ps3 and Xbox 360 to be bought
Get some HDMI to VGA adapters, the kind that screw into the VGA port and then have an HDMI port. I have a bunch of old VGA monitors I use with Raspberry Pis and as test displays when working on PCs and never have to deal with the annoyances of VGA since they’re basically HDMI displays now.
CRTs are highly desirable for retro gaming.
I used to use locktite on my stand-offs.
Now there’s an idea.
Every. Fucking. Time.
Video card manufacturers, why u no threadlocker?
Every time
And when pulling it out from the mess of cables
Or when your’re trying to feed that fucker back through the passthrough on a desk.
I do tech support in a school filled with old computers all connected with VGA. One day I’ll hang myself with one of those.
hey would you mind using an impact to screw one in… just to mess with someone
You have to tighten the loose one to loosen the tight one. My fingers hurt just looking at it
you a director yet? that’s gandalf level wisdom
Amen to this…or just say fuck it and break out the screwdriver
You mean that thing I set down right there but has somehow transitioned into a different dimension?
You can only find a screwdriver by first releasing your intent to use it.
You also got ADHD?
Absolutely!!
Usually VGA connector has slots for a screwdriver.
Best part is when this sucker unscrews from the port and comes off with the cable:
Ugh this stresses me out just thinking about it
The actual retro problem was when those tighty boys would start unscrewing the port instead of themselves
Pretty sure the little slit was so that you could use a flathead screwdriver. Had to do that a couple times
deleted by creator
clatter click … oh no…
I learned not to do that with the system plugged in. Only lost one expansion slot somehow.
Then one side of the driver notch shears off
those slots were near useless.
edit to say: one trick was to use the blank expansion slot plates to gently break the vice like grip the screw had in the hex stand-off. the metal used on the cheap “digit remover” cases was sometimes soft enough to loosen the thumb screws via the driver slot without the thumb screw breaking.
still nearly useless though.
We referred to those blank expansion slot metal pieces as “keys.” They were useful lockpicks.
TIL!!
I mean, I could’ve been doing it wrong lol
This happens because the connector is at an angle. Since it’s at an angle, the screw presses against the side and jams itself in place. All you have to do is tilt the connector the other direction and the tight screw loosens right up. Easy peasy.
This would have been really good for me to know about 20 years ago.
Holy Diver!
Retro problem? I used a DVI connector on my monitor until December last year.
Yep, you are retro
I still do. If it works, it works. Until my video card self-immolates, I’ll keep using it. Damn these modern infants and their cable endings! shakes fist at sky
>.>
DVI-D is basically HDMI with a large connector, so nothing wrong with it
¯\_(ツ)_/¯ it’s just thicc HDMI
Retro problem? All of our monitors at work use VGA… not to mention pretty much all servers
Am I the only one that never tightened them?
I tighten them and it saved my monitor! Robbers broke in to our house, stole a bunch of stuff. The computer monitor was still there, connected to the computer, dangling from the table.
How do I know they tried to steal it? Because they tried to cut through the cable with PAPER SCISSORS, because they didn’t know how to unscrew the cables.
I feel sorry for the dumb robbers. I hope they didn’t pawn it and are still enjoying playing Wii Fitness without the balance board, which they neglected to take with the console.
Oh wow, I didn’t see that coming. Screw-terminal cable connections are now the Manual Transmission of computer parts.
Probably not, there are plenty of people in the world
right? just put it in the vga port like any other cable?
also, both stripped somehow?
GPIB users and instrumentation automating folks know the problem is very modern.
Other than niche Keysight gear that’s has three layers of nameplates because it’s '90s vintage NOS, LXI and USB-TMC have replaced GPIB.
You would think that but where I work we are still manufacturing NEW equipment with GPIB. Industry moves at a glacial pace and plenty of compainies will still pay to have GPIB as an option.
All I can say is that we are fortunate that the overlap between “VGA ports everywhere” and “battery operated impact drivers” is almost zero on the timeline. Imagine trying to unscrew a VGA plug by hand that was tightened down to ugga-dugga-foot-pounds of torque. Of course that assumes that didn’t shear the screws first.
You know you have given me a wonderful idea, I have a few friends that are in VGA heavy places
“Tightened down to ugga-dugga-foot-pounds of torque” sent me into an absolute gigglefit.
At least they had screws? I dont trust HDMI or even worse USB-C. Still using VGA monitors with adapters, never broke a single plug.
I sort of miss the screws too but it’s so much better when a cable accidentally gets yanked and it just comes right out instead of transmitting the force into whatever it’s attached to.
Tell that to the USB ports on my laptop.
Good news, USB-C has two formats with screws: 1 on either side like VGA or 1 on top. Though I’ve never seen them in real life.
Why are you using VGA when DVI-D exists? Or Displayport for that matter.
Because VGA used to be a standard and all monitors I had lying around are VGA only
Kudos for not just trashing them.
Why should I? Full HD and working well, no reason to do so, new displays are 100€+ which is freaking expensive for that improvement
Because there’s plenty of used monitors to be had out there that have DVI on them in some capacity for very reasonable prices.
For instance I just purchased 4 x 24inch Samsung monitors for $15 USD each.
All those new video standards are pointless. VGA supports 1080p at 60Hz just fine, anything more than that is unnecessary. Plus, VGA is easier to implement that HDMI or Displayport, keeping prices down. Not to mention the connector is more durable (well, maybe DVI is comparable in terms of durability)
VGA is analog. You ever look at an analog-connected display next to an identical one that’s connected with HDMI/DP/DVI? Also, a majority of modern systems are running at around 2-4 * 1080p, and that’s hardly unnecessary for someone who spends 8+ hours in front of one or more monitors.
I look at my laptop’s internal display side-by-side with an external VGA monitor at my desk nearly every day. Not exactly a one-to-one comparison, but I wouldn’t say one is noticeably worse than the other. I also used to be under the impression that lack of error correction degrades the image quality, but in reality it just doesn’t seem to be perceptible, at least over short cables with no strong sources of interference.
I think you are speaking on some very different use cases than most people.Really, what “normal people” use cases are there for a resolution higher than 1080p? It’s perfectly fine for writing code, editing documents, watching movies, etc. If you are able to discern the pixels, it just means you’re sitting too close to your monitor and hurting your eyes. Any higher than 1080p and, at best you don’t notice any real difference, at worst you have to use hacks like UI Scaling or non-native resolution to get UI elements to display at a reasonable size.
Sharper text for reading more comfortably, and viewing photos at nearly full resolution. You don’t have to discern individual pixels to benefit from either of these. And stuff you wouldn’t think of, like small thumbnails and icons can actually show some detail.
You had 30Hz when I read your comment. Which is why I said what I said. Still, there’s a lot of benefit for having a higher refresh rate. As far as user comfort goes.
Okay, fair point, sorry for ninja-editing that.
Its unneeded perfectionism that you get used to. And its expensive and makes big tech rich. Know where to stop.
I think a 1440p monitor is a good compromise between additional desktop real estate on an equivalently sized monitor and dealing with the UI being so small you have to scale back the vast majority of that usable space.
People are getting fucking outrageous with their monitor sizes now. There’s monitors that are 38”, 42”+, and some people are using monstrous 55” TVs as monitors on their fucking desks. While I personally think putting something that big on your desk is asinine, the pixel density of even a 27” 1080p monitor is pushing the boundary of acceptable, regardless of how close to the monitor you are.
Also just want to point out that the whole “sitting too close to three screen will hurt your eyes” thing is bullshit. For people with significant far-sightedness it can cause discomfort in the moment, mostly due to difficulty focusing and the resulting blurriness. For people with “normal” vision or people with near-sightedness it won’t cause any discomfort. In any case, no long term or permanent damage will occur. Source from an edu here
I have a 2560x1080p monitor, and while I want to upgrade to a 1440p since the monitors control joystick nub recently broke off I can’t really justify it. I have a 4080s and just run all my games with DLDSR so they in engine render at 1440p or 4k, then I let nvidia ai magic downsample and output the 1080p image to my monitor. Shit looks crispy, no aliasing to speak of so I can turn off the often abysmal in game AA, I have no real complaints. A higher resolution monitor would look marginally better I’m sure, but it’s not worth the cost of a new one to me yet. When I can get a good 21:9 HDR oled without forced oled care cycles or another screen technology that has as good blacks and spot brightness I’ll make the jump.
From what people have told me, 144hz is definitely noticeable in games. I can see it feeling better in an online fps, but i recently had a friend tell me that Cyberpunk with maxed out settings and with ray tracing enabled was “unplayable” on a 4080s, and “barely playable” on a 4090, just because the frame rate wasn’t solidly 144 fps. I’m more inclined to agree with your take on this and chalk his opinion up to trying to justify his monitor purchase to himself.
All that said, afaik you can’t do VRR over VGA/DVI-D. If you play games on your PC, Freesync or G-Sync compatibility is absolutely necessary in my own opinion.
do you live ON train tracks? how often is shit just falling out around you? usually a pretty cozy fit on most things imo 🤔
do you like the display port push tab? I feel like many of those are a PITA for real
Hate it. Though there is one that’s worse.
The mini-DP retention clip. There seems to be either wide and narrow variations or simply on-/off-spec variants.
Those clips just jam right in the back plate of the video card.
My display port cable has a clip that you have to press to remove.
I’m still waiting for the other shoe to drop on USB-C/Thunderbolt. Don’t get me wrong - I think it’s a massive improvement for standardization and peripheral capability everywhere. But I have a hard-used Thinkpad that’s on and off the charging cable all day, constantly getting tugged in every possible direction. I’m afraid the physical port itself is going to give up long before the rest of the machine does. I’m probably going to need Louis Rossmann level skills to re-solder it when the time comes.
Edit: I’m also wondering if the sudden fragility of peripheral connections (e.g. headphones, classic iPod, USB mini/micro) and the emergence of the RoHS standard (lead-free solder) is not a coincidence.
On my Thinkpad the ports where both soldered to the mobo, unlike some random other USB daughterboard. Really annoying, on my T430 the port is a separate piece and can be easily replaces with a cable.
But no, USB-c is pretty tough for me, when done right. But its still too small for no reason in Laptops.
I dealt with this yesterday
I’m sorry to tell you that was 20 years ago.
Na, some of us still deal with these style connectors. Not so much for video, but it’s still used for rs-232 (control signal) and other data. They are great when you dont want the connector to ever fall out.
I’ve seen plenty of medical devices with rs232 ports. And I’m sure there’s a lot of legacy machinery out there which require them.
If you have a factory and your computer-controlled machinery was installed in 1995 but still works just fine, you’re probably not going to invest in newer equipment until it becomes a problem.
Screw loosie in tight as you can by hand, give the plug a moderate side to side jiggle, loosen tighty first then loosie.