Okay, so all they’re saying is that they won’t certify any monitor under 144 for FreeSync technology. Okay? That basically changes nothing. If you are using a 60hz monitor, basic vsync is all you need.
Basic vsync worsens response time, often by a lot. I’d take screen tearing over vsync.
True. For FPS I often do that too. For top-down view games it’s often nicer with vsync. Even triple buffered.
I’d take response time over screen tearing.
Probably their freesync is mostly ineffective on 60Hz but better on higher refresh rates, and they just made a good spin out of that in the marketing department.
I’ve have a monitor that locks to 33-92hz when free sync is enabled (144hz otherwise) it’s way more useful at lower fps values that higher ones
Interesting. Yeah my comment was just a shitpost. But what you are saying makes it seem completely nonsensical what AMD is doing.
But then maybe they just want to push faster video cards, and freesync is something that people care about. Then they can still sell better cards to those people that don’t know that freesync is less useful at higher refresh rates.
60hz4k until I die. Or can afford 144hz4k. Which’ll probably be around the same date, give or take.
I’m very much a 1440p 144hz guy.
I’d like 4k, but would take this compromise for now.
I want my next screen to be 4k, 144hz oled
1440p high refresh rate gamers unite. I have an Alienware AW3423DWF and boy are those new OLED panels beautiful. Expensive but beautiful. I still remember playing Left 4 Dead right after I got it and even without HDR I was baffled by the credits at the end of the match. Just white text floating in nothingness.
They also recently released the AW3225QF which is 4k@240.
I just bought a 4k 144hz screen, and let me say, it is worth the price.
I have a 2.5K ultrawide 144Hz. Even when this PC was new it struggled on that era of games :(. We need better graphics cards that don’t cost the price of a mortgage.
Yeah, I invested in a 4070, I wish I could get a 4090 for that price.
FreeSync is for variable refresh rates, which 60Hz monitors generally don’t support anyway. So this headline is nothing but clickbait.
Also, I don’t know of any sub-120Hz VRR monitors that are still being made, but if they exist, they’re not aimed at anyone who cares about FreeSync branding.
So this whole article is a pointless waste of time.
I recently got a 75hz 1080p monitor with FreeSync branding from Costco, so yea they are still made.
Provided the 60hz monitors still work who cares…if they do some arbitrary bullshit to prevent stuff from not working just because profit, then get fucked. I personally don’t care about their certification or claims.
Freesync is open source, so wouldn’t be profit motivated.
Open source is the best. That doesn’t mean the recommendation to move off 60hz isn’t profit motivated. Especially when driving displays at over 60hz means selling more graphics cards since your older one may not go far beyond 60.
That’s a good point, but freesync also isn’t mandatory to use I thought? Couldn’t you still use vsync for 120hz and lower?
I’d prefer screen tearing over the delay vsync causes.
In a round about way? Maybe. But no.
The first commercially available variable refresh monitor came out like a decade ago, needing expensive bespoke hardware to drive it. Now? We are at a point we are reaching commodity level costs. And yet we still have piles and piles of bottom tier and crap tier products being shoved onto the market.
Sooner or later, the machines and production lines for making those monitors will need overhaul, and at that point - it would 100% make sense to just go to variable refresh.
The reality is, the benefactor is you - if you get a GPU upgrade: You get more frames. If you don’t, variable refresh can still provide a smoother better game experience. This is especially true as frame generation, and upscaling techniques have gotten extremely good in the last few years.
you don’t need to upgrade the GPU to benefit
I want to spell that out clearly: AMD doesn’t need you to buy a new GPU to benefit. NVIDIA doesn’t either. But it also means, if you buy a new monitor that is variable refresh today - when you upgrade your GPU, you get to really take advantage.
Where my perspective comes from
I did the monitor upgrade before a GPU upgrade a few years ago. Variable refresh is king. HDR when the content supports it is amazing - provided the monitor has decent HDR support (low end monitors… don’t).
Given that I had my previous multi-monitor set up for over a decade, and went through 3 system builds with it - Your monitor is something that is going to hang around, and have more impact on your overall experience than you realize. Same with the keyboard and mouse. Unironically the part that you can likely get away with cheaping out the most on in your first build is… the GPU. Decent CPU will last a good 5-6 years at least these days. So get a decent monitor, get good peripherals - those will hang around when you upgrade the GPU. Then start that CPU - GPU - GPU upgrade cycle where it’s CPU, then GPU, then GPU, then back to the CPU. The reality is, once you have a base system - storage carries over, PSU can cycle over a build, the case can be reused.
So I guess what I am saying is: Spend the money on the things liable to hang around the longest. It will lead to a better overall experience.
People can just patch it
What does “certification” mean? It won’t work?
It’s basically
ATIAMD (lol) saying “this product meets or exceeds the required hardware standards to be granted this label”.ATI is a name I haven’t seen in awhile 🙂
It means that they are allowed to put another sticker on the monitor.
To be “FreeSync certified”, a monitor has to have certain minimum specs and must pass some tests regarding its ability to handle Variable Refresh Rate (VRR). In exchange for meeting the minimum spec and passing the tests, the monitor manufacturer gets to put the FreeSync logo on the box and include FreeSync support in its marketing. If a consumer buys an AMD graphics card and a FreeSync certified monitor then FreeSync (AMD’s implementation of VRR) should work out of the box. The monitor might also be certified by Nvidia as GSync compatible, in which case another customer with an Nvidia graphics card should have the same experience with Gsync.
What does this mean for standard TVs that people us for gaming. LG/Sony/Samsung OLEDs tend to be able to do 4k@120, having native 120hz panels. Maybe this only covers “monitors” getting freesymc certified.
those handle VRR with the HDMI 2.1 hardware spec which is a little bit different than the traditional method of VRR.
its the main reason how current gen consoles have VRR (through hdmi 2.1 spec)
Rtings says that the LG (B2 at least) TV’s support VRR via several standards: HDMI 2.1 , FreeSync, and GSYNC. I have a console hooked up, but no GPU good enough in a PC.
its freesync/gsync over hdmi 2.1 standard. Nivida does not have a Gsync over HDMI in the standard hdmi connection. There is no non 2.1 hdmi monitor/tv that will accept VRR over HDMI on Nvidia. Only AMD had Freesync over HDMI (on very low end budget monitors)
Gsync Compatible is basically gsync over the display port standard. Gsync Ultimate is over the FPGA which uses display port as a medium.
I would love to learn more about this. Know of any technical papers or references?
idk about technical documents perse, but heres a news article when AMD introduced VRR over hdmi ways back, noting how vrr on hdmi wasnt a thing yet, so AMD partnered with monitor makers to use a different scaler that would make it compatible with freesync.
VRR over display port would be in the Displayport 1.2a specification sheet. VRR over HDMI (officially) is under the HDMI 2.1b sheet.
To this day, I will never understand why people want their media to look absolutely perfect in super hyper 4k definition.
You’re right, maybe we should go back to 12" monochrome CRTs.
/s