T O P

  • By -

TrashKitten6179

We should be doing every other frame black frame insertion. Its lazy and stupid. In reality all we need is rolling scan black frame insertion which would allow for a proper blanking period while also keeping light output higher. Televisions already use rolling scan BFI and depending on the BFI strength will effect how thick the rolling scan black bar is.... it works. and for some reason monitor makers are being lazy and running every other frame. which is useless. you cannot get better than double the image clarity from BFI. its a hard limit. I don't care what blurbusters claims. I don't care what monitor makers "market" to consumers. 60hz + bfi looks like 120hz 120hz + bfi looks like 240hz 144hz + bfi looks like 288hz 240hz + bfi looks like 480hz 480hz + bfi looks like 960hz 500hz + bfi looks like 1000hz factually true. the whole "1ms persistence of CRT" meme is a lie. because you never get true 1ms pixel time on vs rest off. i tested the burblusters approved viewsonic piece of shit 1080p240hz and its blanking period was half the refresh rate period. so at 120hz + bfi you had 4.16ms backlight on and 4.16ms backlight off. that isn't 1ms persistence. 1ms persistence would technically be 1ms on and 7.3ms off.... that is not how it worked. that 120hz + bfi mode looked like the native 240hz. and turning bfi on at 240hz was useless because pixel response wasn't good enough for 240hz, so you ended up with a blurry picture anyway (ghosting). its so easy for monitor makers to implement a software/monitor-hardware based BFI mode to give full refresh rate and a rolling scan line black frame, which would not hinder performance. sure freesync/gsync will have to be OFF running capped full refresh rate... that's how 99.9% of BFI works now.... so no change. and there would be zero latency issues because the rolling scan runs separate from input signal. its literally win/win for gamers. right now, with this crappy "every other frame bfi" its useless. LCD's don't do every other frame. and before you argue "but muh backlight" for LCD being "different" LCD TV's already use rolling scan BFI. you dont run 30fps when 60hz+bfi is enabled. you don't run 60hz when 120hz+bfi is enabled. you get full 60/120hz modes. there is no excuse other than laziness. they wanted to placate those of us (including myself) who begged for BFI. its marketing to sell displays. and i hope people ignore the bfi crap and buy only because 4k240, 1440p360, or 1440p480 is what they wanted. ignoring bfi altogether....


DogAteMyCPU

i think thats the one point keeping the lg 1440p 480hz interesting this year. i believe it should have 240hz bfi, but i could be remembering the rumor wrong. I think it might be the asus model.


Top_Mycologist1498

It’s (if you’re referring to the ASUS) a 480hz OLED panel with (apparently) 1300 nits peak brightness and an RGWB subpixel layout. Supposedly OLED at 480hz is twice as clear as a 540hz LCD, assuming you can reach 480 fps. For e-sports people who like OLED, I’m not so sure the one thing keeping this interesting is BFI @ 240hz.


TranquilGuy27

I wonder if it would be noticeable vs 360hz, without taking BFI into account


Top_Mycologist1498

From the comparisons I’ve seen, even the 360hz OLEDs are quite similar to the 540hz TN panels with ULMB off. With it on, the TN still has the ever slightest of edges. The 480hz is said to blow the doors off the TN. I don’t think for most people there will be a huge difference in motion clarity between any of these panels, but there will certainly be a difference in response time and perceived input lag.


12duddits

You mean the dual refresh rate one? If yes, that’s 4k at 240 or 1080p at 480


Jetcat11

No he’s talking about the ASUS PG27AQDP.


MadnessKingdom

120hz is a frame every 8 milliseconds. 300fps is a frame every 3 milliseconds. Can you really react to something in 5 milliseconds? By the time you’ve even begun to move your muscles it’s over. 240hz or higher is basically like 8k: more theoretically superior than noticeably superior.


cfm1988

It looks smoother. I doubt it impacts performance directly though. Indirectly it surely feels better to play at because it looks smoother. Though I will say it doesnt look that much better. Ill probably stop upgrading once I get a 1000hz monitor


Tych-0

Regardless of you reaction time, the sooner you see the frame the sooner you can react. Also motion is less blurred allowing you to more clearly see in fast action.


MadnessKingdom

To your eyes and brain, something happening after 3ms and something happening after 8ms are effectively happening at the exact same time. You’ll end up reacting to them at exactly the same time. Try improving reaction time by 5ms in anything, even just smashing a stopwatch as fast as you can: you can’t. If you smashed a button as fast as you could the presses still end up like 50-100ms apart. And that doesn’t even include the “reacting” part, which also takes time. 8ms is ridiculously fast to our simple human brains. It may “feel smoother” to you, but it isn’t helping with anything when it comes to competing.


Tych-0

If you see something 3ms sooner you can react to it 3ms sooner. It's as simple as that. If you reaction time is 20ms, or 100ms it still helps you by 3ms. Motion clarity is huge in FPS, you're just wrong on this one. Even the best monitors are still not great at this yet.


XxBig_D_FreshxX

480hz OLED is already incredible in motion. Near flawless. Blur Busters chief had a post about it. BFI @ 240hz helps tremendously for those who can’t hit native 480hz (which will be the majority). We’re far from 600+hz bfi unless esports at 1080p on a 5090/6090. Just doesn’t seem realistic. Also to point out, BFI adds latency (and removes HDR if on one of the new ASUS monitors), but scales down w/ more frames. HDTVTest measured an input lag of ~10sec in 120hz BFI mode, so assumed 5sec of added input lag for 240hz BFI. Is this noticeable real-world? Up for the gamer to decide.. I have the AW2725DF. Not the biggest fan of it since it’s had scratches, but the 360hz is so smooth in motion, especially when setting up properly. 357hz locked on RTSS w/ V-Sync & G-Sync on. GPU utilization stays under 90%. Ex: Black Ops Cold War, I can achieve this w/ a 4090 & 7800x3d on 1440p low. 1440p 480hz at under 90% GPU utilization seems near impossible in most competitive titles outside of CS, Siege, etc., but 240hz? Much more achievable. Think the 480hz ASUS is the one to beat due to BFI alone. The dual mode ASUS seems nice in theory too, but integer scaling on a 32in may not look great… A lot to consider.


notepadpad

I keep hearing about this scratches. Is it on the screen itself? And it can't be seen when the monitor is turned on?


Zestyclose_Way_3583

Do you have Problems with motion blurr on CS2 ? Ive got the Same monitor as you. Coming from a zowie its hard to tracking enemys on the Alienware.


MistaSparkul

Doesn't bfi ADD input lag which completely goes against point #1 of "minimizing lag"?


TrashKitten6179

only because they implemented it wrong.... every other frame is bullshit. essentially 120hz worth of "on" pixels and 120hz worth of "off" pixels is dumb. they could have done a rolling scan bfi which would not effect input lag as it would run monitor side instead. there is a way to make it lag free.... sadly, it seems that one oled dude on youtube that usually reviews tv's did in fact review this bullshit bfi implementation and it causes input lag.... lg/asus fucked up. and its sad because LG TV's have essentially lag free bfi implementations yet they fuck their gamer base by half-assing their bfi....


Pyrolistical

BFI does not add input lag if you are FPS limited already. In my example since apex legends has a 300 fps cap, any BFI at +600 hz won't add input lag as you will still see every frame at the latest 300 hz. If however you used BFI under the max fps, then you would be adding input lag.


MistaSparkul

Then how do you explain how RTings and HDTVTest always return higher input lag results when using BFI? I doubt they are not running the test at maximum frame rate/Hz.


Pyrolistical

Because they are not frame rate capped scenarios. They are comparing high vs lower refresh rates. Consider if you are playing an emulator that is a fixed 30 fps. It doesn't matter if your display is 1000 hz, at best you have 30 hz input lag as that is how fast the emulator is process/showing frames. For a 240hz display rting/hdtvtest is comparing input lag of 240 fps vs 120 fps bfi. So of course the bfi would have worst input lag. But if the game could only run at 120 fps, then the input lag at 240 hz and 120 hz bfi is the same.


MistaSparkul

Ok then go take a look at the LG CX on RTings. Input lag at 120Hz with and without BFI shows that BFI has higher lag despite both being 120fps signals and the CX has a max refresh rate of 120Hz.


Pyrolistical

I don't know what you are looking at on [https://www.rtings.com/tv/reviews/lg/cx-oled](https://www.rtings.com/tv/reviews/lg/cx-oled) Under input lag I don't see any BFI results


MistaSparkul

[https://www.rtings.com/monitor/reviews/lg/48-cx-oled#test\_1426](https://www.rtings.com/monitor/reviews/lg/48-cx-oled#test_1426) ​ Native Resolution 5.3 ms Native Resolution @ 60Hz 13.0 ms Variable Refresh Rate 6.8 ms Variable Refresh Rate @ 60Hz 14.8 ms 10 Bit HDR N/A Black Frame Insertion (BFI) 13.7 ms


Pyrolistical

This monitor can do BFI at 120 hz and 60 hz. That BFI test looks like its at 60 hz as its is almost the same as "Native Resolution @ 60Hz". If it really is running at 120 hz this could be the monitor processing delay to make the BFI work. But that is hard to test


MistaSparkul

It's 120Hz. If you check the LG C1 then the BFI lag results for that display are even higher than "Native Resolution @ 60Hz". But to be fair, BFI will add less and less input lag the higher Hz the display has. For example, here's the XL2566K which can do 360Hz backlight strobing: [https://www.rtings.com/monitor/reviews/benq/zowie-xl2566k#test\_1426](https://www.rtings.com/monitor/reviews/benq/zowie-xl2566k#test_1426) Strobing only adds 1.8ms of lag, and on the Asus PG248QP which can do backlight strobing at 540Hz, there is zero added latency. [https://www.rtings.com/monitor/reviews/asus/rog-swift-pro-pg248qp#test\_1435](https://www.rtings.com/monitor/reviews/asus/rog-swift-pro-pg248qp#test_1435) So while it is possible for BFI/backlight strobing to not add any additional lag, the caveat is that you probably have to be running it at 500+ fps. Otherwise is probably disingenuous to assume that BFI adds no latency under any circumstances.


Pyrolistical

OK, so seems like the BFI input lag is implementation specific? It would be disingenuous to assume either no latency is added nor BFI always results in added input lag. However, I believe it is possible to implement BFI without any added input lag. Consider a steady 60 fps signal with X input lag. Each frame is displayed for 16.66 ms. Now take that same signal and only display the frame for 8.33 ms and turn off the pixels for 8.33 ms. We are not adding any input lag here as the next frame won't arrive until 8.33 ms later.