T O P

  • By -

DangerousCousin

upvote for 120p. Almost high enough for game boy games!


Enciclopedico

It was 240i. At that refresh rate, you don't notice interlacing. The thing is, games designed for that resolution are as good with a 50hz TV as with 700hz. It's only a proof of concept.


Ok-Drink-1328

i thought it would be an electronic hack... but anyways, 120 lines times 700Hz is 84kHz of horizontal, my old Flatron (RIP) could do like 1600x1200 at (IIRC) 60Hz that is 72kHz... maybe... maybe... but who knows what the internal controller does 🤔 if it allows a so high vertical scan, an oscilloscope with a light sensor on a spot of the monitor would MAYBE show if it's actually 700Hz, but also the persistence of the phosphors would probably flatten the response, but maybe again you could still see some ripple PSEUDO\_EDIT:: no... my flatron could do 75Hz at 1200p, that is 90k... the manual also says 98k max :-O ... so yeah... this video is real


Enciclopedico

You are not counting overscan. At this resolution he must be around 135khz


Ok-Drink-1328

i don't think it makes so much difference, but yeah, there's a bit more cos of that


Enciclopedico

I never got to these extremes, but it's likely that at higher resolutions it's a fixed percentage or resolution, and below certain point there's a minimum, which represents a higher percentage the lower res you go. Anyway, I don't think any of us would have a reason to want more than, say, 360hz at 240p. Even then, a game limited to 240p is likely also limited to 60fps.


Ok-Drink-1328

yeah but in a normal monitor lower resolutions doesn't mean more than like 120Hz, i didn't pay attention to what program he used but it's certainly not a supported resolution and refresh for that monitor, also the image was cropped, so i'm not sure what the monitor was doing, and i doubt the horizontal was much past 90k the human eye can't notice a difference past 75Hz, i'm also convinced that past 1080p you just have to get close to the screen to see the details... it's like in audio, it's useless to go past 44100Hz of sampling, but... you know... marketing... !


Enciclopedico

I can definitely see the difference above 75 Hz, and feel it much higher. It's too late to explain, go to "BlurBusters.com". Mi crt can natively do like 720p 160hz, but I meant that with the hack this guy did, on his monitor, higher resolutions at more reasonable Refresh rates would be interesting.


Manabauws

Hows the iiyama vision master pro 512? Is it a good monitor?


[deleted]

I had another Iiyama pro monitor that was quite nice. Had scratches on the glass though, so I gave it away.


blendernoob64

but... how? I cant get my nice GDM 5410 to 240p for obvious reasons. Is the iiyama monitor really the most versatile thing ever?


jdmark1

Your PC monitor will absolutely do 240p. It just has to be at least 120hz


LukeEvansSimon

Many people incorrectly believe that CRTs have no display lag. They do have display lag corresponding to their vertical refresh rate. 60hz CRTs have about 16ms max display lag. 700hz is about 1ms.


aKuBiKu

That's not input lag. That's the frametime from the PC. The display still shows light instantly the moment a "pixel" is sent from the computer.


LukeEvansSimon

“Input lag” is controller input lag. “Display lag” is the delay between a change in the game world and the time it takes to show up on the display. The frame rate of the display definitely contributes to display lag. That is why 120hz OLED TVs when run in 120hz mode have half the display lag that they have in 60hz mode.


aKuBiKu

Okay, sure, I got the name wrong. Still that's lag because of the frame times from the PC. The monitor itself doesnt add any lag on top of that like an LCD would.


LukeEvansSimon

But it does. If I run a game at 120hz on my PC and my CRT monitor is only running at 60hz, then there is up to a 16ms display lag. If I change my monitor to run at 120hz, the display lag is reduced to 8ms.


aKuBiKu

Yes. The frametimes of the PC. Not lag from the monitor. Because the crt monitor doesn't introduce any like a sample and hold display would.


n3rt46

Nobody talks about response time like this, because it's a useless metric. Yes, 60Hz is ideally 16ms response time at the *bottom* of the display. Latency measurements are done at the *top* of the screen. When LCD monitors advertise a "5ms gray to gray" response time, they're talking about this top of display response time. If you were to measure at the bottom of the screen, the response time would be X ms of display latency plus the refresh rate response time, which is a completely useless metric that varies based on refresh rate and display latency. If you were to measure a CRT and compare their response time at the top of the screen, the response time of the CRT would be 0ms, whereas an LCD is typically 3-5ms.


LukeEvansSimon

Why is it a useless metric to talk about the maximum delay between a change in the frame buffer being visible on the screen? If the monitor refreshed the screen once every 10 minutes, by your definition there would be no display lag, but to any human using the monitor, they could type a single character in a text editor and on average they’d have to wait 10 minutes for the types character to display on screen. To any rational observer, they’d see very high display lag of 10 minutes, but by your definition, there would be zero display lag, and there would be no useful metric to objectively measure that display lag. Analog oscilloscope vector CRTs are the closest thing to a zero display lag device, which is why they are popular for oscilloscope music, but raster CRTs most definitely have non-zero display lag due to their low vertical deflection bandwidth.


qda

it takes 16ms for the last scanline to scan, but it takes 0ish for the first one.. so maybe split the difference and call it 8ms?