why do VR games look bad? how to achieve VR in real life?

Here’s an interesting video that discusses the present-day (2021) reality of VR headsets. I find the concept of “pixel density per degree field of view” fascinating.

You might want to skip to about 3:00 for the good stuff. The key takeaway here is that human eyes have an acuity to see a minimum of about 60 pixels per degree FOV (field of view). Less than that leads to muddy, blurry results, plus the problem of the “screen door effect” in which individual pixels become obvious. 4K total pixels is not really enough; 90 FPS is not really enough; plus you really want ray tracing for accurate lighting simulation. So right now, to even start with something vaguely “acceptable” for view quality requires an insanely highly-spec’d workstation machine with ridiculous high end graphic cards and VR headsets that aren’t even on the market yet. Which is why the host of the video says in the end “…just wait…” when it comes to VR.

…but how about VR IRL????

The science behind human vision is incredibly complex, but roughly what I found is that humans have a total horizontal field of view of roughly 210-degrees with binocular field of roughly 120-degrees and a high-acuity central view of 6-degrees.

So in using digital media/signage/lighting in architectural scale physical spaces, let’s say you’re standing about one meter away from a surface. Here’s a quick sketch to scale I made looking at someone standing 1-meter away from a wall.

Their field of view of 120-degrees, which means they can see 3,464mm of horizontal width. Requiring 60 pixels per degree field-of-view, that requires 60 pixels x 120 degrees = 7,200 pixels per 3,464 mm of width, or a pixel pitch of 3,464mm/7200 pixels = 0.48 mm pitch.

So a general rule of thumb is that once a direct-view LED, LCD or projection screen achieves a real-life pixel pitch of about half a millimeter, the screen has achieved “retina” quality (to steal Apple’s marketing parlance).

Or in other words, from a meter away, a person starts to lose the ability to distinguish the digital surface from reality. You have achieved a basic “VR IRL”. There are a host of other factors that will lead to the illusion of “reality” – the FPS rate, ambient lighting, the physical quality of the pixels and interstitial space between the pixels, perspective and parallax effects, etc.

I also find it interested that the occupant can be precisely focused on only about 105mm horizontal width of that surface at any one time – leading to the importance of designing for “ambient communications”.

Just to double check a bit, I found one video presentation from several years ago looking at the future of VR tech. The presenter shows that humans can discern an even higher pixel density per degree, up to ~120 pixels/degree. IRL, at 1-meter, that would require a pixel pitch of 0.24mm. But still, in a real-life situation, the closest a person would practically get to a large-scale wall surface is probably going to be 1-meter, so the 0.5mm pixel pitch is solid rule of thumb for a minimum VR-like experience.

So when you start to create crazy constructions like dome-shaped direct-view LED screens, or the “volume” space as used during the filming of the Mandalorian, you have to start wonder if VR-headsets stand any chance at being adopted in commercial retail/entertainment applications. They have already been surpassed technologically by direct-view LED screens, which are routinely available in ultra-fine pixel pitches that are needed to achieve in-real-life VR/AR/XR experiences. Plus IRL VR experiences have the added benefit of being a shared, social experience.