image-based lighting

I’ve recently become quite enamored with a new term: IMAGE-BASED LIGHTING

I heard the term mentioned in a recent conference presentation by Jeremy Hochman, Co-Founder of Megapixel (and my new boss!), in which Jeremy chronicled the evolution from ye-olde LED pixel tubes to sophisticated film/production special effects lighting to the rapid innovation currently happening with highly immersive virtual production studios.

Image-based lighting is nothing new: If you’ve ever worked with even the most rudimentary of theatrical lighting, surely you’ve used gobos to help break up the light and add texture into a scene on a stage set. What you are doing is adding layers of visual richness onto the geometry of the space. Similarly, on film sets, stage hands would manually shake or sweep a light around a set to replicate movement of light or motion effects on actors.

But we are entering a new world where we will be surrounded by digital surfaces and the threshold between the physical world and the virtual world becomes increasingly blurred. Virtual production technologies are radically and rapidly changing the nature of film and broadcast production.

And those virtual production technologies are proving so powerful that they will rapidly transform traditional architectural placemaking. That is what my team is working on in launching Ventana, a product line of stunningly high-fidelity LED display tiles.

These technologies will result in two radical changes to the concept of “architectural lighting”:

First, in a space dominated by large format digital display screens, pixels become the dominate light source. So fundamentally, media images generated by the walls that surround us become the key source of lighting in a space. Literally “image-based lighting” that can create any scenographic concept as desired. Pixel-based lighting will define the fundamental feeling of the space, as a growing percentage of the actual physical “stuff” that constitutes the built environment.

Second, we will increasingly need to match the experience we are seeing in the virtual world (on the digital surfaces that surround us) with the experience we have in the physical world (in our IRL rooms, lobbies, museums, etc.). Large format LED display surfaces are all inherently Lambertian in their optical distribution – meaning the lighting is all flat/soft/surfaces of light. No light projects from the surface in tightly focused, hard-edged beams. So to replicate the feeling of light beams projecting into a space, new forms of light fixtures will become prominent.

Such natural-looking lighting! Must be shot on location at a pier in sunny California?
Just kidding. Shot on a small virtual-production setup in a brick room.

Traditional digital projection (or older theatrical gobos, spot lights, etc.) obviously can be used to augment and add the layers of visual texture into a space, but in the virtual production world, entirely new concepts of light fixtures are being developed, such as the Kinoflo MIMIK, which is a pixelated “softbox” concept that helps to match what you expect the lighting in a space to actually be, based on the imagery/architecture/distant landscape that surrounds that space. In film, this is especially important for the reflections on objects/characters in the scene.

What makes the Kinoflo MIMIK especially interesting for architectural innovation is that Kinoflo is using RGBWW sources for each pixel, providing outstanding color rendition. And behind the scenes, the MIMIK is tied to Megapixel’s HELIOS system for perfectly matching the surrounding digital displays – even synced frame perfect for amazing technologies like GhostFrame.

I think it is obvious that tighter optics will be married to such pixel-grid-lighting fixtures, projecting gorgeous patterns and textures of light into spaces. They will fill the gap between “lights” and “digital projectors” with the advantages of both.

One more thing:

As video becomes the dominate control methodology for architectural spaces (as opposed to hopelessly outdated preset-scene lighting control systems), adding in fixtures that “sample” the virtual world and project the correct pixelated projection effects into the space becomes quite simple in concept: Live-rendered 3D game engines are increasingly driving virtual production studios – so placing a sampling “virtual light” in the 3D space that drives the “real light” in the physical space instantly gives you the spatially accurate lighting that is needed to match the media content, so we don’t have jarring disconnects between the surface media and the volumetric lighting.

So perhaps we should call this SPATIAL IMAGE-BASED LIGHTING?