Why have images not evolved past two dimensions yet? Why are we satisfied with century-old technology? What if the technology already exists that’s ready to push the content world forward and only requires a camera? That technology is neural radiance fields (NeRFs).
Although this technology was created only a few years ago (2020), the pace at which it has continued to evolve cannot be ignored or undersold. NeRF is poised to have a DALL-E moment this year. It is disrupting longstanding processes in industries such as production (both virtual and live), live events, architecture and construction, and general content creation and memory capture.
NeRFs are the disruptive technology that’s causing a stir in the world of image rendering. NeRFs have already made significant strides in generating highly detailed and realistic images of complex 3D objects. In this post, I dive deep into the current state of NeRFs and discuss their potential impact on various industries.
While the name sounds intimidating, a neural radiance field describes itself. In summary, a NeRF can take a series of 2D images and create a photorealistic 3D scene from them. It is data trained through a neural network that can accurately represent what each pixel’s color looks like depending on the viewing angle (otherwise known as the radiance), which is then displayed in a volumetric field. After it’s compiled, it can be used to render new views of the object or scene from any viewpoint, even those that were not captured during training.
To create a NeRF before 2022, you would have needed a deep understanding of neural networks, AI, and a lot of patience. Now, with platforms such as Instant NGP, Luma AI, nerfstudio, and TurboNeRF, the barrier to entry is nearly nonexistent.
NeRFs have the potential to revolutionize the field of image rendering due to their ability to create highly detailed and realistic images of complex objects. They can capture subtle variations in lighting and surface properties, such as reflections and shadows, that traditional rendering techniques struggle to reproduce.
From my viewpoint, people have become complacent with the technology that is set before them and look to additional megapixels to increase image immersiveness. This is inherently flawed and outdated. It is time to take advantage of the breakthroughs in technology that enable us as humans to experience life and document our current state of existence for every generation to come.
My generation was the first to have access to high-definition photos and videos documented through social media. I firmly believe that the next generation will be the first to have access to photorealistic NeRFs that enable them to truly show the entirety of a moment in life.
These are all massive claims but NeRF artists have made undeniable progress toward this goal, from Luma AI making it easy to generate a NeRF from an iPhone to papers that for the first time enable people to edit NeRFs in a 3D space. There are even plug-ins and platforms that enable NeRFs to be used in Unreal Engine and Blender.
NeRFs are gradually becoming more accessible to the wider public.
The “camera of the future” might look the same, but how the content is manipulated and structured will look different. People are no longer bound to a composition that was set in one point in time by the original photographer. They can now infinitely explore a NeRF and find the exact angle and perspective that has the most meaning to them.
Recently, I found myself exploring a NeRF that I took at a former partner’s apartment. While walking around the scene, the most moving part of it was not the subject but rather a pair of my socks on the floor. At once, all the times of her pleading with me to pick up my socks came back, and I was reminded of a time completely forgotten.
While this was a cherry-picked example, I can’t help but feel that for each person, a moment in time can represent more than what is displayed on its front. Enabling people to interpret and make their own meaning out of a space is the true value of a memory.
It is not difficult to imagine how NeRF technology can help improve consumer experiences across industries. A more difficult question is what will not be disrupted.
We got a glimpse as to how much NeRFs will shake up real estate this week with the introduction of Zip-NeRF. Several friends who work in the real estate industry and owners have reached out to me about how they can use NeRFs as a better product offering than Matterport. There are thousands of Matterport-certified people throughout the US, and all of them should be realizing how their world is about to change. I expect to see several NeRF companies emerge to streamline the vertical within the next 18 months, so start developing.
This is another vertical that has already seen NeRFs begin to shake things up. As I mentioned earlier, there are now ways to get NeRFs into Unreal Engine and Blender, opening the possibilities. It’s possible that, by the end of 2023, NeRFs will have dramatically lowered the barrier and budget-to-entry of virtual production.
People all have different educational preferences. Some absorb didactics, others prefer group learning. Some get motivated by experiencing and understanding the context of what they’re taught.
This is where NeRFs come in:
- History can be shown in stunning photorealistic detail.
- Medical instruction can be shown in VR for dissections and surgical procedures.
- Museums can provide educational tours for items they don’t have room to display.
The wave of learning management systems using NeRFs is coming, and it will electrify learning.
Recent winner of the #InstantNGPVR Contest, Harrison Forsyth, used Instant-NGP to show off an archeological dig. It’s easy to imagine how professors and institutions can use NeRF as part of their lesson plans and review on location.
Google recently implemented immersive view into a handful of cities, with rapid expansion coming. Take a guess as to what’s powering it: NeRFs. There are several use cases, ranging from tourism, wayfinding, and planning out what you want to experience on your trip.
If you find yourself in a new city for a work trip and need a place to get coffee, you can get a photorealistic understanding of what the place’s vibe looks like. No more looking at questionable phone photos from 2013. Now you can finally experience exactly what you’ll get. (Expedia, if you read this, please call me.)
Some selected use cases include the live ticketing and event industry where it’s possible to view seating and the atmosphere of a stadium accurately.
I went to a Mets game (an interactive NeRF if you would like to check it out) last week with my father. We spent 20+ minutes debating which seat to buy and what we expected the view to be. Quite frankly, the technology in place is subpar. This was further cemented when I got an email from StubHub asking me to upload photos from our seats to help other users.
How can this be the standard? NeRFs can immediately provide a better user experience for fans and those heading to a new stadium.
The same can be said for other live venues. Be honest, how often do you look at a restaurant’s menu before going? The same should be true for a venue. What is the seat’s proximity to Section 132, where there’s a two-foot-long hot dog? What are the unique features of each venue and how do consumers make a plan before heading to a stadium?
By giving people the ability to explore and plan out their events, stadiums and vendors can upsell and also create a memorable experience for their customers.
Other examples include product visuals and augmented reality. Vendors such as Crate & Barrel and Nike have long been using AR to help buyers see what their products could look like in their space. However, they have been lacking the photorealism that NeRF brings to the table. Now consumers can accurately see what they will get and how it will fit into their space.
Okay, here’s the big one that has captivated the minds of tech companies and venture capitalists over the past 3 years. While the progress of the metaverse has continued to evolve, there have been questions about how it will be implemented and by who.
In my opinion, NVIDIA is best positioned to translate the metaverse into reality (no pun intended). As I sit here, it’s hard to imagine the metaverse existing without the presence of NeRFs.
This post describes just a small subsection of the industries that will be disrupted by NeRF.
NeRF is a game-changing technology with the potential to revolutionize the field of image rendering. Its ability to create realistic and intricate images of complex 3D objects has far-reaching implications for various industries such as gaming, entertainment, and advertising. Although there are still limitations to be addressed, NeRF’s possibilities are endless, and its impact is only beginning to be felt.
What story will you tell and how will you use NeRFs? There are no rules or best practices yet. Find them, create them. Explore the world of NeRFs and push the way that humans document history forwards.
Nothing is holding you back from starting. Start capturing, make your own new memories, and realize how NeRFs will affect not only your work, but everyday life. Already, we are seeing artists adapt the medium and create stories with NeRFs.
Now that you’ve caught a glimpse of the thrilling world of NeRFs, we hope you’re eager to learn more about the incredible impact they’ll have across industries. If you’re hungry for more knowledge about NeRFs, don’t hesitate to stay tuned to Neuralradiancefields.io or subscribe to Radiant, the only newsletter dedicated to NeRFs.