Wednesday, March 21, 2018

What is ray tracing? Everything you need to know about the next big graphical leap

Modern-day video games can look pretty darn amazing, and in the past, we saw seismic shifts between each console and graphics card generation. That hasn't been the case in recent years, however, as games focus less on pumping in more polygons and instead make small-but-meaningful upgrades with things like texture quality, resolution, lighting, and visual effects work.

Ray tracing looks to be another one of those seemingly modest, but potentially significant upgrades that will hit the gaming landscape in the near future. 

What is ray tracing you might ask? It's a much more advanced and lifelike way of rendering light and shadows in a scene. It's what movies and TV shows use to create and blend in amazing CG work with real-life scenes. The drawback is ray tracing often requires extensive server farms to pre-render graphics. That's been too much to ask from a real-time, interactive video game running on a compact box in your home.

Well, at least until now. At the Game Developers Conference, Nvidia, Microsoft, and AMD announced initiatives that will finally make ray tracing possible in real-time games, which means dazzling effects and much more immersive game worlds. Here's a look at what to expect, who's involved in this new push, and what they're bringing to the table.

What is ray tracing?

Ray tracing is a rendering technique that can produce incredibly realistic lighting effects. Essentially, an algorithm can trace the path of light, and then simulate the way that the light interacts with the virtual objects it ultimately hits in the computer-generated world. 

We've seen in-game lighting effects become more and more realistic over the years, but the benefits of ray tracing are less about the light itself and more about how it interacts with the world. 

Ray tracing allows for dramatically more lifelike shadows and reflections, along with much-improved translucence and scattering. The algorithm takes into account where the light hits and calculates the interaction and interplay much like the human eye would process real light, shadows, and reflections, for example. The way light hits objects in the world also affects which colors you see. 

With enough computational power available, it's possible to produce incredibly realistic CG images that are nearly indistinguishable from life. But that's the problem: even a well-equipped gaming PC only has so much GPU power to work with, let alone a modern game console.

Ray tracing is used extensively when developing computer graphics imagery for films and TV shows, but that's because studios can harness the power of an entire server farm (or cloud computing) to get the job done. And even then, it can be a long, laborious process. Doing it on the fly has been far too taxing for existing gaming hardware.

Instead, video games use rasterization, which is a much speedier way to render computer graphics. It converts the 3D graphics into 2D pixels to display on your screen, but rasterization then requires shaders to depict reasonably lifelike lighting effects. 

The results just don't look quite as natural or realistic as they would with ray tracing. The benefits of this technology probably won't seem individually mind-blowing, but the collective enhancements could really elevate the realism of interactive game worlds.

Who is working on ray tracing?

Microsoft is the biggest fish in this new video game ray tracing pond, as the company announced DirectX Raytracing (DXR) in the DirectX 12 API. They've created the structure for introducing and computing rays in the world, and have made it possible for developers to begin experimenting with the technology to see what's possible in their game engines.

And they're not alone: Microsoft has been working with several of the world's biggest game makers and game engine creators to help introduce ray tracing into PC games. Electronic Arts' Frostbite and SEED engines will be compatible, along with the ubiquitous Unreal Engine and Unity engine seen throughout the industry. 

Creators can get started right away, too, thanks to an experimental DXR SDK available now. Microsoft will share further insight at GDC 2018 this week.

Bringing ray tracing to life in games requires incredible GPU power, so unsurprisingly, Nvidia is also leading the charge. The company's RTX technology leverages a decade of work on graphics algorithms and GPUs, and they're working closely with Microsoft's DXR API to get developers up to speed quickly. 

According to Nvidia, "film-quality algorithms" and updates to their GameWorks API will deliver lighting, reflections, shadows, and related effects with a previously-unseen level of fidelity. And Nvidia's incoming Volta-class GPUs will be compatible, of course.

And AMD won't be left behind, either. They haven't shared as much as Nvidia yet as of this writing, but they have announced "real-time ray tracing" capabilities via their ProRender rendering engine and Radeon GPU Profiler 1.2. However, AMD's announcement seems less focused on games at this point, and more about improving developers' workflows and results with a blend of ray tracing and rasterization.

When will we see the benefits?

While AMD's approach doesn't seem targeted towards games just yet, that same sort of mixed approach is probably what we'll see in the gaming world to start. As Microsoft's official blog post suggested, DirectX Raytracing will "supplement current rendering techniques." 

In other words, it'll make some improvements over rasterization, but not fully replace it. Even tomorrow's GPUs probably aren't fully up to that task. But ray tracing will be another tool in game developers' toolkits, and one that will become more and more important over time. 

Microsoft suggests that ray tracing will gain more focus "over the next several years" for things that rasterization just doesn't excel in, including global illumination. "Eventually, ray tracing may completely replace rasterization as the standard algorithm for rendering 3D scenes," the post concludes.

That's a far-off possibility, but these are important steps in the right direction. Nvidia's new tech demos show that companies like Remedy Entertainment (Quantum Break) and Epic Games (Fortnite) are already learning the ins and outs of ray tracing and delivering dazzling results. 

It seems possible that we'll see ray tracing start to roll out in games running on high-end GPUs (like Nvidia's Volta series) in the near future, perhaps later in 2018. Nothing is concrete yet, however.

For now, though, it's exciting to think that this long-awaited ability is finally on the horizon – and it will only help bridge the gap in graphical fidelity between interactive worlds and the kind of incredible CGI work seen on the big screen. 

We're on the ground at the Game Developers Conference (GDC) in  San Francisco this week covering the latest in gaming, from mobile and  consoles to VR headsets. Catch up on all the latest from GDC 2018 so far!

  • Turing could be the name of Nvidia's next generation graphics cards


from blogger-2 http://ift.tt/2FUhN9O
via IFTTT

No comments:

Post a Comment