GPU Path Tracer

// //


During undergrad at Penn, I worked on a GPU powered path tracer.  I learned quite a bit during the project, and below I'll outline some of the features that I implemented for this project.

The first major part of the project was to get the framework setup properly.  The basis of any ray-tracing enabled renderer is determining intersections between the rays that you fire and the scene itself.  To keep things simple during testing, I ran the renderer and as soon as a ray hit an object, I returned that object's surface color.


My initial test scene was a cornell box with some spheres placed in it.  Once I had the intersection testing finished, I started working on a simple intregation scheme.  Namely, direct lighting.


Since I knew I eventually wanted to have a path tracing renderer, I kept the BRDFs limited to diffuse during this iteration.  Looking back, this step wasn't too important in the grand scheme of things, but I was comfortable with simple direct lighting ray tracing, and wanted to get my feet wet in CUDA with something familiar.

Next, I swtiched the integration scheme to full path tracer.



The first thing you'll notice is that the shadows in this image are quite a bit brighter than the ray traced image.  This is because light rays are not limited to a single bounce in this integration scheme.  Rays can bounce around an arbitrary numbers of times, until they hit either a light or nothing.  This allows surfaces to interact with each via light transportation.

With the simple path tracer up and running, I started working on some BRDFs to make the scene more interesting.


The walls still use a diffuse BRDF, while the spheres have some that a bit more interesting.  The red sphere has Cook-Torrance microfacet specular highlights, which involves using a model that pretends the surface of the object has tiny micro facets covering it.  This gives a rougher highlight than Blinn or Phong shading.  The two other spheres have Fresnel reflections applied to them.  That BRDF involves mixing pure reflection and pure refraction based on the incident angle of the incoming ray.  At the time, I was extremely happy with this image.  But, in hindsight, there are some problems.  The caustic on the red wall is much too bright to be physically realistic.  This problem was present throughout development, and it took quite a bit of digging to find the root cause.  As it turns out, the random seed generator I was using was rolling over if the renderer ran too long.  Here's an extreme example of the problem and it's fix:



When I got that top render back needless to say I was pretty concerned.  After some digging I eventually narrowed down the problem to the seed generator, and was finally able to fix the problem.

One other utility that I implemented for a performance boost is something called stream compacting.  I talk about it in this blog post.  Basically, it involves storing light rays in a slightly different way, but really speeds things up.

Overall I am pretty happy with this project.  I learned a lot about both rendering and GPU programming, which is what I set out to do with this project.  The renderer itself is still woefully simple.  The images are overall extremely noisy, due to the fact that this is still purely brute force with no importance sampling.  Furthermore, the only acceleration structure is simple per object bounding boxes.  Adding in support for a kd-tree of bvh tree would speed the renderer up quite a bit.