Realtime GPU path tracing, denoising with OIDN (Intel Open Image Denoise)

  c++, glsl, graphics, opengl, raytracing

I’m experimenting with path tracing and denoising. I’m trying to achieve a noiseless image in real time. The path tracer itself runs on the GPU (OpenGL/GLSL). I decided to try using the OIDN denoizer, but I ran into such a problem that the denoizer runs on the CPU only, and I don’t understand how to switch to GPU. Is this even possible?

This is how I initialize the denoizer

// Init device
g_denoiserDevice = oidnNewDevice(OIDN_DEVICE_TYPE_DEFAULT);
oidnCommitDevice(g_denoiserDevice);

// Calculate size
unsigned frameSizeBytes = 4 * 3 * width * height;

// Create buffers
g_denoiserBufferColor = oidnNewBuffer(g_denoiserDevice,frameSizeBytes);
g_denoiserBufferAlbedo = oidnNewBuffer(g_denoiserDevice,frameSizeBytes);
g_denoiserBufferNormal = oidnNewBuffer(g_denoiserDevice,frameSizeBytes);
g_denoiserBufferResult = oidnNewBuffer(g_denoiserDevice,frameSizeBytes);

// Create filter
g_denoiserFilter = oidnNewFilter(g_denoiserDevice,"RT");
oidnSetFilterImage(g_denoiserFilter,"color",g_denoiserBufferColor,OIDN_FORMAT_FLOAT3,width,height,0,0,0);
oidnSetFilterImage(g_denoiserFilter,"albedo",g_denoiserBufferAlbedo,OIDN_FORMAT_FLOAT3,width,height,0,0,0);
oidnSetFilterImage(g_denoiserFilter,"normal",g_denoiserBufferNormal,OIDN_FORMAT_FLOAT3,width,height,0,0,0);
oidnSetFilterImage(g_denoiserFilter,"output",g_denoiserBufferResult,OIDN_FORMAT_FLOAT3,width,height,0,0,0);
oidnSetFilter1b(g_denoiserFilter,"hdr",false);
oidnCommitFilter(g_denoiserFilter);

// Create OpenGL texture for denoiser output
g_denoisedFrameTexture = new gl::Texture2D(nullptr,width,height,gl::Texture2D::ColorSpace::eRgb,gl::Texture2D::eBilinear,false,GL_FLOAT);

This is how I use the denoizer

// Calculate size
unsigned frameSizeBytes = 4 * 3 * frameBufferPtr->getWidth() * frameBufferPtr->getHeight();

// Map input image buffers
void* colorBufferPtr = oidnMapBuffer(g_denoiserBufferColor,OIDN_ACCESS_READ_WRITE,0,frameSizeBytes);
void* normalBufferPtr = oidnMapBuffer(g_denoiserBufferNormal,OIDN_ACCESS_READ_WRITE,0,frameSizeBytes);
void* albedoBufferPtr = oidnMapBuffer(g_denoiserBufferAlbedo,OIDN_ACCESS_READ_WRITE,0,frameSizeBytes);

// Bind framebuffer
glBindFramebuffer(GL_FRAMEBUFFER, frameBufferPtr->getId());
// Copy data from framebuffer attachments to denoiser buffers
glReadBuffer(GL_COLOR_ATTACHMENT0);
glReadPixels(0,0,frameBufferPtr->getWidth(),frameBufferPtr->getHeight(),GL_RGB,GL_FLOAT,colorBufferPtr);
glReadBuffer(GL_COLOR_ATTACHMENT1);
glReadPixels(0,0,frameBufferPtr->getWidth(),frameBufferPtr->getHeight(),GL_RGB,GL_FLOAT,normalBufferPtr);
glReadBuffer(GL_COLOR_ATTACHMENT2);
glReadPixels(0,0,frameBufferPtr->getWidth(),frameBufferPtr->getHeight(),GL_RGB,GL_FLOAT,albedoBufferPtr);

// Unmap input image buffers
oidnUnmapBuffer(g_denoiserBufferColor, colorBufferPtr);
oidnUnmapBuffer(g_denoiserBufferNormal, normalBufferPtr);
oidnUnmapBuffer(g_denoiserBufferAlbedo, albedoBufferPtr);

// Denoise
oidnExecuteFilter(g_denoiserFilter);

// Map output image buffer
void* resultBufferPtr = oidnMapBuffer(g_denoiserBufferResult,OIDN_ACCESS_READ_WRITE,0,frameSizeBytes);
// Copy to texture
g_denoisedFrameTexture->setTextureData(resultBufferPtr,GL_RGB32F,GL_RGB,GL_FLOAT);
// Unmap
oidnUnmapBuffer(g_denoiserBufferResult, resultBufferPtr);

And here’s what I got

https://youtu.be/Bc0AFGilpdM

As you can see, the denoizer is very slow. I tried changing OIDN_DEVICE_TYPE_DEFAULT to OIDN_DEVICE_TYPE_CPU and nothing changed. It looks like the CPU is being used anyway. There are also questions about quality, but that’s later.

I didn’t build the OIDN from source, but just took the precompiled binaries for Windows.
Is it even possible to use OIDN for real-time path tracing? If so, how?

Source: Windows Questions C++

LEAVE A COMMENT