Amplify Virtual Texturing for Unity Pro

After a year or so of on/off development, this project finally reached a reasonably mature alpha state. What I’ve been working on is a Sparse Virtual Texturing extension for Unity.

Official Product Page

Amplify for Unity Pro Demo from Insidious Technologies on Vimeo.

Current features:
- Virtual textures up to 512K x 512K.
- Seamless integration with Unity Editor.
- Real-time WYSIWYG editing.
- Per-material diffuse+coverage, normal and glossiness textures.
- Per-material textures larger than 4K x 4K.
- Texture repeat / tiling.
- Trilinear filtering.

This project was built in partnership with Zona Paradoxal, Lda.

Fluid Simulation

Just came across a very cool looking fluid simulation using DX11/DirectCompute, by Jan Vlietinck. It solves solves the Navier- Stokes differential equations to simulate an incompressible fluid, using either a Semi-Lagrangian scheme or the second order MacCormack technique.

On the rendering side, ray marching on the 200x200x200 volume shows the amplitude of the maximum speed vectors:

A demo, including source code, can be found at the author’s website:
http://users.skynet.be/fquake/

CryEngine 3 – Global Illumination

Crytek finally revealed details about their diffuse global illumination technique. Papers and videos can be found here.

It seems to be loosely based on irradiance volumes, instant radiosity and photon mapping. Part of this clever approach was presented by Alex Evans back at Siggraph 2006, where he discussed fast lighting approximations using irradiance slices. Anton Kaplanyan, who developed this technique at Crytek, went even further by improving quality and tackling scalability issues.

Instant radiosity works by using virtual point lights to approximate global illumination. To get decent quality out of IR, hundreds (if not thousands) of virtual point lights need to be generated. Kaplanyan opted for reflective shadow maps, a GPU friendly way of generating VPLs. Due to high fillrate demands, however, even deferred lighting wasn’t fast enough for the huge number of VPLs required. This is where light propagation volumes (or radiance volumes) came in very handy.

VPL SH-based radiance info is injected into the radiance volumes using point based rendering. Light is then propagated in the volume using the computed outgoing radiance flux. During the lighting pass the volume textures can be sampled directly, anywhere in the scene, in order to generate the lighting contribution, at each point, from the SH coefficients. Check the paper for a detailed explanation of the process.

Normal mapped surfaces, and even glossy reflections, are supported. Cascaded volumes are used when dealing with larger scenes. To improve the quality for local, more high frequency details, this approach is combined with screen space global illumination.

The performance, even on consoles, is very good and fairly stable due to the nature of the technique. Quality should scale very well with memory/hardware.

All in all, a very fast, current-gen, console-friendly approach to diffuse GI for dynamic scenes.

Last day at Splash Damage

Yesterday was my last day at Splash Damage. It’s been a great ride but it’s time for me to move on. I miss my girlfriend, my family, the warmth of the motherland and an infinite urge to dedicate my peak productive years to personal endeavours. I’ve learned a lot over the past year and I would like to personally thank everyone at SD: I’m sure Brink will become a kick ass game, thanks to all your effort.

That said, I am currently “on holiday”, arranging my move back to Portugal. My personal work is on hold; I’ll get back to it as soon as I have some stability.

Feel free to contact me if you’re interested in discussing business opportunities.

Raytracing

In the last couple of weeks I began researching into high quality global illumination rendering. I finally started reading “Physically Based Rendering” by Pharr et al., which was kindly donated by Luis Alvarado. A few months ago I devoured “Realistic Image Synthesis Using Photon Mapping” by Jensen; it’s an excellent book. I’ve been wanting to do this for a while now because GI is growing more relevant to real-time rendering at an accelerated rate.

The first step was to build a framework to read and process scene information. I decided to go for FBX; turns out the sdk is a bit dodgy but functional. The next step was to build an acceleration structure and a raytracing core and this is what I’m working on right now. Here’s the first image with basic diffuse lighting and shadows from a point light:

cornellbox_basic

It took around 3.8 seconds at 140K rays/second (including shading). I have my Centrino Duo mobile CPU down clocked to 1GHz because of overheating; this will force me to optimize the raytracing core before I can start investing time on something like path tracing. The next step is to replace the Octree with a KdTree or BVH with both mono and packed ray traversal. My goal is to achieve a minimum of 2 million rays/second per-core on the same scene.

Real-time photorealistic rendering

Randomcontrol, the makers of fryrender, recently announced “fryrenderRT” the first commercial unbiased render engine with real-time visualization capabilities. They provide a player app that allows you to move the camera freely within a pre-computed scene and still have view-dependent glossy surfaces behaving accurately; pre-computing scenes with unbiased lighting may take up to hours or even days.

Here’s a sneak peek showing off glossy materials:

From what I’ve gathered it seems to work by storing global lighting information at each point (vertex/texel), much like the Precomputed Radiance Transfer techniques currently used in games. These tend to focus on Spherical Harmonic encoding due to efficient representation of low frequency lighting; it’s possible that RC’s technique is similar but instead based on encoding of nonlinear wavelets or gaussians which have been shown to work well for all-frequency relighting.

What does this mean for games?
- The video above runs on a mid-range ATI 4850 at 20 frames per second which is VERY promising.
- Static geometry is also a limitation for light maps.
- The high offline rendering times already exist when developing games that rely on high-quality directional or SH-based lightmaps. However, a biased version of the offline renderer could help reduce the hardware costs and generate good enough results for most games.
- Local memory requirements are certainly much higher than lightmaps. However, more compact representations trading quality for memory footprint could be used. e.g. partial wavelet coefficients.
- Local memory on GPUs will eventually outpace or be merged with system ram. A virtual page-based approach could also be used to help keep most of the data in disk.

I believe this technique has potential for use in games in the near future. These could very well be the next generation light maps.

Instant Radiosity

I’ve been following the state of real-time Instant Radiosity for a while now. I see it as one of the best options for a gpu-friendly transition to more complex global illumination techniques like photon mapping. Unfortunately I never got around to implement it myself in order to verify it’s feasibility. Today I stumbled upon one of the best examples so far:

Flavien Brebion, aka Ysaneya, is the guy behind Infinity. He is also the guy behind the bold attempt at Instant Radiosity in the shots above. In his journal he talks about the technique and provides some statistics on its effectiveness when combined with deferred lighting. Follow this link to read all about it.

Compilation woes

These past few days I’ve been patiently experimenting with Mono. Getting libmono to compile on Visual Studio was straightforward. However, if you’re compiling from the trunk be aware that the win32 build gets broken every once in a while.

Compiling the class libraries, however, is a different kind of challenge. After installing cygwin and following all the steps in this tutorial, I get compiler errors for undefined symbols in a few libraries. Of course, if the mono daily-builds were operational I wouldnt have to compile them myself.

In the meantime I’ve just downloaded the Unity 2.5 trial to see what all the fuss is about. It uses Mono as a scripting platform, something I’m looking to achieve for my development framework.

Final Skin shader

As promised by O2, my broadband internet was activated on friday. I’m done with the skin shader but still haven’t given up on Mono.

I took the liberty to take some screenshots of the several steps required for the final composition. The only step I didn’t bother to implement was the translucent shadow maps which are used to simulate light transfer through thin regions such as ears. The RenderMonkey project is available here; it includes the media files created by johny.

The diffusion profiles:

diffusion_profiles1

The composition of diffuse, sum of diffusion profiles and specular:

composition_steps

Comparison of a basic shader (left) with the skin shader (right):

comparison21 comparison3 comparison1

Final result in high resolution:

skinbigspec

Note that the seams are back because I added support for shadow maps and that basically killed my initial workaround. This issue could be addressed through the use of a spherical parameterization.

In the meantime

It’s been a few days now since I finally purchased an iPhone. Even after 6 months in the UK, for some reason, I wasn’t eligible for an iPhone contract with O2 but I still managed to set up a phone line with BT and broadband with.. yes.. O2. Had no choice but to purchase a Pay and Go version and I have to say it is worth every penny.

I haven’t been working a lot at home lately but the skin shader is practically done. If I get broadband till the end of the week then I shall roll out the RenderMonkey project (yes, decided to stick with it) this weekend.

←Older