I’m a rendering engineer and working on AAA games graphics for many years.
However, this project’s focus is not on rendering quality at all. We want to make it fun and comfortable first. Everything is WIP, stay tune.
[Play test download] Currently, only Wave1-Wave4 are playable.
[Control]
We don’t know what’s the VR controller looks like yet, just want to avoid the traditional control as much as we can.
Only one button required for playing this game, Mouse LB or Xb controller A
Use your head rotation for aiming
the mouse rotation works, but it is just for debug purpose, Don’t use it to play the game, will feel uncomfortable.
]]>
We have random variable , it has pdf (probability density function) p(x),
we have known
And according Monte Carlo estimator, if we want evaluate function f(x) by samples from p(x), we got
where is sample from
Let’s say we want( or can only) to sample from another distribution function q(x), but we still want to get the value by Monte Carlo method, what we should do?
For clear, we use another random variable , its pdf q(x)
A little trick
it just looks like we want evaluate function by samples from q(x), using the new random variable , we have
According Monte Carlo like above, we should have
so we have
where is sample from
this is what importantce sampling does, it uses samples from another distribution funtion, but still evaluate the old function for old random variable.
2. why Importance sampling can do better
let’s caculate variance on
so it we select q(x) smartly, we can get a much smaller
]]>
In the original VSM paper，William and Andrew derived the MAGIC result from probability theory:
The derivation looks very beautiful, but not so easy to understand , especially the idea to treat z as a random variable, so here I’ll try to explain it by middle school math, then we can kill the uncomfortable MAGIC feeling.
Firstly, let’s back to the problem why VSM had been invented : Soft Shadow.
In the real word，soft shadow are generated with a physic explanation（eg area light），
but visually, Soft shadow just means a smooth transition between shadowed area and unshadowed area.
so there are many methods to fake it in computer graphics, VSM is just one of them ( note: even with beautiful math, VSM is not a physic correct algorithm at all).
In the above graph, we have 2 neighborhood texels in shadow map corresponding to 2 light rays, their shadow depth value are Z1 and Z2,.
According shadow map algorithm, we can know pixel A is in shadow and pixel B is lit, what we care about is the area between A and B,
If applying regular shadow test, a sudden shadow density changing must happen at somewhere between A and B. Even we can use bilinear filter to fetch shadow, the smooth input value doesn’t generate a smooth output value because shadow testing result is a binary value. That is why we said shadow map is not filterable.
Let’s see what will happen if we apply the MAGIC shadow testing formula from VSM.
First, let’s introduce a interpolating variable t, t will be changing linearly (0-1) from pixel A to pixel B.
Then with linear filtering, between A and B, we can write:
So
here, we define:
Looks still mess, right?
Let’s see 2 special cases :
when receiver surface perpendicular to light direction (see reciever Rp)
For this scenario , we can easily know ( between t=0 and t=1)
So
Then final result:
So for this case, the shadow result will follow the most simple smooth transition when t change from 0 to 1 : Linear transition.
When Shadow Reciever are on interp(z) ( see the red dash line in above picture).
According
we can know immediately:
Combine the two special cases，and the property of that Pshadow is a Monotonic function decreasing function to D （see Appendix）
For the area between interp(z) (red dash line) and Reciever Rp, we have：
we can say: PShadow is Monotonic changing between t and 1 along D decreasing. it’s a smooth transition too.
For the portion below Rp, the situation is same but the PShadow < t.
From the result, we can say ShadowMap filtering start to make sense, since a interpolated shadow depth can generate a shadow value between 0 and 1. that’s why we say VSM is filterable.
we can go one step forward, if we blur the shadow map texture, the same transition effect will expand from 2 neighborhood shadow depth texels to more adjacent texel, then it results a more smooth shadow transition, that’s why we say VSM is pre-filtable.
Check the original define:
since if t is fixed,
so if D is increasing, then d is increasing, then Pshadow is decreasing,
Then we can know Pshadow is a Monotonic decreasing function to D.
PS. the original VSM paper can be downloaded at :
VSM “Variance” http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.104.2569&rep=rep1&type=pdf
]]>Actually artists(especially, shader/FX artists) need Profiler too. unlike programmers who always want to find the reason behind of bad performance. What artists need is a tool which can verify their idea quickly, if the performance doesn’t become good, then they go for another idea.
Obviously, Tools like pix can’t help them.
When I was working on Tom Clancy’s End War, There was a big complain raised from ParticleEffect artist and Shader artist. It isn’t so easy to know the content performance on console, and profiler like pix isn’t powerful enough and not suitable for artists. There are several big problems with current profile tools.
Step1. Artist Modifies data(Shader/Particle System)
Step2. Cook the data into Xbox360 format (usually slow)
Step3. Launch the game on console and find the stuff we want to profile ( slow)
Step4. Capture the scene
Step5.Analysis and find the draw call (slow)
Then if artist change the data again, we need repeat step1-5, the whole process takes so long that no artists want to do it.
However, this is so important for artist to optimize data. Both Shader Editor and Particle Editor are highly flexible, there are too many parameters and options, after combination, even engineer can’t predict what the performance is going to be. The only way is “profile it”, but sadly, too painful.
With those reasons, even artists want to optimize the data, it’s too difficult. so all optimize/analysis task went to programmer side. Apparently, Engineer can’t do it too much, the efficient is low…
For some reasons, we never have a chance to do it in EndWar. So I put the idea into the next project, I’ll show some design here. The profiler has two parts.
1) On-The-Fly Profiler: it’s a separated app, we run it on console (Xbox360） when we are editing content. it supports:
Actually, there is no real tech blocker, finally,my shader profiler like this:
By this method, Artists can really have “ Editing on the fly and profiling on the fly“, it can automatically show the result on console screen.
2)In Engine Profiler for batch processing: Create Test case in game engine, finish the batch testing in the night build. so we can find the most expensive/problematic content quickly.
In my implementation,
Particle Profile results like this.
Shader Log tool output like this, basically it’s a txt file which contains performance data, you can use any tools to sort and analysis it.
There is no much difficult about techniques, but there are 2 points may save you some time.
Not like traditional particle system which calculates everything on the fly, we have a pre-calculation stage (Baking).
In Baking stage, since we don’t need worry about CPU performance problem, we can do a bunch of stuffs that the real-time one can’t do.
After baking, we recording necessary information into vertex stream, playback it in runtime.
Here is a example video, for complex collision effects.
[demo video ：Crazy bounce]
[demo video ：Multi sprite Lighting]
Following is a PPT when I introduce the tech to my team internally, it give a briefing how the tech works, I’d like to share it with you guys
PPT download
]]>what I shared here is for open space shooter engine. especially designed for Xbox360.
It supports:
I share the ppt here, it come from one of internal presentations I did before. the target audience is for game developer ( it also introduced some basic concepts for helping non-graphic programmers understanding).
hope it can help your own shadow system
Download PPT Cascade VSM Shadow
]]>For current generation game console, they all chose PowerPC as CPU architecture.
Even PowerPC framework is more clear and understandable than x86, it’s still complex if you want to know everything.
Fortunately，for most cases, what we only need to know is ：CPU CACHE
This is a presentation I did in our team internally, it explain some terms and concepts:
Click follow Image to see sliders .(http://portal.sliderocket.com/AOQDX/PowerPC-CPU-cache)
]]>