Fast direct and deferred atmosphere rendering with GLSL 4.1+
Volumetric rendering is typically slow. Sending raycasts through voxelfields, upsampling the results and merging the results with the final image is costly. I present two methods with reasonable computation times and believable results.
Direct Rendering
The direct rendered atmosphere is based on billboard sprites. This means that many small 2D images are drawn to fake the effect of smoke particles. I’m using several different textures to create these images.
(Compressed) Irradiance maps
Irradiance maps are used to receive the emissions from indirect (nonscene graph) light sources. They store the lambertian part of a physically based calculation.
Noise
Noise maps are used to simulate randomness. However the values (the shades of gray 🙂 ) aren’t completely independent and can be used to simulate chaotic phenomena like clouds or terrain.
Gauss
To calulate the likelihood of entangled events the gauss curve can be used.
Texture Creation
Usually all textures are created in realtime from my library. For example the irradiancemap is part of a lightprobe and created with:

renderLoop.addLightProbe(Vector3f); 
Lightprobes are part of the scene graph and are bound to the nearest objects.
Also all images are rendered in HDR color range (Therefore the images presented in this post are gamma corrected).
I added a small plotter library for the pattern generation. The gauss curve was created like this:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26

for (uint y = 0; y < h; ++y) { for (uint x = 0; x < w; ++x) { const float boundary = 3.16f; const float laplaceMax = 0.707107f; const float blr = boundary / laplaceMax; const float xR = YAProbability.mapToProbRange(x, 0.0f, w); const float yR = YAProbability.mapToProbRange(y, 0.0f, h); Vector2f dp = Vector2f(xR,yR); Vector2f cp = Vector2f(0.5f,0.5f); const float d = dp.distanceTo(cp) * blr; const float gSA = YAProbability.normcdf(d,0.0f,1.0f); // mean, deviation const float gSB = YAProbability.normcdf(d,0.0f,0.7f); const float gSC = YAProbability.normcdf(d,0.0f,0.5f); // HDR Stuff Rgba &p = px[y][x]; p.r = gSA; p.g = gSB; p.b = gSC; p.a = 0; } } 
It is also possible to use programming languages like Julia or GNU Octave for texture generation.
The Vertex Shader
As already mentioned the atmosphere particles are rendered as cameraview aligned planes. I learned a nice trick from Datenwolf to get rid of the annoying normaleyedirection alignment:

// undo the rotation vec3 undoRot = (clientPosition.x * clientRatio) * vec3(vsMVP[0][1], vsMVP[1][1], vsMVP[2][1]) + clientPosition.y * vec3(vsMVP[0][0], vsMVP[1][0], vsMVP[2][0]); vsPosition = (vsModel * vec4(undoRot, 1.0)).xyz; 
The mesh rotation and the scale is exported from the projection matrix and reversed.
Alternatively, if you have screenspace access from your engine you can directly use these coordinates:

gl_Position = vec4( (((clPosition.x + clTranslation.x) / clScreenSize.x)  0.5) * 2, (((clPosition.y + clTranslation.y) / clScreenSize.y)  0.5) * 2, 0 , 1); 
The Normal
The plane normal can immediately be derived from the texture coordinates:

vec3 fakedNorm = (vec3(1.0f * (vsTextureCoord.y  0.5f), (vsTextureCoord.x  0.5f), 0)); 
The Fragment Shader
At first I’m limiting the surface to a sphere:

float r = length(vsTextureCoord  0.5f); if(r > 0.5f) discard; 
With the surface normal and camera position it is already possible to render a virtual sphere(the volume where the atmosphere simulation starts):
To calculate the likelihood of a light scatter event the gauss value is used:

float gauss = texture(tMAP_00, vsTextureCoord).channel; 
Finally the color is created with a phong Ambient Diffuse calculation:

FragColor = vec4(ambient + diffuse * (1.0f  (gauss + noiseInfluence)) , eta * gauss * noiseInfluence); 
Deferred Rendering
The Advantage of deferred against directrendering is the possibility to use screenspace raycasting and with it the use of depthfields or ray/object intersections. This allows, that the atmosphere can directly interact with all other visible scene elements. The depth and distances of objects can be taken into account.
Example of a deferred rendered atmosphere
In this example the clouds are partly occluded by the sphere. Light scattering is visible at the sphere borders:
The Vertex Shader
the vertex shader just covers the screenplane. the texture coordinates are normalized:

void main() { vsTextureCoord = clientPosition * 0.5f + 0.5f; gl_Position = vec4(clientPosition,1,1); } 
If necessary, the inversed projection (GLSL: inverseMatrix) can be used to calculate the screen world space positions.
The Fragment Shader
Instead of aligned planes the fragment shader now calculates the density of a depth field. Spheres are described by the center position and their dimensions.
If you know the PBRT Raytracer you will immediately recognize my implementation of a sphere:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18

void sphereLight(vec3 org, vec3 dir, out float near, out float far) { near = 0; far = 0; float b = dot(dir, org); float c = dot(org, org)  0.25; float delta = b*b  c; if( delta < 0.0) return; float deltasqrt = sqrt(delta); far = b + deltasqrt; far *= float(far > 0.0f); near = clamp(b  deltasqrt, 0.0f, far); } 
ScreenSpace refractions
If a ray hits a atmosphere its properties can be used to calculate a refraction:

vec bendRay = refract(workEyeDir, vsNormal, material.glass); 
Finally this result can be mixed with the a volume noise function: