https://oseiskar.github.io/webgl-raytracer/
The idea is to create a framework for composing different WebGL raytracers from shared pieces of GLSL fragment shader code. Different combination of rendering methods, material models, cameras and data structures may be used.
Utilizes my glsl-bench library (included as a submodule) & TWGL for the WebGL boilerplate, for example, everything not related to "worlds with two triangles".
npm install
npm run build
# npm run watch # for hot reloading
python3 -m http.server 8000 --bind 127.0.0.1 # or similar
# then go to http://localhost:8000/
# TODO: hacky
git checkout gh-pages
git reset --hard A_COMMIT_SHA_WITHOUT_BUNDLE_JS
git rebase main
npm run build
git add .
git commit -m "Deploy"
git push -f
- reflection, refraction, fresnel... the usual stuff
- some basic surface types: box, sphere, plane
- distance fields
- 1-light-vertex bidirectional path tracer
- GGX microfacet model for specular highlights
- thin-lens, pinhole and orthographic camera models
- tent reconstruction filter
- image-based and procedural textures
- fog
- tone mapping / gamma correction (sRGB)
- faster subsurface scattering
- cylinders and cones
- constructive solid geometry
- triangle meshes & octrees
- interval arithmetic implicit surfaces
- spectral color model (enables dispersion)
- reconstruction filter for blurring highlights
Miscellaneous interesting free online material on raytracing
- Physically Based Rendering. A book that won an Oscar (really)
- Eric Veach's PhD thesis from 1997 introduces bidirectional path tracing
- SIGGRAPH material, e.g., 2012, 2013
- https://agraphicsguy.wordpress.com/
- http://www.codinglabs.net/article_physically_based_rendering_cook_torrance.aspx