logo

Show HN: Emotive Engine – I wrote 8 elemental shaders to prove one pattern works

Posted by emotiveengine |3 hours ago |1 comments

emotiveengine 3 hours ago

Started this about 6 months ago. Wanted an animated character that could react to conversation, not just sit there as a static avatar.

First version was Canvas 2D. Shape morphing and particles. Worked fine for basic emotions but felt limited, so I added a WebGL 3D renderer with custom shaders, bloom, AO, a post-processing chain.

About 6 weeks ago I started the elemental system. Fire first. FBM noise, decoupled color from alpha so additive stacking looks natural instead of washing out. Then water (screen-space refraction, spray particles). By the third element (ice — Snell's law, Voronoi crack lines, chromatic dispersion), the architecture was clearly repeating: factory + instanced GPU material + overlay shader + gesture configs + registration hook. Same five pieces every time.

So I kept going. Electricity, earth, nature, light, void. Eight total in about 6 weeks. Wanted to see if the pattern held or if I'd been fooling myself.

161 gestures later it held. Each gesture is about 20 lines of config composing from archetypes. The factory does the rest.

Shuffle button in the demo shows all 161. Press G for the GPU monitor. `window.profiler` is exposed if you want to poke around.

Happy to talk about any of the shaders, the rendering pipeline, or things I got wrong along the way.