LEADIVO
Mission Briefing LEADIVO.TECHNICAL.CRAFT

Code as Paint. Constraints as Canvas. Our Technical Stack.

We don't just use tools; we build them. This is a technical dossier on the rendering pipelines, physics systems, and procedural algorithms that generate Leadivo's signature visual language—engineered for performance, designed for feel.

Inspect The Stack
Abstract wireframe and organic geometry render
RENDER: 0x7F2A POLYS: 10,240 SHADER: Custom PBR
Core Philosophy

We enforce strict technical budgets to force creative solutions, turning limitations into defining visual signatures.

Performance Target

60 FPS stability on mid-tier hardware (2023+) is the non-negotiable baseline for every core feature.

Integration Path

Our tools are designed as modular SDKs, not monolithic engines, ensuring minimal disruption to existing pipelines.

The Leadivo Stack

A modular breakdown of the proprietary subsystems powering our projects.

Request Full Documentation →
CR

Decision Lens: How We Choose Our Tools

Optimization Targets

  • Sub-16ms render frame time
  • Asset load under 500ms
  • Memory footprint under 2GB

Sacrifice Points

  • Dynamic shadows (baked preferred)
  • Complex LOD transitions
  • Per-pixel physics accuracy
MODULE 01 CORE

Phantom Engine (PBR+)

A modified Physically Based Rendering pipeline with custom BRDF for subsurface scattering. This simulates light penetrating fog and atmospheric dust, creating depth without real-time ray tracing. Built for Unity HDRP, it reduces draw calls by 40% via hybrid batching.

16ms
Frame Time
12
Passes
30k
Draws/Batch
MODULE 02 SYSTEM

Resonance Suite

Audio-driven animation and input compensation logic. Uses real-time FFT analysis to drive vertex shaders on objects (e.g., reactive force fields). Paired with a 16ms input buffer to compensate for OS-level latency, ensuring combo execution feels instant even on older mobiles.

FFT Analysis Input Buffer Haptic Mapping

“We don't sync audio to animation; we drive animation with audio. This creates a unified, felt system.”

— Technical Lead, Audio Systems

Atlas Generator & Builder CLI

A custom C# command-line tool that automates sprite sheet generation, texture compression, and asset bundling for deployment. Integrates directly with our CI/CD pipeline, reducing build times by 70%.

70%
Build Time ↓
CLI
Interface
⚠️ AVOID: USING HIGH-POLY MODELS AS A CRUTCH FOR WEAK SILHOUETTES.

Constraint as a Catalyst

We weaponize limitations. Our most defining styles are born from a single, strict technical rule.

The 10k Triangle Rule

Constraint: All environments under 10,000 polygons.
Result: Forced stylization. We use baked normal maps for high-frequency detail, creating a low-poly aesthetic that reads clearly at distance and scales perfectly to mobile.

Monochrome Palette (4H)

Constraint: Base scene uses only 4 hues.
Result: UI clarity is amplified. Contrast is readable in all lighting. Narrative signals are carried by light intensity, not hue variation.

Single-Source Lighting

Constraint: One dynamic light per scene.
Result: Dramatic, readable compositions. Shadows are a design element, not a technical overhead. Performance cost is predictable.

Vertex Displacement Only

Constraint: No rasterized effects (particles, post).
Result: All weather (rain, smoke) uses vertex shaders. This keeps geometry count low and is GPU-friendly, but requires solving motion purely with math.

The "Glitch" as Feature

Constraint: Intentional floating-point errors in mesh import.
Result: We get organic, unpredictable topological deformations that feel "lived-in" and narrative-appropriate for a corrupted digital world.

Texture Atlas Lock

Constraint: All objects share one texture atlas (4K).
Result: Forces texture reuse and UV coordinate creativity. Zero texture swaps. The trade-off is a more "painterly" look as we blend areas of the atlas.

CASE

Field Note: Project Chimera

Scenario: A freelance indie dev (Unity) needs to prototype a 3D platformer in 3 months with a team of two. The initial photorealistic assets looked promising but tanked frame rate on their target device (iPhone 12). They had the budget for one custom shader, not an engine rewrite.

Our Diagnosis

High-poly models with baked lighting. No dynamic elements. Weak silhouettes hidden by detail.

Constraint Applied

Enforced 5k triangle budget. Switched to 4-color palette. Used our Phantom Engine PBR+ for atmospheric scattering to hide polygon lack.

Trade-Off

Visual complexity sacrificed for readable silhouettes. Performance locked at 60fps. Development speed increased by focusing the team on optimization early.

Systems Analysis

The Physics of 'Feel'

Invisible timing and response systems that define game feel. We don't design what you see; we design what you perceive.

Abstract sound wave visualization
Audio-Driven Vertex Shader

Signal amplitude maps to displacement amplitude in real-time.

1

Hit-Stop & Screen Shake

A 2-frame freeze (33ms) on impact sells weight without disrupting flow. Screen shake is tied to camera relative velocity, not just events, preventing nausea in fast-paced scenes.

2

Input Buffering & Cancels

A 16ms input buffer on mobile allows for "ghost inputs" during complex animations. This lowers skill ceiling but drastically increases perceived control precision. The trade-off is accessibility vs. hardcore mastery.

3

Audio Latency Compensation

On mobile, audio can lag by 50-100ms. We predict audio events 2 frames ahead, using a 'pre-emptive' mixer. The player hears the impact before the physics animation completes.

4

UI Micro-Interactions

Every button press is a 12-frame (200ms) animation. Scale, overshoot, settle. This cost is budgeted per interaction. A non-interactive UI feels dead; a reactive UI feels "expensive" and trustworthy.