• Modeling & Simulation

Obsidian Labs pioneers AI/ML-enhanced simulations and real-time decision-support environments that replicate complex operational realities. We build high-fidelity immersive training systems and digital twins to boost readiness across multi-domain operations, while adaptive interfaces, intelligent alerting, and situational cueing provide cognitive augmentation for operators.

Our agent-based modeling enables operational planning, multi-threat wargaming, and optimized responses in dynamic, high-stakes battlespaces. Obsidian Labs delivers adaptive user interfaces, intelligent alerting, and advanced modeling to augment cognition, support wargaming, and drive superior decisions in complex, contested environments.

AI-Powered Simulations for
Multi-Domain Readiness

We develop advanced AI-driven simulation and decision-support solutions. We create immersive training environments and digital models that help organizations prepare for complex operations—using adaptive interfaces, intelligent alerts, and modeling tools to enhance insight, planning, and performance in dynamic environments.

Obsidian Labs advances medical readiness through VITAL—a mission-ready mixed-reality manikin system purpose-built to elevate Tactical Combat Casualty Care (TCCC) and emergency response training. Engineered with a modular architecture and open systems framework, VITAL enables rapid reconfiguration to support mission-specific requirements, diverse physiological profiles, and complex casualty scenarios. The platform delivers high-fidelity training aligned to operational realities across military and civilian domains.

By integrating tactile feedback with immersive mixed-reality overlays, VITAL replicates complex injuries, treatment protocols, and dynamic physiological responses. Embedded gesture recognition, pressure sensing, voice command capability, and adaptive learning modules enhance realism and enable scalable, performance-driven training. From austere battlefield care to large-scale emergency response, VITAL strengthens force readiness, resilience, and lifesaving capability in demanding operational environments.

Obsidian Labs delivers advanced AR/VR-enabled training and simulation solutions to enhance operational readiness across mission-critical domains. Our immersive platforms support specialized instruction in firefighting, aircraft incident response, munitions handling, and emergency medical operations—equipping personnel to execute effectively in high-risk, time-sensitive environments. At the core of this capability is VERTEX (Virtual Emergency Response for Tactical Exercises), an agent-based simulation platform engineered for high-consequence scenarios. Designed to support the Department of the Air Force and mission partners, VERTEX provides dynamic, scenario-driven environments for technical, tactical, and strategic-level training. By integrating immersive simulation with data-driven insights, Obsidian Labs strengthens coordination, decision-making, and mission effectiveness across defense and homeland security operations.

This video presents a comprehensive demonstration of a mixed reality training prototype developed for nuclear emergency responders, highlighting how immersive technologies can transform preparedness and response capabilities. The prototype integrates virtual and physical elements to simulate high-risk nuclear incident scenarios in a safe, controlled environment, allowing trainees to practice critical procedures without real-world danger. Through interactive simulations, users can engage with realistic hazards, equipment, and environmental conditions that mirror the complexities of an actual nuclear emergency. The demonstration showcases how the system enhances situational awareness, decision-making, and coordination among response teams, while also providing opportunities for repeated practice, performance feedback, and skills refinement. By combining advanced visualization, real-time interaction, and scenario-based learning, this mixed reality approach represents an innovative step forward in training methodologies for nuclear emergency response.

This video demonstrates a high-fidelity UAS flight simulator, rapidly prototyped in a 10-day sprint to address urgent operational needs. The system provides precise, real-time replication of drone flight dynamics—supporting operator training, mission rehearsal, tactics refinement, and platform assessment in contested environments.

Core to its performance is hardware-in-the-loop integration with a production flight controller operating at 1000 Hz, delivering sub-millisecond control-loop fidelity that accurately models vehicle response, sensor fusion, stabilization, and autopilot behavior under realistic stress.

Seamless interfacing with the TX16S transmitter ensures authentic pilot inputs, tactile feel, and timing—preserving muscle memory and procedural discipline for direct transition to live operations. The environment leverages optimized high-resolution Google Tiles geospatial data in a gaming engine, balancing visual fidelity and spatial accuracy for nap-of-the-earth, urban, ISR, and multi-asset scenarios while retaining headroom for future enhancements like sensor simulation, degraded visuals, EW effects, or swarming.

Delivered under aggressive defense timelines, this effort combines disciplined systems engineering, agile execution, and rigorous verification to field a robust, warfighter-grade training asset that drives operator qualification, risk reduction, and mission readiness in high-threat theaters.