MAK ONE
A powerful and flexible Computer Generated Forces (CGF) platform to fill your synthetic environments with urban, battlefield, maritime, and airspace activity. VR-Engage lets users play the role of a first-person human character; a ground vehicle driver, gunner, or commander; or the pilot of a fixed-wing aircraft or helicopter. Game-like visual quality in a high-performance image generator. Designed by modeling & simulation experts for training and simulation projects. Physically accurate sensors, model the physics of light in any wavelength to represent electro-optical, night-vision, and infrared sensors. Applications that let you model, simulate, visualize, and participate in whole-earth multi-domain simulations. Multi-domain computer-generated forces. Multi-role virtual simulator. Image generator & battlefield visualization. EO, IR, and NVG imaging sensors. Synthetic aperture radar simulation.
Learn more
Ansys VRXPERIENCE Driving Simulator
Discover an open, scalable and modular virtual driving simulator that enables testing against a variety of objectives and performance requirements. Ansys VRXPERIENCE Driving Simulator Powered by SCANeR™ enables you to assemble scenarios, test software, consider vehicle dynamics and experience sensors within a virtual driving environment. It enables a fully virtual driving lab for analyzing performance results. VRXPERIENCE Driving Simulator offers an immersive simulated test-drive experience set within a representative world. Perform exhaustive safety assessments to drive millions of virtual miles in days and accelerate development by 1,000x compared to physical road-testing. As passenger automobiles become more digitalized and more autonomous, they require a wide range of advanced technologies that include sensors, such as cameras, radar and lidars, as well as embedded software supporting automated control systems.
Learn more
MuSES
Electro-optic and infrared renderings in MuSES achieve global benchmark accuracy through a stepwise workflow – beginning with heat sources such as engines, exhaust, bearings, and electronics, followed by an in-band diffuse radiosity solution. Then place your sensor at range and render multi-bounce, spectrally-summed radiance values, with DeltaT-RSS contrast metrics. Sensor response curve handy? Import it and see what you’ve been missing. With MuSES, you have reality at your fingertips. Because MuSES covers physics all the way back to heat sources and environmental loads, you can manage thermal signature contrast and evaluate control kits for low observable design in any global location. Heat shields, cooling schemes, and even camo surface treatments can be tested and evaluated for in-band radiance, replete with atmospheric attenuation along sensor line-of-sight. Triage and prioritize your engineering with MuSES early in the development cycle.
Learn more
Parallel Domain Replica Sim
Parallel Domain Replica Sim enables the creation of high-fidelity, fully annotated, simulation-ready environments from users’ own captured data (photos, videos, scans). With PD Replica, you can generate near-pixel-perfect reconstructions of real-world scenes, transforming them into virtual environments that preserve visual detail and realism. PD Sim provides a Python API through which perception, machine learning, and autonomy teams can configure and run large-scale test scenarios and simulate sensor inputs (camera, lidar, radar, etc.) in either open- or closed-loop mode. These simulated sensor feeds come with full annotations, so developers can test their perception systems under a wide variety of conditions, lighting, weather, object configurations, and edge cases, without needing to collect real-world data for every scenario.
Learn more