Our VR solutions let you record and analyze physiological, behavioral, and subjective response data in realistic, immersive environments that would be impossible or prohibitively expensive in the real world. Systems are provided to meet specific research needs and lab space for single or multiple users: participants can be seated, standing, walking or viewing a projection.
VizMove is a turnkey hardware & software solution that can be customized to fit specific use cases.
VizMove is available on its own or with a biofeedback utility to tightly synchronize VR world events.
Psychophysiology, consumer research
Rehabilitation, ergonomics
Biomechanics & kinesiology
Neuromarketing & neuroeconomics
Seated VR | Compact & Precise: Computer, Vizard software, VR headset, Gamepad controller
Standing VR | Precision tracking: Computer, Vizard software, VR headset, 2-sensor PPT motion tracking with wand (up to 3m2)
Walking VR | Wide-area Motion Tracking: Computer, Vizard software, VR headset, 4-sensor PPT motion tracking with wand (up to 7m2)
Projection VR | Stereoscopic multi-user experience: Computer, Vizard software, two proj
ectors & five 3D glasses, 3-sensor PPT motion tracking with wand
Get all the benefits of wireless with the quality & integrity of a wired system. The MP160 data acquisition and analysis platform with AcqKnowledge software provides a complete, wireless in-lab solution that supports advanced analysis. The MP160 can record up to 16 channels of data for multi-subject or multi-parameter protocols for multiple applications. These VR Systems get you started with two dual-channel BioNomadix transmitter-receiver sets for four channels of your selected biometrics
The biofeedback link uses AcqKnowledge software with Network Data Transfer and a proprietary utility to tightly synchronize VR world events with real-time physiological response data, allowing you to change the environment in real time, based on the participant’s responses.
Request InfoWith our great available VR products, researchers can obtain and analyze various types of stimulus-response data across multiple disciplines and run automated routines to extract measurements in tabular format for various statistical analysis. The Stim-Response analysis package covers various types of stimulus-response studies across multiple disciplines including psychophysiology, exercise science, communications, human factors, and more.
Presents visual, auditory, somatosensory, or electrical stimulation
Creates feedback loops between VR world and physiological data
Controls virtual environment based on physiological responses
Automated Stim-Response Analysis, Full subject immersion
Digital Input to Stim Events—Identify and label stimulus events corresponding to any combination of digital inputs. The Digital input to stim events function works with TTL trigger information coming from applications including E-Prime®, SuperLab®, DirectRT®, MediaLab®, Inquisit®, Vizard, and Presentation. Convert TTL data acquired on the digital channels of an MP device into specific stimulus events. For each stimulus, a light bulb icon is placed in the global events bar, the event is labeled with the stimulus event type, and a mouse-over tag includes the event time. All event information is accessible and exportable from the Event Palette. The digital input to stim events program is useful when you have many different types of stimulus event groups.