Markerless human gait tracking for research video
ConductVision Human Gait extends the same pose-estimation workflow used in behavioral research to human mobility protocols. Capture motion with standard cameras, estimate body landmarks, and export reviewable gait metrics without markers, treadmills, or wearable sensors.
Quantify mobility without changing the protocol
The page is positioned for research workflows, not clinical diagnosis. Teams can review extracted coordinates and overlays before moving metrics into downstream statistical analysis.
Gait speed
Extracted from tracked pose landmarks and reviewed as frame-level or session-level outputs.
Step length
Extracted from tracked pose landmarks and reviewed as frame-level or session-level outputs.
Cadence
Extracted from tracked pose landmarks and reviewed as frame-level or session-level outputs.
Stance and swing timing
Extracted from tracked pose landmarks and reviewed as frame-level or session-level outputs.
Trunk sway
Extracted from tracked pose landmarks and reviewed as frame-level or session-level outputs.
Limb asymmetry
Extracted from tracked pose landmarks and reviewed as frame-level or session-level outputs.
From video to exportable gait data
Record standard video
Use fixed cameras in a hallway, lab walkway, rehabilitation setup, or controlled motion protocol.
Track body keypoints
Pose models estimate body landmarks frame by frame without reflective markers or wearable sensors.
Export reproducible metrics
Coordinates, confidence values, overlays, and gait metrics can be reviewed and exported for statistics.
Built for translational motion studies
Human gait tracking can sit beside rodent gait and animal behavior analysis, helping teams keep a common video and export workflow across translational programs.
Bring human gait into your ConductVision workflow
Send a sample video or schedule a consultation to map the outputs your protocol needs.