Visual X Maze
Overview
The Visual X Maze is a four-arm cross-shaped maze developed in collaboration with Cedars-Sinai Medical Center and described by Vit et al. in Scientific Reports. Each arm of the X maze terminates in a visual stimulus display — either a backlit panel or small monitor — that presents distinct color, contrast, or pattern cues. The task requires the animal to navigate from a central start zone to the arm displaying a designated target stimulus, testing visual-spatial discrimination, color vision, and contrast sensitivity. Unlike traditional maze tasks that rely on distal spatial cues, the Visual X Maze directly challenges the animal's ability to discriminate between proximal visual stimuli, making it particularly sensitive to retinal pathology, optic nerve damage, and early-stage visual cortex dysfunction.
In the standard protocol, the animal is placed in the center of the X maze facing a neutral wall. Four arms extend outward, each displaying a different visual stimulus — for example, vertical green gratings (target S+), horizontal red gratings, blue checkerboard, and gray uniform background. The target stimulus arm contains a food reward or escape platform, while entry into non-target arms results in a brief confinement penalty. Across trials, the position of the target stimulus is pseudorandomly rotated among the four arms to prevent spatial bias, forcing the animal to rely on visual cue identification rather than allocentric spatial memory. The task has been extensively validated in Alzheimer's disease transgenic mouse models (5xFAD, APP/PS1), where it detects subtle visual discrimination deficits months before conventional memory impairments emerge.
ConductMaze provides complete control over the Visual X Maze through per-arm stimulus display management, automated arm entry detection via infrared beam-break sensors at each arm entrance, motorized guillotine doors for controlled access, and food well delivery in the target arm. The software pseudorandomizes target arm position across trials, enforces inter-trial intervals, and computes trial-by-trial accuracy, error patterns, and learning curves. Integration with overhead video tracking enables simultaneous analysis of path efficiency and decision-point hesitation behavior.
Trial Flow
Stimulus Configuration
Load visual stimuli on 4 arm displays, pseudorandomize target arm position
Central Placement
Subject placed in center zone, doors closed briefly for orientation
Doors Open
All four guillotine doors open simultaneously
Arm Entry Detection
IR sensor registers first arm entry at decision boundary
Choice Evaluation
Did the animal enter the target stimulus arm (S+)?
Outcome Delivery
Correct: food reward delivered. Incorrect: 15s confinement, doors close
Trial Data Logging
Record arm chosen, latency, error type, stimulus positions
Session Completion
All trials complete — compute accuracy, save learning curve data
Parameters
| Parameter | Type | Default | Description |
|---|---|---|---|
| Stimulus Type | enum | Color | Visual stimulus category for discrimination (Color, Grating, Contrast, Shape) |
| Target Stimulus | string | Green Vertical | The rewarded visual pattern the animal must identify across arm positions |
| Trials per Session | integer | 20 | Number of choice trials per session (each with randomized target arm position) |
| Confinement Penalty | seconds | 15 | Duration of confinement in incorrect arm before return to center |
| Inter-Trial Interval | seconds | 30 | Delay between trial end and next doors-open event |
| Max Trial Duration | seconds | 120 | Maximum time allowed per trial before scoring as omission |
| Stimulus Brightness | integer | 200 | Display brightness in cd/m² for stimulus panels (calibrated per arm) |
| Reward Type | enum | Sucrose Pellet | Food reward delivered in correct arm (sucrose pellet, chocolate pellet, condensed milk) |
Metrics
| Metric | Unit | Description |
|---|---|---|
| Percent Correct | % | Proportion of trials where the animal chose the target stimulus arm |
| Error Distribution | count | Number of entries to each non-target arm — reveals systematic bias or confusion patterns |
| Choice Latency | seconds | Time from doors opening to first arm entry — decision speed |
| Trials to Criterion | count | Number of trials to reach 80% correct in a block of 10 — learning speed |
| Perseverative Errors | count | Repeated entries to the same incorrect arm across consecutive trials |
| Learning Curve Slope | %/block | Rate of accuracy improvement across session blocks — acquisition rate |
| Omissions | count | Trials where the animal failed to enter any arm within the time limit |
Sample Data
| Subject | Group | Session | Pct_Correct | Choice_Latency_s | Perseverative_Errors | Omissions |
|---|
Representative data for illustration purposes. Actual values will vary by species, strain, and experimental conditions.
Applications
- 1Alzheimer's disease research — detecting early visual-spatial discrimination deficits in 5xFAD and APP/PS1 transgenic mice before hippocampal memory decline
- 2Retinal and optic nerve pathology — functional assessment of color and contrast vision after retinal degeneration, optic neuritis, or glaucoma induction
- 3Visual cortex lesion studies — mapping discrimination deficits to specific cortical regions (V1, extrastriate areas)
- 4Aging research — characterizing age-related visual discrimination decline as distinct from spatial memory impairments
- 5Therapeutic screening — evaluating neuroprotective or vision-restoring treatments using behavioral visual endpoints
Related Protocols
Compatible Products
Ready to Automate Your Behavioral Protocols?
Contact us for a demo and pricing information.