Visual X Maze

Overview

The Visual X Maze is a four-arm cross-shaped maze developed in collaboration with Cedars-Sinai Medical Center and described by Vit et al. in Scientific Reports. Each arm of the X maze terminates in a visual stimulus display — either a backlit panel or small monitor — that presents distinct color, contrast, or pattern cues. The task requires the animal to navigate from a central start zone to the arm displaying a designated target stimulus, testing visual-spatial discrimination, color vision, and contrast sensitivity. Unlike traditional maze tasks that rely on distal spatial cues, the Visual X Maze directly challenges the animal's ability to discriminate between proximal visual stimuli, making it particularly sensitive to retinal pathology, optic nerve damage, and early-stage visual cortex dysfunction.

In the standard protocol, the animal is placed in the center of the X maze facing a neutral wall. Four arms extend outward, each displaying a different visual stimulus — for example, vertical green gratings (target S+), horizontal red gratings, blue checkerboard, and gray uniform background. The target stimulus arm contains a food reward or escape platform, while entry into non-target arms results in a brief confinement penalty. Across trials, the position of the target stimulus is pseudorandomly rotated among the four arms to prevent spatial bias, forcing the animal to rely on visual cue identification rather than allocentric spatial memory. The task has been extensively validated in Alzheimer's disease transgenic mouse models (5xFAD, APP/PS1), where it detects subtle visual discrimination deficits months before conventional memory impairments emerge.

ConductMaze provides complete control over the Visual X Maze through per-arm stimulus display management, automated arm entry detection via infrared beam-break sensors at each arm entrance, motorized guillotine doors for controlled access, and food well delivery in the target arm. The software pseudorandomizes target arm position across trials, enforces inter-trial intervals, and computes trial-by-trial accuracy, error patterns, and learning curves. Integration with overhead video tracking enables simultaneous analysis of path efficiency and decision-point hesitation behavior.

Trial Flow

start

Stimulus Configuration

Load visual stimuli on 4 arm displays, pseudorandomize target arm position

input

Central Placement

Subject placed in center zone, doors closed briefly for orientation

process

Doors Open

All four guillotine doors open simultaneously

input

Arm Entry Detection

IR sensor registers first arm entry at decision boundary

decision

Choice Evaluation

Did the animal enter the target stimulus arm (S+)?

output

Outcome Delivery

Correct: food reward delivered. Incorrect: 15s confinement, doors close

process

Trial Data Logging

Record arm chosen, latency, error type, stimulus positions

end

Session Completion

All trials complete — compute accuracy, save learning curve data

Parameters

ParameterTypeDefaultDescription
Stimulus TypeenumColorVisual stimulus category for discrimination (Color, Grating, Contrast, Shape)
Target StimulusstringGreen VerticalThe rewarded visual pattern the animal must identify across arm positions
Trials per Sessioninteger20Number of choice trials per session (each with randomized target arm position)
Confinement Penaltyseconds15Duration of confinement in incorrect arm before return to center
Inter-Trial Intervalseconds30Delay between trial end and next doors-open event
Max Trial Durationseconds120Maximum time allowed per trial before scoring as omission
Stimulus Brightnessinteger200Display brightness in cd/m² for stimulus panels (calibrated per arm)
Reward TypeenumSucrose PelletFood reward delivered in correct arm (sucrose pellet, chocolate pellet, condensed milk)

Metrics

MetricUnitDescription
Percent Correct%Proportion of trials where the animal chose the target stimulus arm
Error DistributioncountNumber of entries to each non-target arm — reveals systematic bias or confusion patterns
Choice LatencysecondsTime from doors opening to first arm entry — decision speed
Trials to CriterioncountNumber of trials to reach 80% correct in a block of 10 — learning speed
Perseverative ErrorscountRepeated entries to the same incorrect arm across consecutive trials
Learning Curve Slope%/blockRate of accuracy improvement across session blocks — acquisition rate
OmissionscountTrials where the animal failed to enter any arm within the time limit

Sample Data

SubjectGroupSessionPct_CorrectChoice_Latency_sPerseverative_ErrorsOmissions

Representative data for illustration purposes. Actual values will vary by species, strain, and experimental conditions.

Applications

  • 1
    Alzheimer's disease researchdetecting early visual-spatial discrimination deficits in 5xFAD and APP/PS1 transgenic mice before hippocampal memory decline
  • 2
    Retinal and optic nerve pathologyfunctional assessment of color and contrast vision after retinal degeneration, optic neuritis, or glaucoma induction
  • 3
    Visual cortex lesion studiesmapping discrimination deficits to specific cortical regions (V1, extrastriate areas)
  • 4
    Aging researchcharacterizing age-related visual discrimination decline as distinct from spatial memory impairments
  • 5
    Therapeutic screeningevaluating neuroprotective or vision-restoring treatments using behavioral visual endpoints

Compatible Products

ME-VXMME-VXM-DISPLAYCS-958344

Ready to Automate Your Behavioral Protocols?

Contact us for a demo and pricing information.