← Capabilities

Puzzle Box Problem-Solving

Automated problem-solving assessment in the puzzle box

Quantify latency to solve, strategy type, and learning curves across sequential trials — replacing continuous manual observation with frame-accurate automated scoring.

Puzzle Box Problem-Solving
5
Difficulty levels supported
4
Strategy types classified
33ms
Latency timing precision
100%
Automated scoring pipeline
The problem

Puzzle box scoring demands continuous undivided attention

The puzzle box test requires an observer to watch every second of every trial, coding strategy type and timing solution latency simultaneously. Strategy coding is subjective, and multi-trial experiments with increasing difficulty levels multiply the scoring burden.

  • Continuous observation for 5-10 minutes per trial across 7+ difficulty levels per animal
  • Strategy coding (e.g., direct entry vs. exploration vs. perseveration) varies across scorers
  • Learning curves require consistent scoring across all trials — observer fatigue degrades late-trial accuracy
The solution

Event detection and strategy classification from video

ConductVision detects key puzzle box events — door manipulation, entry into goal zone, orientation toward barrier — and classifies solution strategy from the behavioral sequence. Multi-trial learning curves are computed automatically.

  • Frame-accurate latency from trial start to successful entry
  • Automated strategy classification: direct, exploratory, perseverative, and novel
  • Multi-trial learning curves with strategy transitions across difficulty levels
Endpoints

Problem-solving dependent variables

Solution latency per trial

Solution latency per trial

Time from trial start to successful goal entry for each trial and difficulty level.

CSVJSON
Strategy classification log

Strategy classification log

Strategy type assigned to each trial with confidence score and supporting behavioral features.

CSV
Learning curve summary

Learning curve summary

Latency and strategy progression across trials and difficulty levels, with regression fit for learning rate quantification.

CSVPNG
Applications

Cognitive paradigms

Cognitive flexibility

Rule-shift and reversal learning

Puzzle box difficulty increases require strategy shifts. Automated scoring detects perseverative errors and successful rule adaptation.

Measures
  • Perseverative error count
  • Trials to criterion
  • Strategy shift latency
Aging and neurodegeneration

Executive function decline

Puzzle box performance declines with age and in AD/PD models. Automated scoring enables longitudinal tracking with consistent criteria.

Measures
  • Maximum difficulty reached
  • Learning rate by age
  • Strategy complexity score
Enrichment

Environmental enrichment effects

Enriched-housing animals solve puzzle box tasks faster and use more sophisticated strategies — quantified automatically.

Measures
  • Enriched vs. standard latency
  • Strategy type distribution
  • Learning rate comparison
Drug effects

Cognitive enhancer screening

Puzzle box is sensitive to cognitive enhancers and impairing drugs. Automated scoring enables high-throughput dose-response.

Measures
  • Dose-dependent latency
  • Strategy sophistication by dose
  • Error rate
Compared to typical systems

How ConductVision differs

FeatureConductVisionTypical systems
Strategy classificationAutomated, 4 typesManual subjective coding
Latency precisionFrame-accurate (33 ms)Manual stopwatch
Multi-trial learning curvesAutomatic with regression fitManual spreadsheet calculation
Difficulty level trackingIntegrated protocol managementManual trial-by-trial notes
Batch processingUnlimited sessions overnightReal-time manual observation

Automated cognitive assessment from your puzzle box recordings

Upload puzzle box videos and get strategy-classified learning curves without manual scoring.