x
[quotes_form]
The light digital microscope can be used to visualize structures as small as ~1 micron

Scanning Tunneling Microscope

Introduction

Scanning Tunneling Microscopy, or STM, is a scanning probe technique. This means that the specimen is imaged by scanning a very sharp tip over its surface, while information about the shape or composition of the surface is collected based on interactions between the tip and the sample. This is unlike optical or electron microscopy, where a beam of light or electrons either passes through or reflects off the specimen.

The main advantage of scanning probe techniques is that they are extremely sensitive to height or composition variations on the surface, with some having the potential for even atomic-level resolution (< 0.1 nanometer). In fact, STM was the first technique which was able to distinguish individual atoms on a surface.[1] Its inventors were awarded the Nobel Prize in physics in 1986.

In STM, the interaction between the probe tip and the specimen consists of quantum tunneling of electrons. This is a process where electrons can cross over a barrier, in this case the empty space between the specimen and the probe. 

This is because the location of an electron is not described by a fixed position, but by a cloud, where the electron may exist anywhere within the cloud. If the cloud of an electron extends across a barrier, the electron can cross over it by existing simultaneously on both sides of the barrier. 

Electron tunneling can happen at distances < 3 nm, with a probability that is strongly dependent on distance, which gives STM its extreme sensitivity at short length scales. 

This is a simplified explanation, and a full analysis of tunneling in specific cases requires determining the quantum wavefunction of the electron in the system,[2] through use of the Schrӧedinger equation or other formulations.

Electrons can tunnel either between the tip and the specimen, or vice versa, depending on the applied voltage. In either case, the rules of quantum tunneling require that the individual electron tunnels between an occupied energy state in one to the same, unoccupied state in the other. 

A central idea in quantum physics is that electrons exist in materials in such discrete (or quantized) energy states. The process of tunneling is shown schematically in the figure below.

Scanning tunneling microscope tunneling process

Figure: Schematic of the tunneling process.

The rate that electrons cross the barrier (or the tunneling current) depends on:

  • Voltage applied between the probe tip and specimen
  • Density of occupied and unoccupied electron energy states in the specimen and tip (sometimes called the local density of states or LDOS)
  • Distance between probe tip and specimen

Tunneling current is roughly proportional to the first two, and depends exponentially on the third.[3]

In this article, we will first describe the various types of analysis that can be done with STM, based on the dependence of tunneling on these three factors. We will then discuss practical aspects of STM, including the parts of a typical STM setup, and considerations when performing experiments with an STM.

Types of STM analysis

The three parameters that the STM analyst can vary and/or measure are the applied probe to specimen voltage, the probe height, and the tunneling current.[4] A number of different techniques can be built by varying these parameters. They are described below, from simplest to most complex.[5]

Constant tunneling current

In this simple model, the probe is scanned (or rastered) in the x and y directions across the specimen. While the tip is being rastered, a voltage is applied to the specimen and the resulting tunneling current is measured. 

An electronic feedback loop attempts to then keep this tunneling current constant during the x-y scanning by adjusting the z-position (height) of the probe tip. For example, if the tunneling current is higher than the set value, the feedback loop will retract the tip. Typical scan areas are in the ~ 100 to 10,000 nm2 range.

The resulting information is a two-dimensional x-y map of the probe height required to maintain a constant tunneling current, sometimes called a “topographic” map. 

This height is a combination of the physical height variation of the surface, and the local density of electron states. The very strong exponential dependence of tunneling current on probe-specimen distance, however, results in high sensitivity to surface topography.

Constant height

This mode is similar to the constant tunneling current mode, except that the tunneling current is measured, and the height of the probe tip is held constant. 

The result in this case is a map of the tunneling current at constant height. As with constant current mode, the magnitude of the tunneling current depends on the shape of the surface, and the density of electron states.

The main advantage to constant height STM is that the tip can be scanned faster, since no feedback is needed. However, it requires an atomically flat specimen, and can only be used over small areas, since without feedback there is danger of having the probe contact the specimen.

Barrier height and density of states imaging

A more sophisticated form of STM analysis involves varying or modulating either the probe to sample distance, or the applied voltage, during an x-y scan.

In the first case, the change in the tunneling current with probe to sample distance (dI/dz) is proportional to the local work function of the specimen. 

The work function is the amount of energy required to remove an electron from a material, and depends on the composition and structure of the surface just under the probe tip.

In the second case, the change in tunneling current with applied voltage (dI/dV) is proportional to the local density of states in the material. So scanning in x-y while modulating the applied voltage can be used to map the density of states in the material, another useful diagnostic tool.

Spectroscopic modes

In the techniques of barrier height and density of states mapping, the height or voltage are continuously modulated during the x-y sweep. A more comprehensive analysis can be done where the probe stops at each point in the x-y scan, and runs a full sweep of probe height or supply voltage, measuring the resulting tunneling current response. 

These are full I vs. Z and I vs. V curves. This type of scanning is much slower, but results in a full spectroscopic data set at each point, which can be useful when the response of the specimen is not easy to predict.

Parts of the STM

Scanning Tunneling Microscope System

Figure: Schematic of an STM system.

Source: http://www.iap.tuwien.ac.at/www/surface/stm_gallery/stm_schematic

1. Probe tip

The probe tip is a critical part of the STM because it directly interacts with the specimen. The most critical aspect of the probe is the sharpness of the tip itself, which effectively determines the resolution of the image.

STM probes are traditionally metallic, specifically, either tungsten or platinum-iridium are common. They are formed by either mechanical methods, like shearing, or by electrochemical etching.[6] More recently, carbon nanotubes have been used.

2. Tip positioning hardware

The position of the tip in all three dimensions is typically controlled using piezoelectric transducers,[7] or simply “piezos”. These consist of crystals that expand or contract by minute amounts when voltage is applied. So by applying variable voltages to crystals controlling the x and y positions, the position of the probe tip can be scanned across the specimen.

Conversely, if the piezo crystals are compressed or put in tension, a voltage is generated across the crystal. So the position of the probe tip can also be sensed by measuring the voltage.

In an STM setup, the piezo crystals responsible for x and y motion are directly controlled by a computer. Z motion, which determines the probe height, has the additional capability or being controlled by a feedback loop in constant tunneling current measurements.

3. Electronics

The electronics in an STM are straightforward but very sensitive and precise. First, a voltage source applies a small voltage to the specimen. When the probe is at a typical scanning distance (< 1 nm, or 10-9 m), a small tunneling current, on the order of 10-9 A, is generated. An amplifier converts this small current to a voltage, and this voltage signal is sent to control electronics.

4. Environmental chamber

Unlike electron microscopy, there is nothing inherent about the physics of the STM technique that requires special environmental conditions. However, since STM is surface-sensitive, a highly controlled environment is required to maintain surface cleanliness. 

Consider that at 10-6 Torr,  considered a moderately high vacuum, a gas molecule impacts every atom on a surface once per second. For that reason, STM analysis is normally performed under ultra-high vacuum conditions, which greatly reduces contamination from background gases.

However, STM can be performed in other controlled environments, including under the surface of liquids.[4]

5. Vibration isolation

Because the distances being measured in STM are extremely small, at times < 0.1 nm, the technique is very sensitive to vibrations, even more so than other microscopy techniques. 

Even the imperceptible vibrations from a nearby road or other lab equipment can severely limit or even prevent the operation of an STM. For that reason, all components in the STM are designed to be rigid. 

Furthermore, STM equipment must be isolated from vibrations by mounting it on dampening air springs or similar hardware. The placement of the STM in a building is also critical for this reason.

Analyzing a Specimen with a Scanning Electron Microscope

Analysis using STM is normally done with the assistance of a trained expert since these instruments are extremely sensitive and complicated to operate. Similar to TEM, one of the most difficult and critical aspects of the process is sample preparation. The quality of the analysis often depends on how well the sample was prepared.

Analysis by STM is highly dependent on the type of sample and the instrument being used. The steps given here are a rough generalization of the analysis method.

  1. When making a plan for analysis. as with any lab analysis, the first step to using STM is to develop a plan and set of goals for the analysis. One way of doing this is to list the questions you hope to answer using this technique.
  2. Sample preparation. In STM, sample preparation will vary widely depending on the situation. Some examples include:
    • Cleaning a metallic sample using solvents to remove any environmental contamination, then polishing the sample to ensure it is macroscopically flat.
    • Immobilizing a biological macromolecule on a substrate, using a self-assembled monolayer of functionalized alkanethiols on gold.[4]
  3. Loading the sample into the environmental chamber, using an established procedure that will prevent contamination of the chamber.
  4. Performing additional in situ cleaning or preparation. For some kinds of analysis, additional steps, for example, sputter-cleaning and/or annealing will be required. These techniques are used to generate a clean and flat surface.
  5. Positioning the sample. Before being able to scan, the specimen and probe must be positioned close to one another, and the location in the x-y plane is set by the operator.
  6. Collecting scans and analyzing data. The types of scans will depend on the questions that are formulated in step 1, and may vary depending on initial findings. 

Care of an STM

Scanning tunneling microscopes are expensive and sensitive pieces of equipment, so in most cases, they are owned and maintained by trained staff. For the individual users of an STM, there are a number of guidelines that will help to prevent damage to the equipment:

  • Handle any materials that will be introduced to the environmental chamber with gloves that leave no residue, particularly if the environmental chamber is an ultra-high vacuum chamber. Touching any vacuum components with bare hands can contaminate the vacuum environment.
  • Carefully follow established procedures to introduce the sample into the microscope
  • Do not introduce any samples that are wet, oily, or will otherwise release vapors.
  • Do not bump the piezo motors while manipulating items in the STM. These are very sensitive components and can be broken even by a light contact.
  • Likewise, be careful during analysis not to crash the probe either in the x-y or height directions. Crashes risk damaging the probe tip as well as the piezo drives.

Comparisons With Other Microscopes

Scanning tunneling microscopes are quite different from the more common optical and electron microscopes, and the information they provide is different and often complementary to those instruments. 

STM differs from those techniques mainly in its ability to provide detailed information on the atomic-level, local structure of the surface of a material, and extremely good resolution in the vertical (height) dimension. Because of this, STM is considered a surface science tool as well as a microscope.

STM is closely related to atomic force microscopy (AFM), which is also a scanning probe technique with good height resolution. The key difference between STM and AFM is that AFM uses intermolecular or interatomic forces, rather than quantum tunneling, to scan the surface.

The main drawbacks to STM are:

  • STM requires a significant amount of careful sample/probe preparation, and effort in vibration isolation to collect quality images. For this reason, it can be a difficult measurement to execute.
  • Because specimens must be immobilized on the atomic scale, and because the electronic properties of the specimen are being probed, it has limited versatility in terms of the types of specimens that can be analyzed.
  • It can only access the outer surface of a material.
  • Because the tunneling current signal in STM is a combination of probe to specimen distance and local density of states, the interpretation of STM data can be difficult.

References

  1. See, for example, Hebenstreit, E. L. D., Hebenstreit, W., Schmid, M., & Varga, P. (1999). Pt25Rh75 (111),(110), and (100) studied by scanning tunnelling microscopy with chemical contrast. Surface Science, 441(2-3), 441-453.
  2. Atkins, P. W. (1997). Physical Chemistry, 6th ed. New York, NY: W. H. Freeman & Co.
  3. Park Systems, “Scanning Tunneling Microscopy (STM): Probing the Local Electronic Structure of a Sample’s Surface” https://www.parksystems.com/images/spmmodes/electrical/Scanning-Tunneling-Microscopy-(STM).pdf
  4. Chen, J. (2008). Introduction to Scanning Tunneling Microscopy, 2nd ed. Oxford Univeristy Press.
  5. NT-MDT Spectrum Instruments “STM Techniques”, https://www.ntmdt-si.com/resources/spm-principles/stm-techniques
  6. Zeljkovic Lab, “STM Tip Preparation”, https://capricorn.bc.edu/wp/zeljkoviclab/research/scanning-tunneling-microscopy-stm/stm-tip-preparation/
  7. Phillips, J. R. (2000). Piezoelectric technology primer. CTS Wireless Components, 4800.

Learn More about our Services and how can we help you with your research!

Introduction

In behavioral neuroscience, the Open Field Test (OFT) remains one of the most widely used assays to evaluate rodent models of affect, cognition, and motivation. It provides a non-invasive framework for examining how animals respond to novelty, stress, and pharmacological or environmental manipulations. Among the test’s core metrics, the percentage of time spent in the center zone offers a uniquely normalized and sensitive measure of an animal’s emotional reactivity and willingness to engage with a potentially risky environment.

This metric is calculated as the proportion of time spent in the central area of the arena—typically the inner 25%—relative to the entire session duration. By normalizing this value, researchers gain a behaviorally informative variable that is resilient to fluctuations in session length or overall movement levels. This makes it especially valuable in comparative analyses, longitudinal monitoring, and cross-model validation.

Unlike raw center duration, which can be affected by trial design inconsistencies, the percentage-based measure enables clearer comparisons across animals, treatments, and conditions. It plays a key role in identifying trait anxiety, avoidance behavior, risk-taking tendencies, and environmental adaptation, making it indispensable in both basic and translational research contexts.

Whereas simple center duration provides absolute time, the percentage-based metric introduces greater interpretability and reproducibility, especially when comparing different animal models, treatment conditions, or experimental setups. It is particularly effective for quantifying avoidance behaviors, risk assessment strategies, and trait anxiety profiles in both acute and longitudinal designs.

What Does Percentage of Time in the Centre Measure?

This metric reflects the relative amount of time an animal chooses to spend in the open, exposed portion of the arena—typically defined as the inner 25% of a square or circular enclosure. Because rodents innately prefer the periphery (thigmotaxis), time in the center is inversely associated with anxiety-like behavior. As such, this percentage is considered a sensitive, normalized index of:

  • Exploratory drive vs. risk aversion: High center time reflects an animal’s willingness to engage with uncertain or exposed environments, often indicative of lower anxiety and a stronger intrinsic drive to explore. These animals are more likely to exhibit flexible, information-gathering behaviors. On the other hand, animals that spend little time in the center display a strong bias toward the safety of the perimeter, indicative of a defensive behavioral state or trait-level risk aversion. This dichotomy helps distinguish adaptive exploration from fear-driven avoidance.

  • Emotional reactivity: Fluctuations in center time percentage serve as a sensitive behavioral proxy for changes in emotional state. In stress-prone or trauma-exposed animals, decreased center engagement may reflect hypervigilance or fear generalization, while a sudden increase might indicate emotional blunting or impaired threat appraisal. The metric is also responsive to acute stressors, environmental perturbations, or pharmacological interventions that impact affective regulation.

  • Behavioral confidence and adaptation: Repeated exposure to the same environment typically leads to reduced novelty-induced anxiety and increased behavioral flexibility. A rising trend in center time percentage across trials suggests successful habituation, reduced threat perception, and greater confidence in navigating open spaces. Conversely, a stable or declining trend may indicate behavioral rigidity or chronic stress effects.

  • Pharmacological or genetic modulation: The percentage of time in the center is widely used to evaluate the effects of pharmacological treatments and genetic modifications that influence anxiety-related circuits. Anxiolytic agents—including benzodiazepines, SSRIs, and cannabinoid agonists—reliably increase center occupancy, providing a robust behavioral endpoint in preclinical drug trials. Similarly, genetic models targeting serotonin receptors, GABAergic tone, or HPA axis function often show distinct patterns of center preference, offering translational insights into psychiatric vulnerability and resilience.

Critically, because this metric is normalized by session duration, it accommodates variability in activity levels or testing conditions. This makes it especially suitable for comparing across individuals, treatment groups, or timepoints in longitudinal studies.

A high percentage of center time indicates reduced anxiety, increased novelty-seeking, or pharmacological modulation (e.g., anxiolysis). Conversely, a low percentage suggests emotional inhibition, behavioral avoidance, or contextual hypervigilance. reduced anxiety, increased novelty-seeking, or pharmacological modulation (e.g., anxiolysis). Conversely, a low percentage suggests emotional inhibition, behavioral avoidance, or contextual hypervigilance.

Behavioral Significance and Neuroscientific Context

1. Emotional State and Trait Anxiety

The percentage of center time is one of the most direct, unconditioned readouts of anxiety-like behavior in rodents. It is frequently reduced in models of PTSD, chronic stress, or early-life adversity, where animals exhibit persistent avoidance of the center due to heightened emotional reactivity. This metric can also distinguish between acute anxiety responses and enduring trait anxiety, especially in longitudinal or developmental studies. Its normalized nature makes it ideal for comparing across cohorts with variable locomotor profiles, helping researchers detect true affective changes rather than activity-based confounds.

2. Exploration Strategies and Cognitive Engagement

Rodents that spend more time in the center zone typically exhibit broader and more flexible exploration strategies. This behavior reflects not only reduced anxiety but also cognitive engagement and environmental curiosity. High center percentage is associated with robust spatial learning, attentional scanning, and memory encoding functions, supported by coordinated activation in the prefrontal cortex, hippocampus, and basal forebrain. In contrast, reduced center engagement may signal spatial rigidity, attentional narrowing, or cognitive withdrawal, particularly in models of neurodegeneration or aging.

3. Pharmacological Responsiveness

The open field test remains one of the most widely accepted platforms for testing anxiolytic and psychotropic drugs. The percentage of center time reliably increases following administration of anxiolytic agents such as benzodiazepines, SSRIs, and GABA-A receptor agonists. This metric serves as a sensitive and reproducible endpoint in preclinical dose-finding studies, mechanistic pharmacology, and compound screening pipelines. It also aids in differentiating true anxiolytic effects from sedation or motor suppression by integrating with other behavioral parameters like distance traveled and entry count (Prut & Belzung, 2003).

4. Sex Differences and Hormonal Modulation

Sex-based differences in emotional regulation often manifest in open field behavior, with female rodents generally exhibiting higher variability in center zone metrics due to hormonal cycling. For example, estrogen has been shown to facilitate exploratory behavior and increase center occupancy, while progesterone and stress-induced corticosterone often reduce it. Studies involving gonadectomy, hormone replacement, or sex-specific genetic knockouts use this metric to quantify the impact of endocrine factors on anxiety and exploratory behavior. As such, it remains a vital tool for dissecting sex-dependent neurobehavioral dynamics.
The percentage of center time is one of the most direct, unconditioned readouts of anxiety-like behavior in rodents. It is frequently reduced in models of PTSD, chronic stress, or early-life adversity. Because it is normalized, this metric is especially helpful for distinguishing between genuine avoidance and low general activity.

Methodological Considerations

  • Zone Definition: Accurately defining the center zone is critical for reliable and reproducible data. In most open field arenas, the center zone constitutes approximately 25% of the total area, centrally located and evenly distanced from the walls. Software-based segmentation tools enhance precision and ensure consistency across trials and experiments. Deviations in zone parameters—whether due to arena geometry or tracking inconsistencies—can result in skewed data, especially when calculating percentages.

     

  • Trial Duration: Trials typically last between 5 to 10 minutes. The percentage of time in the center must be normalized to total trial duration to maintain comparability across animals and experimental groups. Longer trials may lead to fatigue, boredom, or habituation effects that artificially reduce exploratory behavior, while overly short trials may not capture full behavioral repertoires or response to novel stimuli.

     

  • Handling and Habituation: Variability in pre-test handling can introduce confounds, particularly through stress-induced hypoactivity or hyperactivity. Standardized handling routines—including gentle, consistent human interaction in the days leading up to testing—reduce variability. Habituation to the testing room and apparatus prior to data collection helps animals engage in more representative exploratory behavior, minimizing novelty-induced freezing or erratic movement.

     

  • Tracking Accuracy: High-resolution tracking systems should be validated for accurate, real-time detection of full-body center entries and sustained occupancy. The system should distinguish between full zone occupancy and transient overlaps or partial body entries that do not reflect true exploratory behavior. Poor tracking fidelity or lag can produce significant measurement error in percentage calculations.

     

  • Environmental Control: Uniformity in environmental conditions is essential. Lighting should be evenly diffused to avoid shadow bias, and noise should be minimized to prevent stress-induced variability. The arena must be cleaned between trials using odor-neutral solutions to eliminate scent trails or pheromone cues that may affect zone preference. Any variation in these conditions can introduce systematic bias in center zone behavior. Use consistent definitions of the center zone (commonly 25% of total area) to allow valid comparisons. Software-based segmentation enhances spatial precision.

Interpretation with Complementary Metrics

Temporal Dynamics of Center Occupancy

Evaluating how center time evolves across the duration of a session—divided into early, middle, and late thirds—provides insight into behavioral transitions and adaptive responses. Animals may begin by avoiding the center, only to gradually increase center time as they habituate to the environment. Conversely, persistently low center time across the session can signal prolonged anxiety, fear generalization, or a trait-like avoidance phenotype.

Cross-Paradigm Correlation

To validate the significance of center time percentage, it should be examined alongside results from other anxiety-related tests such as the Elevated Plus Maze, Light-Dark Box, or Novelty Suppressed Feeding. Concordance across paradigms supports the reliability of center time as a trait marker, while discordance may indicate task-specific reactivity or behavioral dissociation.

Behavioral Microstructure Analysis

When paired with high-resolution scoring of behavioral events such as rearing, grooming, defecation, or immobility, center time offers a richer view of the animal’s internal state. For example, an animal that spends substantial time in the center while grooming may be coping with mild stress, while another that remains immobile in the periphery may be experiencing more severe anxiety. Microstructure analysis aids in decoding the complexity behind spatial behavior.

Inter-individual Variability and Subgroup Classification

Animals naturally vary in their exploratory style. By analyzing percentage of center time across subjects, researchers can identify behavioral subgroups—such as consistently bold individuals who frequently explore the center versus cautious animals that remain along the periphery. These classifications can be used to examine predictors of drug response, resilience to stress, or vulnerability to neuropsychiatric disorders.

Machine Learning-Based Behavioral Clustering

In studies with large cohorts or multiple behavioral variables, machine learning techniques such as hierarchical clustering or principal component analysis can incorporate center time percentage to discover novel phenotypic groupings. These data-driven approaches help uncover latent dimensions of behavior that may not be visible through univariate analyses alone.

Total Distance Traveled

Total locomotion helps contextualize center time. Low percentage values in animals with minimal movement may reflect sedation or fatigue, while similar values in high-mobility subjects suggest deliberate avoidance. This metric helps distinguish emotional versus motor causes of low center engagement.

Number of Center Entries

This measure indicates how often the animal initiates exploration of the center zone. When combined with percentage of time, it differentiates between frequent but brief visits (indicative of anxiety or impulsivity) versus fewer but sustained center engagements (suggesting comfort and behavioral confidence).

Latency to First Center Entry

The delay before the first center entry reflects initial threat appraisal. Longer latencies may be associated with heightened fear or low motivation, while shorter latencies are typically linked to exploratory drive or low anxiety.

Thigmotaxis Time

Time spent hugging the walls offers a spatial counterbalance to center metrics. High thigmotaxis and low center time jointly support an interpretation of strong avoidance behavior. This inverse relationship helps triangulate affective and motivational states.

Applications in Translational Research

  • Drug Discovery: The percentage of center time is a key behavioral endpoint in the development and screening of anxiolytic, antidepressant, and antipsychotic medications. Its sensitivity to pharmacological modulation makes it particularly valuable in dose-response assessments and in distinguishing therapeutic effects from sedative or locomotor confounds. Repeated trials can also help assess drug tolerance and chronic efficacy over time.
  • Genetic and Neurodevelopmental Modeling: In transgenic and knockout models, altered center percentage provides a behavioral signature of neurodevelopmental abnormalities. This is particularly relevant in the study of autism spectrum disorders, ADHD, fragile X syndrome, and schizophrenia, where subjects often exhibit heightened anxiety, reduced flexibility, or altered environmental engagement.
  • Hormonal and Sex-Based Research: The metric is highly responsive to hormonal fluctuations, including estrous cycle phases, gonadectomy, and hormone replacement therapies. It supports investigations into sex differences in stress reactivity and the behavioral consequences of endocrine disorders or interventions.
  • Environmental Enrichment and Deprivation: Housing conditions significantly influence anxiety-like behavior and exploratory motivation. Animals raised in enriched environments typically show increased center time, indicative of reduced stress and greater behavioral plasticity. Conversely, socially isolated or stimulus-deprived animals often show strong center avoidance.
  • Behavioral Biomarker Development: As a robust and reproducible readout, center time percentage can serve as a behavioral biomarker in longitudinal and interventional studies. It is increasingly used to identify early signs of affective dysregulation or to track the efficacy of neuromodulatory treatments such as optogenetics, chemogenetics, or deep brain stimulation.
  • Personalized Preclinical Models: This measure supports behavioral stratification, allowing researchers to identify high-anxiety or low-anxiety phenotypes before treatment. This enables within-group comparisons and enhances statistical power by accounting for pre-existing behavioral variation. Used to screen anxiolytic agents and distinguish between compounds with sedative vs. anxiolytic profiles.

Enhancing Research Outcomes with Percentage-Based Analysis

By expressing center zone activity as a proportion of total trial time, researchers gain a metric that is resistant to session variability and more readily comparable across time, treatment, and model conditions. This normalized measure enhances reproducibility and statistical power, particularly in multi-cohort or cross-laboratory designs.

For experimental designs aimed at assessing anxiety, exploratory strategy, or affective state, the percentage of time spent in the center offers one of the most robust and interpretable measures available in the Open Field Test.

Explore high-resolution tracking solutions and open field platforms at

References

  • Prut, L., & Belzung, C. (2003). The open field as a paradigm to measure the effects of drugs on anxiety-like behaviors: a review. European Journal of Pharmacology, 463(1–3), 3–33.
  • Seibenhener, M. L., & Wooten, M. C. (2015). Use of the open field maze to measure locomotor and anxiety-like behavior in mice. Journal of Visualized Experiments, (96), e52434.
  • Crawley, J. N. (2007). What’s Wrong With My Mouse? Behavioral Phenotyping of Transgenic and Knockout Mice. Wiley-Liss.
  • Carola, V., D’Olimpio, F., Brunamonti, E., Mangia, F., & Renzi, P. (2002). Evaluation of the elevated plus-maze and open-field tests for the assessment of anxiety-related behavior in inbred mice. Behavioral Brain Research, 134(1–2), 49–57.

Written by researchers, for researchers — powered by Conduct Science.