Frequently Asked Questions (FAQ)
This page addresses common questions about the OpenScope Community Predictive Processing project, its data, and how to get involved.
General Project Questions
What is the OpenScope Community Predictive Processing project?
The OpenScope Community Predictive Processing project is a collaborative effort to investigate the neural mechanisms underlying predictive processing in the brain. Through carefully designed experiments using in-vivo two-photon imaging and electrophysiological recordings, the project aims to test whether mismatch stimuli engage shared or distinct mechanisms.
For more details, see:
Who is involved in this project?
The project involves researchers from multiple institutions, including the Allen Institute and various collaborating laboratories (Bastos lab, Najafi lab, Ruediger lab, and Oweiss lab). We also welcome contributions from the broader research community.
For more information:
What are the main research questions?
The project addresses several key questions:
- Do temporal, motor, and omission mismatch stimuli engage shared or distinct neural mechanisms?
- How do these mechanisms differ across species (mice vs. primates)?
- What computational primitives (stimulus adaptation, dendritic computation, E/I balance, hierarchical processing) are central to predictive processing?
Learn more:
Data and Resources
How can I access the experimental data?
Our data is publicly hosted on Amazon S3 in the aind-open-data bucket — no AWS account is required. You can browse the data with Quilt, stream NWB files directly from S3 with Python, or download files using the AWS CLI.
To find a session, copy its identifier from the tracking spreadsheet and search for it on Quilt.
For full instructions, see:
- Data Access Guide — step-by-step guide to finding and opening files
- Stream NWB from S3 notebook — open any NWB from S3 in one line
- Examine Ophys NWB notebook — guided walkthrough of ophys NWB contents
What types of data are being collected?
The project collects several types of data:
- Two-photon calcium imaging data from pan-excitatory and pan-inhibitory lines
- Neuropixels recordings with SST-optotagging
- Voltage imaging recordings of pyramidal cell somata and dendrites
- Behavioral data (running speed, eye movements)
See our methods:
In what format are the data stored?
Data are standardized in Neurodata Without Borders (NWB) format to ensure interoperability and ease of use across the research community.
Are there code samples for working with the data?
Yes — we provide several notebooks on this website under Analysis:
- Stream NWB from S3 — utility to open any NWB file directly from S3
- Examine Ophys NWB — walkthrough of ophys NWB contents (ROIs, ΔF/F, events)
- Intro to Ephys NWBs — explore spike-sorted Neuropixels NWBs
Additional scripts are in the code/data-access directory of our GitHub repository. The OpenScope Databook also has extensive NWB examples, though our latest NWBs may differ slightly in key names or organization.
Getting Involved
How can I contribute to the project?
There are several ways to contribute:
- Analyze existing datasets and share your findings
- Develop or validate computational models using our data
- Contribute to the codebases for data analysis or visualization
- Conduct complementary experiments in your own lab
Get started here:
How do I report issues or suggest improvements?
Issues can be reported on our GitHub Issues page. For discussions and suggestions, please use our GitHub Discussions.
How do I cite this project in my publications?
Please cite our arXiv paper when using data or code from this project:
For details on authorship in future publications, please review our Collaboration Policy.
Technical Questions
What stimulus paradigms are used in the experiments?
Four main experimental paradigms are used:
- Standard Mismatch: Drifting gratings with occasional orientation changes
- Sensorimotor Mismatch: Closed-loop visuo-motor interactions with occasional mismatches
- Sequence Mismatch: Learned sequences with occasional disruptions
- Temporal Mismatch: Stimuli with unexpected timing changes
Explore our approach:
What hardware is used for the recordings?
The project uses three primary recording systems:
- SLAP2 (Scanned Line Angular Projection) for high-speed subcellular imaging
- Neuropixels probes for high-density electrophysiological recordings
- Mesoscope for wide-field calcium imaging
See the Hardware Documentation for more details.
How are the stimuli implemented?
All stimuli are implemented using the Bonsai framework. The stimulus code is available in the code/stimulus-control/src directory of our GitHub repository. See the Bonsai Instructions for more information.
💬 Start a discussion for this page on GitHub (A GitHub account is required to create or participate in discussions)