In a nutshell: Virtual experiment lets you stimulate visual neurons in different ways and record and listen to how they respond.

View Paper Abstract

Hands-on experiments — ones where students get to touch, feel and listen — can make neuroscience make sense. The theoretical, tangible.

But they are getting increasingly difficult to run. Technical advances mean that huge amounts of data, collected from multiple recordings, are becoming the norm. The average student prac cannot cater for this. And neither ethics committees or students want to use animals in their experiments unless it’s strictly necessary.

Enter Visual Neuroscience Experiments, a virtual monkey (or human) brain cell ready for probing, and backed by a computer model that generates large datasets for analysis. It was designed and developed by Maria del Mar Quiroga and Nicholas Price of the ARC Centre for Integrative Brain Function and the Department of Physiology at Monash University in Melbourne.

Visual Neuroscience Experiments allows students to “record” the electrical activity in cells that respond to motion, found in a part of the brain called MT or V5. The cells respond differently depending on the direction of the motion in the visual field — some have their biggest response to a movement at 15 degrees from vertical, some at 37.5 degrees from horizontal, and so on. Without these cells your tennis game would be shot.

Visual Neuroscience Experiments can be used as a demonstration, or more advanced students can run experiments to work out a particular cell’s preferred direction of movement, or to get a handle on noise — how cells vary their responses to identical signals.

“The coolest part is that you see the response, the electrical spikes, and you hear them too, just as you would in an actual experiment. Pop. Pop. Pop..,” says Quiroga.


The two researchers released their simulation under a Creative Commons Attribution Non Commercial 4.0 International license, meaning that it can be used at no cost and modified by anyone for non-commercial purposes with attribution.

Quiroga, with other colleagues at Monash University, has developed a suite of virtual experiments, including several more neuroscience simulations, such as a virtual brain cell that responds to the location of a sound stimulus.

Next steps:
The code underlying this simulation is being adapted to other types of sensory brain cells.

Quiroga, M., & Price, N.S. (2016) Simulated in vivo Electrophysiology Experiments Provide Previously Inaccessible Insights into Visual Physiology. Journal of Undergraduate Neuroscience Education, 15(1), A11

Republish this article:

We believe in sharing knowledge. We use a Creative Commons Attribution 4.0 International License, which allows unrestricted use of this content, subject only to appropriate attribution. So please use this article as is, or edit it to fit your purposes. Referrals, mentions and links are appreciated.