Our skull protects our brain, but it also serves as a barrier to the direct observation of cognitive activity and our brain’s organization and development. Scientists who studied brain properties and functions 20 years ago were thus forced to experiment on animal brains, to study autopsied brains of people who had various cognitive and/or motor impairments, and to compare the behavior of people with normal and abnormal brains. It was a difficult and principally inferential process.

Recent advances in computerized imaging technology have made it possible to technologically pass through the skull and brain tissue and observe, amplify, record, rapidly analyze, and graphically display the brain substances and signals that reflect activity in very specific brain regions. This technology has revolutionized brain and mind research, and the diagnosis and treatment of many brain-related diseases and malfunctions.

The first imaging technologies, the X-ray and EEG (electroencephalogram), were primitive by today’s standards, but both have been considerably improved—and they provided the conceptual base of the other amazing imaging technologies that have recently emerged.

Computerized brain imaging technologies now measure and display the variations in chemical composition, blood-flow patterns, and electromagnetic fields that occur in normal and/or abnormal brains. Each of the several current forms of brain imaging technology has strengths and weaknesses, and new developments are continually making the technology faster, more powerful, less invasive, and less expensive. Imaging technology was primarily used in medical diagnosis initially, but it is being increasingly used in pure neuroscience and psychological research.

Educational researchers are just beginning to use imaging technologies, but this use will dramatically increase in the coming years. It will revolutionize educational research and many elements of educational policy and practice.

The various computerized imaging technologies differ, but the following analogy demonstrates how someone can examine internal differences in something (a brain) that has a protective cover (a skull): Imagine that you’re looking for similarities, differences, and imperfections in successive slices of a thinly sliced apple or potato. A brain-imaging machine is basically a camera that can rapidly and successively change its focus as it photographs and digitally stores successive thin slices of a brain in order to create a comprehensive three-dimensional image of selected properties of the entire brain.

The graphic displays in imaging technology typically use the color spectrum gradations to represent the activity levels of the various brain areas in a scan (the red end of the spectrum representing a high level of activity in a brain area, the purple end representing low activity, and the other colors representing intermediate levels). Topographical maps similarly use different colors to represent elevations (mountains, valleys, etc). A scan of a slice of brain thus graphically indicates which brain areas were active and inactive during the time interval of the scan.

Functional Magnetic Resonance Imaging (commonly written fMRI) measures brain blood flow patterns and metabolic changes. Although almost a dozen different kinds of imaging technologies exist, fMRI is perhaps the currently most important of the group for cognitive neuroscience researchers. fMRI permits them to identify specific brain regions that are active when the subject is carrying out a task, such as reading a text, making a decision, or moving a finger. Scientists can thus compare the brain anatomy and activity of people who read well and poorly, or who make appropriate and inappropriate decisions. Much of what we’ve learned recently about cognition has been carried out with fMRI technology.

Positron Emission Tomography (PET) is another important imaging technology. Scientists using PET insert a small amount of radioactively tagged glucose (or other compound) into the bloodstream of the experimental subject. Since glucose is the brain’s principal food, the PET scans of subjects will reveal the brain areas that are the most active (those with the most glucose) when, for example, the subjects are asked to say the first verb that comes to mind when they hear a specific noun—such as cut or slice when they hear the noun knife.

Emerging major advances in EEG (electroencephalogram) technology may provide the best initial and potential venue for educational researchers. EEG is the least invasive, cheapest, and most portable of the imaging technologies. For example, since fMRI uses powerful magnets and PET uses radioactive isotopes, and both require expensive equipment in specialized laboratory settings, their use in educational research has been limited by ethical and financial considerations. Conversely, EEG measures electrical brain waves via electrodes placed on the skull, and so it’s no more invasive than a blood pressure sleeve. Further, the electrodes can now be placed inside a cap where they send wireless signals to a nearby computer, so a researcher could eventually observe brain activity in non-laboratory settings, such as within a classroom.

Although we’re excited about the potential use of imaging technologies in our understanding of many current cognitive mysteries, and in the diagnosis and treatment of learning disabilities, worrisome issues exist. Imaging technologies can also become potentially superb lie detectors, and so their use and possible misuse in government, business, and our criminal justice system poses serious issues about the nature of privacy. For example, if police can’t enter a suspect’s house without a warrant, can they technologically enter a suspect’s skull in search of evidence without the equivalent of a warrant?

Each recent technological and biological advance has brought both promise and concern. Brain imaging technology thus joins recent advances in genetics, medicine, and computer technology as another element of what promises to be an intellectually stimulating and challenging era.