touching atoms

0

Have you ever felt an atom? Being composed of atoms ourselves, we are always in contact with them, both in our own body and in every aspect of the physical world. But we don’t feel them per se. Even if you place your palm on a table, you don’t actually feel atoms — you feel the repulsion of the electrostatic field created by the electrons whizzing around the periphery of each atom at the speed of light. They create a negative charge that prevents other atoms – which also have negative charges – from getting too close. At this level of detail, the whole world of “hard” surfaces becomes something resembling an unimaginable number of tiny homopolar magnets trying to squeeze each other. You can get close – but not too close.

Even if you put your palm on a table, you don’t actually feel atoms.

The physics of the “untouchable” atom opened the door to the first real attempts to “feel” matter at the atomic level. In 1981, Gerd Binnig and Heinrich Rohrer, researchers at IBM Zurich, developed the “scanning tunneling microscope” (STM). Building on one of the fundamental effects of quantum mechanics, the STM places what is essentially the very sharp point of a needle very close to a material of interest. When electrically charged, electrons “jump” off the probe tip and “tunnel” through the material. The pattern of this tunneling – where and when the electrons jump from the tip to the material – gives you a picture of the material as if it had been shot through by an X-ray beam. Although atoms can’t get close together, Binnig and Rohrer used quantum tunneling to allow them to brush one another so gently — research that earned them the 1986 Nobel Prize in Physics.

The IBM logo in atoms, taken in 1989. Source: IBM via Wikimedia Commons.

In 1985, Binnig created the first real improvement on the STM – the “Atomic Force Microscope” or AFM, which added a micromechanical vibrator to the tip of the probe. As the tip of the AFM vibrates back and forth, it scans a region of a material at the atomic level. This tip – just a few millionths of a meter long – could both “read” the material beneath it and (with the appropriate electrical charge) even be used to push that material around, gently moving individual atoms into new positions. In 1989, to demonstrate their newfound capabilities, IBM released a famous photo of a row of xenon atoms arranged to form the IBM logo. It wasn’t an easy task – the same quantum effect that allows electrons to tunnel into the material from the tip also made it incredibly easy for those atoms to migrate away from the positions to which they had been lured by the AFM.

When electrically charged, electrons “jump” off the probe tip and “tunnel” through the material.

Atomic force microscopy made it possible to both “read” and “write” atoms, but it took a very clever graduate student at the University of North Carolina, USA, to figure out how to touch them. Russell M. Taylor fed the information generated by an atomic force microscope into a multimillion-dollar graphics supercomputer (which, considering it was 1993, was almost certainly less powerful than the average smartphone), and used that data to to generate a three-dimensional “contour” of the material under the probe tip. Although images generated from AFM scans gave a rough picture of the “shape” of atoms, Taylor’s visualizations conveyed a sense of depth, placement, and orientation—not just a single atom, but that atom in relation to that atom, creating the structures were revealed by chemically bonded atoms (molecules). Projected onto a table-sized surface and viewed with special 3D glasses, these atoms and molecules looked as real as apples and oranges.

A boy wearing VR glasses and using a mechanical arm to interact with a large screen
Russell M. Hill’s “nanomanipulator” enabled humans to “feel” atoms and molecules. Credit: Todd Gaul / University of North Carolina, Chapel Hill.

Taylor added finishing touches to his research device – his VR system had a haptic interface; That is, it could give a false sense of “touch” to the objects displayed on the table in its virtual world. You could (virtually) run your hand over the surface of atoms, even push them around and feel them snap back into place. This nanomanipulator, as Taylor dubbed it, became one of the seminal works of the first age of virtual reality. Sharing his work with some research chemists, they were amazed that they could “feel” through chemical bonds and molecular structures that had always been theoretical abstractions, and discovered things they never could have known about these substances because her sense of touch she revealed details that no one had ever thought of in intuition. By engaging multiple senses, the nanomanipulator made the atomic scale tangible, giving chemists an incredible tool to think about their work.

The nanomanipulator made the atomic scale tangible.

But the nanomanipulator was large, expensive, and delicate. STMs and AFMs require a level of precision and support that makes them the rarest parts of a lab kit — and even if you could get access to one, you’d still need a million-plus dollar supercomputer to turn it into a nanomanipulator to transform. Taylor had developed a groundbreaking, unique tool. Even preparing a sample for an AFM scan required significant work; Subjects of AFMs and STMs must be placed in an isolated vacuum chamber — which immediately precludes observing anything even remotely living on an atomic scale. With the exception of tardigrades, vacuum cleaners and life do not mix.

A chance discovery by researcher Christopher Bolton in a lab at the University of Melbourne opened a less toxic window into the nanoscale. Working with lasers, Bolton saw something he had never seen or heard of before – illuminating something microscopic from multiple angles produced multiple views of the same object, and Bolton was able to combine those images with a fairly simple bit of math into a single view of this very small thing.

Concept of seeing and touching atoms scientific figure of a silica molecule with different colored circles arranged in a ring to represent the different angles of light
Example of the visualization of a silica molecule using Tiny Bright Things’ nanoscopic technique. Credit: Tiny Bright Things.

How small? Optical microscopes reach the physical limit of what they can see at half a micrometer (a micrometer is one-millionth of a meter) – because then something is so small that it is smaller than the wavelength of light. Bolton found that his method of capturing the subject from all angles with beams of light allowed him to image objects that were only one-twentieth the size – just 25 nanometers (billionths of a meter).

“We put a live bacterium on a slide and watched it struggle.”

Christopher Bolton, Tiny Bright Things

Better still, this technique worked with pretty much any sample you wanted to throw onto a slide—no vacuum required. “We put a live bacterium on a slide,” Bolton reported, “and watched it struggle. As it died, it spilled its nanoscale guts on the slide—and we got to see them too!” These were the kinds of events biologists had theorized about but never saw happen. Bolton’s discovery – which he partnered with research advisor Ray Dagastine to transform into the startup Tiny Bright Things – looks like it could give both medicine and biology the eyes they need to see bacteria, viruses, and the deep, but poorly understood interactions between our body and our environment.

400 years ago, the first microscopes gave us a glimpse into a world we never imagined. These latest microscopes open up a new perspective on a world that we understand in theory but have never visited in practice. How much more will we learn by watching nanoscopic creatures dance? And how long before an enterprising graduate student clips a haptic interface onto this new microscope so we can touch a virus’s surface, feel its spike proteins, and perhaps better learn how to defend ourselves against them?



Like the weekly? You will love the quarterly COSMOS magazine.

The biggest news in detail, quarterly. Buy a subscription today.

Share.

Comments are closed.