Vibration Sensing with Laser Speckle

OPN Staff

How to record individual musical instruments playing in an ensemble, using laser light.

Optical microphones—in which laser light and interferometry directly sense the vibration of the microphone membrane—are undergoing rapid commercial development. But there’s another way lasers can pick up sound: through changes in the speckle patterns of beams bounced directly off of sound-producing objects, such as musical instruments.

Changes in speckle patterns of laser light can be used to reconstruct sound.

Such “speckle-based vibrometry” can, in principle, sense sound directly from individual musical instruments in an ensemble or orchestra, recording each instrument as a separate track. OPN talked with Mark Sheinin, a researcher at Carnegie Mellon University (CMU), USA, who, along with several colleagues, has developed a setup for doing just that—without the expensive high-speed cameras required in previous setups to record sound with laser speckle.

1. From source to speckle

The CMU setup begins with a 532-nm laser, split by a diffraction grating into multiple beams that are routed via a polarizing beam splitter to bounce off of individual sound sources—such as the pair of guitars shown here.

If the camera objective lens focuses directly on the instrument surface, all it sees are laser dots. But defocusing to a point in front of the actual object lets the camera pick up not the laser dot, but its speckle—the random self-interference pattern that results from the laser’s interaction with microstructures on the object surface. And defocusing to just the right plane captures the speckle from each instrument as a separate spot.

2. Stretching the speckle

It’s long been known that tiny tilts in an object surface due to acoustic or other vibrations lead to similarly tiny shifts in the speckle pattern from an incident laser [Z. Zalevsky et al., Opt. Express 17, 21566 (2009)]. These shifts can be used to reconstruct the original sound—but that has required high-speed cameras capable of recording thousands of frames per second.

The CMU group got past that limitation with a two-
sensor setup requiring frame rates no higher than 134 fps. As a first step, a cylindrical lens is placed in front of the camera objective lens; this stretches the speckle spot from each instrument into a column. Spreading the speckle vertically is crucial for the system’s ability to sense high frequencies using “slow” cameras (see below). And, since each column takes up only a fraction of the camera’s imaging area, multiple vibration locations can be sampled simultaneously.

3. Dual-shutter imaging

The speckle column then passes through a series of relay optics to be imaged by two different sensors. One, a 134-fps global-shutter sensor, captures a single, “frozen” image of the complete speckle pattern—an undistorted reference image of the pattern at a given time. The other, a 65-fps rolling-shutter sensor (similar to the shutter used on a smartphone camera), captures slices of the speckle pattern as a series of rapidly exposed sequential rows, top to bottom.

In effect, the rolling shutter serves as a very fast 1D sensor, with each sequential row capturing the shifts in the speckle pattern in a tiny fraction of the time implied by the sensor’s nominal 65-fps frame rate. Comparing the rapidly acquired, row-by-row shifts from the rolling-shutter sensor with the reference speckle captured by the global-shutter sensor lets the system analyze acoustic vibrations with frequencies as high as 63 kHz, despite the use of two “slow” camera sensors.

4. Off to the computer

Pulling sound from the two instruments back out of the speckle shifts requires sophisticated computing to correlate the static and rolling-shutter signals and reconstruct the sounds that created the speckle shift—as well as application of a high-pass filter to screen out low-frequency motions attributable, for example, to movement of the musicians. Using the setup, the CMU team was able to separately record the individual tunes played simultaneously by multiple musical instruments, to sense vibrations at multiple points on a single struck tuning fork, and more.

The team believes its system could have applications ranging from musical recording to more industrial applications, such as monitoring large collections of factory-floor machinery using a single camera.

OPN thanks Mark Sheinin, CMU, for assistance with this tutorial. For references and resources, go online:

Publish Date: 01 September 2022

Add a Comment