A new “squeezed light” method improves the sensitivity and accuracy of the interferometers used to measure gravitational waves. Observing the gravitational waves that result from supernova explosions and other cosmic events requires extremely sensitive metrology techniques. A gravitational wave’s signal is so small that it is usually dwarfed by the noise generated by the quantum mechanical fluctuations of the light beams, and this shot noise limits the accuracy of the interferometer. A highly complex laser system produces light in the gravitational wave detector GEO600 that is particularly quiet. Courtesy of Max Planck Institute for Gravitational Physics. A new “squeezing” process developed at Max Planck Society and Leibniz University Hannover minimizes this uncertainty to produce laser light with almost no fluctuations, thus improving the measuring accuracy and increasing the sensitivity of gravitational wave detectors such as the GEO600 at the university’s Center for Quantum Engineering and Space-Time Research. The squeezed light method generates a completely new quality of laser light, which the team used to increase the measurement sensitivity of the GEO600 to 150 percent. This is an important step toward direct sensing of gravitational waves, the researchers say. 3-D visualization of gravitational waves produced by two orbiting black holes. Courtesy of Henze, NASA. The scientists fed the squeezed light, in addition to the normal laser light, into the interferometer. If the two light fields superimpose, the laser beam that results has more uniform intensity than the original signal beam. Thus, they determined that their method smooths out irregularities caused by quantum physical effects in the signal of the detector. The results appeared online Sept. 11 in Nature Physics (doi: 10.1038/nphys2083). Since April of last year, the squeezed laser light has undergone a longer test phase at GEO600 and is currently being used in the search for gravitational waves.