Alignment of an interferometric gravitational wave detector

Peter Fritschel, Massachusetts Institute of Technology
Nergis Mavalvala, Massachusetts Institute of Technology
David Shoemaker, Massachusetts Institute of Technology
Daniel Sigg, Massachusetts Institute of Technology
Michael Zucker, Massachusetts Institute of Technology
Gabriela González, Pennsylvania State University

Abstract

Interferometric gravitational wave detectors are designed to detect small perturbations in the relative lengths of their kilometer-scale arms that are induced by passing gravitational radiation. An analysis of the effects of imperfect optical alignment on the strain sensitivity of such an interferometer shows that to achieve maximum strain sensitivity at the Laser Interferometer Gravitational Wave Observatory requires that the angular orientations of the optics be within 1028 rad rms of the optical axis, and the beam must be kept centered on the mirrors within 1 mm. In addition, fluctuations in the input laser beam direction must be less than 1.5 × 10−14 rad/√Hz in angle and less than 2.8 × 10−10 m/#x221A;Hz in transverse displacement for frequencies f > 150 Hz in order that they not produce spurious noise in the gravitational wave readout channel. We show that seismic disturbances limit the use of local reference frames for angular alignment at a level approximately an order of magnitude worse than required. A wave-front sensing scheme that uses the input laser beam as the reference axis is presented that successfully discriminates among all angular degrees of freedom and permits the implementation of a closed-loop servo control to suppress the environmentally driven angular fluctuations sufficiently. © 1998 Optical Society of America.