Explaining the gamma-ray burst lag/luminosity relation
The spectral lag of a gamma-ray burst (i.e., the cross-correlation delay between the light curves as measured at ∼200 and 30 keV) is inversely related to the burst's peak luminosity. This lag/luminosity relation has passed three stiff predictions, so we have good empirical evidence for its validity. The relation is of high importance, since it provides a means to get the distances to bursts from the gamma-ray light curves alone, thus opening up the entire BATSE database to demographic studies, as well as making bursts available as tools for cosmology, even out to very high redshifts. Nevertheless, it would be good to have some theoretical understanding of why the lag is related to the luminosity. In this paper the lag/luminosity relation is shown to be a simple and forced consequence of the empirical and general Liang-Kargatis relation (which describes how the peak photon energy in the spectrum changes with time). In short, the lag is related to the time it takes for a pulse to cool somewhat (by ∼6 keV); if the burst is highly luminous, then the cooling time (and hence the lag) will be short, while if the burst has a low luminosity, then the cooling time and lag will be long. With this understanding I propose three methods to improve the scatter in the lag/luminosity relation, and I note that there is no apparent cause for evolution to distort the gamma-ray burst Hubble diagram at high redshift.
Publication Source (Journal or Book title)
Schaefer, B. (2004). Explaining the gamma-ray burst lag/luminosity relation. Astrophysical Journal, 602 (1 I), 306-311. https://doi.org/10.1086/380898