We calculate the observed luminosity and spectrum following the emergence of a relativistic shock wave from a stellar edge. Shock waves propagating at 0.6 < Γsh β sh, where Γsh is the shock Lorentz factor, and β sh is its associated reduced velocity, heat the stellar envelope to temperatures exceeding ∼50 keV, allowing for a vigorous production of electron and positron pairs. Pairs significantly increase the electron-scattering optical depth and regulate the temperature through photon generation, producing distinct observational signatures in the escaping emission. Assuming Wien equilibrium, we find analytic expressions for the temperature and pair density profiles in the envelope immediately after shock passage, and compute the emission during the expansion phase. Our analysis shows that, in pair-loaded regions, photons are produced at a roughly uniform rest-frame energy of ∼200 keV, and reinforce previous estimates that the shock breakout signal will be detected as a short burst of energetic γ-ray photons, followed by a longer phase of X-ray emission. We test our model on a sample of low-luminosity gamma-ray bursts using a closure relation between the γ-ray burst duration, the radiation temperature, and the γ-ray isotropic equivalent energy, and find that some of the events are consistent with the relativistic shock breakout model. Finally, we apply our results to explosions in white dwarfs and neutron stars, and find that typical type Ia supernovae emit ∼1041 erg in the form of ∼1 MeV photons.
Bibliographical notePublisher Copyright:
© 2023. The Author(s). Published by the American Astronomical Society.