Predicting Absorption with Relativistically Induced Transparency in Thin Foils
ORAL
Abstract
Relativistically induced transparency (RIT) occurs when a strong enough electric field oscillates plasma electrons at relativistic speeds which changes the plasma frequency by the Lorentz gamma factor, resulting in an intensity-dependent relativistic critical density; what is classically an over dense plasma becomes relativistically transparent to the incident laser light. Theoretical treatments of RIT have typically assumed a negligible absorption of laser energy into the plasma, such that energy conservation is simplified to 1 = T + R, where T and R are the intensity transmission and reflection coefficients. Presented here is an analysis of particle-in-cell simulations that determines a simple prediction for the absorption fraction for thin foils that is relevant to RIT theory.
This work was motivated from experimental results collected at the Scarlet Laser Facility though a LaserNetUS experiment using 8CB liquid crystal films. The predicted absorption coefficient is compared to results from 2D and 3D particle-in-cell (PIC) OSIRIS 4.0 simulations.
This work was motivated from experimental results collected at the Scarlet Laser Facility though a LaserNetUS experiment using 8CB liquid crystal films. The predicted absorption coefficient is compared to results from 2D and 3D particle-in-cell (PIC) OSIRIS 4.0 simulations.
–
Presenters
-
Brendan L Stassel
Authors
-
Brendan L Stassel
-
Hongmei Tang
Lawrence Berkeley National Laboratory
-
Paul T Campbell
-
Brandon K Russell
Princeton University
-
Alec G.R. GR Thomas
Michigan University
-
Pedro Spingola
The Ohio State University
-
German Tiscareno
Ohio State University
-
Ali Rahimi
Ohio State University
-
Rebecca L Daskalova
Ohio State Univ - Columbus
-
Douglass W Schumacher
Ohio State University
-
Louise Willingale