Helium Flash and Surface Effects
Aug 17, 2022 11:42 · 523 words · 3 minute read
I sometimes have research ideas that I think are cool, but that don’t make sense for me to pursue. I generally just make a note of them and move on. This is the 26th post in a series describing some of the ideas I’ve accumulated. This idea is based on work I did with Jim Fuller.
Helium Flash and Surface Effects
What’s the idea?
Stars below around $3 M_\odot$ exhibit a helium flash, in which degenerate helium in their cores ignites. This ignition runs away until degeneracy is lifted, producing enormous luminosities ($10^{9-10}L_\odot$). Despite the enormous energies involved, there are thought to be no immediate effects of the helium flash that can be observed at the surface, primarily because the thermal diffusion time from the core to the surface is very long compared with the flash itself.
Internal gravity waves are a strong candidate to cause surface effects, though. Of order $10^{6-7}L_\odot$ of internal gravity waves are emitted by the helium convection zone at the peak of the flash, and these can travel ballistically to the surface. Along the way they damp, depositing large quantities of heat much higher up than thermal diffusion alone could manage. This heat deposition could cause effects at the surface, including potentially luminosity and radial velocity changes and mass loss.
Why is this important?
I’m not sure that it’s important, but it’s definitely cool. It’s just a really interesting combination of physics (waves, degenerate ignition, convection). The event rate in our galaxy is likely low, but if we knew what we were looking for it’s possible we might be able to catch a star right as it goes through the helium flash.
How can I get started?
There have been proposals that the Helium flash causes mass loss and lithium depletion, though as best we can tell the latter doesn’t happen.Jim and colleagues have worked on on wave-driven outbursts in more massive stars. When he and I tried to investigate the possibility of mass loss we ran into a bunch of challenges:
- The wave heat that gets deposited can modulate convection in the outer envelope.
- The wave heat can drive shocks in the outer envelope, and these can be challenging to resolve numerically.
If I were continuing to study this problem, I’d start by trying to get a more solid handle on how internal gravity waves deposit heat in evanescent regions. That seems core to the problem, because even a small fraction of the wave luminosity being deposited in the outer envelope can cause large changes to the thermal structure, can stop convection there (which may cause more wave heating), etc.
It might also be useful to study this problem in e.g. hydrodynamics simulations. If I were doing this, I would set up a convection zone on top of a radiative zone. I would then pump a large luminosity of waves through the radiative zone from the lower boundary and see what that does to the convection zone. It could be tricky to get the damping scaled to match what happens in stars, but if it’s doable this seems like the most direct way to study the phenomenon.