Geant4 Version: geant4-v11.3.1
Operating System: Fedora 31
_Compiler/Version: 12.2.1 20221121 _
CMake Version: 3.27.7
Hi all. I’m working on using GEANT4 to simulate an “alpha sail” spacecraft propulsion system driven by the decay of 232U, with alphas captured by a backing film in one direction but free to escape in the other (plus some complications, such as allowing for partial radon escape - see Note 1). The goal is to calculate the heat deposition and momentum change. To do this, I simulate a large number of events, each with a primary particle randomly chosen in terms of location and type (based on the isotopic distribution within the active material of the sail), and let GEANT4 simulate it for a fixed period (say, a day):
void UserSteppingAction(const G4Step* step) override {
if (!step || !fRunAction) return;
G4Track* track = step->GetTrack();
if (!track) return;
G4double time = track->GetGlobalTime();
if (time > Config::SimTimePerEvent) {
track->SetTrackStatus(fStopAndKill);
return;
… which should allow for primary decays and, often (if there is a primary decay), secondaries. I then have to normalize the energy and momentum recorded to the calculated number of decays during that timeperiod, so that I can then compute heating, equilibrium temperature, and acceleration. But this has proven very difficult. My core approach to getting a scaling factor was:
const G4double calculatedDecaysOverSimTime = calculatedDecaysPerSecond * simSeconds;
const G4double calculatedDecaysPerAtomOverSimTime = calculatedDecaysOverSimTime / atoms;
const G4double expectedGEANT4DecayEventsOccurred = nEventsProcessed * calculatedDecaysPerAtomOverSimTime;
const G4double scaleFactor = calculatedDecaysOverSimTime / expectedGEANT4DecayEventsOccurred;
However, not only were GEANT4’s numbers not in the ballpark of my manually calculated numbers, but they didn’t scale linearly with simulation time. With a 1d simulation time:
Average Heating Power: 5079826.411588 W [2405.454463 W/kg, 507.982641 W/m²]
But with a 10d simulation time:
Average Heating Power: 3109996.728445 W [1472.679360 W/kg, 310.999673 W/m²]
Code of relevance for the above:
G4double scaledEnergyDeposit = fMasterData.energyDeposit * scaleFactor;
G4double heatingPower = (scaledEnergyDeposit / CLHEP::joule / simSeconds) * CLHEP::watt;
G4cout << "Average Heating Power: " << G4BestUnit(heatingPower, "Power") << " ["
<< ((heatingPower / CLHEP::watt) / (Config::ActiveMass / CLHEP::kg)) << " W/kg, "
<< ((heatingPower / CLHEP::watt) / (sailArea / CLHEP::m2)) << " W/m²]\n";
My expectations were that if my code were correct, changing the time from 1d to 10d should make no difference; and if my code were wrong, the calculation should grow by 10x or shrink by 10x. But the results halving the heating power makes little sense to me. I track the number of primary decays, and they rise from 287 to 1178… so the increase in primary decays is not linear, but that’s expected (we’re talking primaries here, and short-lived primaries are pretty much guaranteed to decay whether it’s 1d or 10d, with the change only being in long-lived primaries)… but I expect that with a longer simulation time, each primary also averages a higher number of secondaries, and that the total number of simulated decays (primary + secondary) rises 10x. Yet the energy deposited doesn’t seem to rise 10x, it seems to rise ~4x, and the total energy deposited seems to rise, ~6x - not 10x.
I tried finding a way to directly count secondary decays, but nothing has worked. As an example, as a user stepping action, I tried:
G4StepPoint* postStepPoint = step->GetPostStepPoint();
if (postStepPoint) {
const G4VProcess* process = postStepPoint->GetProcessDefinedStep();
// Check if the process that *ended* this step was a decay
if (process && process->GetProcessName() == "RadioactiveDecay") {
// Increment total decay counter associated with RunAction
fRunAction->AddTotalDecay();
}
}
But that doesn’t work. GetProcessName is always “NoProcess”. I tried a number of other things, but nothing worked out. And I can’t see why my original approach wouldn’t work anyway, which makes me wonder if I’m making a conceptual mistake. I’ve tried removing my radon special casing (both the handling of radon within GEANT4, and the equilibrium decay distribution starting point), and while that changes the numbers, the same problem continues to exist.
1d: Average Heating Power: 7072060.753646 W [2900.978239 W/kg, 707.206075 W/m²]
10d: Average Heating Power: 3482075.117464 W [1428.356528 W/kg, 348.207512 W/m²]
Ideas?
Full current (very rough and ugly!) code here: Paste.gd: Copy and Paste Text Tool
*** Note 1: the design attempts to minimize 232U’s problem of intense gammas from 208Tl decay via using a urania aerogel (as per Reibold et al 2002) and maximizing the equilibrium temperature, which my calculations suggest should allow the lion’s share of the radon to diffuse out (pre-deployment: into helium coolant and then a cold trap; post-deployment: into space). This does eliminate a lot of the decay chain energy, but has an offsetting side bonus of not retaining the lead mass. In GEANT4 I model this first off by adjusting the equilibrium distribution, and secondly - to account for the possibility that decay near the sail will lead to sending alphas, polonium, lead etc back into the sail or payload - I add in special cases for radon handling, both as daughter products (in the user stepping action - “if (track->GetDefinition()->GetParticleName() == “Rn220”) {” - and if 220Rn is chosen in GeneratePrimaries (“if (Z == 86 && A == 220)”). In each case, my approach has been to pick a random thermal velocity at escape; a random escape trajectory from the surface; and a random time to decay (minus the time to escape the aerogel); then to position the particle at the projected decay location with zero velocity, so that - so long as the simulation time is long enough - GEANT4 should almost certainly decay it there. Does this sound like the most reasonable approach? Either way, though, the above problem I’m facing exists whether I have radon special-cased or not.