Normalization of Net Peak Area vs Expected cps using G4RadioactiveDecay

Dear,

I am working on validating a Geant4 simulation of a detector against experimental data. I simulated a 152 Eu source using the G4RadioactiveDecay module to generate the primary particles.

My goal is to compare the experimental count rate (cps) of specific gamma lines (e.g., 121.78 keV, 344.3 keV) with the simulated expected count rate.

For the experiment, I did the calculation in this way:
Experimental cps = (Experimental Net Peak Area) / (Live Time)

For the simulation, I generated 60,000,000 primary events and extracted the Net Peak Area for each specific energy line from the simulated spectrum. However I don’t know how to properly normalize this simulated Net Peak Area to get the expected simulated cps. The source activity is 15.6 kBqw.

My question is:
Since I am using G4RadioactiveDecay, does Geant4 intrinsically account for the emission yield of each specific gamma line during the event generation?

Consequently, which of the following post-processing formulas is the correct one to calculate the simulated cps?

Option A (No extra yield needed):
Simulated cps = (Simulated Net Peak Area / Number of Generated Decays) * Activity (Bq)

Option B (Multiplying by an Emission Yield):
Simulated cps = (Simulated Net Peak Area / Number of Generated Decays) * Activity (Bq) * Emission Yield

and the Emission Yield in the option B have to be the sum of all the emission yield of all the gamma emitted or the emission yield of the gamma peak that im analysing?

I have attached a screenshot of my spreadsheet. The bold column represents this factor (currently set to 1, as in Option A). Thank you very much for your time, and please let me know if there is a flaw in my reasoning.

simulation
Net_Peak_Area_Sim Activity (Bq) Generated Particles Emission Yield Total cps_Sim
112395,5 15600 60000000 1 29,22283
9914,5 15600 60000000 1 2,57777
15747,5 15600 60000000 1 4,09435
1304 15600 60000000 1 0,33904
87 15600 60000000 1 0,02262
596 15600 60000000 1 0,15496
esperiment
Net_Peak_Area Exp Live Time (s) cps_Exp
2123433,8 61417,8 34,57358942
165890,4 61417,8 2,701015015
281430,4 61417,8 4,582228605
18271,1 61417,8 0,297488676
13543,1 61417,8 0,220507736
10503,2 61417,8 0,171012312

Giulia

Hallo Giulia,
Im sorry I don’t understand your esperiment table and the calculated Experimental cps→ your LiveTime is constant 61417.8. So what’s changing? Am I missing something?

David

“Emission Yield” is confusing here. Geant4 contains the branching ratios of radioisotopes already. Any errors or inconsistencies are generally just reflections of missing experimental data. The “lines” you are interested in for Eu152 are not actually solely attributed to it but also to daughter decays and energy levels. Every Eu152 you place in your simulation will decay all the way to 140Ce if you put no constraints on time. So it is normalized by proxy to “activity”.

Efficiency (and the physics) of gamma interactions with detectors is an interplay between mean free path and detector geometry constraints. This makes the efficiency (related to cps) a function of energy. So if you are interested in X energy lines, you should be making separate comparison for each of those X lines. Technically your entire spectrum are counts attributed to decays. What you are worried about is fully capturing the energy of those gamma rays. This is a combination of physics and pulse shaping electronics.

So “option A” will work if you do not bias the width of the peak area. If you know your real electronic noise (and even better, charge loss as a function of energy), then you can convolve this blur with your spectrum and make an apples to apples comparison by using the same “peak window” for both.