Hi, I was simulating the alpha spectrum given by a silicon (PIPS) detector in Geant4. In the experiment the alpha source was placed about 4 cm distance and the spectrum was obtained. Its an ORTEC made TB-20-100-2000 detector which means the manufacturer given thickness is 2000 micrometer = 2mm and active area =100 mm^2. I have simulated the same detector in GEANT4 with the source detector distance as used in the experiment. But the obtained spectrum alpha peak has FWHM far lower than the one obtained experimentally. I have varied various geometrical parameters like the source distance separation, effective area of the detector as the bias voltage may affect it and the Boron dead layer thickness. But these variations also cannot account the large difference between theory and simulation. I am attaching the experimental spectrum along with the one that I simulated using GEANT4.
241Amrawdata.txt (7.6 KB)
2mm_24_12_241_107_4_5_normalised.txt (5.7 KB)
The file ‘241Amrawdata.txt’ contains the raw experimental data which is not normalised and the second file ‘2mm_24_12_241_107_4_5_normalised.txt’ contains the simulated data with the data normalised on the maximum height. The difference in the FWHM of the peak is evident. I am attaching the source macro and detector construction.cc files here for the reference.
Also the experimental and theoretical spectra are like this
B4DetectorConstruction.cc (9.1 KB)
run1.txt (502 Bytes)
I have used the /grdm/nucleusLimits 240 250 94 96 command in run macro to take out the spontaneous fission events from the spectrum.