In order to estimate by calculation the magnitude of the true coincidence summing losses that may be affecting the observed gamma-ray spectrum of a given nuclide, measured using a spectrometer, knowledge of the total detection efficiencies at the gamma-ray energies within the cascades is essential. The total efficiency can be determined from the full energy peak efficiency, provided the peak-to-total ratio is known. For a given high purity germanium (HPGe) detector, one can establish an intrinsic peak-to-total (P/T) efficiency curve using a set of measurements performed with “single” (ideally monoenergetic) gamma-emitting nuclides (e.g., 241Am, 109Cd, 57Co, 113Sn, 137Cs, 65Zn). Some of these nuclides are short lived and so have to be replaced periodically. Moreover, the presence of low energy gamma-rays and X-rays in most of the decay schemes complicate the empirical determination of the P/T ratios. This problem is especially severe if measurements are made using HPGe detectors that have a very thin dead layer. The problems posed by low energy gamma-rays and X-rays can be avoided by using absorbers, but then one has to be careful not to perturb the intrinsic value of the P/T ratio being sought. This paper addresses these problems. Measurement related limitations are avoided if one can use a computational technique instead. In the work presented here, the feasibility of using a Monte-Carlo based technique to determine the P/T ratios at a wide range of energies (60 keV to 2000 keV) is explored. The Monte-Carlo code MCNP (version 4B) is used to simulate gamma-ray spectra from various nuclides. Measured P/T ratios are compared to calculated ratios for several HPGe detectors to demonstrate the generality of the approach. Reasons for observed disagreement between the two are discussed.