共查询到20条相似文献,搜索用时 15 毫秒
1.
(123)I is a radionuclide frequently used in nuclear medicine imaging. The image formed by the 159 keV photopeak includes a considerable scatter component due to high energy gamma-ray emission. In order to evaluate the fraction of scattered photons, a Monte Carlo simulation of a scintillation camera used for (123)I imaging was undertaken. The Monte Carlo code consists of two modules, the HEXAGON code modelled the collimator with a complex hexagonal geometry and the NAI code modelled the NaI detector system including the back compartment. The simulation was carried out for various types of collimators under two separate conditions of the source locations in air and in water. Energy spectra of (123)I for every pixel (matrix size = 256 x 256) were obtained by separating the unscattered from the scattered and the penetrated photons. The calculated energy spectra (cps MBq(-1) keV(-1)) agreed with the measured spectra with approximately 20% deviations for three different collimators. The difference of the sensitivities (cps MBq(-1)) for the window of 143-175 keV was less than 10% between the simulation and the experiment. The partial sensitivities for the scattered and the unscattered components were obtained. The simulated fraction of the unscattered photons to the total photons were 0.46 for LEHR, 0.54 for LEGP and 0.90 for MEGP for the 'in air' set-up, and 0.35, 0.40 and 0.68 for the 'in water' set-up, respectively. The Monte Carlo simulation presented in this work enabled us to investigate the design of a new collimator optimum for (123)I scintigraphy. 相似文献
2.
Fast Monte Carlo based joint iterative reconstruction for simultaneous 99mTc/ 123I SPECT imaging 总被引:2,自引:0,他引:2
Simultaneous 99mTC/ 123I SPECT allows the assessment of two physiological functions under identical conditions. The separation of these radionuclides is difficult, however, because their energies are close. Most energy-window-based scatter correction methods do not fully model either physical factors or patient-specific activity and attenuation distributions. We have developed a fast Monte Carlo (MC) simulation-based multiple-radionuclide and multiple-energy joint ordered-subset expectation-maximization (JOSEM) iterative reconstruction algorithm, MC-JOSEM. MC-JOSEM simultaneously corrects for scatter and cross talk as well as detector response within the reconstruction algorithm. We evaluated MC-JOSEM for simultaneous brain profusion (99mTc-HMPAO) and neurotransmission (123I-altropane) SPECT. MC simulations of 99mTc and 123I studies were generated separately and then combined to mimic simultaneous 99mTc/ 123I SPECT. All the details of photon transport through the brain, the collimator, and detector, including Compton and coherent scatter, septal penetration, and backscatter from components behind the crystal, were modeled. We reconstructed images from simultaneous dual-radionuclide projections in three ways. First, we reconstructed the photopeak-energy-window projections (with an asymmetric energy window for 1231) using the standard ordered-subsets expectation-maximization algorithm (NSC-OSEM). Second, we used standard OSEM to reconstruct 99mTc photopeak-energy-window projections, while including an estimate of scatter from a Compton-scatter energy window (SC-OSEM). Third, we jointly reconstructed both 99mTc and 123I images using projection data associated with two photo-peak energy windows and an intermediate-energy window using MC-JOSEM. For 15 iterations of reconstruction, the bias and standard deviation of 99mTc activity estimates in several brain structures were calculated for NSC-OSEM, SC-OSEM, and MC-JOSEM, using images reconstructed from primary (unscattered) photons as a reference. Similar calculations were performed for 123I images for NSC-OSEM and MC-JOSEM. For 123I images, dopamine binding potential (BP) at equilibrium and its signal-to-noise ratio (SNR) were also calculated. Our results demonstrate that MC-JOSEM performs better than NSC- and SC-OSEM for quantitation tasks. After 15 iterations of reconstruction, the relative bias of 99mTc activity estimates in the thalamus, striata, white matter, and gray matter volumes from MC-JOSEM ranged from -2.4% to 1.2%, while the same estimates for NSC-OSEM (SC-OSEM) ranged from 20.8% to 103.6% (7.2% to 41.9%). Similarly, the relative bias of 123I activity estimates from 15 iterations of MC-JOSEM in the striata and background ranged from -1.4% to 2.9%, while the same estimates for NSC-OSEM ranged from 1.6% to 10.0%. The relative standard deviation of 99mTc activity estimates from MC-JOSEM ranged from 1.1% to 4.8% versus 1.2% to 6.7% (1.2% to 5.9%) for NSC-OSEM (SC-OSEM). The relative standard deviation of 123I activity estimates using MC-JOSEM ranged from 1.1% to 1.9% versus 1.5% to 2.7% for NSC-OSEM. Using the 123I dopamine BP obtained from the reconstruction produced by primary photons as a reference, the result for MC-JOSEM was 50.5% closer to the reference than that of NSC-OSEM after 15 iterations. The SNR for dopamine BP was 23.6 for MC-JOSEM as compared to 18.3 for NSC-OSEM. 相似文献
3.
Interactions of incident photons with the collimator and detector, including septal penetration, scatter and x-ray fluorescence, are significant sources of image degradation in applications of SPECT including dual isotope imaging and imaging using radioisotopes that emit high- or medium-energy photons. Modelling these interactions using full Monte Carlo (MC) simulations is computationally very demanding. We present a new method based on the use of angular response functions (ARFs). The ARF is a function of the incident photon's direction and energy and represents the probability that a photon will either interact with or pass through the collimator, and be detected at the intersection of the photon's direction vector and the detection plane in an energy window of interest. The ARFs were pre-computed using full MC simulations of point sources that include propagation through the collimator-detector system. We have implemented the ARF method for use in conjunction with the SimSET/PHG MC code to provide fast modelling of both interactions in the patient and in the collimator-detector system. Validation results in the three cases studied show that there was good agreement between the projections generated using the ARF method and those from previously validated full MC simulations, but with hundred to thousand fold reductions in simulation time. 相似文献
4.
Images produced by commercial amorphous silicon electronic portal imaging devices (a-Si EPIDs) are subject to multiple blurring processes. Implementation of these devices for fluence measurement requires that the blur be removed from the images. A standard deconvolution operation can be performed to accomplish this assuming the blur kernel is spatially invariant and accurately known. This study determines a comprehensive blur kernel for the Varian aS500 EPID. Monte Carlo techniques are used to derive a dose kernel and an optical kernel, which are then combined to yield an overall blur kernel for both 6 and 15 MV photon beams. Experimental measurement of the line spread function (LSF) is used to verify kernel shape. Kernel performance is gauged by comparing EPID image profiles with in-air dose profiles measured using a diamond detector (approximating fluence) both before and after the EPID images have been deconvolved. Quantitative comparisons are performed using the chi metric, an extension of the well-known y metric, using acceptance criteria of 0.0784 cm (1 pixel width) distance-to-agreement (deltad) and 2% of the relative central axis fluence (deltaD). Without incorporating any free parameters, acceptance was increased from 49.0% of pixels in a cross-plane profile for a 6 MV 10 x 10 cm2 open field to 92.0%. For a 10 x 10 cm2 physically wedged field, acceptance increased from 40.3% to 73.9%. The effect of the optical kernel was found to be negligible for these chi acceptance parameters, however for (deltaD= 1%, deltad = 0.0784 cm) we observed an improvement from 66.1% (without) to 78.6% (with) of chi scores <1 (from 20.6% before deconvolution). It is demonstrated that an empirical kernel having a triple exponential form or a semiempirical kernel based on a simplified model of the detector stack can match the performance of the comprehensive kernel. 相似文献
5.
In SPECT, simultaneous (99m)Tc/(123)I acquisitions allow comparison of the distribution of two radiotracers in the same physiological state, without any image misregistration, but images can be severely distorted due to cross-talk between the two isotopes. We propose a generalized spectral factor analysis (GSFA) method for solving the cross-talk issue in simultaneous (99m)Tc/(123)I SPECT. In GSFA, the energy spectrum of the photons in any pixel is expressed as a linear combination of five common spectra: (99m)Tc and (123)I photopeaks and three scatter spectra. These basis spectra are estimated from a factor analysis of all spectra using physical priors (e.g. Klein-Nishina distributions). GSFA was evaluated on (99m)Tc/(123)I Monte Carlo simulated data and compared to images obtained using recommended spectral windows (WIN) and to the gold standard (GS) images (scatter-free, cross-talk-free and noise-free). Using GSFA, activity concentration differed by less than 9% compared to GS values against differences from -23% to 110% with WIN in the (123)I and (99m)Tc images respectively. Using GSFA, simultaneous (99m)Tc/(123)I imaging can yield images of similar quantitative accuracy as when using sequential and scatter-free (99m)Tc/(123)I imaging in brain SPECT. 相似文献
6.
7.
Previously we have developed a model-based method that can accurately estimate downscatter contamination from high-energy photons in 123I imaging. In this work we combined the model-based method with iterative reconstruction-based compensations for other image-degrading factors such as attenuation, scatter, the collimator-detector response function (CDRF) and partial volume effects to form a comprehensive method for performing quantitative 123I SPECT image reconstruction. In the model-based downscatter estimation method, photon scatter inside the object was modelled using the effective source scatter estimation (ESSE) technique, including contributions from all the photon emissions. The CDRFs, including the penetration and scatter components due to the high-energy 123I photons, were estimated using Monte Carlo (MC) simulations of point sources in air at various distances from the face of the collimator. The downscatter contamination was then compensated for during the iterative reconstruction by adding the estimated results to the projection steps. The model-based downscatter compensation (MBDC) was evaluated using MC simulated and experimentally acquired projection data. From the MC simulation, we found about 39% of the total counts in the energy window of 123I were attributed to the downscatter contamination, which reduced image contrast and caused a 1.5% to 10% overestimation of activities in various brain structures. Model-based estimates of the downscatter contamination were in good agreement with the simulated data. Compensation using MBDC removed the contamination and improved the image contrast and quantitative accuracy to that of the images obtained from 159 keV photons. The errors in absolute quantitation were reduced to within +/-3.5%. The striatal specific binding potential calculated based on the activity ratio to the background was also improved after MBDC. The errors were reduced from -4.5% to -10.93% without compensation to -0.55% to 4.87% after compensation. The model-based method provided accurate downscatter estimation and, when combined with iterative reconstruction-based compensations, accurate quantitation was obtained with minimal loss of precision. 相似文献
8.
Larsson A Ljungberg M Mo SJ Riklund K Johansson L 《Physics in medicine and biology》2006,51(22):5753-5767
Scatter and septal penetration deteriorate contrast and quantitative accuracy in single photon emission computed tomography (SPECT). In this study four different correction techniques for scatter and septal penetration are evaluated for 123I brain SPECT. One of the methods is a form of model-based compensation which uses the effective source scatter estimation (ESSE) for modelling scatter, and collimator-detector response (CDR) including both geometric and penetration components. The other methods, which operate on the 2D projection images, are convolution scatter subtraction (CSS) and two versions of transmission dependent convolution subtraction (TDCS), one of them proposed by us. This method uses CSS for correction for septal penetration, with a separate kernel, and TDCS for scatter correction. The corrections are evaluated for a dopamine transporter (DAT) study and a study of the regional cerebral blood flow (rCBF), performed with 123I. The images are produced using a recently developed Monte Carlo collimator routine added to the program SIMIND which can include interactions in the collimator. The results show that the method included in the iterative reconstruction is preferable to the other methods and that the new TDCS version gives better results compared with the other 2D methods. 相似文献
9.
M D Harpen 《Medical physics》1986,13(6):954-958
Presented is a quantitative analysis of fat and water spectroscopic imaging. Nuclear magnetic resonance (NMR) signal decay curves obtained from conventional and spectroscopic imaging of a variety of fat and water samples in a phantom and of bone marrow, adipose tissue, and muscle from a clinical scan are analyzed to determine the transverse relaxation times and percent contribution to total signal of the fat and water fractions. The variance-covariance structure of parameter estimations obtained by the spectroscopic imaging technique is compared with those obtained by the conventional in-phase imaging technique by means of a Monte Carlo simulation. 相似文献
10.
11.
12.
J C Barrett 《Journal of medical virology》1988,26(1):99-109
The spread of human immunodeficiency virus (HIV) by heterosexual intercourse, during the first 2 years following the introduction of the virus among a sexually active and unprotected group of men and women, is modelled by Monte Carlo simulation. A beta distribution of the infectee's risks of infection per infected partner-month is assumed, with the same coefficient of variation as used in previous studies for risk of conception (natural fecundability), but with a mean of 0.04 per infected partner-month, after scaling the distribution. The number of sexual partners that one sex has (here, the women), is assumed to be more variable than for the other, with a mean of 2 partners each. The individual infection risks per infected partner-month are generated initially and do not change, but some random gains and losses of partners occur each month. Infections are updated each month. It is found in this simple model that women who become infected by 2 years had a mean risk of infection (not counting the original infector) only about 13% higher than the others. Some implications of the low selection are noted. Very great variability in the number of infections subsequently due to an index seropositive is found, which prevents easy discrimination in the characteristics of "at-risk" persons. 相似文献
13.
T. D. Doukoglou I. W. Hunter R. E. Kearney 《Medical & biological engineering & computing》1993,31(3):277-283
The problem of identifying optical system point spread functions (PSFs) arises frequently in the area of image processing
and restoration. The paper presents a method for determining two-dimensional PSFs from input/output image signals. The PSF
of the system is determined from a set of linear equations involving elements of the input autocorrelation function and the
input/output cross-correlation function. The resulting PSF is the one that minimises the sum of squares difference between
the actual output image and the predicted one. 相似文献
14.
The point spread function (PSF) of a gamma camera describes the photon count density distribution at the detector surface when a point source is imaged. Knowledge of the PSF is important for computer simulation and accurate image reconstruction of single photon emission computed tomography (SPECT) images. To reduce the number of measurements required for PSF characterization and the amount of computer memory to store PSF tables, and to enable generalization of the PSF to different collimator-to-source distances, the PSF may be modeled as the two-dimensional (2D) convolution of the depth-dependent component which is free of detector blurring (PSF(ideal)) and the distance-dependent detector response. Owing to limitations imposed by the radioactive strength of point sources, extended sources have to be used for measurements. Therefore, if PSF(ideal) is estimated from measured responses, corrections have to be made for both the detector blurring and for the extent of the source. In this paper, an approach based on maximum likelihood expectation-maximization (ML-EM) is used to estimate PSF(ideal). In addition, a practical measurement procedure which avoids problems associated with commonly used line-source measurements is proposed. To decrease noise and to prevent nonphysical solutions, shape constraints are applied during the estimation of PSF(ideal). The estimates are generalized to depths other than those which have been measured and are incorporated in a SPECT simulator. The method is validated for Tc-99m and T1-201 by means of measurements on physical phantoms. The corrected responses have the desired shapes and simulated responses closely resemble measured responses. The proposed methodology may, consequently, serve as a basis for accurate three-dimensional (3D) SPECT reconstruction. 相似文献
15.
Optimization of accelerator target and detector for portal imaging using Monte Carlo simulation and experiment 总被引:2,自引:0,他引:2
Flampouri S Evans PM Verhaegen F Nahum AE Spezi E Partridge M 《Physics in medicine and biology》2002,47(18):3331-3349
Megavoltage portal images suffer from poor quality compared to those produced with kilovoltage x-rays. Several authors have shown that the image quality can be improved by modifying the linear accelerator to generate more low-energy photons. This work addresses the problem of using Monte Carlo simulation and experiment to optimize the beam and detector combination to maximize image quality for a given patient thickness. A simple model of the whole imaging chain was developed for investigation of the effect of the target parameters on the quality of the image. The optimum targets (6 mm thick aluminium and 1.6 mm copper) were installed in an Elekta SL25 accelerator. The first beam will be referred to as A16 and the second as Cu1.6. A tissue-equivalent contrast phantom was imaged with the 6 MV standard photon beam and the experimental beams with standard radiotherapy and mammography film/screen systems. The arrangement with a thin Al target/mammography system improved the contrast from 1.4 cm bone in 5 cm water to 19% compared with 2% for the standard arrangement of a thick, high-Z target/radiotherapy verification system. The linac/phantom/detector system was simulated with the BEAM/EGS4 Monte Carlo code. Contrast calculated from the predicted images was in good agreement with the experiment (to within 2.5%). The use of MC techniques to predict images accurately, taking into account the whole imaging system, is a powerful new method for portal imaging system design optimization. 相似文献
16.
The development of a small animal model for radiotherapy research requires a complete setup of customized imaging equipment, irradiators, and planning software that matches the sizes of the subjects. The purpose of this study is to develop and demonstrate the use of a flexible in-house research environment for treatment planning on small animals. The software package, called DOSCTP, provides a user-friendly platform for DICOM computed tomography-based Monte Carlo dose calculation using the EGSnrcMP-based DOSXYZnrc code. Validation of the treatment planning was performed by comparing the dose distributions for simple photon beam geometries calculated through the Pinnacle3 treatment planning system and measurements. A treatment plan for a mouse based on a CT image set by a 360-deg photon arc is demonstrated. It is shown that it is possible to create 3D conformal treatment plans for small animals with consideration of inhomogeneities using small photon beam field sizes in the diameter range of 0.5-5 cm, with conformal dose covering the target volume while sparing the surrounding critical tissue. It is also found that Monte Carlo simulation is suitable to carry out treatment planning dose calculation for small animal anatomy with voxel size about one order of magnitude smaller than that of the human. 相似文献
17.
We have written Monte Carlo programs to simulate the formation of radiological images. Our code is used to propagate a simulated x-ray fluence through each component of an existing video-based portal imaging system. This simulated fluence consists of a 512 x 512 pixel image containing both contrast-detail patterns as well as checker patterns to assess spatial resolution of the simulated portal imager. All of the components of the portal imaging system were modeled as a cascade of eight linear stages. Using this code, one can assess the visual impact of changing components in the imaging chain by changing the appropriate probability density function. Virtual experiments were performed to assess the visual impact of replacing the lens and TV camera by an amorphous silicon array, and the effect of scattered radiation on portal images. 相似文献
18.
Evaluation of penetration and scattering components in conventional pinhole SPECT: phantom studies using Monte Carlo simulation 总被引:3,自引:0,他引:3
In quantitative pinhole SPECT, photon penetration through the collimator edges (penetration), and photon scattering by the object (object scatter) and collimator (collimator scatter) have not been investigated rigorously. Monte Carlo simulation was used to evaluate these three physical processes for different tungsten knife-edge pinhole collimators using uniform, hotspot and donut phantoms filled with 201Tl, 99mTc, 123I and 131I solutions. For the hotspot phantom, the penetration levels with respect to total counts for a 1 mm pinhole aperture were 78%, 28% and 23% for 131I, 123I and 99mTc, respectively. For a 2 mm aperture, these values were 65% for 131I, 16% for 123I and 12% for 99mTc. For all pinholes, 201Tl penetration was less than 4%. The evaluated scatter (from object and collimator) with a hotspot phantom for the 1 mm pinhole was 24%, 16%, 18% and 13% for 201Tl, 99mTc, 123I and 131I, respectively. Summation of the object and collimator scatter for the uniform phantom was approximately 20% higher than that for the hotspot phantom. Significant counts due to penetration and object and collimator scatter in the reconstructed image were observed inside the core of the donut phantom. The collimator scatter can be neglected for all isotopes used in this study except for 131I. Object scatter correction for all radionuclides used in this study is necessary and correction for the penetration contribution is necessary for all radionuclides but 201Tl. 相似文献
19.
The Monte Carlo method was used to investigate dose distributions around the 3M Company model 6701 and model 6702 125I brachytherapy seeds. The transverse axis dose distributions of the two seed models were found to be nearly identical, but the longitudinal axis dose distributions differed significantly. Seed design influences upon dose distributions also were investigated. 相似文献
20.
X-ray imaging dose from computed tomography (CT) or cone beam CT (CBCT) scans has become a serious concern. Patient-specific imaging dose calculation has been proposed for the purpose of dose management. While Monte Carlo (MC) dose calculation can be quite accurate for this purpose, it suffers from low computational efficiency. In response to this problem, we have successfully developed a MC dose calculation code, gCTD, on GPU architecture under the NVIDIA CUDA platform for fast and accurate estimation of the x-ray imaging dose received by a patient during a CT or CBCT scan. Techniques have been developed particularly for the GPU architecture to achieve high computational efficiency. Dose calculations using CBCT scanning geometry in a homogeneous water phantom and a heterogeneous Zubal head phantom have shown good agreement between gCTD and EGSnrc, indicating the accuracy of our code. In terms of improved efficiency, it is found that gCTD attains a speed-up of ~400 times in the homogeneous water phantom and ~76.6 times in the Zubal phantom compared to EGSnrc. As for absolute computation time, imaging dose calculation for the Zubal phantom can be accomplished in ~17 s with the average relative standard deviation of 0.4%. Though our gCTD code has been developed and tested in the context of CBCT scans, with simple modification of geometry it can be used for assessing imaging dose in CT scans as well. 相似文献