首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 234 毫秒
1.
滤波反投影图像重建算法中滤波器设计是其中至关重要的一环。针对现有一次指数滤波器空域波形存在明显振荡、频域曲线截止频率附近没有明显压低的问题,引入高斯滤波器减小Gibbs效应,并设计一种用于CT重建领域的新型滤波器,即改进高斯滤波器。通过仿真实验对比说明滤波器中加权因子选择对图像质量的影响,另与RL滤波器、SL滤波器、一次指数滤波器、高斯滤波器实验对比验证了该滤波器拥有良好的抗噪性能,有效减小重建图像灰度波动以及明显提高重建图像质量。  相似文献   

2.
全数字B超动态滤波器的设计与实现   总被引:1,自引:0,他引:1  
为在全探测深度内获取最佳分辨力的超声回波图像,本研究基于现场可编程门阵列(FPGA)设计了一种动态滤波器,将全数字B超的探测深度平均分为64段,采用64个32阶带通滤波器与之对应,完成人体超声回波信号的动态滤波处理。所设计滤波器应用于128阵元、3.5MHz的全数字B超设备中,并与使用常值滤波器的图像进行了分析比较。通过对仿组织超声体模的检测显示,在图像的探测深度、远场分辨力和噪声滤除等方面都有了较为明显的提高。  相似文献   

3.
将配准的解剖图像作为先验信息指导PET图像重建已有广泛的研究.基于非局部均值(nonlocal means)滤波和解剖图像的区域信息,提出一种解剖自适应的非局部先验(anatomically adaptive nonlocal prior,AANLP)模型.新模型中的信息来自一个较大的非局部邻域内灰度值的加权差,其权值通过计算两个像素的相似性获得.权值参数通过利用解剖图像的区域信息进行自适应迭代估计.在PET图像的重建过程中,AANLP模型自适应地用于每一个解剖区域.构建两步式重建策略,用于图像重建和参数估计.仿真数据重建结果表明,AANLP具有很好的保持边缘效果,并且能鲁棒地产生最高的病灶对比度.  相似文献   

4.
一种基于非局部平均的PET图像去噪方法   总被引:2,自引:0,他引:2  
去噪是医学图像处理的一个重要研究课题。本文对最近提出的非局部平均(Non-local means,NLM)算法进了研究,并将其应用于PET图像去噪。对测试图像与实际PET图像的去噪结果表明,该方法的去噪性能优于中值滤波与维纳滤波的方法,能够在保留重要诊断细节的情况下有效地抑制PET图像中的噪声。  相似文献   

5.
SPECT重建滤波函数的选择与图像效果   总被引:3,自引:1,他引:3  
要获得高品质SPECT重建图像,图像重建过程中滤波函数的选择是极为重要的。本文就SPECT中常用数字滤波函数进行了数学分析和临床应用的比较,认为Buterworth滤波器确为较理想的滤波函数,其它许多滤波函数都可通过改变Buterworth的滤波参数推导出来。并在此基础上提出了部分脏器的SPECT断层影像重建处理时,Butterworth滤波函数较合理的变量参数。  相似文献   

6.
利用Bayesian估计的小波自适应阈值方法对图像进行去噪处理。通过高斯滤波和小波变换的三种方法(传统的硬阈值、传统的软阈值去噪、基于Bayesian估计的自适应阈值去噪)分别同时对加不同标准差σ的Rician噪声信号进行消噪处理,对比验证高斯滤波和传统小波阈值去噪的优劣,以及新的Bayesian估计自适应阈值小波去噪在磁共振成像(magnetic resonance imaging,MRI)图像信号去噪方面的优越性。小波去噪后的信号信噪比比高斯滤波去噪后信号的信噪比高,且均方根误差要低。采用基于Bayesian估计的自适应阈值小波去噪方法比采用的高斯滤波保留了更多有用信号,优化后的氧摄取分数(oxygen extraction fraction,OEF)值有一定程度增大,使结果更接近正电子发射型计算机断层显像(positron emission computed tomography,PET)测量金标准。成功完成信号和噪声分离优化,将一种新的基于Baysian估计的自适应小波阈值去噪应用到了功能核磁共振成像的降噪分析上,取得了不错的效果。  相似文献   

7.
目的:评估一种新的正电子发射断层扫描技术(PET)图像去噪方法—非局部几何非线性扩散滤波。方法:首先,计算PET图像的几何非线性扩散系数;然后,对该扩散系数进行非局部邻域加权平均;最后,用非局部加权平均后的扩散系数对PET图像进行几何非线性扩散滤波。结果:与原几何非线性扩散滤波、非局部均值滤波、PURE-LET滤波方法相比,非局部几何非线性扩散滤波可提高PET图像峰值信噪比和结构相似性,增强图像视觉效果。结论:非局部几何非线性扩散滤波是一种有效的PET图像去噪方法。  相似文献   

8.
目的在电子发射及计算机断层扫描系统(positron emission computed tomography/X-ray computed tomography,PET/CT)图像衰减校正的能量转换过程中,为了改进双线性转换法用线性关系来拟合非线性关系的不足,本文以支持向量回归为基础,提出了一种新的能量转换法即支持向量回归的PET/CT图像衰减校正方法来进行衰减校正,以寻找CT值和511 keV能量下线性衰减系数值之间的最佳转换关系。方法使用仿真软件GATE(Geant4 Application Tomographic Emission)模拟了11组不同材质的圆柱体体模。然后根据GATE仿真的不同材质圆柱体体模,求出其CT值和511 keV能量下线性衰减系数值并代入SVR模型中进行训练,建立CT值和511 keV能量下线性衰减系数值之间的SVR模型。最后与目前PET/CT衰减校正能量转换中常用的双线性能量转换法进行对比分析,并分别应用于GATE仿真的NCAT(NURBs Cardiac Torso)像素体模图像中,评估两种方法准确性的差异。结果支持向量回归的PET/CT图像衰减校正方法得到的511 keV能量下对应物质的线性衰减系数值的相对百分误差值较小(肺的相对百分误差值3.1%和肝脏的相对百分误差值1.08%),且经过支持向量回归法衰减校正的PET图像,其MSE评价值都是最小的(176.9230),其PSNR和AG的评价值都是最大的(31.8621和7.9083)。这说明经过支持向量回归法衰减校正的PET图像相比于双线性转换法衰减校正的PET图像,更接近于静态的图像。结论支持向量回归的PET/CT图像衰减校正方法在PET/CT图像的衰减校正应用中有更好的表现,可以更好地吻合CT值与511 keV能量下线性衰减系数之间的转换关系,从而提高了PET/CT图像的衰减校正效果,改善了PET/CT图像定量的准确性,便于医生做出更精确的临床诊断。  相似文献   

9.
无创的心率变异性(HRV)检测所反映的自主神经状态可受生理、病理和心理等因素影响。提出研究短时HRV分析指标在长时序列中的分布特性,并探讨在正常人中随年龄可能发生的变化。将THEW中Normal子数据库中年龄大于18岁的Holter数据(n=177)分为5个年龄组(18≤y≤25, n=35; 25< y≤35, n=44; 35< y≤45, n=41; 45< y≤55, n=34; y >55, n=23)。利用5 min的滑动窗口、2.5 min的步长,计算每个滑动窗的RR间期均值(MRRI)、LF/HF和短时分形尺度指数(α1),然后基于长时序列,分别计算MRRI和LF/HF,以及MRRI和α1这两种配对的Spearman相关系数,并在各组内统计相关性良好人数的百分占比。然后,以具有正常作息时间和数据长度为筛选标准,从177名正常人中筛选出93名25<y≤65岁的受试数据,并以10岁为间隔分为4个年龄组,计算每人各个2 h时段中的各5 min滑动窗(2.5 min步长)的指标均值(EM_MRRI、EM_LF/HF和EM_α1)。结果表明,具有良好相关性的人数百分占比在18≤y≤55的年龄段保持高水平(94%~100%),但在年龄大于55岁后急剧下降(MRRI vs LF/HF: 78.26%; MRRI vs α1: 65.22%)。对于清晨最低EM_MRRI时段,其EM_MRRI、EM_LF/HF和EM_α1在各年龄组均不存在显著差异(P>0.05)),但在其他时段这些参数则可能存在显著差异。随着可穿戴技术的发展,长时心率序列(RR间期序列)的可获得性大幅度提高,该研究结果对于拓展长时序列的HRV分析方法可提供新的思路。  相似文献   

10.
利用鸽视顶盖神经元对视觉图像刺激产生的局部场电位信号(LFP),重建刺激数字字符图像。采用微电极阵列记录数字图像扫屏刺激下的神经元LFP信号,对其进行傅里叶变换并提取幅值、相位特征,然后利用逆滤波器算法构造重建模型,重建数字图像,并采用互相关系数进行评价。研究结果发现,在最优通道组合下,依据单因素重建试验,确定重建模型下神经元对视觉刺激的响应延迟时间为0.01 s,响应持续时间为0.55 s,频带范围为1 Hz< f1<30 Hz、140 Hz< f2<240 Hz。在各单因素最优的条件下,通过重建模型重建4只鸽子8组数据的10幅数字字符图像(0~9),与原始图像相比,其互相关系数均超过了0.90,总体互相关系数为0.935±0.01。总之,数字图像的扫屏视觉刺激模式所诱发的神经元响应可以以信息积累的方式重建该视觉刺激图像,同时也表明LFP信号的幅值、相位特征可较好地表征视觉刺激图像。  相似文献   

11.
New nonlinear image processing techniques, in particular smoothing based on the understanding of the image, may create computerized tomography (CT) images of good quality using less radiation. Such techniques may be applied before the reconstruction and particularly after it. Current CT scanners use strong linear low-pass filters applied to the CT projections, reducing noise but also deteriorating the resolution of the image. The method in this study was to apply a weak low-pass filter on the projections, to perform the reconstruction, and only then to apply a nonlinear filter on the image. Various kinds of nonlinear filters were investigated based on the fact that the image is approximately piecewise constant. The filters were applied with many values of several parameters and the effects on the spatial resolution and the noise reduction were evaluated. The signal-to-noise ratio of a high-contrast phantom image processed were compared with the nonlinear filter, with the SNR of the phantom images obtained with the built-in CT linear filters in two scanning modes, the normal and the ultra high resolution modes. It was found that the nonlinear filters improve the SNR of the image, compared to the built-in filters, about three times for the normal mode and twice for the UHR scanning mode. The most successful filter on low-contrast phantom image was applied and it also seems to lead to promising results. These results seem to show that applying nonlinear filters on CT images might lead to better image quality than using the current linear filters.(ABSTRACT TRUNCATED AT 250 WORDS)  相似文献   

12.
The basic mathematical problem behind PET is an inverse problem. Due to the inherent ill-posedness of this inverse problem, the reconstructed images will have noise and edge artifacts. A roughness penalty is often imposed on the solution to control noise and stabilize the solution, but the difficulty is to avoid the smoothing of edges. In this paper, we propose two new types of Bayesian one-step-late reconstruction approaches which utilize two different prior regularizations: the mean curvature (MC) diffusion function and the Gauss curvature (GC) diffusion function. As they have been studied in image processing for removing noise, these two prior regularizations encourage preserving the edge while the reconstructed images are smoothed. Moreover, the GC constraint can preserve smaller structures which cannot be preserved by MC. The simulation results show that the proposed algorithms outperform the quadratic function and total variation approaches in terms of preserving the edges during emission reconstruction.  相似文献   

13.
Tumor boundary delineation using positron emission tomography (PET) is a promising tool for radiation therapy applications. In this study we quantify the uncertainties in tumor boundary delineation as a function of the reconstruction method, smoothing, and lesion size in head and neck cancer patients using FDG-PET images and evaluate the dosimetric impact on radiotherapy plans. FDG-PET images were acquired for eight patients with a GE Advance PET scanner. In addition, a 20 cm diameter cylindrical phantom with six FDG-filled spheres with volumes of 1.2 to 26.5 cm3 was imaged. PET emission scans were reconstructed with the OSEM and FBP algorithms with different smoothing parameters. PET-based tumor regions were delineated using an automatic contouring function set at progressively higher threshold contour levels and the resulting volumes were calculated. CT-based tumor volumes were also contoured by a physician on coregistered PET/CT patient images. The intensity value of the threshold contour level that returns 100% of the actual volume, I(V100), was measured. We generated intensity-modulated radiotherapy (IMRT) plans for an example head and neck patient, treating 66 Gy to CT-based gross disease and 54 Gy to nodal regions at risk, followed by a boost to the FDG-PET-based tumor. The volumes of PET-based tumors are a sensitive function of threshold contour level for all patients and phantom datasets. A 5% change in threshold contour level can translate into a 200% increase in volume. Phantom data indicate that I(V100) can be set as a fraction, f, of the maximum measured uptake. Fractional threshold values in the cylindrical water phantom range from 0.23 to 0.51. Both the fractional threshold and the threshold-volume curve are dependent on lesion size, with lesions smaller than approximately 5 cm3 displaying a more pronounced sensitivity and larger fractional threshold values. The threshold-volume curves and fractional threshold values also depend on the reconstruction algorithm and smoothing filter with more smoothing requiring a higher fractional threshold contour level. The threshold contour level affects the tumor size, and therefore the ultimate boost dose that is achievable with IMRT. In an example head and neck IMRT plan, the D95 of the planning target volume decreased from 7770 to 7230 cGy for 42% vs. 55% contour threshold levels. PET-based tumor volumes are strongly affected by the choice of threshold level. This can have a significant dosimetric impact. The appropriate threshold level depends on lesion size and image reconstruction parameters. These effects should be carefully considered when using PET contour and/or volume information for radiotherapy applications.  相似文献   

14.
Three-dimensional filtered backprojection uses filters generally specified in the Fourier domain. Implementing these filters by direct sampling in the Fourier domain produces an artifact in the reconstructed images consisting primarily of a DC shift. This artifact is caused by aliasing of the reconstruction filter. We have developed a filter construction technique using Fourier domain oversampling, which reduces the artifact. A method to construct the filter efficiently without the need to create and store the entire oversampled filter array is also presented. Quantitative accuracy in filtered backprojection is of particular importance in multiple-pass algorithms used to reconstruct data from cylindrical PET scanners. We are able to implement such algorithms without fitting the reprojected views to the scanner data.  相似文献   

15.
Voxel-based estimation of PET images, generally referred to as parametric imaging, can provide invaluable information about the heterogeneity of an imaging agent in a given tissue. Due to high level of noise in dynamic images, however, the estimated parametric image is often noisy and unreliable. Several approaches have been developed to address this challenge, including spatial noise reduction techniques, cluster analysis and spatial constrained weighted nonlinear least-square (SCWNLS) methods. In this study, we develop and test several noise reduction techniques combined with SCWNLS using simulated dynamic PET images. Both spatial smoothing filters and wavelet-based noise reduction techniques are investigated. In addition, 12 different parametric imaging methods are compared using simulated data. With the combination of noise reduction techniques and SCWNLS methods, more accurate parameter estimation can be achieved than with either of the two techniques alone. A less than 10% relative root-mean-square error is achieved with the combined approach in the simulation study. The wavelet denoising based approach is less sensitive to noise and provides more accurate parameter estimation at higher noise levels. Further evaluation of the proposed methods is performed using actual small animal PET datasets. We expect that the proposed method would be useful for cardiac, neurological and oncologic applications.  相似文献   

16.
目的:采用Gabor滤波器实现眼底图像中新生血管检测,帮助医生准确确定糖尿病视网膜病变的分期。 方法:对眼底图像进行预处理,并使用不同尺度参数和方向参数Gabor滤波器作用于预处理图像,并在尺度参数确定的情况下取各方向输出结果的最大值作为最后Gabor滤波器的输出。 结果:对比分析不同尺度参数的Gabor滤波器的结果,发现小尺度参数的Gabor滤波器在新生血管部分具有较强的输出。 结论:本研究提出的Gabor滤波器可以很好地区分眼底图像中正常血管与新生血管结构。  相似文献   

17.
Previously we have investigated a depth-independent compensation for collimator detector response (CDR) included in the OSEM reconstruction, intended for SPECT images that have been corrected for scatter and septal penetration using convolution-based methods. In this work, the aim was to study how different filtering strategies affect contrast as a function of noise when using Gaussian smoothing filters in combination with the above-described CDR compensation. The evaluation was performed for (123)I dopamine transporter (DAT) SPECT images. Prefiltering with 2D Gaussian filter kernels, where the deterioration in resolution is included in the depth-independent CDR compensation, was compared to conventional postfiltering with 3D Gaussian filter kernels. Images reconstructed without filtering are also included in the comparison. It was found that there is little benefit in noise reduction when using CDR compensation. However, this variant of prefiltering gives consistently higher contrasts as a function of noise compared with the postfiltering alternative, and that could be of interest when using other types of filters with contrast improving properties.  相似文献   

18.
Emission tomographic image reconstruction is an ill-posed problem due to limited and noisy data and various image-degrading effects affecting the data and leads to noisy reconstructions. Explicit regularization, through iterative reconstruction methods, is considered better to compensate for reconstruction-based noise. Local smoothing and edge-preserving regularization methods can reduce reconstruction-based noise. However, these methods produce overly smoothed images or blocky artefacts in the final image because they can only exploit local image properties. Recently, non-local regularization techniques have been introduced, to overcome these problems, by incorporating geometrical global continuity and connectivity present in the objective image. These techniques can overcome drawbacks of local regularization methods; however, they also have certain limitations, such as choice of the regularization function, neighbourhood size or calibration of several empirical parameters involved. This work compares different local and non-local regularization techniques used in emission tomographic imaging in general and emission computed tomography in specific for improved quality of the resultant images.  相似文献   

19.
Positron emitters are activated by proton beams in proton radiotherapy, and positron emission tomography (PET) images can thus be used for dose verification. Since a PET image is not directly proportional to the delivered radiation dose distribution, predicted PET images are compared to measured PET images and an agreement of both indicates a successful irradiation. Such predictions are given on the basis of Monte Carlo calculations or a filtering approach which uses a convolution of the planned dose with specific filter functions to estimate the PET activity. In this paper, we describe and evaluate a dose reconstruction method based on PET images which reverses the just mentioned convolution approach using appropriate deconvolution methods. Deconvolution is an ill-posed inverse problem, and suitable regularization techniques are required in order to guarantee a stable solution. The basic convolution approach is developed for homogeneous media and additional procedures are necessary to generalize the PET estimation to inhomogeneous media. This generalization formalism is used in our dose deconvolution approach as well. Various simulations demonstrate that the dose reconstruction method is able to reverse the PET estimation method both in homogeneous and inhomogeneous media. Measured PET images are however degraded by noise and artifacts and the dose reconstructions become more difficult and the results suffer from artifacts as well. Recently used in-room PET scanners allow a decreased delay time between irradiation and imaging, and thus the influence of short-lived positron emitters on the PET images increases considerably. We extended our dose reconstruction method to process PET images which contain several positron emitters and simulated results are shown.  相似文献   

20.
This article discusses an adaptive filtering technique for reducing speckle using second order statistics of the speckle pattern in ultrasound medical images. Several region-based adaptive filter techniques have been developed for speckle noise suppression, but there are no specific criteria for selecting the region growing size in the post processing of the filter. The size appropriate for one local region may not be appropriate for other regions. Selection of the correct region size involves a trade-off between speckle reduction and edge preservation. Generally, a large region size is used to smooth speckle and a small size to preserve the edges into an image. In this paper, a smoothing procedure combines the first order statistics of speckle for the homogeneity test and second order statistics for selection of filters and desired region growth. Grey level co-occurrence matrix (GLCM) is calculated for every region during the region contraction and region growing for second order statistics. Further, these GLCM features determine the appropriate filter for the region smoothing. The performance of this approach is compared with the aggressive region-growing filter (ARGF) using edge preservation and speckle reduction tests. The processed image results show that the proposed method effectively reduces speckle noise and preserves edge details.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号