首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 109 毫秒
1.
贝叶斯粗糙集处理噪声数据能力强,分类肺部肿瘤CT图像结果准确,为图像去噪提供精准的图像分类结果。基于此,设计基于贝叶斯粗糙集的肺部肿瘤CT图像抗噪算法,基于贝叶斯粗糙集分类模型进行肺部CT图像分类,约简贝叶斯粗糙集属性和决策规则,基于决策规则预测肺部CT图像类别;对存在肿瘤的CT图像噪声小波系数构建拉普拉斯数学模型,基于贝叶斯最大后验概率估计小波系数概率密度,计算噪声方差和子代小波系数标准差,使去噪算法具备自适应性;基于小波系数的概率密度得到最大后验(maximum a posteriori,MAP)估计值,对该值做小波反变换,实现肺部肿瘤CT图像自适应去噪。结果表明,该算法去除肺部肿瘤CT图像噪声效果好,抗噪能力强,较好保留图像细节特征,视觉效果佳。  相似文献   

2.
一种基于医学序列图像的"有损-无损"压缩方案   总被引:2,自引:0,他引:2  
针对医学序列图像的特点,提出了基于图像配准技术的差值预测方法,以减少序列图像之间的相关冗余.对于预测差值图像,采用基于可逆整数小波变换的EBCOT编码方法,实现对"非必要-必要"区域的"有损-无损"压缩.通过对实际医学图像(CT)的实验表明,该方法具有很好的压缩性能,可以有效满足现代医学图像的压缩要求.  相似文献   

3.
提出一种基于整型提升小波变换的图像块压缩编码方法.整型提升小波变换具有计算快速、能实现任意图像尺寸的小波算法、能在当前位置完成小波变换、节省内存等特点,而且该提升算法能同时对图像进行有损或无损压缩,因而更适应于远程医疗系统和医学图像压缩系统.基于图像块压缩编码方法不仅可以实现比特率控制,还可以实现SNR(信噪比)可缩放性,支持图像的渐进传输.  相似文献   

4.
以颅脑CT图像为研究对象,提出了一种基于小波变换的自动标记非刚性配准所需对应特征点的算法.这种算法充分考虑了颅脑CT图像的像素点及其临域的纹理特征,通过进行小波变换建立对应于每个像素点的多分辨率小波特征向量,并以小波特征向量间的差异作为判别依据,在目标图像中标记非刚性配准所需的对应特征点.一系列的实验结果表明,这种基于小波变换的算法能够准确地在目标图像中标记出配准所需的对应特征点,可以作为基于特征的非刚性配准对应特征点自动标记的参量之一.  相似文献   

5.
探讨增强早期肺癌CT图像的方法.采用小波变换增强图像细节的方法,根据图像特点,同时结合对比度自适应直方图均衡化或(和)自适应滤波去噪法,对10位早期周围型肺癌患者的50张CT图像进行增强.结果表明,处理后图像中肺内结节的边缘锐利、内部密度清晰、周围征象(如胸膜凹陷征等)清楚;各组织器官边缘清晰、层次明显,肺纹理清晰度增加.尤其是低对比度图像,处理后图像质量有较明显提高.基于小波变换结合其他预处理方法,对早期肺癌的CT图像进行增强,可为同类研究提供一定的参考.  相似文献   

6.
计划CT图像与锥形束CT(CBCT)图像的配准是基于CBCT图像引导放射系统中实现自适应放疗的重要部分。为了提高系统中形变配准的精度和速度,提出一种基于正交小波变换的形变配准方法,此方法利用正交小波变换的多分辨率特性描述计划CT和CBCT图像的全局和局部形变,由Navier偏微分方程设计极小化能量函数来实现小波系数的能量估计。实验表明,所提出方法用于基于CBCT的图像引导放射系统时,可将日常放疗时的CBCT图像和计划CT图像进行准确且快速的配准,并且可用于放射计划系统中器官的自动分割,从而有效指导自适应治疗。  相似文献   

7.
基于提升格式整数小波变换的超声图像压缩算法   总被引:1,自引:0,他引:1  
本文提出了一种基于提升格式整数小波变换和改进的SPIHT编码(多级树集合分裂算法)的医学超声图像压缩算法.在压缩对象选择和小波变换方面充分考虑了超声扫描线图像的分辨率特性.与基于Mallat小波变换的标准SPIHT编码算法相比,本文算法在压缩比和重建图像峰值信噪比至少不降低的情况上,运算时间不到前者的40%,内存消耗也大大减小,因而更适合于实时图像压缩.  相似文献   

8.
基于小波变换的CT/PET图像融合最佳参数研究   总被引:1,自引:1,他引:0  
为了提高基于小波变换图像融合的性能,在图像融合规则相对固定的情况下,提出一种确定最佳小波基函数和分解层数的方法。从图像的信息熵出发,通过比较低频子带图像熵差与原始图像熵差的接近程度,选择每一种小波基所对应的最佳分解层数;在小波分解层数确定的情况下,结合图像融合评价方法,选择最佳的小波基函数。与引入融合效果的评价构成一个闭环系统来确定小波参数相比,该方法极大地简化了判别过程;将该方法应用于CT/PET图像融合,获得了较好的融合效果。实验结果表明,该方法简单可行,对基于小波变换图像融合的小波参数选取有一定的指导意义。  相似文献   

9.
背景:医学数字图像必须是高质量的、高分辨率,所以数据量很大,如此巨大的数据量不利于图像存档与传输系统的运行和数字化医院、远程医疗的实现。因此,图像压缩成为图像存档与传输系统要解决的重要问题。目的:分析零树小波变编码算法原理并编程实现对医学数字图像的压缩,使之能够满足医学图像的传输和诊断要求。方法:应用嵌入式零树小波编码算法,探讨小波基和小波变换层数的选择,编程实现对医学数字图像的压缩。结果与结论:选择双正交小波基对医学图像进行4层小波变换实现压缩,获得了较高的峰值信噪比,取得了较好的压缩效果。  相似文献   

10.
基于ISA-DWT的MR图像压缩新算法   总被引:2,自引:0,他引:2  
本研究提出一种新的ISA-DWT(整数到整数的形状自适应离散小波变换)的核磁共振(MR)图像压缩算法,对变换后的系数采用适合于形状自适应离散小波变换的修改的SPIHT算法进行编码,并增加上下文自适应算术编码以提高其压缩性能.本研究算法可将前景区域和背景区域生成的压缩码流完全分离,且小波域的系数个数和图像域的个数相同.实验结果表明,对于MR图像,本研究算法的压缩性能明显优于JPEG2000中的最大位移算法.  相似文献   

11.
详细地介绍了一种基于随机回归识理论的医学图像无埚压缩方法,该方法的性能通过十幅X线胸部图像与DPCM方法进行了比较,实验结果表明:这种方法对实现医学图像的无损压缩是非常有效的。  相似文献   

12.
医学图像无损压缩与有损压缩技术的进展   总被引:5,自引:0,他引:5  
本文对医学图像无损压缩和有损压缩的概念和应用进行了分析比较 ,并简要介绍了几种近年来发展的图像无损压缩方法 ,重点介绍了有损压缩中的小波图像压缩技术和分形图像压缩技术。在医学图像的压缩中 ,通过有效地结合无损压缩和有损压缩技术 ,可以在得到医学要求的图像保真度的前提上 ,达到较高的压缩比  相似文献   

13.
In teleradiology, image contents may be altered due to noisy communication channels and hacker manipulation. Medical image data is very sensitive and can not tolerate any illegal change. Illegally changed image-based analysis could result in wrong medical decision. Digital watermarking technique can be used to authenticate images and detect as well as recover illegal changes made to teleradiology images. Watermarking of medical images with heavy payload watermarks causes image perceptual degradation. The image perceptual degradation directly affects medical diagnosis. To maintain the image perceptual and diagnostic qualities standard during watermarking, the watermark should be lossless compressed. This paper focuses on watermarking of ultrasound medical images with Lempel-Ziv-Welch (LZW) lossless-compressed watermarks. The watermark lossless compression reduces watermark payload without data loss. In this research work, watermark is the combination of defined region of interest (ROI) and image watermarking secret key. The performance of the LZW compression technique was compared with other conventional compression methods based on compression ratio. LZW was found better and used for watermark lossless compression in ultrasound medical images watermarking. Tabulated results show the watermark bits reduction, image watermarking with effective tamper detection and lossless recovery.  相似文献   

14.
To address the low compression efficiency of lossless compression and the low image quality of general near-lossless compression, a novel near-lossless compression algorithm based on adaptive spatial prediction is proposed for medical sequence images for possible diagnostic use in this paper. The proposed method employs adaptive block size-based spatial prediction to predict blocks directly in the spatial domain and Lossless Hadamard Transform before quantization to improve the quality of reconstructed images. The block-based prediction breaks the pixel neighborhood constraint and takes full advantage of the local spatial correlations found in medical images. The adaptive block size guarantees a more rational division of images and the improved use of the local structure. The results indicate that the proposed algorithm can efficiently compress medical images and produces a better peak signal-to-noise ratio (PSNR) under the same pre-defined distortion than other near-lossless methods.  相似文献   

15.
Image compression plays a crucial role in medical imaging, allowing efficient manipulation, storage, and transmission of binary, grey-scale, or colour images. Nevertheless, in medical applications the need to conserve the diagnostic validity of the image requires the use of lossless compression methods, producing low compression factors. In this paper, a novel near-lossless compression algorithm from projections, which almost eliminates both redundant information and noise from a greyscale image while retaining all relevant structures and producing high compression factors, is proposed. The algorithm is tested on experimental medical greyscale images from different modalities and different body districts and results are reported.  相似文献   

16.
The use of lossy compression in medical imaging is controversial, although it is inevitable to reduce large data amounts. In contrast with lossy compression, lossless compression does not impair image quality. In addition to our previous studies, we evaluated virtual 3-dimensional microscopy using JPEG2000 whole slide images of gastric biopsy specimens with or without Helicobacter pylori gastritis using lossless compression (1:1) or lossy compression with different compression levels: 5:1, 10:1, and 20:1. The virtual slides were diagnosed in a blinded manner by 3 pathologists using the updated Sydney classification. The results showed no significant differences in the diagnosis of H pylori between the different levels of compression in virtual microscopy. We assume that lossless compression is not required for diagnostic virtual microscopy. The limits of lossy compression in virtual microscopy without a loss of diagnostic quality still need to be determined. Analogous to the processes in radiology, recommendations for the use of lossy compression in diagnostic virtual microscopy have to be worked out by pathology societies.  相似文献   

17.
Lossless image coding is important for medical image compression because any information loss or error caused by the image compression process could affect clinical diagnostic decisions. This paper proposes a lossless compression algorithm for application to medical images that have high spatial correlation. The proposed image compression algorithm uses a multilevel decomposition scheme in conjunction with prediction and classification. In this algorithm, an image is divided into four subimages by subsampling. One subimage is used as a reference to predict the other three subimages. The prediction errors of the three subimages are classified into two or three groups by the characteristics of the reference subimage, and the classified prediction errors are encoded by entropy coding with corresponding code words. These subsampling and classified entropy coding procedures are repeated on the reference subimage in each level, and the reference subimage in the last repetition is encoded by conventional differential pulse code modulation and entropy coding. To verify this proposed algorithm, it was applied to several chest radiographs and computed tomography and magnetic resonance images, and the results were compared with those from well-known lossless compression algorithms.  相似文献   

18.
以医用X射线灰度图像为例,介绍基于象素R,G,B值的无损压缩算法在X射线图像分析系统中的应用。通过对X射线灰度图像特征、象素的R,G,B值,压缩等各个环节的设计与描述,完整地给出了该处 实现过程。  相似文献   

19.
Medical image compression is one of the growing research fields in biomedical applications. Most medical images need to be compressed using lossless compression as each pixel information is valuable. With the wide pervasiveness of medical imaging applications in health-care settings and the increased interest in telemedicine technologies, it has become essential to reduce both storage and transmission bandwidth requirements needed for archival and communication of related data, preferably by employing lossless compression methods. Furthermore, providing random access as well as resolution and quality scalability to the compressed data has become of great utility. Random access refers to the ability to decode any section of the compressed image without having to decode the entire data set. The system proposes to implement a lossless codec using an entropy coder. 3D medical images are decomposed into 2D slices and subjected to 2D-stationary wavelet transform (SWT). The decimated coefficients are compressed in parallel using embedded block coding with optimized truncation of the embedded bit stream. These bit streams are decoded and reconstructed using inverse SWT. Finally, the compression ratio (CR) is evaluated to prove the efficiency of the proposal. As an enhancement, the proposed system concentrates on minimizing the computation time by introducing parallel computing on the arithmetic coding stage as it deals with multiple subslices.  相似文献   

20.
This article presents a lossless compression of volumetric medical images with the improved three-dimensional (3-D) set partitioning in hierarchical tree (SPIHT) algorithm that searches on asymmetric trees. The tree structure links wavelet coefficients produced by 3-D reversible integer wavelet transforms. Experiments show that the lossless compression with the improved 3-D SPIHT gives improvement about 42% on average over two-dimensional techniques and is superior to those of prior results of 3-D techniques. In addition, we can easily apply different numbers of decomposition between the transaxial and axial dimensions, which is a desirable function when the coding unit of a group of slices is limited in size.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号