首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
With the development of communication technology the applications and services of health telemetics are growing. In view of the increasingly important role played by digital medical imaging in modern health care, it is necessary for large amount of image data to be economically stored and/or transmitted. There is a need for the development of image compression systems that combine high compression ratio with preservation of critical information. During the past decade wavelets have been a significant development in the field of image compression. In this paper, a hybrid scheme using both discrete wavelet transform (DWT) and discrete cosine transform (DCT) for medical image compression is presented. DCT is applied to the DWT details, which generally have zero mean and small variance, thereby achieving better compression than obtained from either technique alone. The results of the hybrid scheme are compared with JPEG and set partitioning in hierarchical trees (SPIHT) coder and it is found that the performance of the proposed scheme is better.  相似文献   

2.
With the development of communication technology the applications and services of health telemetics are growing. In view of the increasingly important role played by digital medical imaging in modern health care, it is necessary for large amount of image data to be economically stored and/or transmitted. There is a need for the development of image compression systems that combine high compression ratio with preservation of critical information. During the past decade wavelets have been a significant development in the field of image compression. In this paper, a hybrid scheme using both discrete wavelet transform (DWT) and discrete cosine transform (DCT) for medical image compression is presented. DCT is applied to the DWT details, which generally have zero mean and small variance, thereby achieving better compression than obtained from either technique alone. The results of the hybrid scheme are compared with JPEG and set partitioning in hierarchical trees (SPIHT) coder and it is found that the performance of the proposed scheme is better.  相似文献   

3.
N C Phelan  J T Ennis 《Medical physics》1999,26(8):1607-1611
Image compression is fundamental to the efficient and cost-effective use of digital medical imaging technology and applications. Wavelet transform techniques currently provide the most promising approach to high-quality image compression which is essential for diagnostic medical applications. A novel approach to image compression based on the wavelet decomposition has been developed which utilizes the shape or morphology of wavelet transform coefficients in the wavelet domain to isolate and retain significant coefficients corresponding to image structure and features. The remaining coefficients are further compressed using a combination of run-length and Huffman coding. The technique has been implemented and applied to full 16 bit medical image data for a range of compression ratios. Objective peak signal-to-noise ratio performance of the compression technique was analyzed. Results indicate that good reconstructed image quality can be achieved at compression ratios of up to 15:1 for the image types studied. This technique represents an effective approach to the compression of diagnostic medical images and is worthy of further, more thorough, evaluation of diagnostic quality and accuracy in a clinical setting.  相似文献   

4.
基于小波网络的动态心电数据压缩算法   总被引:9,自引:0,他引:9  
本文研究了动态心电信号的非平稳过程特性和动态心电图 (ECG)的诊断信息依据 ,从时间序列建模角度研究数据表示模型和压缩算法 ,采用小波网络 (WN)作为数据表示模型 ,提出了动态心电数据的小波网络压缩算法。本算法对原始心电数据实时地分帧 ,将每帧数据映射为小波网络的网络参数作为原始数据的重构信息。文中详细叙述了小波网络的数据表示原理和分帧压缩算法 ,给出了动态心电数据的压缩 /重建的实验结果并进行分析讨论  相似文献   

5.
Compression of electrocardiography (ECG) is necessary for efficient storage and transmission of the digitized ECG signals. Discrete wavelet transform (DWT) has recently emerged as a powerful technique for ECG signal compression due to its multi-resolution signal decomposition and locality properties. This paper presents an ECG compressor based on the selection of optimum threshold levels of DWT coefficients in different subbands that achieve maximum data volume reduction while preserving the significant signal morphology features upon reconstruction. First, the ECG is wavelet transformed into m subbands and the wavelet coefficients of each subband are thresholded using an optimal threshold level. Thresholding removes excessively small features and replaces them with zeroes. The threshold levels are defined for each signal so that the bit rate is minimized for a target distortion or, alternatively, the distortion is minimized for a target compression ratio. After thresholding, the resulting significant wavelet coefficients are coded using multi embedded zero tree (MEZW) coding technique. In order to assess the performance of the proposed compressor, records from the MIT-BIH Arrhythmia Database were compressed at different distortion levels, measured by the percentage rms difference (PRD), and compression ratios (CR). The method achieves good CR values with excellent reconstruction quality that compares favourably with various classical and state-of-the-art ECG compressors. Finally, it should be noted that the proposed method is flexible in controlling the quality of the reconstructed signals and the volume of the compressed signals by establishing a target PRD and a target CR a priori, respectively.  相似文献   

6.
提出一种基于整型提升小波变换的图像块压缩编码方法.整型提升小波变换具有计算快速、能实现任意图像尺寸的小波算法、能在当前位置完成小波变换、节省内存等特点,而且该提升算法能同时对图像进行有损或无损压缩,因而更适应于远程医疗系统和医学图像压缩系统.基于图像块压缩编码方法不仅可以实现比特率控制,还可以实现SNR(信噪比)可缩放性,支持图像的渐进传输.  相似文献   

7.
Lossy image compression is thought to be a necessity as radiology moves toward a filmless environment. Compression algorithms based on the discrete cosine transform (DCT) are limited due to the infinite support of the cosine basis function. Wavelets, basis functions that have compact or nearly compact support, are mathematically better suited for decorrelating medical image data. A lossy compression algorithm based on semiorthogonal cubic spline wavelets has been implemented and tested on six different image modalities (magnetic resonance, x-ray computed tomography, single photon emission tomography, digital fluoroscopy, computed radiography, and ultrasound). The fidelity of the reconstructed wavelet images was compared to images compressed with a DCT algorithm for compression ratios of up to 40:1. The wavelet algorithm was found to have generally lower average error metrics and higher peak-signal-to-noise ratios than the DCT algorithm.  相似文献   

8.
医学图像数据量大,在高效压缩的同时确保其压缩后的高保真度是医学图像压缩首要考虑的因素。使用第二代整数实现的提升格式小波变换代替原来的小波变换,保证图像的可逆性和小波特性,能够实现真正的无损压缩。实验结果表明,在此基础上完成的多集集合分裂算法(SPIHT),对医学图像的压缩更加平滑,视觉效果好,压缩效果和质量较高,提高了重构图像的PSNR。  相似文献   

9.
10.
The Monte Carlo dose calculation method works by simulating individual energetic photons or electrons as they traverse a digital representation of the patient anatomy. However, Monte Carlo results fluctuate until a large number of particles are simulated. We propose wavelet threshold de-noising as a postprocessing step to accelerate convergence of Monte Carlo dose calculations. A sampled rough function (such as Monte Carlo noise) gives wavelet transform coefficients which are more nearly equal in amplitude than those of a sampled smooth function. Wavelet hard-threshold de-noising sets to zero those wavelet coefficients which fall below a threshold; the image is then reconstructed. We implemented the computationally efficient 9,7-biorthogonal filters in the C language. Transform results were averaged over transform origin selections to reduce artifacts. A method for selecting best threshold values is described. The algorithm requires about 336 floating point arithmetic operations per dose grid point. We applied wavelet threshold de-noising to two two-dimensional dose distributions: a dose distribution generated by 10 MeV electrons incident on a water phantom with a step-heterogeneity, and a slice from a lung heterogeneity phantom. Dose distributions were simulated using the Integrated Tiger Series Monte Carlo code. We studied threshold selection, resulting dose image smoothness, and resulting dose image accuracy as a function of the number of source particles. For both phantoms, with a suitable value of the threshold parameter, voxel-to-voxel noise was suppressed with little introduction of bias. The roughness of wavelet de-noised dose distributions (according to a Laplacian metric) was nearly independent of the number of source electrons, though the accuracy of the de-noised dose image improved with increasing numbers of source electrons. We conclude that wavelet shrinkage de-noising is a promising method for effectively accelerating Monte Carlo dose calculations by factors of 2 or more.  相似文献   

11.
This presentation focuses on the quantitative comparison of three lossy compression methods applied to a variety of 12-bit medical images. One Joint Photographic Exports Group (JPEG) and two wavelet algorithms were used on a population of 60 images. The medical images were obtained in Digital Imaging and Communications in Medicine (DICOM) file format and ranged in matrix size from 256 × 256 (magnetic resonance [MR]) to 2,560 × 2,048 (computed radiography [CR], digital radiography [DR], etc). The algorithms were applied to each image at multiple levels of compression such that comparable compressed file sizes were obtained at each level. Each compressed image was then decompressed and quantitative analysis was performed to compare each compressed-thendecompressed image with its corresponding original image. The statistical measures computed were sum of absolute differences, sum of squared differences, and peak signal-to-noise ratio (PSNR). Our results verify other research studies which show that wavelet compression yields better compression quality at constant compressed file sizes compared with JPEG. The DICOM standard does not yet include wavelet as a recognized lossy compression standard. For implementers and users to adopt wavelet technology as part of their image management and communication installations, there has to be significant differences in quality and compressibility compared with JPEG to justify expensive software licenses and the introduction of proprietary elements in the standard. Our study shows that different wavelet implementations vary in their capacity to differentiate themselves from the old, established lossy JPEG.  相似文献   

12.
This paper introduces an effective technique for the compression of one-dimensional signals using wavelet transforms. It is based on generating a binary stream of 1s and 0s that encodes the wavelet coefficients structure (i.e., encodes the locations of zero and nonzero coefficients). A new coding algorithm, similar to the run length encoding, has been developed for the compression of the binary stream. The compression performances of the technique are measured using compression ratio (CR) and percent root-mean square difference (PRD) measures. To assess the technique properly we have evaluated the effect of signal length, threshold levels selection and wavelet filters on the quality of the reconstructed signal. The effect of finite word length representation on the compression ratio and PRD is also discussed. The technique is tested for the compression of normal and abnormal electrocardiogram (ECG) signals. The performance parameters of the proposed coding algorithm are measured and compression ratios of 19:1 and 45:1 with PRDs of 1% and 2.8% are achieved, respectively. At the receiver end, the received signal is decoded and inverse transformed before being processed. Finally, the merits and demerits of the technique are discussed.  相似文献   

13.
Increasing use of computerized ECG processing systems requires effective electrocardiogram (ECG) data compression techniques which aim to enlarge storage capacity and improve data transmission over phone and internet lines. This paper presents a compression technique for ECG signals using the singular value decomposition (SVD) combined with discrete wavelet transform (DWT). The central idea is to transform the ECG signal to a rectangular matrix, compute the SVD, and then discard small singular values of the matrix. The resulting compressed matrix is wavelet transformed, thresholded and coded to increase the compression ratio. The number of singular values and the threshold level adopted are based on the percentage root mean square difference (PRD) and the compression ratio required. The technique has been tested on ECG signals obtained from MIT-BIH arrhythmia database. The results showed that data reduction with high signal fidelity can thus be achieved with average data compression ratio of 25.2:1 and average PRD of 3.14. Comparison between the obtained results and recently published results show that the proposed technique gives better performance.  相似文献   

14.
Increasing use of computerized ECG processing systems requires effective electrocardiogram (ECG) data compression techniques which aim to enlarge storage capacity and improve data transmission over phone and internet lines. This paper presents a compression technique for ECG signals using the singular value decomposition (SVD) combined with discrete wavelet transform (DWT). The central idea is to transform the ECG signal to a rectangular matrix, compute the SVD, and then discard small singular values of the matrix. The resulting compressed matrix is wavelet transformed, thresholded and coded to increase the compression ratio. The number of singular values and the threshold level adopted are based on the percentage root mean square difference (PRD) and the compression ratio required. The technique has been tested on ECG signals obtained from MIT-BIH arrhythmia database. The results showed that data reduction with high signal fidelity can thus be achieved with average data compression ratio of 25.2:1 and average PRD of 3.14. Comparison between the obtained results and recently published results show that the proposed technique gives better performance.  相似文献   

15.
基于小波包理论的ECG信号数据压缩   总被引:1,自引:0,他引:1  
小波包函数族是小波函数的推广,与一般的小波函数相比小波包函数具有更好时频局域特性。本文在小波包理论的基础上对医学上常用的心电图信号(ECG)进行了数据压缩实验。实验结果表明,在不影响诊断分析的条件下,小波包压缩技术可以达到令人满意的效果。  相似文献   

16.
This paper concerns the robustness of discrete wavelet transform (DWT) compression in terahertz pulsed imaging (TPI). TPI datasets consist of terahertz time-domain series which are sampled at each 'pixel' of the image, leading to file sizes which are typically of the order of several megabytes (MB) per image. This makes efficient compression highly desirable for both transmission and storage. However, since the data may be required for diagnostic purposes it is essential that no relevant information is lost or artefacts introduced. We show that for a nylon step wedge the estimates of refractive index and absorption coefficients are not significantly altered when the terahertz data are reconstructed from only 20% of DWT coefficients.  相似文献   

17.
一种基于小波变换的医学图像量化编码算法的研究   总被引:7,自引:0,他引:7  
医学图像压缩是远程医疗和PACS系统中的重要研究课题,研究了小波子带图像系数的统计分布,发现小波子带图像系数分布和拉普拉斯分布非常相似,继而提出了一种基于其统计特征的图像量化编码算法,该算法以小波子带图像样本标准差为选择量化编码阈值的重要依据。实验表明,该算法具有计算简单,不同阈值范围待编码系数可预测以及易于获得较高压缩效率的优点,在远程医疗和PACS系统等领域的医学图像压缩中有重要的潜在应用价值。  相似文献   

18.
The modified embedded zero-tree wavelet (MEZW) compression algorithm for the one-dimensional signal was originally derived for image compression based on Shapiro's EZW algorithm. It is revealed that the proposed codec is significantly more efficient in compression and in computation than previously proposed ECG compression schemes. The coder also attains exact bit rate control and generates a bit stream progressive in quality or rate. The EZW and MEZW algorithms apply the chosen threshold values or the expressions in order to specify that the significant transformed coefficients are greatly significant. Thus, two different threshold definitions, namely percentage and dyadic thresholds, are used, and they are applied for different wavelet types in biorthogonal and orthogonal classes. In detail, the MEZW and EZW algorithms results are quantitatively compared in terms of the compression ratio (CR) and percentage root mean square difference (PRD). Experiments are carried out on the selected records from the MIT-BIH arrhythmia database and an original ECG signal. It is observed that the MEZW algorithm shows a clear advantage in the CR achieved for a given PRD over the traditional EZW, and it gives better results for the biorthogonal wavelets than the orthogonal wavelets.  相似文献   

19.
基于小波的医学图像渐进编码算法的研究   总被引:1,自引:0,他引:1  
医学图像的渐进编码在医学图像档案的回放和医学图像的远程传输中都有非常重要的意义.作者在研究二维图像小波分解的基础上,提出了一种新的基于小波变换的图像渐进编码算法.实验表明该算法可取得较高的压缩比和运算速度.  相似文献   

20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号