首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
This paper proposes a new wavelet-based ECG compression technique. It is based on optimized thresholds to determine significant wavelet coefficients and an efficient coding for their positions. Huffman encoding is used to enhance the compression ratio. The proposed technique is tested using several records taken from the MIT-BIH arrhythmia database. Simulation results show that the proposed technique outperforms others obtained by previously published schemes.  相似文献   

2.
A new wavelet-based method for the compression of electrocardiogram (ECG) data is presented. A discrete wavelet transform (DWT) is applied to the digitized ECG signal. The DWT coefficients are first quantized with a uniform scalar dead-zone quantizer, and then the quantized coefficients are decomposed into four symbol streams, representing a binary significance stream, the signs, the positions of the most significant bits, and the residual bits. An adaptive arithmetic coder with several different context models is employed for the entropy coding of these symbol streams. Simulation results on several records from the MIT-BIH arrhythmia database show that the proposed coding algorithm outperforms some recently developed ECG compression algorithms.  相似文献   

3.
A new adaptive thresholding mechanism to determine the significant wavelet coefficients of an electrocardiogram (ECG) signal is proposed. It is based on estimating thresholds for different sub-bands using the concept of energy packing efficiency (EPE). Then thresholds are optimized using the particle swarm optimization (PSO) algorithm to achieve a target compression ratio with minimum distortion. Simulation results on several records taken from the MIT-BIH Arrhythmia database show that the PSO converges exactly to the target compression after four iterations while the cost function achieved its minimum value after six iterations. Compared to previously published schemes, lower distortions are achieved for the same compression ratios.  相似文献   

4.
Optimal wavelets for biomedical signal compression   总被引:3,自引:3,他引:0  
Signal compression is gaining importance in biomedical engineering due to the potential applications in telemedicine. In this work, we propose a novel scheme of signal compression based on signal-dependent wavelets. To adapt the mother wavelet to the signal for the purpose of compression, it is necessary to define (1) a family of wavelets that depend on a set of parameters and (2) a quality criterion for wavelet selection (i.e., wavelet parameter optimization). We propose the use of an unconstrained parameterization of the wavelet for wavelet optimization. A natural performance criterion for compression is the minimization of the signal distortion rate given the desired compression rate. For coding the wavelet coefficients, we adopted the embedded zerotree wavelet coding algorithm, although any coding scheme may be used with the proposed wavelet optimization. As a representative example of application, the coding/encoding scheme was applied to surface electromyographic signals recorded from ten subjects. The distortion rate strongly depended on the mother wavelet (for example, for 50% compression rate, optimal wavelet, mean±SD, 5.46±1.01%; worst wavelet 12.76±2.73%). Thus, optimization significantly improved performance with respect to previous approaches based on classic wavelets. The algorithm can be applied to any signal type since the optimal wavelet is selected on a signal-by-signal basis. Examples of application to ECG and EEG signals are also reported.  相似文献   

5.
目的为降低心电信号存储和传输的数据量,并克服传统心电压缩方法只利用导联内相关性的劣势,本文提出一种基于小波域主成分分析和分层编码(w PCA_LC)的压缩方法。方法首先通过心电电极获取12通道心电数据,对所有通道的心电信号做小波变换,每个尺度下的小波系数组成小波系数矩阵,在每个系数矩阵上做主成分分析(principal component analysis,PCA),之后对小波系数小的主成分做[位置增量,数据]的编码方式,其他主成分采用霍夫曼编码,最后使用本文算法压缩圣彼得堡心率失常数据库。结果实验表明,在均方根误差为5.2%时,本文算法的压缩比为71,远高于基于稀疏分解的方法和基于小波变换阈值选择的方法。结论基于小波域主成分分析的心电压缩算法对多导联心电信号具有较好的压缩性能。  相似文献   

6.
The most characteristic wave set in ECG signals is the QRS complex. Automatic procedures to classify the QRS are very useful in the diagnosis of cardiac dysfunctions. Early detection and classification of QRS changes are important in realtime monitoring. ECG data compression is also important for storage and data transmission. An Adaptive Hermite Model Estimation System (AHMES) is presented for on-line beat-to-beat estimation of the features that describe the QRS complex with the Hermite model. The AHMES is based on the multiple-input adaptive linear combiner, using as inputs the succession of the QRS complexes and the Hermite functions, where a procedure has been incorporated to adaptively estimate a width related parameter b. The system allows an efficient real-time parameter extraction for classification and data compression. The performance of the AHMES is compared with that of direct feature estimation, studying the improvement in signal-to-noise ratio. In addition, the effect of misalignment at the QRS mark is shown to become a neglecting low-pass effect. The results allow the conditions in which the AHMES improves the direct estimate to be established. The application is shown, for subsequent classification, of the AHMES in extracting the QRS features of an ECG signal with the bigeminy phenomena. Another application is highlighted that helps wide ectopic beats detection using the width parameter b.  相似文献   

7.
A discrete semi-periodic signal can be described as x(n)=x(n+T+ΔT) +Δx,∀n, where T is the fundamental period, ΔT represents a random period variation, and Δx is an amplitude variation. Discrete ECG signals are treated as semi-periodic, where T and Δx are associated with the heart beat rate and the baseline drift, respectively. These two factors cause coding inefficiency for ECG signal compression using vector quantisation (VQ). First, the periodic characteristic of ECG signals creates data redundancy among codevectors in a traditional two-dimensional codebook. Secondly, the fixed codevectors in traditional VQ result in low adaptability to signal variations. To solve these two problems simultaneously, an adaptive VQ (AVQ) scheme is proposed, based on a one-dimensional (1D) codebook structure, where codevectors are overlapped and linearly shifted. To further enhance the coding performance, the Δx term is extracted and encoded separately, before 1D-AVQ is applied. The data in the first 3 min of all 48 ECG records from the MIT/BIH arrhythmic database are used as the test signals, and no codebook training is carried out in advance. The compressed data rate is 265.2±92.3 bits s−1 at 10.0±4.1% PRD. No codebook storage or transmission is required. Only a very small codebook storage space is needed temporarily during the coding process. In addition, the linearly shifted nature of codevectors makes this easier to be hardware implemented than any existing AVQ method.  相似文献   

8.
An important factor to consider when using findings on electrocardiograms for clinical decision making is that the waveforms are influenced by normal physiological and technical factors as well as by pathophysiological factors. In this paper, we propose a method for the feature extraction and heart disease diagnosis using wavelet transform (WT) technique and LabVIEW (Laboratory Virtual Instrument Engineering workbench). LabVIEW signal processing tools are used to denoise the signal before applying the developed algorithm for feature extraction. First, we have developed an algorithm for R-peak detection using Haar wavelet. After 4th level decomposition of the ECG signal, the detailed coefficient is squared and the standard deviation of the squared detailed coefficient is used as the threshold for detection of R-peaks. Second, we have used daubechies (db6) wavelet for the low resolution signals. After cross checking the R-peak location in 4th level, low resolution signal of daubechies wavelet P waves and T waves are detected. Other features of diagnostic importance, mainly heart rate, R-wave width, Q-wave width, T-wave amplitude and duration, ST segment and frontal plane axis are also extracted and scoring pattern is applied for the purpose of heart disease diagnosis. In this study, detection of tachycardia, bradycardia, left ventricular hypertrophy, right ventricular hypertrophy and myocardial infarction have been considered. In this work, CSE ECG data base which contains 5000 samples recorded at a sampling frequency of 500 Hz and the ECG data base created by the S.G.G.S. Institute of Engineering and Technology, Nanded (Maharashtra) have been used.  相似文献   

9.
Reconstruction quality maintenance is of the essence for ECG data compression due to the desire for diagnosis use. Quantization schemes with non-linear distortion characteristics usually result in time-consuming quality control that blocks real-time application. In this paper, a new wavelet coefficient quantization scheme based on an evolution program (EP) is proposed for wavelet-based ECG data compression. The EP search can create a stationary relationship among the quantization scales of multi-resolution levels. The stationary property implies that multi-level quantization scales can be controlled with a single variable. This hypothesis can lead to a simple design of linear distortion control with 3-D curve fitting technology. In addition, a competitive strategy is applied for alleviating data dependency effect. By using the ECG signals saved in MIT and PTB databases, many experiments were undertaken for the evaluation of compression performance, quality control efficiency, data dependency influence. The experimental results show that the new EP-based quantization scheme can obtain high compression performance and keep linear distortion behavior efficiency. This characteristic guarantees fast quality control even for the prediction model mismatching practical distortion curve.  相似文献   

10.
Dynamic time warping techniques have been used to characterize the timing variation of the constituent components of the human electrocardiogram (ECG). Lead II ECG recordings were obtained in 21 subjects, 10 male and 11 female aged between 13–65 years. The fiducial points in each cardiac cycle were identified in the recordings across the range of heart rate from 46–184 beats/min. A set of second order equations in the square root of the cardiac cycle time was obtained to describe the duration each of the constituent components in the ECG signal. The accuracy of the dynamic time warping technique was verified against professionally annotated clinical recordings in the on-line PhysioNet? database. The equations obtained allow a Lead II ECG signal to be synthesized in which the variation with heart rate of the profile of each in the signal mirrors the true in-vivo behaviour.  相似文献   

11.
Increasing use of computerized ECG processing systems requires effective electrocardiogram (ECG) data compression techniques which aim to enlarge storage capacity and improve data transmission over phone and internet lines. This paper presents a compression technique for ECG signals using the singular value decomposition (SVD) combined with discrete wavelet transform (DWT). The central idea is to transform the ECG signal to a rectangular matrix, compute the SVD, and then discard small singular values of the matrix. The resulting compressed matrix is wavelet transformed, thresholded and coded to increase the compression ratio. The number of singular values and the threshold level adopted are based on the percentage root mean square difference (PRD) and the compression ratio required. The technique has been tested on ECG signals obtained from MIT-BIH arrhythmia database. The results showed that data reduction with high signal fidelity can thus be achieved with average data compression ratio of 25.2:1 and average PRD of 3.14. Comparison between the obtained results and recently published results show that the proposed technique gives better performance.  相似文献   

12.
During the last few years, medical research areas of critical importance such as Epilepsy monitoring and study, increasingly utilize wireless sensor network technologies in order to achieve better understanding and significant breakthroughs. However, the limited memory and communication bandwidth offered by WSN platforms comprise a significant shortcoming to such demanding application scenarios. Although, data compression can mitigate such deficiencies there is a lack of objective and comprehensive evaluation of relative approaches and even more on specialized approaches targeting specific demanding applications. The research work presented in this paper focuses on implementing and offering an in-depth experimental study regarding prominent, already existing as well as novel proposed compression algorithms. All algorithms have been implemented in a common Matlab framework. A major contribution of this paper, that differentiates it from similar research efforts, is the employment of real world Electroencephalography (EEG) and Electrocardiography (ECG) datasets comprising the two most demanding Epilepsy modalities. Emphasis is put on WSN applications, thus the respective metrics focus on compression rate and execution latency for the selected datasets. The evaluation results reveal significant performance and behavioral characteristics of the algorithms related to their complexity and the relative negative effect on compression latency as opposed to the increased compression rate. It is noted that the proposed schemes managed to offer considerable advantage especially aiming to achieve the optimum tradeoff between compression rate-latency. Specifically, proposed algorithm managed to combine highly completive level of compression while ensuring minimum latency thus exhibiting real-time capabilities. Additionally, one of the proposed schemes is compared against state-of-the-art general-purpose compression algorithms also exhibiting considerable advantages as far as the compression rate is concerned.  相似文献   

13.
The proposed ECG compression method combines three major approaches based on time division multiplexing (TDM) and multilevel wavelet decomposition followed by parametrical modelling. Before applying these techniques, a pre-processing step is required, which consists of detecting and aligning different beats. Even though this compression method is regarded as a lossy method, we will show how a high compression ratio (CR) can be achieved by preserving the major medical information within the ECG. Several normal and abnormal signals from various databases are used to evaluate the performance of the proposed technique.  相似文献   

14.
This paper proposes an efficient compression scheme for compressing time-varying medical volumetric data. The scheme uses 3-D motion estimation to create a homogenous preprocessed data to be compressed by a 3-D image compression algorithm using hierarchical vector quantization. A new block distortion measure, called variance of residual (VOR), and three 3-D fast block matching algorithms are used to improve the motion estimation process in term of speed and data fidelity. The 3-D image compression process involves the application of two different encoding techniques based on the homogeneity of input data. Our method can achieve a higher fidelity and faster decompression time compared to other lossy compression methods producing similar compression ratios. The combination of 3-D motion estimation using VOR and hierarchical vector quantization contributes to the good performance.  相似文献   

15.
16.
为了提高噪声环境下的电子耳蜗言语识别率,麦克风阵列语音增强的方法逐渐应用在电子耳蜗前端信号采集中。由于电子耳蜗尺寸的限制,双麦克风阵列信号采集是常见的模式,其形成的系统波束指向可用于特定方位的干扰信号的抑制。延迟、权重、频率和间距等参数影响着双麦克风系统的极性图。本文研究基于双麦克风模式的电子耳蜗信号采集和不同参数条件下的波束形成特点,为双麦克风模式的语音增强算法研究提供支持。  相似文献   

17.
The stimulation of programmed cell death can either enhance or inhibit antigen presentation by classic major histocompatibility complex molecules. In the current study, we report that the induction of apoptosis by topoisomerase I inhibition or elevation of intracellular ceramide levels substantially impairs CD1d-mediated antigen presentation. In the former case, such a reduction occurred via the regulation of both the p38 mitogen-activated protein kinases and protein kinase C delta signal transduction pathways as well as the caspase cascade, whereas the latter was p38-(but not caspase)-dependent. Confocal microscopic analysis showed an altered intracellular distribution of CD1d following the inhibition topoisomerase I or by an increase in intracellular ceramide levels, that was prevented by p38 and caspase inhibitors. Thus, the induction of apoptosis in antigen presenting cells severely compromises CD1d-mediated antigen presentation by multiple mechanisms.  相似文献   

18.
Summary By completing and correcting the sequence of a 1.8 kb DNA segment downstream of the oxi2 gene of Saccharomyces cerevisiae, a long, potentially coding sequence (RF2) has been identified. The sequence is rather closely related to the RF1 open reading frame, downstream of the oxil gene, and, further, to the major family of intronic open reading frames. The RF2 open reading frame is not continuous, however, for it is interrupted by two GC clusters, both of which ultimately result in a –1 frameshift. Comparison with RF1 reveals a third insertion. This is centered on an oligo nucleotide, AATAATATTCTTA, which is found (sometimes in a slightly modified form) downstream of ten proven or suspected protein coding genes, including RF1 and RF2, and is known to terminate the apocytochrome b messenger RNA. It is suggested from the known distribution of this putative end-of-messenger signal, that it could play an essential part in controlling the expression of several minor proteins, both intronic and non-intronic. The possibility of the RF2 sequence being functional in spite of its interruptions is also discussed.  相似文献   

19.
A novel in situ hybridization (ISH) method for detecting human immunodeficiency virus-1 (HIV-1) was developed by applying a peptide nucleic acid (PNA) probe and a catalysed signal amplification (CSA) method. The PNA probe used in the present study possessed 15 base sequences of the HIV-1 protease gene, and the 5' end of the probe was labelled with the fluorescein isothiocyanate (FITC) molecule. The hybridized probe was detected by sequential reactions of the following antibodies and reagents: horseradish peroxidase (HRP)-conjugated anti-FITC antibody, biotinylated tyramide (first amplification), HRP-labelled streptavidin, biotinylated tyramide (second amplification), and streptavidin-conjugated Alexa 488. The signal of Alexa 488 was finally detected by fluorescence microscopy. HIV-1-related dotted signals were clearly obtained in HIV-1 persistently infected cell lines, MOLT4-III(B) and ACH-2, and CD4-positive T lymphocytes from AIDS patients. For light microscopy, HRP-labelled streptavidin was reacted instead of streptavidin-conjugated Alexa 488 at the final treatment, followed by diaminobenzidine as chromogen. This method can detect HIV-1 in either blood smear samples or paraffin-embedded autopsy tissue and is useful as a sensitive non-radioactive method for in situ hybridization.  相似文献   

20.
DNA-bending proteins are known to facilitate the in vitro V(D)J joining of antigen receptor genes. Here we report that the high-mobility group protein, HMG1, is necessary for the correct nicking of the 23 bp recombination signal sequence (23-RSS) by the recombination [corrected] activating gene (RAG) proteins, RAG1 and RAG2. Without HMG1, the mouse Jkappa1 23-RSS was recognized as if it were the 12-RSS and nicked at a site 12 + 7 nucleotides away from the 9mer signal, even though no 7mer-like sequence was evident at the cryptic nicking site. When increased amounts of HMG1 were added, the 23-RSS substrate was nicked correctly at a site 23 + 7 nucleotides from the 9mer, and nicking at the cryptic site disappeared. Unlike the 23-RSS, the 12-RSS did not require HMG1 for correct nicking, although HMG1 was found to increase the interaction between RSS and RAG proteins. Modification-interference assays demonstrated that HMG1 caused changes in the interaction between the 23-RSS and RAG proteins specifically at the 7mer and the cryptic nicking site.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号