首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
In the decomposition of multi-channel EEG signals, principal component analysis (PCA) and independent component analysis (ICA) have widely been used. However, as both methods are based on handling two-way data, i.e. two-dimensional matrices, multi-way methods might improve the interpretation of frequency transformed multi-channel EEG of channel x frequency x time data. The multi-way decomposition method Parallel Factor (PARAFAC), also named Canonical Decomposition (CANDECOMP), was recently used to decompose the wavelet transformed ongoing EEG of channel x frequency x time (Miwakeichi, F., Martinez-Montes, E., Valdes-Sosa, P.A., Nishiyama, N., Mizuhara, H., Yamaguchi, Y., 2004. Decomposing EEG data into space-time-frequency components using parallel factor analysis. Neuroimage 22, 1035-1045). In this article, PARAFAC is used for the first time to decompose wavelet transformed event-related EEG given by the inter-trial phase coherence (ITPC) encompassing ANOVA analysis of differences between conditions and 5-way analysis of channel x frequency x time x subject x condition. A flow chart is presented on how to perform data exploration using the PARAFAC decomposition on multi-way arrays. This includes (A) channel x frequency x time 3-way arrays of F test values from a repeated measures analysis of variance (ANOVA) between two stimulus conditions; (B) subject-specific 3-way analyses; and (C) an overall 5-way analysis of channel x frequency x time x subject x condition. The PARAFAC decompositions were able to extract the expected features of a previously reported ERP paradigm: namely, a quantitative difference of coherent occipital gamma activity between conditions of a visual paradigm. Furthermore, the method revealed a qualitative difference which has not previously been reported. The PARAFAC decomposition of the 3-way array of ANOVA F test values clearly showed the difference of regions of interest across modalities, while the 5-way analysis enabled visualization of both quantitative and qualitative differences. Consequently, PARAFAC is a promising data exploratory tool in the analysis of the wavelets transformed event-related EEG.  相似文献   

2.
Brain activity during reaction time tasks has been reported to consist of stimulus- and response-locked components. The aim of this study is to apply a method for temporally extracting these components from human scalp electroencephalography (EEG) during an auditory simple reaction time task (SR-task). The stimulus- and response-locked components are extracted from each channel of the EEG epochs and reaction times (RTs) of all the trials by using a discrete Fourier transform; the performance of the method is verified using known simulation data. The extracted stimulus-/response-locked components are compared with the stimulus-/response-triggered average EEG during the SR-task, auditory-evoked potential (AEP) during the passive hearing of an auditory stimulus, and movement-related potential (MRP) during self-paced voluntary movement. For the EEG filtered with a bandpass of 1-40 Hz, the scalp distributions of negative peaks around 400 ms (N400) in the extracted stimulus-locked components are significantly different from those in the stimulus-triggered average EEG during the SR-task, suggesting that the late parts of the stimulus-triggered average EEG largely suffer from temporal smearing with the response-locked components. Furthermore, we show that the effect of the temporal smearing is large when slow waves remain in the EEG. In conclusion, these results confirm the feasibility and necessity of the decomposition method proposed.  相似文献   

3.
The combination of functional magnetic resonance imaging (FMRI) and electroencephalography (EEG) has received much recent attention, since it potentially offers a new tool for neuroscientists that makes simultaneous use of the strengths of the two modalities. However, EEG data collected in such experiments suffer from two kinds of artifact. First, gradient artifacts are caused by the switching of magnetic gradients during FMRI. Second, ballistocardiographic (BCG) artifacts related to cardiac activities further contaminate the EEG data. Here we present new methods to remove both kinds of artifact. The methods are based primarily on the idea that temporal variations in the artifacts can be captured by performing temporal principal component analysis (PCA), which leads to the identification of a set of basis functions which describe the temporal variations in the artifacts. These basis functions are then fitted to, and subtracted from, EEG data to produce artifact-free results. In addition, we also describe a robust algorithm for the accurate detection of heart beat peaks from poor quality electrocardiographic (ECG) data that are collected for the purpose of BCG artifact removal. The methods are tested and are shown to give superior results to existing methods. The methods also demonstrate the feasibility of simultaneous EEG/FMRI experiments using the relatively low EEG sampling frequency of 2048 Hz.  相似文献   

4.
Ferree TC  Brier MR  Hart J  Kraut MA 《NeuroImage》2009,45(1):109-121
A new method is developed for analyzing the time-varying spectral content of EEG data collected in cognitive tasks. The goal is to extract and summarize the most salient features of numerical results, which span space, time, frequency, task conditions, and multiple subjects. Direct generalization of an established approach for analyzing event-related potentials, which uses sequential PCA followed by ANOVA to test for differences between conditions across subjects, gave unacceptable results. The new method, termed STAT-PCA, advocates statistical testing for differences between conditions within single subjects, followed by sequential PCA across subjects. In contrast to PCA-ANOVA, it is demonstrated that STAT-PCA gives results which: 1) isolate task-related spectral changes, 2) are insensitive to the precise definition of baseline power, 3) are stable under deletion of a random subject, and 4) are interpretable in terms of the group-averaged power. Furthermore, STAT-PCA permits the detection of activity that is not only different between conditions, but also common to both conditions, providing a complete yet parsimonious view of the data. It is concluded that STAT-PCA is well suited for analyzing the time-varying spectral content of EEG during cognitive tasks.  相似文献   

5.
Yao J  Dewald JP 《NeuroImage》2005,25(2):369-382
Different cortical source localization methods have been developed to directly link the scalp potentials with the cortical activities. Up to now, these methods are the only possible solution to noninvasively investigate cortical activities with both high spatial and time resolutions. However, the application of these methods is hindered by the fact that they have not been rigorously evaluated nor compared. In this paper, the performances of several source localization methods (moving dipoles, minimum Lp norm, and low resolution tomography (LRT) with Lp norm, p equal to 1, 1.5, and 2) were evaluated by using simulated scalp EEG data, scalp somatosensory evoked potentials (SEPs), and upper limb motor-related potentials (MRPs) obtained on human subjects (all with 163 scalp electrodes). By using simulated EEG data, we first evaluated the source localization ability of the above methods quantitatively. Subsequently, the performance of the various methods was evaluated qualitatively by using experimental SEPs and MRPs. Our results show that the overall LRT Lp norm method with p equal to 1 has a better source localization ability than any of the other investigated methods and provides physiologically meaningful reconstruction results. Our evaluation results provide useful information for choosing cortical source localization approaches for future EEG/MEG studies.  相似文献   

6.
Delorme A  Sejnowski T  Makeig S 《NeuroImage》2007,34(4):1443-1449
Detecting artifacts produced in EEG data by muscle activity, eye blinks and electrical noise is a common and important problem in EEG research. It is now widely accepted that independent component analysis (ICA) may be a useful tool for isolating artifacts and/or cortical processes from electroencephalographic (EEG) data. We present results of simulations demonstrating that ICA decomposition, here tested using three popular ICA algorithms, Infomax, SOBI, and FastICA, can allow more sensitive automated detection of small non-brain artifacts than applying the same detection methods directly to the scalp channel data. We tested the upper bound performance of five methods for detecting various types of artifacts by separately optimizing and then applying them to artifact-free EEG data into which we had added simulated artifacts of several types, ranging in size from thirty times smaller (-50 dB) to the size of the EEG data themselves (0 dB). Of the methods tested, those involving spectral thresholding were most sensitive. Except for muscle artifact detection where we found no gain of using ICA, all methods proved more sensitive when applied to the ICA-decomposed data than applied to the raw scalp data: the mean performance for ICA was higher and situated at about two standard deviations away from the performance distribution obtained on raw data. We note that ICA decomposition also allows simple subtraction of artifacts accounted for by single independent components, and/or separate and direct examination of the decomposed non-artifact processes themselves.  相似文献   

7.
8.
We present two related probabilistic methods for neural source reconstruction from MEG/EEG data that reduce effects of interference, noise, and correlated sources. Both methods localize source activity using a linear mixture of temporal basis functions (TBFs) learned from the data. In contrast to existing methods that use predetermined TBFs, we compute TBFs from data using a graphical factor analysis based model [Nagarajan, S.S., Attias, H.T., Hild, K.E., Sekihara, K., 2007a. A probabilistic algorithm for robust interference suppression in bioelectromagnetic sensor data. Stat Med 26, 3886-3910], which separates evoked or event-related source activity from ongoing spontaneous background brain activity. Both algorithms compute an optimal weighting of these TBFs at each voxel to provide a spatiotemporal map of activity across the brain and a source image map from the likelihood of a dipole source at each voxel. We explicitly model, with two different robust parameterizations, the contribution from signals outside a voxel of interest. The two models differ in a trade-off of computational speed versus accuracy of learning the unknown interference contributions. Performance in simulations and real data, both with large noise and interference and/or correlated sources, demonstrates significant improvement over existing source localization methods.  相似文献   

9.
Nurses and other health researchers are often concerned with infrequently occurring, repeatable, health-related events such as number of hospitalizations, pregnancies, or visits to a health care provider. Reports on the occurrence of such discrete events take the form of non-negative integer or count data. Because the counts of infrequently occurring events tend to be non-normally distributed and highly positively skewed, the use of ordinary least squares (OLS) regression with non-transformed data has several shortcomings. Techniques such as Poisson regression and negative binomial regression may provide more appropriate alternatives for analyzing these data. The purpose of this article is to compare and contrast the use of these three methods for the analysis of infrequently occurring count data. The strengths, limitations, and special considerations of each approach are discussed. Data from the National Longitudinal Survey of Adolescent Health (AddHealth) are used for illustrative purposes.  相似文献   

10.
High-throughput cDNA microarray technology allows for the simultaneous analysis of gene expression levels for thousands of genes and as such, rapid, relatively simple methods are needed to store, analyze, and cross-compare basic microarray data. The application of a classical method of data normalization, Z score transformation, provides a way of standardizing data across a wide range of experiments and allows the comparison of microarray data independent of the original hybridization intensities. Data normalized by Z score transformation can be used directly in the calculation of significant changes in gene expression between different samples and conditions. We used Z scores to compare several different methods for predicting significant changes in gene expression including fold changes, Z ratios, Z and t statistical tests. We conclude that the Z score transformation normalization method accompanied by either Z ratios or Z tests for significance estimates offers a useful method for the basic analysis of microarray data. The results provided by these methods can be as rigorous and are no more arbitrary than other test methods, and, in addition, they have the advantage that they can be easily adapted to standard spreadsheet programs.  相似文献   

11.
12.
Successful settlement of disabled persons in gainful employment is closely contingent on both their training and the working environment present. For those disabled persons who cannot find jobs in the open market, it is possible to work in sheltered workshops. Founded on the initiative of parents of spastic children, the Saarbrücken Reha GmbH, a limited liability company for the sheltered employment of disabled persons, has for several years been employing disabled people in the field of text and data processing. This paper not only outlines some practical examples to illustrate suitable systems but also describes the types of tasks where good results can be achieved by the disabled employees.  相似文献   

13.
A total of 84 volatile aroma components were determined in the 9 samples of sugarcane to non-centrifugal sugar (NCS), including 15 alcohols, 12 aldehydes, 10 ketones, 17 carboxylic acids, 11 pyrazines, 7 phenols, 3 esters, 3 hydrocarbons, and 2 sulfur compounds. Of these compounds, 10 were with high flavor dilution (FD) factors based on the aroma extract dilution analysis (AEDA). 4-Hydroxy-2,5-dimethyl-3(2H)furanone exhibited the highest FD factor of 2187, followed by (E)-2-nonenal, 2-hydroxy-3-methyl-2-cyclopentene-1-one, and 4-allyl-2,6-dimethoxyphenol with a FD factor of 729. The odor compounds showed no significant change and were similar to that of sugarcane during the first four steps in the production of non-centrifugal cane sugar. In the middle three stages, the heating slightly affected the aroma composition. Additionally, a prolonged period of high-temperature heating, lead to the production of the Maillard reaction products, such as pyrazines, pyrroles, and furans, differentiating the step to be unique from the previous seven stages. However, the content of the NCS odorants was significantly reduced due to the loss of odor compounds during the drying process.

84 volatile aroma components were determined in 9 samples of sugarcane to non-centrifugal sugar (NCS), including 15 alcohols, 12 aldehydes, 10 ketones, 17 carboxylic acids, 11 pyrazines, 7 phenols, 3 esters, 3 hydrocarbons, and 2 sulfur compounds.  相似文献   

14.
Hauk O 《NeuroImage》2004,21(4):1612-1621
The present study aims at finding the optimal inverse solution for the bioelectromagnetic inverse problem in the absence of reliable a priori information about the generating sources. Three approaches to tackle this problem are compared theoretically: the maximum-likelihood approach, the minimum norm approach, and the resolution optimization approach. It is shown that in all three of these frameworks, it is possible to make use of the same kind of a priori information if available, and the same solutions are obtained if the same a priori information is implemented. In particular, they all yield the minimum norm pseudoinverse (MNP) in the complete absence of such information. This indicates that the properties of the MNP, and in particular, its limitations like the inability to localize sources in depth, are not specific to this method but are fundamental limitations of the recording modalities. The minimum norm solution provides the amount of information that is actually present in the data themselves, and is therefore optimally suited to investigate the general resolution and accuracy limits of EEG and MEG measurement configurations. Furthermore, this strongly suggests that the classical minimum norm solution is a valuable method whenever no reliable a priori information about source generators is available, that is, when complex cognitive tasks are employed or when very noisy data (e.g., single-trial data) are analyzed. For that purpose, an efficient and practical implementation of this method will be suggested and illustrated with simulations using a realistic head geometry.  相似文献   

15.
A cross‐sectional survey was conducted, and the construct validity and reliability of the Brisbane Practice Environment Measure in an Australian sample of registered nurses were examined. Nurses were randomly selected from the database of an Australian nursing organization. The original 33 items of the Brisbane Practice Environment Measure were utilized to inform the psychometric properties using confirmatory factor analysis. The Cronbach's alpha was 0.938 for the total scale and ranged 0.657–0.887 for the subscales. A five‐factor structure of the measure was confirmed, χ2 = 944.622, (P < 0.01), χ2/d.f. ratio = 2.845, Tucker Lewis Index 0.929, Root Mean Square Error = 0.061 and Comparative Fit Index = 0.906. The selected 28 items of the measure proved reliable and valid in measuring effects of the practice environment upon Australian nurses. The implications are that regular measurement of the practice environment using these 28 items might assist in the development of strategies which might improve job satisfaction and retention of registered nurses in Australia.  相似文献   

16.
In this study, we propose an extended Kalman filter approach for the estimation of the human head tissue conductivities in vivo by using electroencephalogram (EEG) data. Since the relationship between the surface potentials and conductivity distribution is nonlinear, the proposed algorithm first linearizes the system and applies extended Kalman filtering. By using a three-compartment realistic head model obtained from the magnetic resonance images of a real subject, a known dipole assumption and 32 electrode positions, the performance of the proposed method is tested in simulation studies and it is shown that the proposed algorithm estimates the tissue conductivities with less than 1% error in noiseless measurements and less than 5% error when the signal-to-noise ratio is 40 dB or higher. We conclude that the proposed extended Kalman filter approach successfully estimates the tissue conductivities in vivo.  相似文献   

17.
18.
19.
20.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号