首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The present state of the art in the solution of the inverse problem by neuromagnetic measurements, as demonstrated by recent important findings, is encouraging. Nevertheless, significant instrumental and analytical problems still need to be solved, in order to significantly improve the capability of the technique. This paper is aimed at providing a perspective for these problems and their possible solutions.  相似文献   

2.
The techniques of magnetoencephalography (MEG) have advanced considerably recently with several major installations now being built or planned. In this paper, the present status of MEG within clinical medicine is evaluated and compared with that of other large diagnostic instruments. Deficiencies in present instruments and procedures are discussed. It is argued that in the future methods must be found of investigating sub-cortical structures which are relevant in the majority of clinically significant conditions. Some important possible lines of enquiry are indicated.  相似文献   

3.
Using the boundary element method in conjunction with Tikhonov zero-order regularization, we have computed epicardial potentials from body surface potential data in a realistic geometry heart-torso system. The inverse-reconstructed epicardial potentials were compared to the actual measured potentials throughout a normal cardiac cycle. Potential features (maxima, minima) were recovered with an accuracy better than 1 cm in their location. In this chapter, we use these data to illustrate and discuss computational issues related to the inverse-reconstruction procedure. These include the boundary element method, the choice of a regularization scheme to stabilize the inversion, and the effects of incorporating a priori information on the accuracy of the solution. In particular, emphasis is on the use of temporal information in the regularization procedure. The sensitivity of the solution to geometrical errors and to the spatial and temporal resolution of the data is discussed.  相似文献   

4.
基于头皮电位测量数据反向推导脑电活动源头信息 ,对研究脑神经电气活动规律、探索脑认知功能具有重要的科学意义和临床应用价值。本文详细介绍了脑电逆问题的研究现状 ,主要包括头模型、偶极子模型、正问题的算法和逆问题的求解 ,并对其中的全局优化算法进行了较深入的讨论  相似文献   

5.
6.
This paper reviews those inverse electrocardiographic solutions that compute the electrical activity of the heart in terms of equivalent sources such as multipoles or multiple dipoles, as opposed to more realistic source formulations such as epicardial potentials. It treats, in succession, inverse solutions in terms of a single fixed-location dipole, a multipole series, moving dipoles, and, finally, multiple fixed-location dipoles. For each category of solution, simulation studies, animal experiments, and work involving human subjects are reviewed. Finally, more recent work that seeks to compute the cardiac activation isochrones, from the time integrals of the torso potentials during the QRS complex of the electrocardiogram, is described. The paper concludes with a discussion on the future of inverse electrocardiographic solutions in terms of equivalent sources.  相似文献   

7.
8.
The inverse problem of electrocardiography aims at noninvasively reconstructing electrical activity of the heart from recorded body-surface electrocardiograms. A crucial step is regularization, which deals with ill-posedness of the problem by imposing constraints on the possible solutions. We developed a regularization method that includes electrophysiological input. Body-surface potentials are recorded and a computed tomography scan is performed to obtain the torso–heart geometry. Propagating waveforms originating from several positions at the heart are simulated and used to generate a set of basis vectors representing spatial distributions of potentials on the heart surface. The real heart-surface potentials are then reconstructed from the recorded body-surface potentials by finding a sparse representation in terms of this basis. This method, which we named ‘physiology-based regularization’ (PBR), was compared to traditional Tikhonov regularization and validated using in vivo recordings in dogs. PBR recovered details of heart-surface electrograms that were lost with traditional regularization, attained higher correlation coefficients and led to improved estimation of recovery times. The best results were obtained by including approximate knowledge about the beat origin in the PBR basis.  相似文献   

9.
This is a review of the role of model and computational experiments in studies of the part of the biomagnetic inverse problem that deals with the determination of electrical sources in the body using magnetic measurements around the body. Results from modelling studies of the forward problem that are important for the inverse problem are also reviewed. An evaluation is made of the adequacy of various models of the body for use in the biomagnetic inverse problem. This evaluation indicates that simple torso models, e.g. a semi-infinite volume or sphere, are probably inadequate. The review of the modelling studies of the inverse problem includes the effects of noise, source modelling errors, body modelling errors and measurement errors on the accuracy of source localisation methods using magnetic measurements. Source modelling errors are caused by differences between an actual complex source in the body and the simple model of it used in most source localisation methods; body modelling errors are caused by differences between the actual body and a simple model of it. The review indicates that typical experimental noise will only cause significant source localisation errors for inverse solutions calculated using fewer than approximately ten measurement points; it also indicates that source modelling errors must be rather large to be detectable when typical experimental noise is present. In addition, the review indicates that many experimental measurement errors will not cause significant localisation errors. The effects of body modelling errors are largely unknown. Suggestions for further biomagnetic inverse problem research are given. These include the development of more realistic models of the body, the experimental verification of such models and source localisation methods, and the development of methods for detecting and localising distributed or multiple discrete sources.  相似文献   

10.
This study evaluated the effectiveness of the Problem Solving For Life program as a universal approach to the prevention of adolescent depression. Short-term results indicated that participants with initially elevated depressions scores (high risk) who received the intervention showed a significantly greater decrease in depressive symptoms and increase in life problem-solving scores from pre- to postintervention compared with a high-risk control group. Low-risk participants who received the intervention reported a small but significant decrease in depression scores over the intervention period, whereas the low-risk controls reported an increase in depression scores. The low-risk group reported a significantly greater increase in problem-solving scores over the intervention period compared with low-risk controls. These results were not maintained, however, at 12-month follow-up.  相似文献   

11.
It has long been speculated that incorporation of available time constraints into the inverse electrocardiography problem could improve the accuracy of maps of epicardial potential or activation reconstructed from body surface potential measurements. However, all prior formulations of this problem have remained ill-posed, and the best way to utilize these constraints has been unclear. By making proper use of the timing information, we show that the inverse electrocardiography problem (for calculation of ventricular surface activation isochrones) is formally well-posed under anisotropic bidomain conditions and the assumption that ventricular muscle action potential phase 0 is a step discontinuity. In practical terms, this implies that non-regularized stable activation map solutions are possible if correlates of derived body surface potential derivative discontinuity times can be identified from the noisy analog signals, and only a small number of ventricular surface activation function extrema occur during a unit of time resolution defined by phase zero duration over the spatial extent of a bidomain point. We include a quasi-realistic numerical example illustrating the ease with which the extrema of the endocardial and epicardial activation maps are computed via Jump Maps derived from body surface potentials (this being the crucial step in rendering images of ventricular surface activation in this approach). The efficient signal processing algorithm used to accomplish this task is well suited to the setting of multiple extrema occurring during overlapping phase zero time intervals.  相似文献   

12.
Phantom tests are performed for pre-clinical evaluation of a commercial inverse planning system (HELAX TMS, V 6.0) for segmented multileaf collimator (MLC) intensity modulated radiotherapy (IMRT) delivery. The optimization module has available two optimization algorithms: the target primary feasibility and the weighted feasibility algorithm, only the latter allows the user to specify weights for structures. In the first series, single beam tests are performed to evaluate the outcome of inverse planning in terms of plausibility for the following situations: oblique incidence, presence of inhomogeneities, multiple targets at different depths and multiple targets with different desired doses. Additionally, for these tests a manual plan is made for comparison. In the absence of organs at risk, both the optimization algorithms are found to assign the highest priority to low dose constraints for targets. In the second series, tests resembling clinical relevant configurations (simultaneous boost and concave target with critical organ) are performed with multiple beam arrangements in order to determine the impact of the system's configuration on inverse planning. It is found that the definition of certain segment number and segment size limitations does not largely compromise treatment plans when using multiple beams. On the other hand, these limitations are important for delivery efficiency and dosimetry. For the number of iterations and voxels per volume of interest, standard values in the system's configuration are considered to be sufficient. Additionally, it is demonstrated that precautions must be taken to precisely define treatment goals when using computerized treatment optimization. Similar phantom tests could be used for a direct dosimetric verification of all steps from inverse treatment planning to IMRT delivery.  相似文献   

13.
In previously published studies, blood flow velocity from x-ray biplane angiography was measured by solving an inverse advection problem, relating velocity to bolus densities summed across sections. Both spatial and temporal velocity variations were recovered through a computationally expensive parameter estimation algorithm. Here we prove the existence and uniqueness of the solution on three sub-domains of the plane defined by the axial position along the vessel and the time of the angiographic sequence. A fast direct scheme was designed in conjunction with a regularization step stemming from the volume flow conservation law applied on consecutive segments. Its accuracy and immunity towards noise were tested on both simulated and real densitometric data. The relative error between the estimated and expected velocities was less than 5% for more than 90% of the points of the spatiotemporal plane with simulated densities normalized to 1.0 and a Gaussian additive noise of standard deviation 0.01. For densities reconstructed from a biplane angiographic sequence, increase in velocity is used as a functional index for the stenosis ratio and to characterize the sharing of flow at bifurcation.  相似文献   

14.
Elastography is emerging as an imaging modality that can distinguish normal versus diseased tissues via their biomechanical properties. This paper reviews current approaches to elastography in three areas--quasi-static, harmonic and transient--and describes inversion schemes for each elastographic imaging approach. Approaches include first-order approximation methods; direct and iterative inversion schemes for linear elastic; isotropic materials and advanced reconstruction methods for recovering parameters that characterize complex mechanical behavior. The paper's objective is to document efforts to develop elastography within the framework of solving an inverse problem, so that elastography may provide reliable estimates of shear modulus and other mechanical parameters. We discuss issues that must be addressed if model-based elastography is to become the prevailing approach to quasi-static, harmonic and transient elastography: (1) developing practical techniques to transform the ill-posed problem with a well-posed one; (2) devising better forward models to capture the complex mechanical behavior of soft tissues and (3) developing better test procedures to evaluate the performance of modulus elastograms.  相似文献   

15.
16.
Regularization is an effective method for the solution of ill-posed ECG inverse problems, such as computing epicardial potentials from body surface potentials. The aim of this work was to explore more robust regularization-based solutions through the application of subspace preconditioned LSQR (SP-LSQR) to the study of model-based ECG inverse problems. Here, we presented three different subspace splitting methods, i.e., SVD, wavelet transform and cosine transform schemes, to the design of the preconditioners for ill-posed problems, and to evaluate the performance of algorithms using a realistic heart-torso model simulation protocol. The results demonstrated that when compared with the LSQR, LSQR-Tik and Tik-LSQR method, the SP-LSQR produced higher efficiency and reconstructed more accurate epcicardial potential distributions. Amongst the three applied subspace splitting schemes, the SVD-based preconditioner yielded the best convergence rate and outperformed the other two in seeking the inverse solutions. Moreover, when optimized by the genetic algorithms (GA), the performances of SP-LSQR method were enhanced. The results from this investigation suggested that the SP-LSQR was a useful regularization technique for cardiac inverse problems.  相似文献   

17.

Objective

It has been recognised in a review of the developments of lower-limb prosthetic socket fitting processes that the future demands new tools to aid in socket fitting. This paper presents the results of research to design and clinically test an artificial intelligence approach, specifically inverse problem analysis, for the determination of the pressures at the limb/prosthetic socket interface during stance and ambulation.

Methods

Inverse problem analysis is based on accurately calculating the external loads or boundary conditions that can generate a known amount of strain, stresses or displacements at pre-determined locations on a structure. In this study a backpropagation artificial neural network (ANN) is designed and validated to predict the interfacial pressures at the residual limb/socket interface from strain data collected from the socket surface. The subject of this investigation was a 45-year-old male unilateral trans-tibial (below-knee) traumatic amputee who had been using a prosthesis for 22 years.

Results

When comparing the ANN predicted interfacial pressure on 16 patches within the socket with actual pressures applied to the socket there is shown to be 8.7% difference, validating the methodology. Investigation of varying axial load through the subject's prosthesis, alignment of the subject's prosthesis, and pressure at the limb/socket interface during walking demonstrates that the validated ANN is able to give an accurate full-field study of the static and dynamic interfacial pressure distribution.

Conclusions

To conclude, a methodology has been developed that enables a prosthetist to quantitatively analyse the distribution of pressures within the prosthetic socket in a clinical environment. This will aid in facilitating the “right first time” approach to socket fitting which will benefit both the patient in terms of comfort and the prosthetist, by reducing the time and associated costs of providing a high level of socket fit.  相似文献   

18.
Despite over 30 years of widespread research, there does not exist today a valid method of mathematically transforming ECG, VCG, etc. data into a form that provides approximate diagnoses such as “left ventricular infarct of moderate size”. This paper reviews the major efforts on this problem including a recent stochastic identification approach suggested and tested by the author(1,2) and outlines and assesses difficulties inherent in these techniques. In the development of some approaches such as Fourier analysis of ECG's it appears that at best one can only succeed in transforming the difficult problem of relating ECG's to physiological condition of the myocardium into the more difficult and possibly unsolvable problem of relating Fourier coefficients derived from ECG's to the physiological condition. The advantages of using constrained dipole models were recently described. (3) The author feels that the most promising current technique under study is a modified version of the above-mentioned stochastic identification approach which employs dipole models in which the parameters have a range of several discrete values. A computer program that employs this approach will be described in detail.  相似文献   

19.
Patients with advanced gynaecological cancer are often treated with a temporary interstitial implant using the Syed template and Ir- 192 ribbons at the Memorial Sloan-Kettering Cancer Center. Urgency in planning is great. We created a computerized inverse planning system for the Syed temporary gynaecological implant, which optimized the ribbon strengths a few seconds after catheter digitization. Inverse planning was achieved with simulated annealing. We discovered that hand-drawn target volumes had drawbacks; hence instead of producing a grid of points based on target volume, the optimization points were generated directly from the catheter positions without requiring an explicit target volume. Since all seeds in the same ribbon had the same strength, the minimum doses were located at both ends of the implant. Optimization points generated at both ends ensured coverage of the whole implant. Inverse planning took only a few seconds, and generated plans that provide a good starting point for manual improvement.  相似文献   

20.
Summary A mathematical method of extracting salient features from electroencephalographic data called eigenfunction analysis is presented. It allows the reduction of 21 channels of EEG data to a few components which can be separated into those which are likely to originate relatively close to the surface and others of deeper origin. It was demonstrated that the original tracings can be reconstituted from these few components. The eigenvectors give an indication of the location of sources and the degree to which the eigenfunction appears on source derivation and average reference recordings allows an estimation of relative depth. The method has been successfully applied to EEG tracings from 10 patients and is illustrated in the case of a young woman suffering from complex partial seizures associated with a deep left temporal lesion. The implications for marked data reduction and the development of objective assessment of clinical neurophysiologic data are discussed.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号