首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
A three-dimensional computer modelling system has been developed for use in biology, and is currently running on a Sun3 computer. The data originate as a series of two-dimensional micrographs which are digitised via a TV camera. The two-dimensional images are used to select features of interest and to construct a three-dimensional model. This model can be viewed in vector or solid format, it can be rotated about three orthogonal axes and can be viewed in three dimensions as a stereo pair or an anaglyph. The system has been used in a large number of projects over the past 10–15 years, for example, to examine physiological and nerve structures. The time-consuming part of the process is the selection of features, which involves a high level of biological expertise. Present developments are concerned with reduction of the time spent in feature recognition and involve the introduction of expert systems together with human-computer interaction to deal with problems of identification.  相似文献   

2.
A 32 channel data acquisition system for the survey of intercostal muscle activity in the cat has been developed. Electromyographic signals, picked up by electrodes implanted in the intercostal muscles, are conditioned to be digitised and recorded on a magnetic tape through an on-line digital computer. In this signal conditioning, significant importance is given to motor unit selection and weighing, according to their size and distance from the electrode pick up point. A low pass-filter, with a response time course similar to the one observed in the twitch of the intercostal muscles of the cat, is also used. Since data obtained in the process is time locked, synchronised multichannel time domain analysis can be achieved.  相似文献   

3.
A fast method of measuring the distribution of furrows can be obtained via the analysis of TV camera images of skin replicas. These replicas are either negative (or direct) or positive. Reflected or transmitted light is used, depending on the type of replica. Each point of the matrix (256×256 pixels) is digitised over 6 bits (64 levels) and treated by a personal computer (IBM AT or PC). With such an image, it is not possible to obtain significant results without eliminating noise and enhancing the furrows. Specific software is written in processor language 8086 to obtain a real time system. The distribution of skin furrows is quantified for several interesting cases such as the ageing of the skin or the effect of stress.  相似文献   

4.
The purpose of the present study is to describe the setup of an image analysis workstation designed for multiple users, and to show the application of digital imaging techniques to the analysis of electron microscopic images. The image analysis system consists of a conventional light microscope mounted on a table-top, vibration-free platform, a light box for viewing negatives, two separate video cameras, a switch box, a video monitor, a digitizing tablet, a computer, and morphometric software packages. The system can quantitate the amount that each of the 256 gray levels contributes to the image, perform morphometric analysis (eg, shape and size) on individual gray level-defined subimages, and perform statistical analysis. Each operator has access to his or her own data and program setups through the use of 21.4-Mb removable Bernoulli cartridges. This setup for multiple users prevents the cluttering of the hard drive of the computer and avoids the possibility of accidentally removing the stored data of another user. The quantitative capabilities of the digital imaging system is demonstrated using an image of a normal lymphocyte and an apoptotic cell (ie, a cell which has undergone programmed cell death), both captured on the same electron microscopic negative. A comparison of the histograms of nuclear densities determined for these two cells reveals subtleties in gray level distribution not appreciated by the naked eye.  相似文献   

5.
Quantitative analysis of optical coherence tomography data can be strongly hampered by speckle. Here, we introduce a new method to reduce speckle, which leverages from Fourier-domain configurations and operates on individual axial scans. By subdividing the digitized spectrum into a number of distinct narrower windows, each with a different center frequency, several independent speckle patterns result. These can be averaged to yield a lower-resolution image with strongly reduced speckle. The full resolution image remains available for human interpretation; the low resolution version can be used for parametric imaging or quantitative analysis. We demonstrate this technique using intravascular optical frequency domain imaging data acquired in vivo.  相似文献   

6.
The design of a suitable transducer to measure the pressure under the foot it presented. The transducer is attached to the sole of the foot and can, if required, be worn inside shoes. A system is described using up to 16 transducers, the analogue signals from which are digitised online and stored on magnetic tape for subsequent computer analysis. Examples of the data obtained form patients with arthritic involvement of the feet and from normal subjects, both walking with and without shoes, are given. The importance of the clinical application of this type of measurement is discussed. Proofs and reprints are available from R. W. Soames, Department of Anatomy, King's College London, Strand, London WC2R 2LS, England  相似文献   

7.
A new method is demonstrated to visualise the microcirculation map of the human retina using a dynamic laser speckle effect. The retina is illuminated with a diode laser spot through a retinal camera, and its speckle image is detected by an area sensor. The output signal from the sensor is digitised, and the data for more than a hundred scannings of the speckle image are stored in a mass image memory. The difference between a pair of successive image data was calculated and integrated for each pixel. The results were displayed in colour graphics showing the spatial distribution of the blood flow in the retina.  相似文献   

8.
The vast majority of microscopic data in biology of the cell nucleus is currently collected using fluorescence microscopy, and most of these data are subsequently subjected to quantitative analysis. The analysis process unites a number of steps, from image acquisition to statistics, and at each of these steps decisions must be made that may crucially affect the conclusions of the whole study. This often presents a really serious problem because the researcher is typically a biologist, while the decisions to be taken require expertise in the fields of physics, computer image analysis, and statistics. The researcher has to choose between multiple options for data collection, numerous programs for preprocessing and processing of images, and a number of statistical approaches. Written for biologists, this article discusses some of the typical problems and errors that should be avoided. The article was prepared by a team uniting expertise in biology, microscopy, image analysis, and statistics. It considers the options a researcher has at the stages of data acquisition (choice of the microscope and acquisition settings), preprocessing (filtering, intensity normalization, deconvolution), image processing (radial distribution, clustering, co-localization, shape and orientation of objects), and statistical analysis. Electronic supplementary material The online version of this article (doi:) contains supplementary material, which is available to authorized users. O. Ronneberger and D. Baddeley contributed equally.  相似文献   

9.
Keratinocytes in skin epidermis, which have bright cytoplasmic contrast and dark nuclear contrast in reflectance confocal microscopy (RCM), were modeled with a simple error function reflectance profile: erf(?). Forty-two example keratinocytes were identified as a training set which characterized the nuclear size a = 8.6±2.8 μm and reflectance gradient b = 3.6±2.1 μm at the nuclear∕cytoplasmic boundary. These mean a and b parameters were used to create a rotationally symmetric erf(?) mask that approximated the mean keratinocyte image. A computer vision algorithm used an erf(?) mask to scan RCM images, identifying the coordinates of keratinocytes. Applying the mask to the confocal data identified the positions of keratinocytes in the epidermis. This simple model may be used to noninvasively evaluate keratinocyte populations as a quantitative morphometric diagnostic in skin cancer detection and evaluation of dermatological cosmetics.  相似文献   

10.
The paper describes an approach to solving the problem of providing a large-capacity image archive for diagnostic imaging departments at reasonable cost. Optical disk stores, when fitted retrospectively to scanners, are very expensive and may not be compatible with existing computer hardware. We describe the use of an industry standard personal computer (PC) linked to a standard 5 1/4-in. optical disk drive as a 'stand-alone' image store. Image data are transferred from the scanner using 8-in. floppy disks, and these are read into the PC using an attached 8-in. floppy disk drive and then transferred to the optical disk. The patient details (patient name, ID, date, etc.) are entered into a database program held on the PC and these are used to generate a reference pointer to the optical disk file through which the data can be retrieved. Data retrieval involves entering the patient details into the data base and inserting a blank 8-in. floppy disk into the drive attached to the PC. A sector copy is then made from the optical disk to the 8-in. floppy disk, which can then be used at the viewing station at the scanner. The system is flexible since it can accept data from a variety of sources in any format; it is also low cost and operates independently of the scanner. The hardware is industry standard, ensuring low maintenance costs.  相似文献   

11.
Reducing the patient dose while keeping the image noise at the same level is desired for x-ray CT examinations. In order to achieve the goal, we propose a new weighting scheme taking the validity of the data and redundant data samples into account. The method is evaluated with a new generalized version of the Feldkamp helical reconstruction algorithm. It allows us to enlarge the projection angular range used in reconstruction, and thus, to reduce the image noise by increasing the detector utilization rate to 100% without sacrificing the image quality or z-resolution. This concept can be adapted to other exact or approximate algorithms as far as they use redundant data samples.  相似文献   

12.
13.
The paper presents a microcomputer-aided pyroelectric thermal imaging system developed for medical use. The processing procedures of this system can be divided into two stages: data acquisition and noise reduction. In the first stage, the video signal produced by the pyroelectric vidicon camera is digitised in synchronism with a chopper. In the second stage, averaging, immage-difference processing and local smoothing are applied to the obtained digital image. As a result, temperature resolution of less than 0·5°C was achieved on a four-bit digital image. The system was applied to pain management, and demonstrated a good performance in the evaluation of effects of nerve-block operations.  相似文献   

14.
M K Woo  A Fung  P O'Brien 《Medical physics》1992,19(5):1273-1275
In this work, the accuracy of the asymmetric jaws planning feature in a commercial treatment planning (TP) system is assessed. In the latest version of this software, the off-axis beam quality variation is handled by a function g(d,r), which is derived from measured horizontal beam profiles at four different depths. The calculated and measured isodoses for a 6-MV linear accelerator with asymmetric jaws agree to +/- 0.5% along the central axis and to within 2 mm at the beam edge. Formulas for treatment time calculations using the output data reported by the computer program are described, as well as formulas for manual calculations based on pregenerated data tables. Doses calculated based on these formulas are compared to measurement and the accuracy is +/- 1% and +/- 2% for the computer and manual calculations, respectively. It is concluded that this version of the treatment planning system as well as the treatment time calculation formulas can be used adequately for asymmetric jaw computerized and manual treatment planning.  相似文献   

15.
A new approach to the processing of circular medical charts based on "odd symmetry digital subtraction" is presented. The technique allows automatic digitisation of these charts using an image processing system based on a flat bed scanner. This approach is simple, fast, and reliable; and provides a system in which the digitised image is processed by a microcomputer to extract and quantify the plotted information. The main features and the potential applications of this approach in medical and nonmedical fields are discussed.  相似文献   

16.
用计算机图象分析方法定量测定毛细血管数的一种新方法   总被引:4,自引:1,他引:3  
目的用计算机图象分析的方法自动测定毛细血管数。方法VB6.0编制软件 ,采用图象两值化 ,扫描标号法等分析方法测定毛细血管数。结果分别对小鼠SRS实体瘤模型组和口服鲨鱼软骨粉治疗组的肿瘤切片 (vWF染色) ,用计算机自动分析与人工计数对照方法检测单位面积中的毛细血管数 ,其相关系数分别为0.93(P<0.001) ,0.99(P<0.001) ;配对t检验 ,无显著性差异 (P>0.05)。结论这是一种用计算机图象分析定量测定毛细血管数的新方法 ,对研究疾病时的血管新生十分有用  相似文献   

17.
A system is presented which tracks the three-dimensional trajectory of points of the human body by means of the digitised co-ordinates of corresponding points which result from multiple perspective views. Recursive algorithms for calibration and localisation based on vector algebra are described, and computer graphics are then used for displaying the localised movements.  相似文献   

18.
A method of locating analytically the precise position in the object space of an image point such as a foreign body using simple stereo X-ray pictures and digitised co-ordinates of common image points identified in the stereo pictures is described.  相似文献   

19.
Myocardial signal intensity curves for myocardial perfusion studies may be made quantitative by the use of T1 measurements made after the first-pass of contrast agent. A short data acquisition method for T1 mapping is presented in which all data for each T1 map are acquired in a short breath hold, and the slice geometry and timing in the cardiac cycle exactly match that of the dynamic first-pass perfusion sequence. This allows accurate image registration of the T1 map with the first-pass series of images. The T1 method is based on varying the preparation-pulse delay time of a saturation recovery sequence, and in this implementation employs an ECG-triggered, single-shot, spoiled gradient echo technique with SENSE reconstruction. The method allows T1 estimates of three slices to be made in fifteen heartbeats. For a range of samples with T1 values equivalent to those found in the myocardium during the first-pass of contrast agent, T1 estimates were accurate to within 6%, and the variation between slices was 2% or less.  相似文献   

20.
Telepathology usage in the past has typically been a qualitative procedure rather than a quantitative measurement. DNA ploidy using image analysis has been favorably compared to DNA ploidy analysis by flow cytometry in numerous publications. A step from DNA ploidy analysis using conventional image analysis to DNA ploidy analysis using stored images allows DNA ploidy analysis by image cytometry to become a powerful tool in telepathology. Remote DNA ploidy analysis using stored images has an impact on the field of pathology, as not every hospital or laboratory can afford to perform this type of specialized testing. However, images have large data files and require lengthy transmission times over communication systems to other computers. Joint Photographer Experts Group (JPEG) compression is a computer algorithm that allows the file size of an image to be reduced in order to decrease transmission times to another computer. A study was initiated to investigate the effects of JPEG compression on images of Feulgen stained breast tumor touch preps and the resulting DNA ploidy histograms. Diagn Cytopathol 1996;15:231–236. © 1996 Wiley-Liss, Inc.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号