首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 30 毫秒
1.
Quantitative analysis of contrast-enhanced dynamic MR images has potential for diagnosing prostate cancer. Contemporary fast acquisition techniques can give sufficiently high temporal resolution to sample the fast dynamics observed in the prostate. Data reduction for parametric visualization requires automatic curve fitting to a pharmacokinetic model, which to date has been performed using least-squares error minimization methods. We observed that these methods often produce unexpectedly noisy estimates, especially for the typically fast, intermediate parameters time-to-peak and start-of-enhancement, resulting in inaccurate pharmacokinetic parameter estimates. We developed a new curve fit method that focuses on the most probable slope. A set of 10 patients annotated using histopathology was used to compare the conventional and new methods. The results show that our new method is significantly more accurate, especially in the relatively less-enhancing peripheral zone. We conclude that estimation accuracy depends on the curve fit method, which is especially important when evaluating the peripheral zone of the prostate.  相似文献   

2.
The purpose of this study was to extract the intercompartmental rate constants analytically from the coefficients and exponents of the exponential sum fitted to tracer data points for a unique four-compartment mixed mammillary/catenary (extended mammillary, radial) model, with losses solely from the central (blood) compartment. This model is useful in estimating myocardial and skeletal blood flow. Using Laplace transforms, the transfer functions of the exponential sum and the differential equations describing the model were obtained and the corresponding coefficients equated sequentially, which yielded the intercompartmental rate constants. Three solution sets of rate constants were found for the model, each of which satisfies both transfer functions. A program written in BASIC is provided, which requires only the coefficients and exponents of the exponential terms as input; the output is the three sets of rate constants. An example is given as an aid in debugging. Only one solution set of the three was physiologically realizable in the example, but the bounds defining these limits are not known. The advantages and limitations of the method are discussed. The method is suitable for initializing a non-linear least-squares fitting program for the rate constants. The appropriate physiological solution set for this model yields fractional and total myocardial or skeletal blood flow in a subject in the steady state.  相似文献   

3.
Selected profiles typed at the Promega PowerPlex 21 (PP21) loci were examined to determine if a linear or exponential model best described the relationship between peak height and molecular weight. There were fewer large departures from observed and expected peak heights using the exponential model. The larger differences that were observed were exclusively at the high molecular weight loci. We conclude that the data supports the use of an exponential curve to model peak heights versus molecular weight in PP21 profiles. We believe this observation will improve our ability to model expected peak heights for use in DNA interpretation software.  相似文献   

4.
Glomerular filtration rate (GFR) can be calculated from the plasma clearance of any of several radiopharmaceuticals that are excreted by glomerular filtration. Simplified methods have been proposed that require only one or two plasma samples in lieu of a more complete clearance curve. We examined the error introduced by this simplification. Forty patients were studied using a dual-isotope technique employing [99mTc]DTPA and [169Yb]DTPA, obtaining eight plasma samples for each clearance curve at intervals from 10 to 240 min after injection. Data were fit to several empirical or semiempirical formulae and also to a two-compartment computer model that permitted GFR estimation from only one or two data points. The computer model gave good fit, but so did several simpler methods. The error that results from replacing the complete clearance curve by a single 3-hr sample was about 8 ml/min (residual s.d.). By using two samples (at 1 and 3 hr), the error could be reduced to 4 ml/min. Recommended one- and two-sample methods are presented.  相似文献   

5.
There is a variety of methods for assessing sampling uncertainty in likelihood ratio calculations in DNA casework. Sampling uncertainty arises because all DNA statistical methods rely on a database of collected profiles. Such databases can be regarded as a sample from the population of interest. The act of taking a sample incurs sampling uncertainty. In some circumstances it may be desirable to provide some estimate of this uncertainty. We have addressed this topic in two previous publications [1] and [2]. In this paper we reconsider the performance of the methods using 15 locus Identifiler™ profiles, rather than the 6 locus data used in [1]. We also examine the differences in performance observed when using a uniform prior versus a 1/k prior in the Bayesian highest posterior density (HPD) method of Curran et al. [1].  相似文献   

6.
Determining the appropriate amount of accommodation time is an important component of research protocol design in the field of limb prosthetics. Insufficiently short or excessively long accommodation periods may limit the external validity of findings and/or the economic efficiency and ethical innocuousness of a study. However, issuing general recommendations is difficult, as individual accommodation periods are affected by subject characteristics, the nature of the intervention, and possibly a number of environmental factors.We are discussing an approach to determine individual accommodation times based on the assumption that the process of becoming accustomed to a prosthetic intervention follows a similar exponential “learning curve” as many other learning processes that have been previously investigated. Initial data collected with a small subject sample gives some indication that gait cycle symmetry changes along the hypothesized curve trajectory. If those preliminary results can be confirmed it may be possible to extrapolate a subject’s eventual level of accommodation based on a small data set that is easily collected during the first twenty minutes after introducing a prosthetic intervention.  相似文献   

7.
We evaluated the forensic usefulness of D15S233 (wg1d1), a tetrameric short tandem repeat (STR) locus, in the Japanese and Chinese populations. Typing was performed by denaturing polyacrylamide gel electrophoresis followed by silver staining. Nine different alleles were found in 472 Japanese chromosomes and seven in 186 Chinese chromosomes. 102 alleles sequenced were composed of two kinds of repeats (AGGA and GGGA). All alleles differed in size by one tetranucleotide repeat unit, and no insertion or deletion was found. The expected unbiased heterozygosities in Japanese and Chinese were 0.766 and 0.785, respectively. No significant deviations from the Hardy-Weinberg equilibrium were found in either population. We retyped all samples using an alternative pair of flanking primers in order to detect any spurious appearances of homozygotes due to sequence variation at the primer annealing site. One heterozygous sample had unbalanced density bands when the original primer set was used, but equal density bands when our newly designed primer set was used. Sequencing analysis revealed that the sparser allele had one nucleotide substitution near the 5' end of the annealing site of the original primer region. Thus, all apparently homo/heterozygous samples were thought to be truly homo/heterozygous. We also applied the D15S233 locus to paternity testing and forensic identification. Our results suggest that this locus should be a very useful STR locus for forensic practice in Japanese and Chinese.  相似文献   

8.
放射事故受照者血型糖蛋白A突变分析   总被引:1,自引:0,他引:1       下载免费PDF全文
目的研究血型糖蛋白A(GPA,glycophorinAlocus)突变分析用作新的辐射生物剂量计。方法结合流式细胞分析和单克隆抗体标记技术,检测红细胞糖蛋白的变异。结果随访全国10例放射源事故受照者,4人为MN杂合体,生物剂量在146~29Gy,1人MN弱阳性。GPANO变异率与剂量线性回归曲线有显著意义,曲线斜率4153×10-6/Gy,而NN没有显著意义。结论GPA变异率剂量效应曲线二次线性拟合比一次线性好。二次线性重建远期辐射剂量高于一次模型,用原爆、切尔诺贝利、戈亚尼亚事故参数回顾剂量多偏高。  相似文献   

9.
RATIONALE AND OBJECTIVES: Receiver operating characteristic (ROC) data with false-positive fractions of 0 are often difficult to fit with standard ROC methods and are sometimes discarded. Some extreme examples of such data were analyzed to evaluate the nature of these difficulties. MATERIALS AND METHODS: Rating reports of fracture for single-view ankle radiographs were analyzed with the binormal ROC model and with two ROC models that keep the ROC curve from crossing the chance line. Because fractures were almost never reported that were not present, some views and locations yielded only ROC points with false-positive fractions of 0, while others yielded at least one ROC point with a non-0 false-positive fraction. RESULTS: The models tended to yield ROC areas close to or equal to 1. ROC areas of 1 imply a true-positive fraction close to 1; yet the data contained no such fractions. When all false-positive fractions were 0, the true-positive fraction could be much higher for one view than another for all observers. ROC areas gave little or no hint of these unmistakable differences in performance. CONCLUSION: These data challenge the validity and robustness of current ROC models. A key aspect of ankle fractures is that some may be visible on one view but not at all visible on another.  相似文献   

10.

Objective:

In multireader, multicase (MRMC) receiver operating characteristic (ROC) studies for evaluating medical imaging systems, the area under the ROC curve (AUC) is often used as a summary metric. Owing to the limitations of AUC, plotting the average ROC curve to accompany the rigorous statistical inference on AUC is recommended. The objective of this article is to investigate methods for generating the average ROC curve from ROC curves of individual readers.

Methods:

We present both a non-parametric method and a parametric method for averaging ROC curves that produce a ROC curve, the area under which is equal to the average AUC of individual readers (a property we call area preserving). We use hypothetical examples, simulated data and a real-world imaging data set to illustrate these methods and their properties.

Results:

We show that our proposed methods are area preserving. We also show that the method of averaging the ROC parameters, either the conventional bi-normal parameters (a, b) or the proper bi-normal parameters (c, da), is generally not area preserving and may produce a ROC curve that is intuitively not an average of multiple curves.

Conclusion:

Our proposed methods are useful for making plots of average ROC curves in MRMC studies as a companion to the rigorous statistical inference on the AUC end point. The software implementing these methods is freely available from the authors.

Advances in knowledge:

Methods for generating the average ROC curve in MRMC ROC studies are formally investigated. The area-preserving criterion we defined is useful to evaluate such methods.Multireader, multicase (MRMC) receiver operating characteristic (ROC) studies have been widely used to evaluate medical imaging devices and computer-assisted detection/diagnosis devices in medical imaging.1 Typically, multiple readers (i.e. radiologists) read images of multiple cases (i.e. patients), and, for each case, each reader rates his/her level of suspicion that a lesion is present based on the image. The diagnostic performance of each reader using an imaging device is thus characterized by a ROC curve constructed using his/her rating data. A ROC curve plots the true-positive fraction (TPF or sensitivity) as a function of the false-positive fraction (FPF or 1-specificity) as the decision threshold varies and thus illustrates the trade-off between the sensitivity and the specificity of a reader across all possible thresholds.2The area under the ROC curve (AUC) is widely used to summarize the diagnostic performance of imaging systems.3 By analysing the sample AUCs that are obtained from a sample of readers reading a sample of cases, one makes statistical inference on the “population AUC”, which represents the performance of the device expected (or averaged) over the population of readers and the population of cases.4 As such, it characterizes the device performance itself independent of the particular readers and cases used in the study. Methods are well established for the design and analysis of such studies and tutorials,5 and consensus papers1,6 are available in the literature.The AUC is a meaningful metric, has many favourable properties and makes rigorous statistical inference possible.3 However, as a summary figure of merit, it loses some of the wealth of information that a ROC curve conveys and thus has limitations.7 For example, when ranking imaging systems using ROC curves, one system (A) is surely superior to the other (B) if its ROC curve is higher everywhere because, for each non-trivial sensitivity (or specificity), system A has higher specificity (or sensitivity) than system B. However, if two ROC curves cross between (0, 0) and (1, 1), then some portion of curve A is higher and some portion of curve A is lower than curve B, making the comparison dependent on the location of the operating point. In this case, use of AUC to rank systems becomes problematic and other metrics, such as partial area under the curve8 or sensitivity at a fixed specificity, may be used. A question that naturally arises is why not compare the ROC curves directly instead of a summary figure of merit? One of the reasons is that it is technically much more difficult to make rigorous statistical inference on curves than on scalar variables.In practice, when a clinical study is designed to use AUC as a study end point for comparing two imaging systems, it is assumed that the “population ROC curves” of the two systems do not cross and thus AUC would provide an unambiguous comparison. At the data analysis stage, it is desirable to plot the average ROC curve as an estimate of the population ROC curve to examine that assumption. Moreover, a summary ROC curve provides a visual representation of the operating characteristics of the readers on an imaging modality and thus provides more information than the summary metric AUC. While a summary ROC curve is not a substitute for the information in all the individual-reader curves, a summary ROC curve is often required and provided in scientific publications as a companion to the rigorous statistical inference results on the AUC in MRMC reader studies.9,10However, although some methods for generating average ROC curves are suggested,7,11 the properties of these methods have not been formally investigated. Section 2.3 in Swets and Pickett11 briefly mentioned that an average ROC curve can be obtained by averaging ROC parameters. Metz7 suggested averaging (over readers) parameters of the bi-normal ROC model to obtain the average ROC curve, which was later used in scientific publications.9,10 A caveat of this method, as we will show in this article, is that, while the average curve is a companion to the reader-averaged mean AUC, the area under the average curve often does not equal the mean AUC. In this article, we propose curve-averaging methods to overcome this shortcoming. We use hypothetical examples, simulation data and a real-world data set to illustrate the properties of several methods that we considered for generating average ROC curves in MRMC imaging studies.  相似文献   

11.
Numerous simplified methods for the estimation of ERPF have described, including the so-called slope/intercept (SI) methods, based on the analysis of the slope of certain segments of the 131I-OIH plasma disappearance curve and its y-axis intercept, and the single sample (SS) clearance method, based on theoretical volumes of OIH distribution at some fixed time after injection. Using ERPFs estimated from compartment analysis of the entire 60-min plasma disappearance curve, we have compared the errors of data calculated from use of eight SI methods made at various times along the disappearance curve with that from the optimum SS curve. The errors obtained from the SS method were approximately 50% less than those obtained from the SI methods. The errors of the SI methods are greater at both ends of the 60-min plasma curve than when samples are drawn near the mid-time. The SS method appears to be the method of choice for the estimation of ERPF using single injection techniques.  相似文献   

12.
Two methods were used to apply self-absorption corrections for the determination of beta radioactivity in water samples (either for an identified radionuclide or for monitoring trends). One method was performed by estimating the absorption coefficient by assuming an exponential behaviour of absorption using external absorbers, while the other method was performed by preparing empirical curves using standards of different mass. In the first method, a relationship between absorption coefficient and maximum beta energy was also derived. In the second method, self-absorption curves for 90Sr/90Y standard were prepared with several sodium salts (carbonate, nitrate and sulphate salts) and, for 40K standard using a potassium salt. Both beta emitters are usually necessary to calibrate detectors for beta radioactivity measurements. This study showed that, for 90Sr/90Y, results using standards of different mass were more accurate than using external absorbers. Furthermore, it is highly recommended to melt sodium nitrate salts in order to perform a self-absorption curve for 90Sr/90Y because standards were stable in time and homogeneously distributed. For 40K, a self-absorption curve may be easily derived using paper absorbers of different thickness, instead of performing a time consuming self-absorption curve using a set of varying thickness of 40K standards. In order to test the two methods, the gross beta activity of several environmental water samples was analysed.  相似文献   

13.
A population study of HUMF13B locus in a population sample of 426 unrelated individuals was carried out to investigate the allele distribution in north Poland. HUMF13B alleles were identified using a precisely optimised PCR method. Amplification products were separated on horizontal non-denaturing polyacrylamide gels followed by silver staining. The most frequent alleles were 10 (42,6%), 9 (23,9%) and 8 (21,4%) and one rare variant (allele 12) was observed with a frequency of 0,23%. Relatively high indices of discrimination power (0.866), polymorphic information content (0.661) and heterozygosity (0.704) confirm the usefulness of HUMF13B locus for forensic identification purposes.  相似文献   

14.
Numerous simplified methods for the estimation of ERPF have been described, including the so-called slope/intercept (SI) methods, based on the analysis of the slope of certain segments of the131I-OIH plasma disappearance curve and its y-axis intercept, and the single sample (SS) clearance method, based on theoretical volumes of OIH distribution at some fixed time after injection. Using ERPFs estimated from compartment analysis of the entire 60-min plasma disappearance curve, we have compared the errors of data calculated from use of eight SI methods made at various times along the disappearance curve with that from the optimum SS curve. The errors obtained from the SS method were approximately 50% less than those obtained from the SI methods. The errors of the SI methods are greater at both ends of the 60-min plasma curve than when samples are drawn near the mid-time. The SS method appears to be the method of choice for the estimation of ERPF using single injection techniques.  相似文献   

15.
RATIONALE AND OBJECTIVES: In studies with small samples, the authors often encounter data sets in which the estimated area under the receiver operating characteristic (ROC) curve is 1.0. In such cases, neither asymptotic nor resampling methods provide a means of estimating the standard error or constructing a lower confidence bound. The purpose of this study was to develop tables for determining the approximate 95% lower confidence bound when the estimated ROC area is 1.0. MATERIALS AND METHODS: Using Monte Carlo simulation, the authors generated 10,000 data sets for each specification of sample sizes, ROC curve shape, and data format (continuous and ordinal scale). For each of these combinations the authors determined the 95% lower confidence bound. RESULTS: When the estimated ROC area is 1.0, the 95% lower confidence bounds differ dramatically depending on the shape of the ROC curve and on whether the test results are ordinal or continuous. Four tables of 95% lower confidence bounds are provided, along with guidelines for their use. CONCLUSION: Given the different shapes of ROC curves and the different formats in which ROC data are collected, it is not feasible to offer one simple method of constructing confidence bounds that works for all ROC curves. The tables provided in this article are useful for interpreting studies with estimated ROC areas of 1.0.  相似文献   

16.
Summary A series of experiments has been performed to evaluate amplification and typing of the D1S80 VNTR locus. The validation study that has been carried out showed that correct D1S80 typing results can be obtained when a defined amplification protocol and a high-resolution polyacrylamide gel electrophoresis method are used. The use of the Chelex extraction protocol has substantially reduced the processing time. DNA-extraction, amplification and subsequent typing can be performed in one day. The discrimination power of this locus is 0.94 in a Dutch Caucasian population sample. The system is extremely sensitive: 0.1 ng of genomic DNA gave a correct typing result. The test could also detect the correct genotypes in mixed samples containing DNA from different individuals. Even if the major type was in a 20-fold excess, the minority type could still be amplified and typed correctly. We have found no deviation from Hardy-Weinberg equilibrium in a Dutch Caucasian population sample. Evidence for the somatic stability of this locus was obtained from a set of experiments where we compared DNA-profiles from corresponding blood, semen and saliva samples. The results of this study suggest that in the near future analysis of the D1S80 locus by DNA-amplification can be applied in actual forensic case work.  相似文献   

17.
18.
In order to apply a set of nine STR loci in parentage testing, we performed a population genetic study on a sample of the Flemish population. Genotypes for HUMHPRTB, HUMFABP, HUMCD4, HUMCSF1PO, HUMTH01, HUMPLA2A, HUMPLA2A1, HUMF13A01, HUMCYAR04 and HUMLIPOL were determined using three triplex PCR reactions and silver staining. Allele frequencies showed no deviation from Hardy-Weinberg equilibrium. The frequency distribution agreed well with other Caucasian populations but three intermediate fragments, not previously found in Caucasians, were observed. We then resolved a series of 151 parentage disputes of which 103 were exclusions. In six cases, evidence for exclusion was obtained by only one informative STR locus out of eight for male children or out of nine for female children. These exclusions were confirmed with additional polymorphic markers. In one case of inclusion, a paternal allele expanded with one repeat unit of HUMHPRTB. This observation illustrates that STRs do not differ from other genetic systems in the fact that more than one excluding locus is required before exclusion is demonstrated.  相似文献   

19.
We present two cases where a single locus mismatch was found in the locus D19S433 using the AmpF?STR® Identifiler? PCR Amplification Kit (Applied Biosystems) (Identifiler Kit) during paternity and maternity tests. This mismatch differed from the mismatch pattern where there is usually a one repeat difference. We designed forward and reverse primers so that they were positioned further away from the primer set contained in the Identifiler Kit. The results showed the existence of a silent allele 13 in both families, due to a point mutation that changed guanine to adenine at 32 nucleotides downstream from the 3′ end of the AAGG repeat sequences in all four members. A single locus mismatch due to a silent allele may occur in any locus using any kit. Accordingly, we should pay attention to this silent allele when carrying out human identification and parentage analysis.  相似文献   

20.
The weighted least squares analysis (WLS) was compared with the commonly used least squares analysis in the fitting of radioactivity data to an exponential equation. The results show that the two methods present the same accuracy in estimating the parameters of exponential functions, while the WLS analysis offers a much higher reliability, permitting a considerable shortening of the sample counting time.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号