首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   161篇
  免费   7篇
  国内免费   2篇
耳鼻咽喉   2篇
儿科学   4篇
基础医学   36篇
口腔科学   2篇
临床医学   10篇
内科学   10篇
神经病学   21篇
特种医学   7篇
外科学   6篇
综合类   10篇
预防医学   22篇
药学   26篇
中国医学   9篇
肿瘤学   5篇
  2023年   6篇
  2022年   8篇
  2021年   9篇
  2020年   6篇
  2019年   5篇
  2018年   1篇
  2017年   3篇
  2016年   1篇
  2015年   3篇
  2014年   12篇
  2013年   10篇
  2012年   9篇
  2011年   11篇
  2010年   6篇
  2009年   7篇
  2008年   10篇
  2007年   11篇
  2006年   3篇
  2005年   5篇
  2004年   4篇
  2003年   4篇
  2001年   2篇
  2000年   3篇
  1999年   4篇
  1998年   4篇
  1997年   3篇
  1996年   2篇
  1994年   4篇
  1993年   1篇
  1992年   4篇
  1990年   2篇
  1989年   1篇
  1988年   2篇
  1987年   2篇
  1985年   2篇
排序方式: 共有170条查询结果,搜索用时 15 毫秒
41.
Objective To obtain an accurate assessment of the percentage and depth of extra-capsular soft tissue removed with the prostate by the various surgical techniques in order to help surgeons in determining the appropriateness of different surgical approaches. This can be enhanced by an accurate and automated means of identifying the prostate gland contour. Materials and Methods To facilitate 3D reconstruction and, ultimately, more accurate analyses, it is essential for us to identify the capsule boundary that separates the prostate gland tissue from its extra-capsular tissue. However, the capsule is sometimes unrecognizable due to the naturally occurring intrusion of muscle and connective tissue into the prostate gland. At these regions where the capsule disappears, its contour can be arbitrarily created with a continuing contour line based on the natural shape of the prostate. We utilize an algorithm based on a least squares curve fitting technique that uses a prostate shape equation to merge previously detected capsule parts with the shape equation to produce an approximated curve that represents the prostate capsule. Results We have tested our algorithm using three different shapes on 13 histologic prostate slices that are cut at different locations from the apex. The best result shows a 90% average contour match when compared to pathologist-drawn contours. Conclusion We believe that automatically identifying histologic prostate contours will lead to increased objective analyses of surgical margins and extracapsular spread of cancer. Our results show that this is achievable.  相似文献   
42.
Introduction Short-term precision is often quoted and used as the most important performance parameter of a dual-energy X-ray absorptiometry (DXA) scanner; however, long-term precision has a more profound impact on patient monitoring. Long-term precision refers to the combination of in-vivo precision errors and long-term equipment stability. Methods To monitor long-term equipment stability, a phantom was designed with four inserts ranging in bone-mineral density from 0.5 to 3.3 g/cm2. This phantom was used to monitor the equipment stability of four modern fan-beam densitometers, two each from Hologic and GE/Lunar, over a 4-year period. Manufacturer-recommended quality assurance (QA) procedures were performed, and the scanners stayed within manufacturer-specified tolerances throughout the study. Results and conclusion During the 4-year period, the Hologic scanners were observed to cause clinically insignificant BMD shifts (maximum of 0.34%), whereas the GE/Lunar scanners revealed BMD shifts that were clinically significant (1.5% and 2.1%). As a result, using least-significant-change (LSC) calculations based only on short-term in-vivo precision studies for monitoring patients is not valid for the two GE/Lunar densitometers due to the poorer long-term stability they exhibited.  相似文献   
43.
Summary The class of generalized autoregressive conditional heteroscedastic (GARCH) models has proved particularly valuable in modelling time series with time varying volatility. These include financial data, which can be particularly heavy tailed. It is well understood now that the tail heaviness of the innovation distribution plays an important role in determining the relative performance of the two competing estimation methods, namely the maximum quasi‐likelihood estimator based on a Gaussian likelihood (GMLE) and the log‐transform‐based least absolutely deviations estimator (LADE) (see Peng and Yao 2003 Biometrika,90, 967–75). A practically relevant question is when to use what. We provide in this paper a solution to this question. By interpreting the LADE as a version of the maximum quasilikelihood estimator under the likelihood derived from assuming hypothetically that the log‐squared innovations obey a Laplace distribution, we outline a selection procedure based on some goodness‐of‐fit type statistics. The methods are illustrated with both simulated and real data sets. Although we deal with the estimation for GARCH models only, the basic idea may be applied to address the estimation procedure selection problem in a general regression setting.  相似文献   
44.
45.
Typical image recognition systems operate in two stages: feature extraction to reduce the dimensionality of the input space, and classification based on the extracted features. Analog Very Large Scale Integration (VLSI) is an attractive technology to achieve compact and low-power implementations of these computationally intensive tasks for portable embedded devices. However, device mismatch limits the resolution of the circuits fabricated with this technology. Traditional layout techniques to reduce the mismatch aim to increase the resolution at the transistor level, without considering the intended application. Relating mismatch parameters to specific effects in the application level would allow designers to apply focalized mismatch compensation techniques according to predefined performance/cost tradeoffs.This paper models, analyzes, and evaluates the effects of mismatched analog arithmetic in both feature extraction and classification circuits. For the feature extraction, we propose analog adaptive linear combiners with on-chip learning for both Least Mean Square (LMS) and Generalized Hebbian Algorithm (GHA). Using mathematical abstractions of analog circuits, we identify mismatch parameters that are naturally compensated during the learning process, and propose cost-effective guidelines to reduce the effect of the rest. For the classification, we derive analog models for the circuits necessary to implement Nearest Neighbor (NN) approach and Radial Basis Function (RBF) networks, and use them to emulate analog classifiers with standard databases of face and hand-writing digits. Formal analysis and experiments show how we can exploit adaptive structures and properties of the input space to compensate the effects of device mismatch at the application level, thus reducing the design overhead of traditional layout techniques. Results are also directly extensible to multiple application domains using linear subspace methods.  相似文献   
46.
目的 采用高效液相色谱-三重四极杆质谱法(LC-MS/MS)法测定替诺福韦酯前体药物中遗传毒性杂质9-丙烯基腺嘌呤含量,并分析最小二乘法不同线性拟合方法对结果准确度的影响。方法 采用Waters XBridge C18(4.6 mm×150 mm,3.5 μm)色谱柱;以甲醇-水(55:45)为流动相,流速:1.0 mL·min-1,等度洗脱8 min(2.0~2.6 min进质谱)。进样体积2 μL,柱温40 ℃。采用正离子电喷雾(ESI+)模式电离,多反应离子监测(MRM)模式,选择m/z 176.0→136.0作为检测离子。结果 经分析方法验证,该方法专属性良好;系统精密度试验RSD为1.1%(n=6);使用最小二乘法进行线性回归,线性范围0.051~25.250 μg·mL-1;加权线性回归在低浓度区域准确性明显高于未加权线性回归。定量限0.051 ng·mL-1,检出限0.017 ng·mL-1;原料药样品溶液平均回收率99.08%~99.97%(n=9),重复性RSD为0.4%~1.3%;片剂样品溶液平均回收率98.94%~101.96%(n=9),重复性RSD为0.2%~0.3%;耐用性良好。结论 建立的方法操作简单、灵敏度高、分析速度快、基质不影响检测,结果准确可靠,能够满足痕量遗传毒性杂质的检测要求。  相似文献   
47.
目的 提高2048线阵探测器的校正质量。方法 对在常见的成像条件下曝光的梯模、线对卡图像进行测量、分析,利用最小二乘法计算各通道的校正因子。结果 ①对本底噪声进行有效的抑制,使得本底近似为零;②使得在有X射线和无X射线两种条件下敏感度不同的各通道取得了较好的一致性。结论 采用本方法能在常见的成像条件下较大地提高校正质量,具有通用性。  相似文献   
48.
This study uses statistical techniques to evaluate reports on suicide scenes; it utilizes 80 reports from different locations in Brazil, randomly collected from both federal and state jurisdictions. We aimed to assess a heterogeneous group of cases in order to obtain an overall perspective of the problem. We evaluated variables regarding the characteristics of the crime scene, such as the detected traces (blood, instruments and clothes) that were found and we addressed the methodology employed by the experts. A qualitative approach using basic statistics revealed a wide distribution as to how the issue was addressed in the documents. We examined a quantitative approach involving an empirical equation and we used multivariate procedures to validate the quantitative methodology proposed for this empirical equation. The methodology successfully identified the main differences in the information presented in the reports, showing that there is no standardized method of analyzing evidences.  相似文献   
49.

AIMS

Accurate ventricular fibrillation (VF) waveform analysis usually requires rescuers to discontinue cardiopulmonary resuscitation (CPR). However, prolonged “hands-off” time has a deleterious impact on the outcome. We developed a new filter technique that could clean the CPR artifacts and help preserve the shockability index of VF

METHODS

We analyzed corrupted ECGs, which were constructed by randomly adding different scaled CPR artifacts to the VF waveforms. A newly developed algorithm was used to identify the CPR fluctuations. The algorithm contained two steps. First, decomposing the raw data by empirical mode decomposition (EMD) into several intrinsic mode fluctuations (IMFs) and combining the dominant IMFs to reconstruct a new signal. Second, calculating each CPR cycle frequency from the new signal and fitting the new signal to the original corrupted ECG by least square mean (LSM) method to derive the CPR artifacts. The estimated VF waveform was derived by subtraction of the CPR artifacts from the corrupted ECG. We then performed amplitude spectrum analysis (AMSA) for original VF, corrupted ECG and estimated VF.

RESULTS

A total of 150 OHCA subjects with initial VF rhythm were included for analysis. Ten CPR artifacts signals were used to construct corrupted ECG. Even though the correlations of AMSA between the corrupted ECG vs. the original VF and the estimated VF vs. the original VF are all high (all p < 0.001), the values of AMSA were obviously biased in corrupted ECG with wide limits of agreement in Bland–Altman mean-difference plot. ROC analysis of the AMSA in the prediction of defibrillation success showed that the new algorithm could preserve the cut-off AMSA value for CPR artifacts with power ratio to VF from 0 to 6 dB.

CONCLUSION

The new algorithm could efficiently filter the CPR-related artifacts of the VF ECG and preserve the shockability index of the original VF waveform.  相似文献   
50.
In this article, we propose a new index to measure maternal and child health in the least developed countries (LDCs) of Asia. This new index is applied to a group of countries particularly affected by poverty, which, in the terminology of the United Nations' Conference on Trade and Development, are the poorest of the poor. Our index has been designed by including the variables defined in the Goals of the Millennium Declaration. For this purpose, we used the P(2) distance method for 2008, the last year for which data were available. This index integrates variables of maternal and child health that allow territorial ordering of the LDCs in terms of these partial indicators. This analysis is particularly useful in a scenario such as the LDCs of Asia, which are beset by profound social and economic inequalities.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号