首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   7944篇
  免费   532篇
  国内免费   50篇
耳鼻咽喉   120篇
儿科学   171篇
妇产科学   127篇
基础医学   966篇
口腔科学   270篇
临床医学   669篇
内科学   1673篇
皮肤病学   165篇
神经病学   516篇
特种医学   315篇
外国民族医学   2篇
外科学   1247篇
综合类   134篇
一般理论   5篇
预防医学   543篇
眼科学   361篇
药学   733篇
中国医学   73篇
肿瘤学   436篇
  2024年   4篇
  2023年   91篇
  2022年   166篇
  2021年   378篇
  2020年   249篇
  2019年   315篇
  2018年   403篇
  2017年   270篇
  2016年   329篇
  2015年   320篇
  2014年   469篇
  2013年   573篇
  2012年   802篇
  2011年   793篇
  2010年   462篇
  2009年   324篇
  2008年   513篇
  2007年   474篇
  2006年   363篇
  2005年   297篇
  2004年   224篇
  2003年   232篇
  2002年   176篇
  2001年   34篇
  2000年   24篇
  1999年   35篇
  1998年   24篇
  1997年   13篇
  1996年   17篇
  1995年   13篇
  1994年   12篇
  1993年   12篇
  1992年   4篇
  1991年   9篇
  1990年   4篇
  1989年   11篇
  1988年   11篇
  1987年   4篇
  1986年   2篇
  1985年   5篇
  1984年   5篇
  1982年   8篇
  1981年   4篇
  1980年   4篇
  1979年   5篇
  1978年   7篇
  1977年   5篇
  1976年   8篇
  1975年   9篇
  1974年   6篇
排序方式: 共有8526条查询结果,搜索用时 15 毫秒
171.
AIM: The healthy ranges for serum alanine aminotransferase (ALT) levels are less well studied. The aim of this study was to define the upper limit of normal (ULN) for serum ALT levels, and to assess factors associated with serum ALT activity in apparently healthy blood donors. METHODS: A total of 1,939 blood donors were included. ALT measurements were performed for all cases using the same laboratory method. Healthy ranges for ALT levels were computed from the population at the lowest risk for liver disease. Univariate and multivariate analyses were performed to evaluate associations between clinical factors and ALT levels. RESULTS: Serum ALT activity was independently associated with body mass index (BMI) and male gender, but not associated with age. Association of ALT with BMI was more prominent in males than in females. Upper limit of normal for non-overweight women (BMI of less than 25) was 34 U/L, and for non-overweight men was 40 U/L. CONCLUSION: Serum ALT is strongly associated with sex and BMI. The normal range of ALT should be defined for male and female separately.  相似文献   
172.
AIM:The healthy ranges for serum alanine aminotransferase (ALT) levels are less well studied. The aim of this study was to define the upper limit of normal (ULN) for serum ALT levels, and to assess factors associated with serum ALT activity in apparently healthy blood donors.METHODS: A total of 1 939 blood donors were included.ALT measurements were performed for all cases using the same laboratory method. Healthy ranges for ALT levels were computed from the population at the lowest risk for liver disease. Univariate and multivariate analyses were performed to evaluate associations between clinical factors and ALT levels.RESULTS: Serum ALT activity was independently associated with body mass index (BMI) and male gender, but not associated with age. Association of ALT with BMI was more prominent in males than in females. Upper limit of normal for non-overweight women (BMI of less than 25) was 34 U/L,and for non-overweight men was 40 U/L.CONCLUSION: Serum ALT is strongly associated with sex and BMI. The normal range of ALT should be defined for male and female separately.  相似文献   
173.
174.
Understanding the underlying mechanisms of COVID-19 progression and the impact of various pharmaceutical interventions is crucial for the clinical management of the disease. We developed a comprehensive mathematical framework based on the known mechanisms of the severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) infection, incorporating the renin−angiotensin system and ACE2, which the virus exploits for cellular entry, key elements of the innate and adaptive immune responses, the role of inflammatory cytokines, and the coagulation cascade for thrombus formation. The model predicts the evolution of viral load, immune cells, cytokines, thrombosis, and oxygen saturation based on patient baseline condition and the presence of comorbidities. Model predictions were validated with clinical data from healthy people and COVID-19 patients, and the results were used to gain insight into identified risk factors of disease progression including older age; comorbidities such as obesity, diabetes, and hypertension; and dysregulated immune response. We then simulated treatment with various drug classes to identify optimal therapeutic protocols. We found that the outcome of any treatment depends on the sustained response rate of activated CD8+ T cells and sufficient control of the innate immune response. Furthermore, the best treatment—or combination of treatments—depends on the preinfection health status of the patient. Our mathematical framework provides important insight into SARS-CoV-2 pathogenesis and could be used as the basis for personalized, optimal management of COVID-19.

COVID-19 has created unprecedented challenges for the health care system, and, until an effective vaccine is developed and made widely available, treatment options are limited. A challenge to the development of optimal treatment strategies is the extreme heterogeneity of presentation. Infection with severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) results in a syndrome that ranges in severity from asymptomatic to multiorgan failure and death. In addition to local complications in the lung, the virus can cause systemic inflammation and disseminated microthrombosis, which can cause stroke, myocardial infarction, or pulmonary emboli (14). Risk factors for poor COVID-19 outcome include advanced age, obesity, diabetes, and hypertension (513).Computational analyses can provide insights into the transmission, control, progression, and underlying mechanisms of infectious diseases. Indeed, epidemiological and statistical modeling has been used for COVID-19, providing powerful insights into comorbidities, transmission dynamics, and control of the disease (1417). However, to date, these analyses have been population dynamics models of SARS-CoV-2 infection and transmission or correlative analyses of COVID-19 comorbidities and treatment response. Simple viral dynamics models have been also developed and used to predict the SARS-CoV-2 response to antiviral drugs (18, 19). These models, however, do not explicitly consider the biological or physiological mechanisms underlying disease progression or the time course of response to various therapeutic interventions, and only a few more-sophisticated models have been developed toward this direction (20, 21).Several therapies targeting various aspects of COVID-19 pathogenesis have been proposed and have either completed—or are currently being tested in—clinical trials (22). Despite strong biologic rationale, these treatments have generally produced conflicting results in the clinic. For example, trials of antiviral therapies (e.g., remdesivir) have been mixed: The original trial from China failed (23), a subsequent trial in the United States led to approval of remdesivir in the United States and other countries (24), and the recent results of the World Health Organization Solidarity trial again show no benefit (25). Other antiviral drugs alone or in combination are also showing promise (26).Other potential treatments include antiinflammatory drugs and antithrombotic agents. Because of the systemic inflammation seen in many patients, antiinflammatory drugs have been tested, including anti-IL6/IL6R therapy (e.g., tocilizumab, siltuximab) and anti-JAK1/2 drugs (e.g., barcitinib). It is not clear whether these drugs will be effective as stand-alone treatments, particularly after the recent failure of tocilizumab in a phase III trial (1, 2729). In addition, given that a common complication of COVID-19 is the development of coagulopathies with microvascular thrombi potentially leading to the dysfunction of multiple organ systems (2, 3), antithrombotic drugs (e.g., low molecular weight heparin) are being tested. Recognizing the interactions of COVID-19 with the immune system (30), the corticosteroid dexamethasone has been tested, showing some promising results. Given the large range of patient comorbidities, disease severities, and variety of complications such as thrombosis, it is likely that patients will have heterogeneous responses to any given therapy, and such heterogeneity will continue to be a challenge for clinical trials of unselected COVID-19 patients (31).Here, we developed a systems biology-based mathematical model to address this urgent need. Our model incorporates the known mechanisms of SARS-CoV-2 pathogenesis and the potential mechanisms of action of various therapeutic interventions that have been tested in COVID-19 patients. In previous work, we have exploited angiotensin receptor blockers (ARBs) and angiotensin converting enzyme inhibitors (ACEis) for the improvement of cancer therapies and developed mathematical models of the renin−angiotensin system in the context of cancer desmoplasia (3235). Using a similar approach, we developed a detailed model that includes lung infection by the SARS-CoV-2 virus and a pharmacokinetic/pharmacodynamic (PK/PD) model of infection and thrombosis to simulate events that take place throughout the body during COVID-19 progression (Fig. 1 and SI Appendix, Fig. S1). The model is first validated against clinical data of healthy people and COVID-19 patients and then used to simulate disease progression in patients with specific comorbidities. Subsequently, we present model predictions for various therapies currently employed for treatment of COVID-19 alone or in combination, and we identify protocols for optimal clinical management for each of the clinically observed COVID-19 phenotypes.Open in a separate windowFig. 1.Schematic of the detailed lung model. The model incorporates the virus infection of epithelial and endothelial cells, the RAS, T cells activation and immune checkpoints, the known IL6 pathways, neutrophils, and macrophages, as well as the formation of NETs, and the coagulation cascade. The lung model is coupled with a PK/PD model for the virus and thrombi dissemination through the body.  相似文献   
175.
Increases in burned area and large fire occurrence are widely documented over the western United States over the past half century. Here, we focus on the elevational distribution of forest fires in mountainous ecoregions of the western United States and show the largest increase rates in burned area above 2,500 m during 1984 to 2017. Furthermore, we show that high-elevation fires advanced upslope with a median cumulative change of 252 m (−107 to 656 m; 95% CI) in 34 y across studied ecoregions. We also document a strong interannual relationship between high-elevation fires and warm season vapor pressure deficit (VPD). The upslope advance of fires is consistent with observed warming reflected by a median upslope drift of VPD isolines of 295 m (59 to 704 m; 95% CI) during 1984 to 2017. These findings allow us to estimate that recent climate trends reduced the high-elevation flammability barrier and enabled fires in an additional 11% of western forests. Limited influences of fire management practices and longer fire-return intervals in these montane mesic systems suggest these changes are largely a byproduct of climate warming. Further weakening in the high-elevation flammability barrier with continued warming has the potential to transform montane fire regimes with numerous implications for ecosystems and watersheds.

Fire is an integral component of most forested lands and provides significant ecological services (1). However, burned area, fire size, the number of large fires, and the length of fire season have increased in the western United States in recent decades (2, 3). Increasing fire activity and the expansion of wildland urban interface (4) collectively amplified direct and indirect fire-related loss of life and property (5, 6) and contributed to escalating fire suppression costs (7). While increased biomass due to a century of fire exclusion efforts is hypothesized to have partially contributed to this trend (8), climate change is also implicated in the rise of fire activity in the western United States (911).Although increases in forest fire activity are evident in all major forested lands in the western United States (2, 12, 13), an abundance of moisture—due to snowpack persistence, cooler temperatures, and delayed summer soil and fuel drying—provides a strong buffer of fire activity (13) and longer fire-return intervals (14) at high elevations. Recent studies, however, point to changing fire characteristics across many ecoregions of the western United States (15), including high-elevation areas of the Sierra Nevada (16), Pacific Northwest, and Northern Rockies (12, 17). These studies complement documented changes in montane environments including amplified warming with elevation (18), widespread upward elevational shift in species (19), and increased productivity in energy-limited high-elevation regions that enhance fuel growth and connectivity (20). These changes have been accompanied by longer snow-free periods (21), increased evaporative demand (9), and regional declines in fire season precipitation frequency (11) across the western United States promoting increased fuel ignitability and flammability that have well-founded links to forest burned area. A warmer climate is also conducive to a higher number of convective storms and more frequent lightning strikes (22).In this study, we explore changes in the elevational distribution of burned forest across the western United States and how changes in climate have affected the mesic barrier for high-elevation fire activity. We focus on changes in high-elevation forests that have endured fewer direct anthropogenic modifications compared to drier low-elevation forests that had frequent low-severity fires prior to European colonization and have been more subject to changes in settlement patterns as well as fire suppression and harvest (23, 24); we also pose the following questions: 1) Has the elevational distribution of fire in the western US forests systematically changed? and 2) What changes in biophysical factors have enabled such changes in high-elevation fire activity? We explore these questions across 15 mountainous ecoregions of the western United States using records from large fires (>405 ha) between 1984 and 2017 [Monitoring Trends in Burn Severity (MTBS) (25)], a 10-m–resolution digital elevation model, and daily high-spatial–resolution surface meteorological data [gridMET (26)].We focus on the trends in Z90—defined as the 90th percentile of normalized annual elevational distribution of burned forest in each ecoregion. Here, the term “normalized” essentially refers to the fraction of forest area burned by elevation. We complement this analysis by examining trends in burned area by elevational bands and using quantile regression of normalized annual forest fire elevation. We then assess the interannual relationships between Z90 and vapor pressure deficit (VPD) and compare the upslope advance in montane fire to elevational climate velocity of VPD during 1984 to 2017. Specifically, we use VPD trends and VPD–high-elevation fire regression to estimate VPD-driven changes in Z90 and BA90— defined as annual burned area above the 90th percentile of forest elevational distribution in each ecoregion—during 1984 to 2017.  相似文献   
176.
177.
Objective To identify clinicopathologic factors associated with a reduced intercarotid distance (ICD) and subgroups at risk for internal carotid artery injury during transsphenoidal surgery. Design A retrospective case-control study. Setting This study was conducted at the McGill University Health Centre, a university-affiliated tertiary care center. Participants Patients with a sellar or parasellar tumor and nontumor controls were included in the study. Main Outcome Measures The smallest distance between the internal carotid arteries at the clival, cavernous, and paraclinoid segments on coronal magnetic resonance imaging was measured. Demographic profiles, cephalometric measurements, tumor dimensions, and sphenoid configuration were assessed as potential determinants of the ICD. Results A total of 212 cases and 34 controls were analyzed. Widening of the ICD at the three segments of the internal carotid arteries was found in patients with pituitary macroadenomas (p < 0.01). Patients with a growth hormone–secreting adenoma had a markedly reduced ICD at the clivus compared with controls (1.59 cm versus 1.77 cm; p = 0.02; 95% confidence interval [CI], 0.03–0.32). The paraclinoid ICD was reduced in patients with an anterior fossa meningioma (1.24 cm versus 1.33 cm; p = 0.04; 95% CI, 0.01–0.45). Conclusion Identifying clinicopathologic factors affecting the ICD can help surgeons recognize constraints to endoscopic access of the skull base and avoid inadvertent arterial injury.  相似文献   
178.
179.
BACKGROUND: Hepatitis C virus infection (HCV) is a main health problem in end-stage renal disease (ESRD) patients. The effect of pretransplant HCV infection on survival among ESRD patients who have undergone renal transplantation is controversial. We report the results of a large monocenter study that evaluated the effect of hepatitis C on the patient, and on graft survival in renal-transplanted patients who received living donated allograft. METHODS: A historical cohort study, we investigated all 1006 patients who received a living kidney transplant at Baghiatollah Medical Center in Tehran, Iran, between March 1995 and October 2001 (up to 85 months follow up). Patients' sera had been routinely assayed for anti-HCV antibodies and hepatitis B surface antigen (HBsAg) at the time of transplantation. The HBsAg-positive patients were excluded from the survival analysis. Survivals were examined using Kaplan-Meier analysis and compared using the log-rank test. Multivariate analysis was performed using Cox's model. RESULTS: Forty-five patients (4.5%) were anti-HCV-antibody positive. Anti-HCV-antibody-positive patients spent a longer time on dialysis and had a higher rate of retransplantation. There were no differences in recipients' sex and age and donors' age between the two groups. The 7-year patient survival rate was 89.9% in the anti-HCV-antibody-positive group and 95.5% in the HCV-negative group (P = 0.74). Seven-year graft survival was 82.0% and 75.0% in the anti-HCV-antibody-positive and HCV-negative groups, respectively (P = 0.39). In the multivariate analysis, age was the only significant parameter correlated with patient survival (P = 0.02). CONCLUSIONS: HCV infection does not seem to influence patient and graft survival within a medium-time follow up in living allograft recipients, and anti-HCV-antibody positive status (alone) is not a contraindication for renal transplantation. However, further studies are needed to better define the role of HCV infection in long-term prognosis.  相似文献   
180.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号