首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   413篇
  免费   3篇
儿科学   16篇
妇产科学   15篇
基础医学   29篇
口腔科学   5篇
临床医学   38篇
内科学   74篇
皮肤病学   6篇
神经病学   28篇
特种医学   44篇
外科学   44篇
综合类   2篇
预防医学   52篇
眼科学   13篇
药学   36篇
中国医学   1篇
肿瘤学   13篇
  2023年   14篇
  2022年   7篇
  2021年   12篇
  2020年   4篇
  2019年   9篇
  2018年   5篇
  2017年   6篇
  2016年   12篇
  2015年   13篇
  2014年   21篇
  2013年   9篇
  2012年   15篇
  2011年   10篇
  2010年   19篇
  2009年   7篇
  2008年   9篇
  2007年   13篇
  2006年   16篇
  2005年   19篇
  2004年   18篇
  2003年   11篇
  2002年   18篇
  2001年   21篇
  2000年   10篇
  1999年   17篇
  1998年   6篇
  1997年   7篇
  1996年   9篇
  1995年   3篇
  1994年   3篇
  1993年   7篇
  1992年   6篇
  1991年   2篇
  1990年   8篇
  1989年   4篇
  1988年   8篇
  1987年   9篇
  1986年   7篇
  1984年   7篇
  1983年   2篇
  1982年   3篇
  1981年   2篇
  1980年   7篇
  1978年   1篇
排序方式: 共有416条查询结果,搜索用时 93 毫秒
31.
Secondary ice production (SIP) can significantly enhance ice particle number concentrations in mixed-phase clouds, resulting in a substantial impact on ice mass flux and evolution of cold cloud systems. SIP is especially important at temperatures warmer than −10C, for which primary ice nucleation lacks a significant number of efficient ice nucleating particles. However, determining the climatological significance of SIP has proved difficult using existing observational methods. Here we quantify the long-term occurrence of secondary ice events and their multiplication factors in slightly supercooled clouds using a multisensor, remote-sensing technique applied to 6 y of ground-based radar measurements in the Arctic. Further, we assess the potential contribution of the underlying mechanisms of rime splintering and freezing fragmentation. Our results show that the occurrence frequency of secondary ice events averages to <10% over the entire period. Although infrequent, the events can have a significant impact in a local region when they do occur, with up to a 1,000-fold enhancement in ice number concentration. We show that freezing fragmentation, which appears to be enhanced by updrafts, is more efficient for SIP than the better-known rime-splintering process. Our field observations are consistent with laboratory findings while shedding light on the phenomenon and its contributing factors in a natural environment. This study provides critical insights needed to advance parameterization of SIP in numerical simulations and to design future laboratory experiments.

Mixed-phase clouds, where supercooled cloud droplets and ice particles coexist, are frequently observed in the Arctic (1). These clouds play a critical role in the hydrological cycle and radiative energy balance, and they have unignorable impacts on sea ice loss and warming in the Arctic (2, 3). Recent theoretical and modeling investigations suggest that the number concentration of ice particles in mixed-phase clouds has a significant influence on the evolution of the cloud microphysical properties (4). Improper representation of ice formation compromises simulation of Arctic mixed-phase clouds in climate and regional models, which can cause considerable errors in the simulated radiative budget (5). Extensive modeling and laboratory studies have been conducted in recent years to investigate ice formation by ice nucleation, especially for heterogeneous ice nucleation for which nucleation is catalyzed by ice-nucleating particles (69). The fundamental underlying mechanisms of heterogeneous ice nucleation are still not fully understood, and the parameterizations that are widely used in atmospheric models are generated by fitting the results from laboratory experiments for various types of ice-nucleating particles. However, observed ice number concentrations can be several orders of magnitude greater than in simulations, especially in supercooled clouds with the temperature warmer than −10C (hereafter, “slightly supercooled clouds”). In this temperature range, some biological aerosols originating from soil, plants, and the ocean are found to be efficient ice-nucleating particles that can trigger ice nucleation above −10C (1013). However, these efficient ice-nucleating particles are rare, suggesting that secondary ice production (SIP) is important (14).The best-known mechanism of SIP in slightly supercooled clouds is the rime-splintering process, also known as the Hallett–Mossop (HM) process. The HM process occurs preferentially for a temperature range of −3C 8C in which small ice splinters are generated during riming. The HM process has been demonstrated in the laboratory using a riming rod rotating in a small chamber filled with supercooled liquid droplets (15). SIP can also be caused by other mechanisms, such as collision fragmentation (16), freezing fragmentation (17, 18), and sublimation fragmentation (19). Details regarding the current understanding of those mechanisms can be found in recent review articles by Field et al. (20) and Korolev and Leisner (21). Among those mechanisms, the HM process is argued to be the most important mechanism for SIP in slightly supercooled clouds (20, 22). However, recent in situ measurements show that substantial numbers of needles and columns (signs of splintering) are observed in mixed-phase clouds without the presence of rimers (i.e., fast falling ice particles). Instead, the presence of large cloud droplets suggests that those observed SIP events are likely due to freezing fragmentation rather than the HM process (23). Pitter and Pruppacher (24) also found in a laboratory wind tunnel study that a noticeable fraction of freezing drizzle drops developed pronounced knobs or spikes, with the spikes breaking off in many cases. The theory of freezing fragmentation is further supported by recent laboratory experiments in which SIP was observed during freezing of a levitated droplet (17, 18). However, conditions for the occurrence of SIP are still poorly known and which SIP mechanism is dominant in mixed-phase clouds is far from clear.Although laboratory experiments can demonstrate the existence of SIP under certain controlled conditions, the idealized mechanisms used for the studies (e.g., rotating rod or a levitated droplet in a calm environment) are not directly translatable to characterizing SIP processes in atmospheric clouds. Therefore, parameterizations of SIP in models using laboratory data are of debatable accuracy (25) because we still do not understand SIP mechanisms at a fundamental level. Aircraft in situ measurements of ice particles and ice-nucleating particles can help to identify the occurrence of SIP in atmospheric clouds; however, statistical studies using such measurements are severely restricted by the small sampling volumes and limited coverage of aircraft flights (23, 26).Remote-sensing techniques provide an alternative way to observe atmospheric clouds, offering larger sampling volumes and longer periods compared with in situ measurements. These features are beneficial for observing processes that are transient and/or infrequent, as may be true for SIP. The occurrence of a SIP event in mixed-phase clouds is indicated by the presence of a large concentration of small ice particles, especially at warmer temperatures where these concentrations are unlikely to be due solely to primary ice nucleation. A common foundation of existing radar-based remote-sensing techniques for identification of SIP events includes the detection of small, nonspherical ice particles using polarimetric variables, such as differential reflectivity (ZDR) (the ratio of the power returned from horizontally versus vertically transmitted and received pulses) and linear depolarization ratio (LDR) (the ratio of cross-polarized versus copolarized power returned with respect to the polarization of transmitted pulses) (27, 28). Close to the time of SIP initiation, radar methods and in situ measurements are challenged alike, as distinguishing small spherical ice particles from cloud droplets is extremely difficult (4). As newly formed small ice particles prefer growing into needle-like ice crystals within the HM temperature zone (between −3C and −8C), they can then alter the value of ZDR and LDR compared with spherical hydrometers, which makes detection of SIP events possible using remote-sensing techniques. Most previous remote-sensing studies of SIP focus on specific cases, for which the thermodynamic properties of the subject mixed-phase clouds are carefully chosen such that the detection of nonspherical ice particles is a readily apparent signal of a SIP event in a small dataset (29, 30).In this study, we obtain a statistical understanding of SIP events. A remote-sensing technique is used to identify SIP events occurring within 6 y (March 2013 to May 2019) of ground-based observations of slightly supercooled liquid clouds. As detailed later, the technique determines the presence of SIP events using joint thresholds of radar LDR and spectral reflectivity and, moreover, quantifies the enhancement of needle-like particle concentrations (i.e., multiplication) based on the spectral reflectivity with respect to a base threshold. We link the occurrence of SIP to the presence of rimers and drizzle, and we estimate the enhancement in ice number concentration with respect to rimer velocity and drizzle size. We show that SIP events can significantly impact ice number concentrations locally when they occur, and we are able to assess the relative importance of two SIP mechanisms, finding that freezing fragmentation is more productive at SIP than the rime splintering normally regarded as the leading process for SIP.  相似文献   
32.
The global impacts of river floods are substantial and rising. Effective adaptation to the increasing risks requires an in-depth understanding of the physical and socioeconomic drivers of risk. Whereas the modeling of flood hazard and exposure has improved greatly, compelling evidence on spatiotemporal patterns in vulnerability of societies around the world is still lacking. Due to this knowledge gap, the effects of vulnerability on global flood risk are not fully understood, and future projections of fatalities and losses available today are based on simplistic assumptions or do not include vulnerability. We show for the first time (to our knowledge) that trends and fluctuations in vulnerability to river floods around the world can be estimated by dynamic high-resolution modeling of flood hazard and exposure. We find that rising per-capita income coincided with a global decline in vulnerability between 1980 and 2010, which is reflected in decreasing mortality and losses as a share of the people and gross domestic product exposed to inundation. The results also demonstrate that vulnerability levels in low- and high-income countries have been converging, due to a relatively strong trend of vulnerability reduction in developing countries. Finally, we present projections of flood losses and fatalities under 100 individual scenario and model combinations, and three possible global vulnerability scenarios. The projections emphasize that materialized flood risk largely results from human behavior and that future risk increases can be largely contained using effective disaster risk reduction strategies.Flooding is one of the most frequent and damaging natural hazards affecting societies across the globe, with average annual reported losses and fatalities between 1980 and 2012 exceeding $23 billion (bn) (in 2010 prices) and 5,900 people, respectively (1). These risks have been shown to negatively affect economic growth on a country level (2). Global trends and regional differences in flood risk result from the dynamics of hazard (i.e., the natural frequency and intensity of floods, without human interference), exposure (i.e., the population and economic assets located in flood hazard-prone areas), and vulnerability (i.e., the susceptibility of the exposed elements to the hazard) (3, 4). Each of these contributing factors can be expected to change over time.Trends in global flood losses have been increasing over the past decades and have been attributed mainly to increasing exposure due to high population growth and economic development in flood-prone areas (49). At the same time, rainfall patterns and intensities may shift under climate change (10, 11), which could influence the flood hazard (1215). In addition, interannual variations in peak discharge, caused by climatic oscillations such as El Niño Southern Oscillation, may lead to strong spatiotemporal fluctuations in the occurrence of floods (16, 17). These hazard and exposure elements can only partly explain spatiotemporal patterns in flood risk, because of the importance of vulnerability (8, 18). There are many different and competing definitions of vulnerability in literature (see ref. 4, chap. 2, for a discussion on these). Vulnerability is considered in this study to include all man-made efforts to reduce the impact of the natural flood hazard on the exposed elements, including structural flood defenses, building quality early-warning systems, and available health care and communication facilities (1921). Vulnerability is dynamic and varying across temporal and spatial scales, and may depend on economic, social, geographic, demographic, cultural, institutional, governance, and environmental factors (ref. 4, chap. 2). The level of vulnerability is therefore affected by socioeconomic development (ref. 4, chap. 2; see also ref. 22) and can specifically be influenced by deliberate disaster risk reduction efforts (19, 23). The reduction of vulnerability over time makes countries less prone to the adverse effects of the current and future flood hazard and is therefore considered as a display of adaptation.For example, two similar tropical cyclones made landfall in eastern India, one in 2013 (Phailin) and one in 1999 (Cyclone 05B). Exposed population was greater in 2013 due to population growth and development in cyclone-prone areas. However, the vulnerability in the region had drastically decreased with the implementation of a disaster management authority; cyclone shelters and early-warning systems ensured that only a small fraction of the population was vulnerable to this event. Because of this, the total reported impacts for the similar event in 2013 were much lower; fewer than 50 lives were lost in 2013, whereas the cyclone in 1999 was responsible for more than 10,000 lives lost (24). A similar study examined the effect of mangrove forests in the 1999 event; controlling for distance from the coast and storm surge (exposure and hazard, respectively), they found a differential vulnerability in the number of deaths as a percentage of the potentially exposed population (25).The level of vulnerability of a community is therefore reflected in the actual losses and fatalities as a share of the people and assets exposed to the flood hazard (26). In vulnerable communities, these mortality and loss rates can be expected to be higher than in less vulnerable communities for the same hazard event.Understanding the complexity of the risk chain and the resulting past and future trends in flood risk is increasingly important for international decisions on risk financing (13, 27), for the allocation of disaster risk reduction (DRR)-related development aid (28, 29), and for designing effective climate change adaptation policies (4, 30, 31). Recently, disentangling the contribution of hazard, exposure, and vulnerability has once again been emphasized explicitly in the climate change debate, as the acceptance of the Warsaw International Mechanism for Loss and Damage Associated with Climate Change Impacts in December 2013 (32) raises questions of causality, responsibility, and equity (33).The understanding of climatic and socioeconomic drivers of risk has improved considerably over the past years, and a range of studies have demonstrated that feasible estimates can be made of current and future flood hazard (34, 35) and exposure (19, 36) at a global scale. These scientific advances have recently been combined in first global-scale risk assessments under current and future climate conditions (3739). However, the understanding of vulnerability remains one of the biggest hurdles in existing continental to global-scale flood risk assessments (33, 37). To disentangle the risk chain and identify the contribution of vulnerability on historical flood losses, it is critical to have consistent information on global flood hazard, exposure, and reported impacts. Previous recent studies (69) have tried to unravel reported flood loss patterns from various disaster databases by normalizing the trends using data on gross domestic product (GDP) and population growth (exposure) and, at best, simplistic climate proxies (hazard). Whereas increases in the proxies for hazard and exposure have been statistically linked to rising long-term global trends in losses, they do not represent interannual variability in flood occurrence and offer limited explanatory power for year-to-year global loss patterns. Finding evidence for changes in vulnerability in these long-term trends has therefore been very difficult (8). No previous study has yet been able to quantify the contribution of the individual risk drivers and convincingly disentangle the dynamics of hazard, exposure, and vulnerability on a global scale.Consequently, little quantitative evidence is yet available about regional differences in human and economic vulnerability to flooding; changes in this vulnerability over time under socioeconomic growth, adaptation, and DRR efforts; and how this relates to observed trends in global disaster risk (8). As a result, past trends in losses and fatalities are not fully understood; the potential effect of climate change-induced increases in flood hazard remains unclear; the global effects of adaptation measures are unquantified; and future projections of fatalities and losses available today are based on simplistic assumptions or do not include vulnerability (8, 13, 40).Here, we focus on analyzing this missing link in the risk chain. We show that variation and trends in vulnerability can be derived by modeling flood hazard and exposure at a high level of detail and paralleling these to reported impacts over the past decades. We investigate the relationship between GDP per capita and vulnerability, whereby we consider vulnerability to be represented by mortality rates (reported fatalities as a percentage of modeled exposed population) and loss rates (reported losses as a percentage of modeled exposed GDP). We also show how the vulnerability of different world regions has changed over time. Finally, we demonstrate how reducing vulnerability by improved adaptation efforts may strongly lower the magnitude of human and economic flood losses in the future.  相似文献   
33.
A bentonite supported amorphous aluminum (B–Al) nanocomposite was synthesized by the NaBH4 reduction method in an ethanol–water interfacial solution and characterized with SEM, TEM, XRD, FT-IR and XRF. Surface morphology and line scans obtained from TEM imaging suggest the successful synthesis of the nanocomposite while XRF data shows a drastic change in Al concentration in the synthesized nanocomposite with respect to raw bentonite. This synthesized nanocomposite was further utilized for the removal of hexavalent chromium (Cr(vi)) from aqueous solutions. The very high removal efficiency of the composite for Cr(vi) (i.e. 49.5 mg g−1) was revealed by the Langmuir sorption isotherm. More than 90% removal of Cr(vi) in just 5 minutes of interaction suggests very fast removal kinetics. Inner sphere complexation and coprecipitation of Cr(vi) can be concluded as major removal mechanisms. No influence of ionic strength suggests inner sphere complexation dominated in Cr(vi) uptake. pH of the solution didn''t influence the sorption much but comparatively the removal was higher under alkaline conditions (99.4%) than under acidic conditions (93.7%). The presence of humic acid and bicarbonate ions reduced the sorption significantly. The final product, Cr–Al(OH)3 results in precipitation by forming alum which indicates that clay supported amorphous aluminum nanocomposites can be considered as potential sorbents for toxic metal ions in the environment.

Synthesis and application of bentonite supported amorphous aluminum nanocomposite as promising material for the removal of Cr(vi) from aqueous solutions.  相似文献   
34.
35.
36.
In order to determine whether an EEG early in the course of asphyxia neonatorum is of any more value than the neurological examination in predicting outcome we reviewed case histories of 38 infants with asphyxia neonatorum. The EEG background activity was valuable in predicting outcome. Normal and maturationally delayed EEGs were associated with normal outcomes while low voltage, electrocerebral inactivity and burst suppression EEGs were highly correlated with severe neurological sequelae. Epileptiform activity was not as predictive of outcome as background activity. Although initial normal neurological examinations were associated with normal developmental and neurological outcomes, moderately and severely abnormal infants had more variable courses. A single EEG done early in the course of asphyxia neonatorum is a more sensitive predictor of outcome than the neurological examination.  相似文献   
37.
38.
39.
40.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号