首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   18篇
  免费   0篇
基础医学   2篇
临床医学   2篇
内科学   5篇
皮肤病学   1篇
神经病学   2篇
预防医学   1篇
药学   2篇
肿瘤学   3篇
  2024年   1篇
  2023年   1篇
  2022年   2篇
  2021年   3篇
  2019年   1篇
  2018年   2篇
  2017年   2篇
  2016年   1篇
  2015年   1篇
  2012年   1篇
  2011年   1篇
  2006年   1篇
  1999年   1篇
排序方式: 共有18条查询结果,搜索用时 46 毫秒
1.
Cerebral small vessel disease (CSVD) is the most important cause of vascular cognitive impairment (VCI). Most CSVD cases are sporadic but familial monogenic forms of the disorder have also been described. Despite the variants identified, many CSVD cases remain unexplained genetically. We used whole-exome sequencing in an attempt to identify novel gene variants underlying CSVD. A cohort of 35 Finnish patients with suspected CSVD was analyzed. Patients were screened negative for the most common variants affecting function in NOTCH3 in Finland (p.Arg133Cys and p.Arg182Cys). Whole-exome sequencing was performed to search for a genetic cause of CSVD. Our study resulted in the detection of possibly pathogenic variants or variants of unknown significance in genes known to associate with CSVD in six patients, accounting for 17% of cases. Those genes included NOTCH3, HTRA1, COL4A1, and COL4A2. We also identified variants with predicted pathogenic effect in genes associated with other neurological or stroke-related conditions in seven patients, accounting for 20% of cases. This study supports pathogenic roles of variants in COL4A1, COL4A2, and HTRA1 in CSVD and VCI. Our results also suggest that vascular pathogenic mechanisms are linked to neurodegenerative conditions and provide novel insights into the molecular basis of VCI.Subject terms: Stroke, Sequencing, Genetics research, Dementia  相似文献   
2.

Background

No randomized studies exist comparing pneumonectomy (PN) and sleeve lobectomy (SL). We evaluated surgical results and long-term quality of life in patients operated on for central non-small cell lung cancer (NSCLC) using either SL or PN.

Methods

A total of 641 NSCLC patients underwent surgery 2000-2010. SL was performed in 40 (6.2%) and PN in 67 (10.5%). In 2011, all surviving patients were sent a 15D Quality of Life Questionnaire which 83% replied. Propensity-score-matching analysis was utilized to compare the groups.

Results

Thirty-two bronchial (18 right/14 left), seven vasculobronchial (3 right/4 left), one right wedge SL, and 18 right and 22 left PN were performed. Preoperatively, the Charlson Comorbidity Index (CCI) score, forced expiratory volume in 1 s (FEV1) and diffusion capacity did not differ between groups. The perioperative complication rate and pattern were similar, but SL group had less major complications (P<0.027). One perioperative death (2.5%) occurred in SL group and four (6%) in PN. The 90-day mortality rate was 5% (n=2) for SL and 7.5% (n=5) for PN. In the follow-up total cancer recurrence did not differ (P=0.187). Quality of life measured by 15D showed no significant difference in separate dimensions or total score, except tendency to favor SL in moving or breathing. The 5-year survival did not differ between groups (P=0.458), but no deaths were observed in SL group after 5 years.

Conclusions

Due to less major operative complications and better long-term survival, we would advocate using SL when feasible, but in patients tolerating PN it should be considered if SL seems not to be oncologically sufficiently radical.  相似文献   
3.
Propranolol is a nonselective beta-adrenergic blocker used as a racemic mixture in the treatment of hypertension, cardiac arrhythmias, and angina pectoris. For study of the stereoselective glucuronidation of this drug, the two propranolol glucuronide diastereomers were biosynthesized, purified, and characterized. A screen of 15 recombinant human UDP-glucuronosyltransferases (UGTs) indicated that only a few isoforms catalyze propranolol glucuronidation. Analysis of UGT2B4 and UGT2B7 revealed no significant stereoselectivity, but these two enzymes differed in glucuronidation kinetics. The glucuronidation kinetics of R-propranolol by UGT2B4 exhibited a sigmoid curve, whereas the glucuronidation of the same substrate by UGT2B7 was inhibited by substrate concentrations above 1 mM. Among the UGTs of subfamily 1A, UGT1A9 and UGT1A10 displayed high and, surprisingly, opposite stereoselectivity in the glucuronidation of propranolol enantiomers. UGT1A9 glucuronidated S-propranolol much faster than R-propranolol, whereas UGT1A10 exhibited the opposite enantiomer preference. Nonetheless, the Km values for the two enantiomers, both for UGT1A9 and for UGT1A10, were in the same range, suggesting similar affinities for the two enantiomers. Unlike UGT1A9, the expression of UGT1A10 is extrahepatic. Hence, the reverse stereoselectivity of these two UGTs may signify specific differences in the glucuronidation of propranolol enantiomers between intestine and liver microsomes. Subsequent experiments confirmed this hypothesis: human liver microsomes glucuronidated S-propranolol faster than R-propranolol, whereas human intestine microsomes glucuronidated S-propranolol faster. These findings suggest a contribution of intestinal UGTs to drug metabolism, at least for UGT1A10 substrates.  相似文献   
4.
Atrial natriuretic factor (ANF) is a potent natriuretic, diuretic, and vasoactive hormone produced and released by atrial cardiomyocytes. We investigated whether adenovirus-mediated ANF gene delivery to dogs leads to a sustained increase in circulating ANF levels resulting in long-lasting biological effects. An adenoviral vector containing the canine ANF cDNA under the control of the Rous sarcoma virus 3' long terminal repeat (AdRSV-ANF) was injected via the intrahepatic route to nonvaccinated 2-month-old dogs. In the first group of four dogs injected with AdRSV-ANF (10(10.2) TCID50), a short-lived increase in plasma ANF concentrations not associated with biological effects occurred 8-10 days after the injection, as compared with four control dogs injected with an adenovirus encoding a luciferase reporter gene (AdRSV-luc). In a second series of experiments, six dogs received AdRSV-ANF at a dose of 10(10) TCID50 and a replication-defective type 5 adenovirus harboring a modified VAI gene (Ad-VAr) at the same dose. Sustained increases in plasma ANF concentrations and urinary cGMP excretion starting on day 2 and persisting until day 20 were seen, as well as concomitant elevations in natriuresis and diuresis, a transient increase in cardiac output, and a delay in body weight gain, as compared with control dogs injected with AdRSV-luc/Ad-VAr. These results show that adenovirus-mediated ANF gene expression can lead to systemic biological effects in dogs, a finding of potential relevance for the treatment of cardiovascular diseases and sodium-retaining disorders.  相似文献   
5.
BackgroundAs a result of routine low-dose computed tomographic screening, lung cancer is more frequently diagnosed at earlier, operable stages of disease. In treating local non–small-cell lung cancer, video-assisted thoracoscopic surgery (VATS), a minimally invasive surgical approach, has replaced thoracotomy as the standard of care. While short-term quality-of-life outcomes favor the use of VATS, the impact of VATS on long-term health-related quality of life (HRQoL) is unknown.Patients and MethodsWe studied patients who underwent lobectomy for the treatment of non–small-cell lung cancer from January 2006 to January 2013 at a single institution (n = 456). Patients who underwent segmentectomy (n = 27), who received neoadjuvant therapy (n = 13), or who were found to have clinical stage > T2 or > N0 disease (n = 45) were excluded from analysis. At time of HRQoL assessment, 199 patients were eligible for study and were mailed the generic HRQoL instrument 15D.ResultsA total of 180 patients (90.5%) replied; 92 respondents underwent VATS while 88 underwent open thoracotomy. The VATS group more often had adenocarcinoma (P = .006), and lymph node stations were sampled to a lesser extent (P = .004); additionally, hospital length of stay was shorter among patients undergoing VATS (P = .001). No other clinical or pathologic differences were observed between the 2 groups. Surprisingly, patients who underwent VATS scored significantly lower on HRQoL on the dimensions of breathing, speaking, usual activities, mental function, and vitality, and they reported a lower total 15D score, which reflects overall quality of life (P < .05).ConclusionIn contrast to earlier short-term reports, long-term quality-of-life measures are worse among patients who underwent VATS compared to thoracotomy.  相似文献   
6.
In type 1 diabetes, it is important to prevent diabetes-related complications and postural instability may be one clinically observable manifestation early on. This study was set to investigate differences between type 1 diabetics and healthy controls in variables of instrumented posturography assessment to inform about the potential of the assessment in early detection of diabetes-related complications. Eighteen type 1 diabetics with no apparent complications (HbA1c = 58 ± 9 mmol/L, diabetes duration = 15 ± 7 years) and 35 healthy controls underwent six 1-min two feet standing postural stability tests on a force plate. Study groups were comparable in age and anthropometric and performed the test with eyes open, eyes closed (EC), and EC head up with and without unstable padding. Type 1 diabetics exhibited greater sway (path length, p = 0.044 and standard deviation of velocity, p = 0.039) during the EC test with the unstable pad. Also, power spectral density indicated greater relative power (p = 0.043) in the high-frequency band in the test with EC head up on the unstable pad and somatosensory activity increased more (p = 0.038) when the unstable pad was added to the EC test. Type 1 diabetes may induce subtle changes in postural control requiring more active balancing when stability is challenged. Postural assessment using a portable easy-to-use force plate shows promise in detecting a diabetes-related decline in postural control that may be used as a sensitive biomarker of early-phase diabetes-related complications.  相似文献   
7.
As human activities impact virtually every animal habitat on the planet, identifying species at-risk from disturbance is a priority. Cetaceans are an example taxon where responsiveness to anthropogenic noise can be severe but highly species and context specific, with source–receiver characteristics such as hearing sensitivity only partially explaining this variability. Here, we predicted that ecoevolutionary factors that increase species responsiveness to predation risk also increase responsiveness to anthropogenic noise. We found that reductions in intense-foraging time during exposure to 1- to 4-kHz naval sonar and predatory killer whale sounds were highly correlated (r = 0.92) across four cetacean species. Northern bottlenose whales ceased foraging completely during killer whale and sonar exposures, followed by humpback, long-finned pilot, and sperm whales, which reduced intense foraging by 48 to 97%. Individual responses to sonar were partly predicted by species-level responses to killer whale playbacks, implying a similar level of perceived risk. The correlation cannot be solely explained by hearing sensitivity, indicating that species- and context-specific antipredator adaptations also shape cetacean responses to human-made noise. Species that are more responsive to predator presence are predicted to be more disturbance sensitive, implying a looming double whammy for Arctic cetaceans facing increased anthropogenic and predator activity with reduced ice cover.

Why are some species more averse to anthropogenic noise disturbances than others? Comparative frameworks for species sensitivity are urgently needed as human activities impact the marine environment on a global scale (1), with underwater noise from shipping, seismic exploration, and military sonar (2) and increased activities in the Arctic Ocean being of particular concern (3). Auditory sensitivity and acoustic masking have been the dominant explanatory factors when comparing the sensitivity of marine organisms that rely on sound for critical life functions (4, 5). However, both theoretical and empirical work have shown that evolutionary and ecological context variables (hereafter, ecoevolutionary factors) other than hearing sensitivity, such as antipredator adaptations and habitat quality, can also be expected to play a significant role (4, 68). This can be illustrated in cetaceans that use underwater sound as a primary sensory and communication modality and for which research efforts have characterized and quantified a diverse array of noise-induced behavioral effects (5, 911). Experimental sound exposures show that free-ranging cetaceans respond to noise by ceasing fitness-enhancing activities, such as feeding (12), leading to concern over population-level impacts (13). Responsiveness varies across species (14), with some taxa like beaked whales (1416) and harbor porpoises (17, 18) considered to be particularly sensitive. Nevertheless, it remains unclear to which extent antipredator adaptations versus other ecoevolutionary factors, like auditory sensitivity (5), might drive this variation (4, 10). Crucially, to our knowledge, predictions linking antipredator adaptations and noise disturbance have not been quantitatively tested in a unified analysis across different species sharing the same underwater soundscape.The risk–disturbance hypothesis posits that responses to a disturbance source are the outcome of each animals’ internal trade-off between the perceived risk posed, against the fitness and missed opportunity costs of a response (6). Given that antipredator responses are costly, prey are expected to adjust their response thresholds according to the phenotypic and evolutionary contexts that have shaped their responses to predation risk (1922). We thus predicted that species and/or populations that are in ecoevolutionary contexts more vulnerable to predation should respond more strongly to both predation risk and anthropogenic disturbances. On the other hand, contexts that promote tolerance of predation risk, such as higher-risk/higher-reward foraging, are expected to also translate to tolerance of anthropogenic disturbance.We empirically tested this prediction by comparing changes in foraging time budgets of four cetacean species (northern bottlenose, humpback, sperm, and long-finned pilot whales) in their feeding grounds during experimental exposure to 1- to 4-kHz naval sonar and predatory mammal-eating killer whale sound (hereafter KW-mammal) playbacks. Playbacks of killer whale sounds elicit antipredator behavior in seals (23) and cetaceans (2426), providing a yardstick for costly and aversive reactions that have evolved to reduce predation risk (27). We chose to quantify reductions in foraging time budgets because that is a well-defined and quantifiable behavioral change that reflects the trade-off between food and safety, which is shared across animal taxa (28). Most mesopredator cetaceans are known prey of killer whales (29), although the precise extent to which each species is subject to predation remains poorly understood. Diverse antipredator strategies across the four species in our study imply a priori that variation in the strength of antipredator responses is expected. Large adult male sperm whales in our study and adult humpback whales with long flippers have strong fight capabilities (30), while large groups of long-finned pilot whales may use social mobbing responses against predation threats (25). In contrast, the northern bottlenose whale, with no physical defense and smaller group sizes, likely relies upon crypsis and flight to avoid predation as do other beaked whales (31). Echolocation sounds of toothed whales while foraging (32) and body movements of lunge-feeding baleen whales (33) are conspicuous, exposing foragers to increased predation risk (e.g., reference 34); consequently, cryptic antipredator responses imply cessation of feeding that will carry a clear consequence to energetic balance. Furthermore, predator detection by prey may be less effective during foraging (28). Therefore, foraging time represents a quantifiable and sensitive indicator of responsiveness to a threat, which can be applied to the diverse species in our study.Sound and movement data from suction-cup-attached data loggers were used to classify dives of 43 whales of 4 species (SI Appendix, Table S1) into functional states, including intense foraging—when animals were maximally engaged in foraging related echolocation or movement behaviors (Fig. 1 and SI Appendix, Table S2). We quantified how time spent in intense foraging during baseline periods changed during exposures to 1- to 4-kHz sonar and predator sounds (KW-mammal). We expected a priori that 1) both stimuli elicit a reduction in intense foraging time, 2) responses to predator sounds are stronger than to 1- to 4-kHz sonar, and 3) higher species average responses to predator sounds correspond with higher responsiveness to 1- to 4-kHz sonar because the ecoevolutionary drivers that increase responsiveness to predation risk are also expected to increase responsiveness to anthropogenic threats in each species'' study population and environmental context.Open in a separate windowFig. 1.Representative time series behavioral data recorded by sound-and-movement recording Dtags, with exposure periods marked as boxes. For each species, the Top panels show dive depth versus time, with feeding indicators shown in color (navy blue, echolocation click production; red line, buzz clicks; red crosses, lunges). Bottom panels show the absolute value of vertical speed, with the color indicating the behavioral state. Note the dark-green intense foraging state was associated with feeding indicators and higher vertical speeds. Note the reduction in intense foraging during 1- to 4-kHz sonar treatments (solid boxes) but little effect of the no-sonar control treatment (dashed boxes).  相似文献   
8.
9.
Objectives:This study aimed to examine the contribution of shift work, work time control (WTC) and informal caregiving, separately and in combination, to sleep disturbances in ageing employees.Methods:Survey data were obtained from two prospective cohort studies with repeated measurements of working conditions, informal caregiving, and sleep disturbances. We used fixed-effect conditional logistic regression analysis to examine whether within-individual changes in shift work, WTC and informal caregiving were associated with changes in sleep. Secondary analyses included between-individuals comparison using standard logistic regression models. Results from the two cohorts were pooled using meta-analysis.Results:Low WTC and informal caregiving were associated with sleep disturbances in within-individual analyses [odds ratios (OR) ranging between 1.13 (95% confidence interval 1.01–1.27) and 1.48 (95% CI 1.29–1.68)] and in between-individuals analyses [OR 1.14 (95% CI 1.03–1.26) to 1.33 (1.19–1.49)]. Shift work alone was not associated with sleep disturbances, but accumulated exposure to shift work, low WTC and informal caregiving was associated with higher risk of sleep disturbances (OR range 1.21–1.76). For some of the sleep outcomes, informal caregiving was related to a higher risk of sleep disturbances when WTC was low and a lower risk when WTC was high.Conclusions:Informal caregiving and low WTC are associated with risk of sleep disturbances among ageing employees. The findings also suggest that low WTC in combination with informal caregiving may increase the risk of sleep disturbances whereas high WTC may alleviate the adverse impact of informal caregiving on sleep.  相似文献   
10.
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号