首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.

Background and aims

This study compared the accuracy of the FreeStyle Libre (Abbott, Alameda, CA) and Dexcom G4 Platinum (DG4P, Dexcom, San Diego, CA) CGM sensors.

Methods and results

Twenty-two adults with type 1 diabetes wore the two sensors simultaneously for 2 weeks. Libre was used according to manufacturer-specified lifetime (MSL); DG4P was used 7 days beyond MSL. At a clinical research center (CRC), subjects were randomized to receive the same breakfast with standard insulin bolus (standard) or a delayed and increased (delayed & increased) bolus to induce large glucose swings during weeks 1 and 2; venous glucose was checked every 5–15 min for 6 h. Subjects performed ≥4 reference fingersticks/day at home. Accuracy was assessed by differences in mean absolute relative difference (%MARD) in glucose levels compared with fingerstick test (home use) and YSI reference (CRC). During home-stay the Libre MARD was 13.7 ± 3.6% and the DG4P MARD 12.9 ± 2.5% (difference not significant [NS]). With both systems MARD increased during hypoglycaemia and decreased during hyperglycaemia, without significant difference between sensors. In the euglycaemic range MARD was smaller with DG4P [12.0 ± 2.4% vs 14.0 ± 3.6%, p = 0.026]. MARD increased in both sensors following delayed & increased vs. standard bolus (Libre: 14.9 ± 5.5% vs. 10.9 ± 4.1%, p = 0.008; DG4P: 18.1 ± 8.1% vs. 13.1 ± 4.6%, p = 0.026); between-sensor differences were not significant (p = 0.062). Libre was more accurate during moderate and rapid glucose changes.

Conclusions

DG4P and Libre performed similarly up to 7 days beyond DG4P MSL. Both sensors performed less well during hypoglycaemia but Libre was more accurate during glucose swings.

Trial registration

The study was registered in ClinicalTrials.gov (NCT02734745) April 12, 2016.  相似文献   

2.
PurposeCOVID-19 has brought many challenges for providing quality healthcare for type 1 diabetes (T1DM). We evaluated the impact of the COVID-19 pandemic on the medical care, glycemic control, and selected outcomes in T1DM patients.MethodsWe retrospectively analyzed medical records from 357 T1DM adults enrolled in the Program of Comprehensive Outpatient Specialist Care at the University Hospital in Krakow, and assessed differences in patient data from before the COVID-19 period (March 2019–February 2020) and after it started COVID-19 (March 2020–February 2021).ResultsThe median HbA1c levels and the percentage of patients within the HbA1c target of <7 % (53 mmol/mol) were similar in both periods: before and after the beginning of the pandemic (6.86 % [51.5 mmol/mol], IQR 6.23–7.58 % [44.6–59.3 mmol/mol] vs. 6.9 % [51.9 mmol/mol], IQR 6.2–7.61 % [44.3–59.7 mmol/mol]; p = 0.50 and 56.3 % vs. 57.1 %, p = 0.42, respectively). However, we observed a rise in BMI and body weight (median 24.25, IQR 21.97–27.05 vs. 24.82, IQR 22.17–27.87 and median weight 71.0 IQR 61–82 vs. 72.55, IQR 55–85; p < 0.001 for both comparisons). There was no reduction in the numbers of total diabetes-related visits (median 4, IQR 4–5 vs. 5, IQR 4–5; p = 0.065), but the frequency of other specialist consultations decreased (2, IQR 0–2 vs. 1, IQR 0–2). During the pandemic, telehealth visits constituted of 1191 out of 1609 (71.6 %) total visits.ConclusionsIn this single-center observation, the COVID-19 pandemic did not have a negative impact on glycemic control in T1DM patients, but the patients' weight did increase. Telemedicine proved to be a valuable tool for T1DM care.  相似文献   

3.
4.
We assessed the performance of the factory-calibrated, sixth-generation continuous glucose monitoring (CGM) system Dexcom G6® (DexCom Inc., San Diego, California) during elective abdominal surgery. Twenty adults with (pre)diabetes undergoing abdominal surgery (>2 hours; 15 men, age 69 ± 13 years, glycated haemoglobin 53 ± 14 mmol/mol) wore the sensor from 1 week prior to surgery until hospital discharge. From induction of anaesthesia until 2 hours post-surgery, reference capillary glucose values were obtained every 20 minutes using the Accu-Chek® Inform II meter (Roche Diabetes Care, Mannheim, Germany). The primary endpoint was the mean absolute relative difference (ARD) between sensor and reference method during this period. In total, 1207 CGM/reference pairs were obtained. In the peri-operative period (523 pairs), mean ± SD and median (interquartile range [IQR]) ARD were 12.7% ± 8.7% and 9.9 (6.3;15.9)%, respectively, and 67.4% of sensor readings were within International Organization of Standardization 15197:2013 limits. CGM overestimated reference glucose by 1.1 ± 0.8 mmol/L (95% limits of agreement −0.5;2.7 mmol/L). Clarke error grid zones A or B contained 99.2% of pairs (A: 78.8%; B: 20.4%). The median (IQR) peri-operative sensor availability was 98.6 (95.9;100.0)%. No clinically significant adverse events occurred. In conclusion, the Dexcom G6 device showed consistent and acceptable accuracy during elective abdominal surgery, opening new avenues for peri-operative glucose management.  相似文献   

5.
ObjectiveThere is limited real-life data demonstrating that hypo-/hyperglycemic alarms added to continuous glucose monitoring (CGM) improve metabolic control in adults with type 1 diabetes (T1D).We evaluated the usefulness of switching from a flash or intermittent-scanned continuous glucose monitoring (is-CGM) device without low or higher glucose alarms to a is-CGM device with alarms to prevent hypoglycemia in adults with T1D.MethodsIndividuals with T1D and fearful of hypoglycemia, prone to hypoglycemia unawareness, and/or experiencing severe hypoglycemia while using is-CGM Free Style Libre 1 (FSL1) were switched to FSL2 with individually-programmable low glucose alarms. The primary endpoint was the changes in % time below range (TBR%) <70 mg/dl [3.9 mmol/l] and <54 mg/dl [3.0 mmol/l] after 12 weeks on FSL2 compared with FSL1. Secondary endpoints were changes in % time in range (TIR% 70–180 mg/dl [3.9–10.0 mmol/l]), % time above range (TAR%) >180 [10.0 mmol/l], mean interstitial glucose, glycemic management indicator (GMI), interstitial glucose coefficient of variation (CV%), hemoglobin A1c, and sensor's scans/day.ResultsWe included 108 individuals (57.4 % men), aged 58.2 ± 17.3 [95 % CI: 55.0 to 61.5] years, with mean diabetes duration 25 ± 14.6 [95 % CI: 22.1 to 27.7] years. Among individuals, 40 (37.0 %) had hypoglycemia awareness with Clarke's score ≥4 and 19 (17.5 %) had a history of severe hypoglycemia. The median low glucose alarm threshold was 70 [IQR: 65–70] mg/dl (3.9 [IQR: 3.6–3.9] mmol/L). By comparison of first 12 weeks on FSL2 vs. last 12 weeks on FSL1, TBR% <70 mg/dl decreased from 4.5 ± 4.4 to 2.3 ± 2.8 % (p < 0.001), TBR% <54 mg/dl decreased from 1.4 ± 2.2 to 0.3 ± 0.9 % (p < 0.001). TIR% was not significantly different (51.5 ± 14.9 vs. 52.9 ± 16 % (p = 0.13)), nor was TAR% (43.8 ± 16.2 vs. 44.7 ± 16.5 % (p = 0.5)). CV% decreased from 39.4 ± 6.9 to 37.9 ± 6.1 % (p < 0.001). Those at risk for hypoglycemia (TBR >4 % and >1 %, respectively, at baseline) showed a significant decrease in the incidence of hypoglycemia <70 and <54 mg/dl (p < 0.0001). Patients' satisfaction with hypoglycemia alarms was high, since all individuals opted to pursue using individual alarm beyond the study period.ConclusionSwitching from FSL1 to FSL2 with low glucose alarms reduced the frequency of hypoglycemia in middle-age adults with T1D, particularly in those who were prone to hypoglycemia awareness or severe hypoglycemia.  相似文献   

6.
BackgroundPublished data suggest worse outcomes in acute coronary syndrome (ACS) patients and concurrent coronavirus disease 2019 (COVID-19) infection. Mechanisms remain unclear.ObjectivesThe purpose of this study was to report the demographics, angiographic findings, and in-hospital outcomes of COVID-19 ACS patients and compare these with pre–COVID-19 cohorts.MethodsFrom March 1, 2020 to July 31, 2020, data from 55 international centers were entered into a prospective, COVID-ACS Registry. Patients were COVID-19 positive (or had a high index of clinical suspicion) and underwent invasive coronary angiography for suspected ACS. Outcomes were in-hospital major cardiovascular events (all-cause mortality, re–myocardial infarction, heart failure, stroke, unplanned revascularization, or stent thrombosis). Results were compared with national pre–COVID-19 databases (MINAP [Myocardial Ischaemia National Audit Project] 2019 and BCIS [British Cardiovascular Intervention Society] 2018 to 2019).ResultsIn 144 ST-segment elevation myocardial infarction (STEMI) and 121 non–ST-segment elevation acute coronary syndrome (NSTE-ACS) patients, symptom-to-admission times were significantly prolonged (COVID-STEMI vs. BCIS: median 339.0 min vs. 173.0 min; p < 0.001; COVID NSTE-ACS vs. MINAP: 417.0 min vs. 295.0 min; p = 0.012). Mortality in COVID-ACS patients was significantly higher than BCIS/MINAP control subjects in both subgroups (COVID-STEMI: 22.9% vs. 5.7%; p < 0.001; COVID NSTE-ACS: 6.6% vs. 1.2%; p < 0.001), which remained following multivariate propensity analysis adjusting for comorbidities (STEMI subgroup odds ratio: 3.33 [95% confidence interval: 2.04 to 5.42]). Cardiogenic shock occurred in 20.1% of COVID-STEMI patients versus 8.7% of BCIS patients (p < 0.001).ConclusionsIn this multicenter international registry, COVID-19–positive ACS patients presented later and had increased in-hospital mortality compared with a pre–COVID-19 ACS population. Excessive rates of and mortality from cardiogenic shock were major contributors to the worse outcomes in COVID-19 positive STEMI patients.  相似文献   

7.
ObjectivesThis study sought to assess the utility of ultrasound (US) guidance for transradial arterial access.BackgroundUS guidance has been demonstrated to facilitate vascular access, but has not been tested in a multicenter randomized fashion for transradial cardiac catheterization.MethodsWe conducted a prospective multicenter randomized controlled trial of 698 patients undergoing transradial cardiac catheterization. Patients were randomized to needle insertion with either palpation or real-time US guidance (351 palpation, 347 US). Primary endpoints were the number of forward attempts required for access, first-pass success rate, and time to access.ResultsThe number of attempts was reduced with US guidance [mean: 1.65 ± 1.2 vs. 3.05 ± 3.4, p < 0.0001; median: 1 (interquartile range [IQR]: 1 to 2) vs. 2 (1 to 3), p < 0.0001] and the first-pass success rate improved (64.8% vs. 43.9%, p < 0.0001). The time to access was reduced (88 ± 78 s vs. 108 ± 112 s, p = 0.006; median: 64 [IQR: 45 to 94] s vs. 74 [IQR: 49 to 120] s, p = 0.01). Ten patients in the control group required crossover to US guidance after 5 min of failed palpation attempts with 8 of 10 (80%) having successful sheath insertion with US. The number of difficult access procedures was decreased with US guidance (2.4% vs. 18.6% for ≥5 attempts, p < 0.001; 3.7% vs. 6.8% for ≥5min, p = 0.07). No significant differences were observed in the rate of operator-reported spasm, patient pain scores following the procedure, or bleeding complications.ConclusionsUltrasound guidance improves the success and efficiency of radial artery cannulation in patients presenting for transradial catheterization. (Radial Artery Access With Ultrasound Trial [RAUST]; NCT01605292)  相似文献   

8.
ObjectivesThe purpose of this study was to evaluate the safety and efficacy of valve-in-valve (ViV) transcatheter aortic valve replacement (TAVR) for stentless bioprosthetic aortic valves (SBAVs) and to identify predictors of adverse events.BackgroundViV TAVR in SBAVs is associated with unique technical challenges and risks.MethodsClinical records and computer tomographic scans were retrospectively reviewed for procedural complications, predictors of coronary obstruction, mortality, and echocardiographic results.ResultsAmong 66 SBAV patients undergoing ViV TAVR, mortality was 2 of 66 patients (3.0%) at 30 days and 5 of 52 patients (9.6%) at 1 year. At 1 year, left ventricular end-systolic dimension was decreased versus baseline (median [interquartile range (IQR)]: 3.0 [2.6 to 3.6] cm vs. 3.7 [3.2 to 4.4] cm; p < 0.001). Coronary occlusion in 6 of 66 procedures (9.1%) resulted in myocardial infarction in 2 of 66 procedures (3.0%). Predictors of coronary occlusion included subcoronary implant technique compared with full root replacement (6 of 31, 19.4% vs. 0 of 28, 0%; p = 0.01), short simulated radial valve-to-coronary distance (median [IQR]: 3.4 [0.0 to 4.6] mm vs. 4.6 [3.2 to 6.2] mm; p = 0.016), and low coronary height (7.8 [5.8 to 10.0] mm vs. 11.6 [8.7 to 13.9] mm; p = 0.003). Coronary arteries originated <10 mm above the valve leaflets in 34 of 97 unobstructed coronary arteries (35.1%).ConclusionsTAVR in SBAVs is frequently associated with high-risk coronary anatomy but can be performed with a low risk of death and myocardial infarction, resulting in favorable ventricular remodeling. A subcoronary surgical approach is associated with an increased risk of coronary obstruction.  相似文献   

9.
ObjectivesThe aim of this study was to compare a delayed and a very early invasive strategy in patients with non–ST-segment elevation acute coronary syndromes (NSTE-ACS) without pre-treatment.BackgroundThe optimal delay of the invasive strategy in patients with NSTE-ACS remains debated and has never been investigated in patients not pre-treated with P2Y12–adenosine diphosphate receptor antagonists.MethodsA prospective, open-label, randomized controlled trial was conducted. Altogether, 741 patients presenting with intermediate- or high-risk NSTE-ACS intended for an invasive strategy were included. The modified intention-to-treat analysis was composed of 709 patients after 32 withdrew consent. Patients were randomized 1:1 to the delayed invasive group (DG) (n = 363) with coronary angiography (CA) performed 12 to 72 h after randomization or the very early invasive group (EG) (n = 346) with CA within 2 h. No pre-treatment with a loading dose of a P2Y12–adenosine diphosphate receptor antagonist was allowed before CA. The primary endpoint was the composite of cardiovascular death and recurrent ischemic events at 1 month, as determined by a blinded adjudication committee.ResultsMost patients had high-risk NSTE-ACS in both groups (93% in the EG vs. 92.5% in the DG). The median time between randomization and CA was 0 h (interquartile range [IQR]: 0 to 1 h) in the EG group and 18 h (IQR: 11 to 23 h) in the DG. The primary endpoint rate was significantly lower in the EG (4.4% vs. 21.3% in the DG; hazard ratio: 0.20; 95% confidence interval: 0.11 to 0.34; p < 0.001), driven by a reduction in recurrent ischemic events (19.8% vs. 2.9%; p < 0.001). No difference was observed for cardiovascular death.ConclusionsWithout pre-treatment, a very early invasive strategy was associated with a significant reduction in ischemic events at the time of percutaneous coronary intervention in patients with intermediate- and high-risk NSTE-ACS. (Early or Delayed Revascularization for Intermediate and High-Risk Non ST-Elevation Acute Coronary Syndromes; NCT02750579)  相似文献   

10.
BackgroundApart from saving the lives of coronavirus disease (COVID-19) patients on mechanical ventilation (MV), recovery from the sequelae of prolonged MV (PMV) is an emerging issue.cMethodsWe conducted a retrospective study among consecutive adult COVID-19 patients admitted to an intensive care unit (ICU) in Kobe, Japan, between March 3, 2020, and January 31, 2021, and received invasive MV. Clinical outcomes included in-hospital mortality and recovery from COVID-19 in survivors regarding organ dysfunction, respiratory symptoms, and functional status at discharge. We compared survivors’ outcomes with MV durations of >14 days and ≤14 days.ResultsWe included 85 patients with a median age of 69 years (interquartile range, 64–75 years); 76 (89%) patients had at least 1 comorbidity, 72 (85%) were non-frail, and 79 (93%) were functionally independent before COVID-19 infection. Eighteen patients (21%) died during hospitalization. At discharge, 59/67 survivors (88%) no longer required respiratory support, 50 (75%) complained of dyspnea, and 40 (60%) were functionally independent. Of the survivors, 23 patients receiving MV for >14 days had a worse recovery from COVID-19 at discharge compared with those on MV for ≤14 days, as observed using the Barthel index (median: 35 [5–65] vs. 100 [85–100]), ICU mobility scale (8 [5–9] vs. 10 [10-10]), and functional oral intake scale (3 [1–7] vs. 7 [7-7]) (P < 0.0001).ConclusionAlthough four-fifths of the patients survived and >50% of survivors demonstrated clinically important recovery in organ function and functional status during hospitalization, PMV was related to poor recovery from COVID-19 at discharge.  相似文献   

11.
《Diabetes & metabolism》2014,40(3):211-214
AimWe compared post-breakfast closed-loop glucose control either matched with a carbohydrate-matching bolus or a weight-dependent bolus.MethodsTwelve adults with type 1 diabetes consumed a 75 g CHO breakfast on two occasions. In random order, the breakfast was accompanied by a full carbohydrate-matching insulin bolus (8.30 U [7.50 U–11.50 U]) or a partial weight-dependent insulin bolus (0.047 U/kg; 3.45 U [2.95 U–3.75 U]). Postprandial glucose was regulated by sensor-responsive insulin and glucagon delivery.ResultsGlucose control after the weight-dependent bolus was safe and feasible (glucose values returned to pre-prandial levels after 5 h). However, 5-hr incremental area under the curve and percentage of time above 10 mmol/L were lower after the full bolus compared to the partial bolus (IAUC, 2.1 [0.8–4.2]mmol/L/hr vs 8.3 [6.5–11.4] mmol/L/hr; time in hyperglycaemia, 24% [6%–29%] vs 50% [25%–63%]; P < 0.001).ConclusionsPost-breakfast closed-loop glucose control without carbohydrate counting, but based on weight-dependent bolus is feasible but a carbohydrate-matching bolus provides better glucose control.Clinical trial registryNCT01519102  相似文献   

12.
Background:Currently, two different types of continuous glucose monitoring (CGM) systems are available: real time (rt) CGM systems that continuously provide glucose values and intermittent-scanning (is) CGM systems. This study compared accuracy of an rtCGM and an isCGM system when worn in parallel.Methods:Dexcom G5 Mobile (DG5) and FreeStyle Libre (FL) were worn in parallel by 27 subjects for 14 days including two clinic sessions with induced glucose excursions. The percentage of CGM values within ±20% or ±20 mg/dL of the laboratory comparison method results (YSI 2300 STAT Plus, YSI Inc., Yellow Springs, OH, United States; glucose oxidase based) or blood glucose meter values and mean absolute relative difference (MARD) were calculated. Consensus error grid and continuous glucose error grid analyses were performed to assess clinical accuracy.Results:Both systems displayed clinically accurate readings. Compared to laboratory comparison method results during clinic sessions, DG5 had 91.5% of values within ±20%/20 mg/dL and a MARD of 9.5%; FL had 82.5% of scanned values within ±20%/20 mg/dL and an MARD of 13.6%. Both systems showed a lower level of performance during the home phase and when using the blood glucose meter as reference.Conclusion:The two systems tested in this study represent two different principles of CGM. DG5 generally provided higher accordance with laboratory comparison method results than FL.  相似文献   

13.
Aims/hypothesis  The incretin hormones glucagon-like peptide-1 (GLP-1) and glucose-dependent insulinotrophic peptide (GIP) are released from intestinal endocrine cells in response to luminal glucose. Glucokinase is present in these cells and has been proposed as a glucose sensor. The physiological role of glucokinase can be tested using individuals with heterozygous glucokinase gene (GCK) mutations. If glucokinase is the gut glucose sensor, GLP-1 and GIP secretion during a 75 g OGTT would be lower in GCK mutation carriers compared with controls. Methods  We compared GLP-1 and GIP concentrations measured at five time-points during a 75 g OGTT in 49 participants having GCK mutations with those of 28 familial controls. Mathematical modelling of glucose, insulin and C-peptide was used to estimate basal insulin secretion rate (BSR), total insulin secretion (TIS), beta cell glucose sensitivity, potentiation factor and insulin secretion rate (ISR). Results  GIP and GLP-1 profiles during the OGTT were similar in GCK mutation carriers and controls (p = 0.52 and p = 0.44, respectively). Modelled variables of beta cell function showed a reduction in beta cell glucose sensitivity (87 pmol min−1 m−2 [mmol/l]−1 [95% CI 66–108] vs 183 pmol min−1 m−2 [mmol/l]−1 [95% CI 155–211], p < 0.001) and potentiation factor (1.5 min [95% CI 1.2–1.8] vs 2.2 min [95% CI 1.8–2.7], p = 0.007) but no change in BSR or TIS. The glucose/ISR curve was right-shifted in GCK mutation carriers. Conclusions/interpretation  Glucokinase, the major pancreatic glucose sensor, is not the main gut glucose sensor. By modelling OGTT data in GCK mutation carriers we were able to distinguish a specific beta cell glucose-sensing defect. Our data suggest a reduction in potentiation of insulin secretion by glucose that is independent of differences in incretin hormone release.  相似文献   

14.
《JACC: Cardiovascular Imaging》2020,13(12):2498-2509
ObjectivesThis study sought to evaluate left ventricular (LV) structure and function in pheochromocytoma and paraganglioma (PPGL) patients before and after curative surgery.BackgroundData on catecholamine-induced effects on LV structure and function in patients with PPGL are limited and conflicting.MethodsThe study evaluated 81 consecutive patients with a PPGL, among whom 66 were evaluated 12 months after tumor removal. Fifty patients matched for age, sex, hypertension presence, and blood pressure (BP) levels served as a control group (non-PPGL group). Echocardiography was employed to assess the LV mass index (LVMI), systolic function including speckle tracking echocardiography, and diastolic function.ResultsPatients with PPGL were characterized by higher LVMI (median 103 [interquartile range (IQR): 88 to 132] g/m2 vs. median 94 [IQR: 74 to 106] g/m2; p = 0.006) and frequency of LV hypertrophy (44.4% vs. 24.0%; p = 0.018) compared with the non-PPGL group. Patients with PPGLs were characterized by lower global longitudinal strain (GLS) and early diastolic mitral annular velocity compared with patients in the non-PPGL group (median –17.2% [IQR: 15.6% to 18.9%] vs. median –19.3% [IQR: 17.7% to 20.6%]; p < 0.001; and median 11.1 [IQR: 8.3 to 13.0] cm/s vs. median 12.3 [IQR: 10.6 to 14.6] cm/s; p = 0.018, respectively). Presence of LV hypertrophy and GLS were independently associated with plasma free metanephrine concentrations. In operated patients, there were lower frequencies of LV hypertrophy (39.4% vs. 22.7%; p = 0.003), LVMI (median 98 [IQR: 85 to 115] g/m2 vs. median 90 [IQR: 76 to 109] g/m2; p < 0.001), and the ratio of transmitral early diastolic velocity to early diastolic mitral annular velocity (median 6.8 [IQR: 5.5 to 8.6] vs. median 6.0 [IQR: 5.0 to 7.6]; p = 0.005) but higher values for GLS (median –17.4 [IQR: –15.8 to 19.1] vs. median −18.5 [IQR: –17.1 to 20.1] p < 0.001) after compared with before surgery.ConclusionsCatecholamine excess in patients with PPGLs can lead not only to LV hypertrophy, but also to impairment of systolic LV function and subclinical alterations of diastolic LV function, independently of BP levels. These structural and functional changes are reversible after surgical intervention.  相似文献   

15.
Background and aimsThe senses of taste and smell are essential determinants of food choice, which in turn may contribute to the development of chronic diseases, including diabetes. Although past studies have evaluated the relationship between type 2 diabetes mellitus (DM2) and senses disorders, this relationship remains controversial.In this study, we evaluated taste and smell perception in DM2 patients and healthy controls (HC). Moreover, we analyzed the association of chemosensory impairments with anthropometric and clinical outcomes (e.g. Body Mass Index (BMI), Fasting blood glucose (FBG), drugs, cardiovascular diseases (CVD), and hypertension) in DM2 patients.Methods and resultsThe study included 94 DM2 patients and 244 HC. Taste recognition for 6-n-propylthiouracil (PROP), quinine, citric acid, sucrose, and sodium chloride (NaCl) compounds was assessed using a filter paper method, while smell recognition of 12 odorants was performed using a Sniffin’ sticks test.We found that a higher percentage of DM2 patients showed identification impairment in salt taste (22% vs. 5%, p-value<0.0009) and smell recognition (55% vs. 27%, p-value = 0.03) compared to HC. We also observed that 65% of hypertensive DM2 subjects presented smell identification impairment compared to 18% of non-hypertensive patients (p-value = 0.019). Finally, patients with impairments in both taste and smell showed elevated FBG compared to patients without impairment (149.6 vs.124.3 mg/dL, p-value = 0.04).ConclusionThe prevalence of taste and smell identification impairments was higher in DM2 patients compared to HC, and a possible relationship with glycemic levels emerged.  相似文献   

16.
BackgroundThe study aimed to evaluate the clinical outcomes of tailored adjuvant chemotherapy according to human equilibrative nucleoside transporter 1 (hENT1) expression in resected pancreatic ductal adenocarcinoma (PDA).MethodsPatients who underwent pancreatectomy for PDA were enrolled prospectively. According to intra-tumoral hENT1 expression, the high hENT1 (≥50%) group received gemcitabine and the low hENT1 (<50%) group received 5-fluorouracil plus folinic acid (5-FU/FA). The propensity score-matched control consisted of patients who received hENT1-independent adjuvant chemotherapy. The primary outcome was recurrence free survival (RFS) and the secondary outcomes were overall survival (OS) and toxicities.ResultsBetween May 2015 and June 2017, we enrolled 44 patients with resected PDA. During a median follow-up period of 28.5 months, the intention-to-treat population showed much longer median RFS [22.9 (95% CI, 11.3–34.5) vs. 10.9 (95% CI, 6.9–14.9) months, P = 0.043] and median OS [36.2 (95% CI, 26.5–45.9) vs. 22.1 (95% CI, 17.7–26.6) months, P = 0.001] compared to the controls. Among 5 patients in the low hENT1 group who discontinued treatment, 2 patients receiving 5-FU/FA discontinued treatment due to drug toxicities (febrile neutropenia and toxic epidermal necrolysis).ConclusionTailored adjuvant chemotherapy based on hENT1 staining provides excellent clinical outcomes among patients with resected PDA.Clinical trial registrationclinicaltrials.gov identifier: NCT02486497.  相似文献   

17.
Background and AimsThere is inconsistent evidence supporting the self-monitoring of blood glucose (SMBG) in people with non-insulin treated type 2 diabetes (T2D). Structured SMBG protocols have a greater impact on glycaemic control than unstructured SMBG and may improve measures of glycaemic variability (GV), though few previous studies have reported on specific GV outcomes. Our aim was to determine the impact of structured SMBG on simple measures of GV in people with T2D.MethodsParticipants undertook structured SMBG over 12 months, with HbA1c recorded at baseline and at 3-monthly follow-up. For each participant, the mean blood glucose (MBG), fasting blood glucose (FBG), standard deviation BG (SD-BG), coefficient of variation of BG (CV-BG), mean absolute glucose change (MAG) and HbA1c were determined for each 3-month period. Responders were participants with an improvement in HbA1c of ≥5 mmol/mol (0.5%) over 12 months.ResultsData from two hundred and thirty-one participants were included for analysis. Participants had a baseline median [interquartile range] HbA1c 68.0 [61.5–75.5] mmol/mol (8.4%). Participants demonstrated significant improvements in the MBG (−1.25 mmol/L), FBG (−0.97 mmol/L), SD-BG (−0.44 mmol/L), CV-BG (−1.43%), MAG (−0.97 mmol/L), and HbA1c (−7.0 mmol/mol) (all p < 0.001) at 12 months compared to these measures collected within the first 3 months of SMBG. Responders had a significantly higher baseline median [interquartile range] HbA1c of 70.0 [63.0–78.0] mmol/mol compared to 61.0 [56.5–66.0] mmol/mol in non-responders (P < 0.001).ConclusionsStructured SMBG improved all the observed measures of GV. These results support the use of structured SMBG in people with non-insulin treated T2D.  相似文献   

18.
BackgroundAmong patients with acute coronary syndrome following transcatheter aortic valve replacement (TAVR), those presenting with ST-segment elevation myocardial infarction (STEMI) are at highest risk.ObjectivesThe goal of this study was to determine the clinical characteristics, management, and outcomes of STEMI after TAVR.MethodsThis was a multicenter study including 118 patients presenting with STEMI at a median of 255 days (interquartile range: 9 to 680 days) after TAVR. Procedural features of STEMI after TAVR managed with primary percutaneous coronary intervention (PCI) were compared with all-comer STEMI: 439 non-TAVR patients who had primary PCI within the 2 weeks before and after each post-TAVR STEMI case in 5 participating centers from different countries.ResultsMedian door-to-balloon time was higher in TAVR patients (40 min [interquartile range: 25 to 57 min] vs. 30 min [interquartile range: 25 to 35 min]; p = 0.003). Procedural time, fluoroscopy time, dose-area product, and contrast volume were also higher in TAVR patients (p < 0.01 for all). PCI failure occurred more frequently in patients with previous TAVR (16.5% vs. 3.9%; p < 0.001), including 5 patients in whom the culprit lesion was not revascularized owing to coronary ostia cannulation failure. In-hospital and late (median of 7 months [interquartile range: 1 to 21 months]) mortality rates were 25.4% and 42.4%, respectively (20.6% and 38.2% in primary PCI patients), and estimated glomerular filtration rate <60 ml/min (hazard ratio [HR]: 3.02; 95% confidence interval [CI]: 1.42 to 6.43; p = 0.004), Killip class ≥2 (HR: 2.74; 95% CI: 1.37 to 5.49; p = 0.004), and PCI failure (HR: 3.23; 95% CI: 1.42 to 7.31; p = 0.005) determined an increased risk.ConclusionsSTEMI after TAVR was associated with very high in-hospital and mid-term mortality. Longer door-to-balloon times and a higher PCI failure rate were observed in TAVR patients, partially due to coronary access issues specific to the TAVR population, and this was associated with poorer outcomes.  相似文献   

19.
BackgroundThe pre-operative neutrophil-to-lymphocyte ratio (NLR), when ≥5 has been associated with reduced survival for patients with various gastrointestinal tract cancers, however, it's prognostic value in patients with periampullary tumour has not been reported to date.ObjectivesTo determine the prognostic value of pre-operative NLR in terms of survival and recurrence of resected periampullary carcinomas.MethodsThis was a retrospective cohort study of consecutive patients undergoing pancreatoduodenectomy (PD) for periampullary carcinoma (pancreatic, ampullary, cholangiocarcinoma) identified from a departmental database. The effect of NLR upon survival and recurrence was explored.ResultsOverall median survival amongst 228 patients was 24 months (inter-quartile range [IQR]: 12–43). The median survival for those whose NLR was <5 was not significantly greater than those patients whose NLR was ≥5 (24 months [IQR: 14–42] versus 13 months [IQR: 8–48], respectively; p = 0.234). However, for those that developed recurrence, survival was greater in those with an NLR <5 at (20 months [IQR: 12–27] versus 11 months [IQR: 7–22], respectively; p = 0.038). This effect was most marked in those patients with cholangiocarcinoma (p = 0.019) whilst a trend to worse survival was seen in those with pancreatic adenocarcinoma. No effect was seen in patients with ampullary carcinoma (p = 0.516).ConclusionsThis study provides further evidence that pre-operative NLR offers important prognostic information regarding disease-free survival. This effect, however, is dependent upon the tumour type amongst patients undergoing PD.  相似文献   

20.
Background and aimsTepehuanos Indians, a traditional Mexican ethnic group, followed a vegetarian diet exhibiting a low prevalence of obesity and the absence of diabetes. However, from the year 2000 the traditional diet of the Tepehuanos was modified by the introduction of western food. In this study we examine the changes in their customary diet and its impact on the prevalence of cardiovascular risk factors in this group.Methods and resultsIndividuals from 12 Tepehuanos communities were randomly enrolled during 1995–1996 and 2006–2007. Using a 64-item semiquantitative food frequency questionnaire macronutrient intakes were calculated from values of Mexican food-composition tables. Cardiovascular risk factors such as obesity, hypertension, hyperglycemia and dyslipidemia were determined.The median (25, 75 percentile) of total caloric intake (1476 [1083, 1842]–2100 [1366, 2680] kcal/day, p < 0.001) as well as the percentage of energy consumed from saturated fat (3.0 [2.7,4.1]–7.2 [3.9,7.4], p < 0.0001) and protein (8.2 [7.8,8.9]–16.8 [16.3,17.1], p < 0.0001) increased, whereas the percentage of total calorie intake from carbohydrates (66.4 [61.3,69.5]–61.3 [61,68.8], p < 0.0001), polyunsaturated fat (11.2 [10.3,12.1]–4.0 [3.9,4.3], p < 0.0001), and the polyunsaturated:saturated fat ratio (3.84–0.53%, p < 0.0001) decreased during the period of study. The prevalence of obesity (11.1–21.9%, p = 0.04), impaired fasting glucose (5.9–14.9%, p = 0.04), diabetes (0.0–0.88%, p = 0.48), hypertension (1.7–3.4%, p = 0.43), triglycerides (2.6–16.7%, p = 0.0006), and low HDL-cholesterol (10.2–71.1%, p < 0.0001) increased.ConclusionsChanges in the customary diet introduced in the Tepehuanos communities are related to the increase of cardiovascular risk factors.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号