首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The association between serum concentrations of zinc, copper, or iron and the risk of metabolic syndrome are inconclusive. Therefore, we conduct a case-control study to explore the relationship between serum levels of zinc, copper, or iron and metabolic syndrome as well as each metabolic factor and insulin resistance. We enrolled 1165 adults, aged ≥ 40 (65.8 ± 10) years in a hospital-based population to compare the serum levels of zinc, copper, and iron between subjects with and without metabolic syndrome by using multivariate logistic regression analyses. The least square means were computed by general linear models to compare serum concentrations of zinc, copper, and iron in relation to the number of metabolic factors. The mean serum concentrations of zinc, copper, and iron were 941.91 ± 333.63 μg/L, 1043.45 ± 306.36 μg/L, and 1246.83 ± 538.13 μg/L, respectively. The odds ratios (ORs) of metabolic syndrome for the highest versus the lowest quartile were 5.83 (95% CI: 3.35–10.12; p for trend < 0.001) for zinc, 2.02 (95% CI: 1.25–3.25; p for trend: 0.013) for copper, and 2.11 (95% CI: 1.24–3.62; p for trend: 0.021) for iron after adjusting for age, sex, personal habits, body mass index, and homeostatic model assessment insulin resistance. Additionally, the serum zinc, copper, and iron concentrations increased as the number of metabolic factors rose (p for trend < 0.001). This was the first study to clearly demonstrate that higher serum levels of zinc, copper, and iron were associated with the risk of metabolic syndrome and the number of metabolic factors independent of BMI and insulin resistance.  相似文献   

2.
Background: Childhood lead exposure has been associated with growth delay. However, the association between blood lead levels (BLLs) and insulin-like growth factor 1 (IGF-1) has not been characterized in a large cohort with low-level lead exposure.Methods: We recruited 394 boys 8–9 years of age from an industrial Russian town in 2003–2005 and followed them annually thereafter. We used linear regression models to estimate the association of baseline BLLs with serum IGF-1 concentration at two follow-up visits (ages 10–11 and 12–13 years), adjusting for demographic and socioeconomic covariates.Results: At study entry, median BLL was 3 μg/dL (range, < 0.5–31 μg/dL), most boys (86%) were prepubertal, and mean ± SD height and BMI z-scores were 0.14 ± 1.0 and –0.2 ± 1.3, respectively. After adjustment for covariates, the mean follow-up IGF-1 concentration was 29.2 ng/mL lower (95% CI: –43.8, –14.5) for boys with high versus low BLL (≥ 5 μg/dL or < 5 μg/dL); this difference persisted after further adjustment for pubertal status. The association of BLL with IGF-1 was stronger for mid-pubertal than prepubertal boys (p = 0.04). Relative to boys with BLLs < 2 μg/dL, adjusted mean IGF-1 concentrations decreased by 12.8 ng/mL (95% CI: –29.9, 4.4) for boys with BLLs of 3–4 μg/dL; 34.5 ng/mL (95% CI: –53.1, –16.0) for BLLs 5–9 μg/dL; and 60.4 ng/mL (95% CI: –90.9, –29.9) for BLLs ≥ 10 μg/dL.Conclusions: In peripubertal boys with low-level lead exposure, higher BLLs were associated with lower serum IGF-1. Inhibition of the hypothalamic–pituitary–growth axis may be one possible pathway by which lead exposure leads to growth delay.  相似文献   

3.
Mushrooms are rich in bioactive compounds. The potential health benefits associated with mushroom intake have gained recent research attention. We thus conducted a systematic review and meta-analysis to assess the association between mushroom intake and risk of cancer at any site. We searched MEDLINE, Web of Science, and Cochrane Library to identify relevant studies on mushroom intake and cancer published from 1 January, 1966, up to 31 October, 2020. Observational studies (n = 17) with RRs, HRs, or ORs and 95% CIs of cancer risk for ≥2 categories of mushroom intake were eligible for the present study. Random-effects meta-analyses were conducted. Higher mushroom consumption was associated with lower risk of total cancer (pooled RR for the highest compared with the lowest consumption groups: 0.66; 95% CI: 0.55, 0.78; n = 17). Higher mushroom consumption was also associated with lower risk of breast cancer (pooled RR for the highest compared with the lowest consumption groups: 0.65; 95% CI: 0.52, 0.81; n = 10) and nonbreast cancer (pooled RR for the highest compared with the lowest consumption groups: 0.80; 95% CI: 0.66, 0.97; n = 13). When site-specific cancers were examined, a significant association with mushroom consumption was only observed with breast cancer; this could be due to the small number of studies that were conducted with other cancers. There was evidence of a significant nonlinear dose–response association between mushroom consumption and the risk of total cancer (P-nonlinearity = 0.001; n = 7). Limitations included the potential for recall and selection bias in case-control designs, which comprised 11 out of the 17 studies included in this meta-analysis, and the large variation in the adjustment factors used in the final models from each study. The association between higher mushroom consumption and lower risk of cancer, particularly breast cancer, may indicate a potential protective role for mushrooms in the diet.  相似文献   

4.
Selenium status of the Danish population is below that assumed optimal for the suggested protective effects against chronic diseases, including certain cancers. Fish and shellfish are important dietary sources of selenium in Denmark. We investigated the effect of increased fish and mussel intake on selenium blood concentrations in a population with relatively low habitual dietary selenium intake. We randomly assigned 102 healthy men and women (all non-smokers) aged 48–76 years to an intervention group (n = 51) or a control group (n = 51). Intervention participants received 1000 g fish and mussels/week for 26 weeks (~50 μg selenium/day). Controls received no intervention. Non-fasting blood samples were taken and whole blood selenium was determined using inductively coupled plasma-mass spectrometry (ICP-MS), and plasma selenoprotein P (SelP) was determined by high performance liquid chromatography coupled to ICP-MS. All available observations were included in linear multiple regression analysis to evaluate the effect of the intervention. The difference in mean change for intervention compared with control persons was 14.9 ng/mL (95% CI: 10.2, 19.7) for whole blood selenium, and 7.0 ng/mL (95% CI: 3.1, 10.9) for plasma SelP (Weeks 0–26). Selenium concentrations were significantly increased after 26 weeks of intervention, albeit to a lower degree than expected.  相似文献   

5.
6.
Background: The correlation between microRNA, obesity, and glycemic intolerance in patients on peritoneal dialysis (PD) is unknown. We aimed to measure the adipose and plasma miR-221 and -222 levels, and to evaluate their association with adiposity, glucose intolerance, and new onset diabetes mellitus (NODM) after the commencement of PD. Methods: We prospectively recruited incident adult PD patients. miR-221 and -222 were measured from adipose tissue and plasma obtained during PD catheter insertion. These patients were followed for 24 months, and the outcomes were changes in adiposity, insulin resistance, and NODM after PD. Results: One hundred and sixty-five patients were recruited. Patients with pre-existing DM had higher adipose miR-221 (1.1 ± 1.2 vs. 0.7 ± 0.9-fold, p = 0.02) and -222 (1.9 ± 2.0 vs. 1.2 ± 1.3-fold, p = 0.01). High adipose miR-221 and -222 levels were associated with a greater increase in waist circumference (miR-221: beta 1.82, 95% CI 0.57–3.07, p = 0.005; miR-222: beta 1.35, 95% CI 0.08–2.63, p = 0.038), Homeostatic Model Assessment for Insulin Resistance (HOMA) index (miR-221: beta 8.16, 95% CI 2.80–13.53, p = 0.003; miR-222: beta 6.59, 95% CI 1.13–12.05, p = 0.018), and insulin requirements (miR-221: beta 0.05, 95% CI 0.006–0.09, p = 0.02; miR-222: beta 0.06, 95% CI 0.02–0.11, p = 0.002) after PD. The plasma miR-222 level predicted the onset of NODM (OR 8.25, 95% CI 1.35–50.5, p = 0.02). Conclusion: miR-221 and -222 are associated with the progression of obesity, insulin resistance, and NODM after PD.  相似文献   

7.
ObjectiveTo compare non-tuberculosis (non-TB)-cause mortality risk overall and cause-specific mortality risks within the immigrant population of British Columbia (BC) with and without TB diagnosis through time-dependent Cox regressions.MethodsAll people immigrating to BC during 1985–2015 (N = 1,030,873) were included with n = 2435 TB patients, and the remaining as non-TB controls. Outcomes were time-to-mortality for all non-TB causes, respiratory diseases, cardiovascular diseases, cancers, and injuries/poisonings, and were ascertained using ICD-coded vital statistics data. Cox regressions were used, with a time-varying exposure variable for TB diagnosis.ResultsThe non-TB-cause mortality hazard ratio (HR) was 4.01 (95% CI 3.57–4.51) with covariate-adjusted HR of 1.69 (95% CI 1.50–1.91). Cause-specific covariate-adjusted mortality risk was elevated for respiratory diseases (aHR = 2.96; 95% CI 2.18–4.00), cardiovascular diseases (aHR = 1.63; 95% CI 1.32–2.02), cancers (aHR = 1.40; 95% CI 1.13–1.75), and injuries/poisonings (aHR = 1.85; 95% CI 1.25–2.72).ConclusionsIn any given year, if an immigrant to BC was diagnosed with TB, their risk of non-TB mortality was 69% higher than if they were not diagnosed with TB. Healthcare providers should consider multiple potential threats to the long-term health of TB patients during and after TB treatment. TB guidelines in high-income settings should address TB survivor health.Electronic supplementary materialThe online version of this article (10.17269/s41997-020-00345-y) contains supplementary material, which is available to authorized users.  相似文献   

8.
Existing reports focus on zinc-associated immunity and infection in malnourished children; however, whether zinc also plays an important role in the immune homeostasis of the non-zinc-deficient population remained unknown. This study aimed to investigate the association between zinc status and toll-like receptor (TLR)-related innate immunity and infectious outcome in well-nourished children. A total of 961 blood samples were collected from 1 through 5 years of age. Serum zinc was analyzed, and mononuclear cells isolated to assess TNF-α, IL-6, and IL-10 production by ELISA after stimulation with TLR ligands. Childhood infections were analyzed as binary outcomes with logistic regression. The prevalence of zinc deficiency was 1.4–9.6% throughout the first 5 years. There was significant association between zinc and TLR-stimulated cytokine responses. Higher serum zinc was associated with decreased risk of ever having pneumonia (aOR: 0.94; 95% CI: 0.90, 0.99) at 3 years, and enterocolitis (aOR: 0.96; 95% CI: 0.93, 0.99) at 5 years. Serum zinc was lower in children who have had pneumonia before 3 years of age (72.6 ± 9 vs. 81.9 ± 13 μg/dL), and enterocolitis before 5 years (89.3 ± 12 vs. 95.5 ± 13 μg/dL). We emphasize the importance of maintaining optimal serum zinc in the young population.  相似文献   

9.

Background

The phasing out of lead from gasoline has resulted in a significant decrease in blood lead levels (BLLs) in children during the last two decades. Tetraethyl lead was phased out in DRC in 2009. The objective of this study was to test for reduction in pediatric BLLs in Kinshasa, by comparing BLLs collected in 2011 (2 years after use of leaded gasoline was phased out) to those collected in surveys conducted in 2004 and 2008 by Tuakuila et al. (when leaded gasoline was still used).

Methods

We analyzed BLLs in a total of 100 children under 6 years of age (Mean ± SD: 2.9 ± 1.6 age, 64% boys) using inductively coupled argon plasma mass spectrometry (ICP – MS).

Results

The prevalence of elevated BLLs (≥ 10 μg/dL) in children tested was 63% in 2004 [n = 100, GM (95% CI) = 12.4 μg/dL (11.4 – 13.3)] and 71% in 2008 [(n = 55, GM (95% CI) = 11.2 μg/dL (10.3 – 14.4)]. In the present study, this prevalence was 41%. The average BLLs for the current study population [GM (95% CI) = 8.7 μg/dL (8.0 – 9.5)] was lower than those found by Tuakuila et al. (F = 10.38, p <0.001) as well as the CDC level of concern (10 μ/dL), with 3% of children diagnosed with BLLs ≥ 20 μg/dL.

Conclusion

These results demonstrate a significant success of the public health system in Kinshasa, DRC-achieved by the removal of lead from gasoline. However, with increasing evidence that adverse health effects occur at BLLs < 10 μg/dL and no safe BLLs in children has been identified, the BLLs measured in this study continue to constitute a major public health concern for Kinshasa. The emphasis should shift to examine the contributions of non-gasoline sources to children’s BLLs: car batteries recycling in certain residences, the traditional use of fired clay for the treatment of gastritis by pregnant women and leaded paint.  相似文献   

10.
Background: Human milk is a potential source of lead exposure. Yet lactational transfer of lead from maternal blood into breast milk and its contribution to infant lead burden remains poorly understood.Objectives: We explored the dose–response relationships between maternal blood, plasma, and breast milk to better understand lactational transfer of lead from blood and plasma into milk and, ultimately, to the breastfeeding infant.Methods: We measured lead in 81 maternal blood, plasma, and breast milk samples at 1 month postpartum and in 60 infant blood samples at 3 months of age. Milk-to-plasma (M/P) lead ratios were calculated. Multivariate linear, piecewise, and generalized additive models were used to examine dose–response relationships between blood, plasma, and milk lead levels.Results: Maternal lead levels (mean ± SD) were as follows: blood: 7.7 ± 4.0 μg/dL; plasma: 0.1 ± 0.1 μg/L; milk: 0.8 ± 0.7 μg/L. The average M/P lead ratio was 7.7 (range, 0.6–39.8) with 97% of the ratios being > 1. The dose–response relationship between plasma lead and M/P ratio was nonlinear (empirical distribution function = 6.5, p = 0.0006) with the M/P ratio decreasing by 16.6 and 0.6 per 0.1 μg/L of plasma lead, respectively, below and above 0.1 μg/L plasma lead. Infant blood lead level (3.4 ± 2.2 μg/dL) increased by 1.8 μg/dL per 1 μg/L milk lead (p < 0.0001, R2 = 0.3).Conclusions: The M/P ratio for lead in humans is substantially higher than previously reported, and transfer of lead from plasma to milk may be higher at lower levels of plasma lead. Breast milk is an important determinant of lead burden among breastfeeding infants.Citation: Ettinger AS, Roy A, Amarasiriwardena CJ, Smith DR, Lupoli N, Mercado-García A, Lamadrid-Figueroa H, Tellez-Rojo MM, Hu H, Hernández-Avila M. 2014. Maternal blood, plasma, and breast milk lead: lactational transfer and contribution to infant exposure. Environ Health Perspect 122:87–92; http://dx.doi.org/10.1289/ehp.1307187  相似文献   

11.

Objective

Prostate cancer (PCa) is one of the major causes of death among men. Our study investigated the association of ESR1 and ESR2 genotypes with susceptibility to PCa in relation to smoking status in Japanese.

Method

A case–control study was performed with 750 Japanese prostate cancer patients and 870 healthy controls. After age-matching in case–controls, 352 controls and 352 cases were enrolled in this study. By using logistic regression analysis, the different genotypes from ESR1 and ESR2 were analyzed according to case/control status.

Result

ESR2 rs4986938 AG and AG + AA genotypes were associated with significantly decreased risk of PCa (AG: OR = 0.68, 95 % CI 0.47–0.97, P < 0.05 and AG + AA: OR = 0.67, 95 % CI 0.47–0.94, P < 0.05). However, there was no significant association between ESR1 rs2234693 and PCa risk. When patients were grouped according to smoking status, the ESR2 rs1256049 AA genotype (OR = 0.48, 95 % CI 0.25–0.95, P < 0.05) and ESR2 rs4986938 AG + AA genotype (OR = 0.64, 95 % CI 0.41–1.00, P < 0.05) showed significantly decreased PCa risk in the ever-smoker group.

Conclusion

Our results suggest that the estrogen receptor ESR2 has a very important function to predict PCa and that different SNPs have different predictive values. Smoking may influence estrogenic activity and may influence PCa together with the estrogen receptor.  相似文献   

12.
Background: The role of environmental exposure to lead as a risk factor for chronic kidney disease (CKD) and its progression remains controversial, and most studies have been limited by a lack of direct glomerular filtration rate (GFR) measurement.Objective: We evaluated the association between lead exposure and GFR in children with CKD.Methods: In this cross-sectional study, we examined the association between blood lead levels (BLLs) and GFR measured by the plasma disappearance of iohexol among 391 participants in the Chronic Kidney Disease in Children (CKiD) prospective cohort study.Results: Median BLL and GFR were 1.2 µg/dL and 44.4 mL/min per 1.73 m2, respectively. The average percent change in GFR for each 1-µg/dL increase in BLL was –2.1 (95% CI: –6.0, 1.8). In analyses stratified by CKD diagnosis, the association between BLL and GFR was stronger among children with glomerular disease underlying CKD; in this group, each 1-µg/dL increase in BLL was associated with a –12.1 (95% CI: –22.2, –1.9) percent change in GFR. In analyses stratified by anemia status, each 1-µg/dL increase in BLL among those with and without anemia was associated with a –0.3 (95% CI: –7.2, 6.6) and –4.6 (95% CI: –8.9, –0.3) percent change in GFR, respectively.Conclusions: There was no significant association between BLL and directly measured GFR in this relatively large cohort of children with CKD, although associations were observed in some subgroups. Longitudinal analyses are needed to examine the temporal relationship between lead and GFR decline, and to further examine the impact of underlying cause of CKD and anemia/hemoglobin status among patients with CKD.  相似文献   

13.
14.
Previous meta-analysis studies have indicated inverse associations between some carotenoids and risks of metabolic syndrome, cardiovascular disease, cancer, and all-cause mortality. However, the results for associations between carotenoids and type 2 diabetes (T2D) remain inconsistent and no systematic assessment has been done on this topic. We conducted a systematic review and meta-analysis to examine the associations of dietary intakes and circulating concentrations of carotenoids with risk of T2D. We searched PubMed and Ovid Embase from database inception to July 2020. Prospective observational studies of carotenoids and T2D risk were included. Random-effects models were used to summarize the RRs and 95% CIs. Thirteen publications were included. Dietary intake of β-carotene was inversely associated with the risk of T2D, and the pooled RR comparing the highest with the lowest categories was 0.78 (95% CI: 0.70, 0.87; I2 = 13.7%; n = 6); inverse associations were also found for total carotenoids (n = 2), α-carotene (n = 4), and lutein/zeaxanthin (n = 4), with pooled RRs ranging from 0.80 to 0.91, whereas no significant associations were observed for β-cryptoxanthin and lycopene. Circulating concentration of β-carotene was associated with a lower risk of T2D, and the pooled RR comparing extreme categories was 0.60 (95% CI: 0.46, 0.78; I2 = 56.2%; n = 7); inverse associations were also found for total carotenoids (n = 3), lycopene (n = 4), and lutein (n = 2), with pooled RRs ranging from 0.63 to 0.85, whereas no significant association was found for circulating concentrations of α-carotene and zeaxanthin when comparing extreme categories. Dose-response analysis indicated that nonlinear relations were observed for circulating concentrations of α-carotene, β-carotene, lutein, and total carotenoids (all P-nonlinearity < 0.05), but not for other carotenoids or dietary exposures. In conclusion, higher dietary intakes and circulating concentrations of total carotenoids, especially β-carotene, were associated with a lower risk of T2D. More studies are needed to confirm the causality and explore the role of foods rich in carotenoids in prevention of T2D.This systematic review was registered at www.crd.york.ac.uk/prospero as CRD42020196616.  相似文献   

15.
We developed a confidence interval-(CI) assessing model in multivariable normal tissue complication probability (NTCP) modeling for predicting radiation-induced liver disease (RILD) in primary liver cancer patients using clinical and dosimetric data. Both the mean NTCP and difference in the mean NTCP (ΔNTCP) between two treatment plans of different radiotherapy modalities were further evaluated and their CIs were assessed. Clinical data were retrospectively reviewed in 322 patients with hepatocellular carcinoma (n = 215) and intrahepatic cholangiocarcinoma (n = 107) treated with photon therapy. Dose–volume histograms of normal liver were reduced to mean liver dose (MLD) based on the fraction size-adjusted equivalent uniform dose. The most predictive variables were used to build the model based on multivariable logistic regression analysis with bootstrapping. Internal validation was performed using the cross-validation leave-one-out method. Both the mean NTCP and the mean ΔNTCP with 95% CIs were calculated from computationally generated multivariate random sets of NTCP model parameters using variance–covariance matrix information. RILD occurred in 108/322 patients (33.5%). The NTCP model with three clinical and one dosimetric parameter (tumor type, Child–Pugh class, hepatitis infection status and MLD) was most predictive, with an area under the receiver operative characteristics curve (AUC) of 0.79 (95% CI 0.74–0.84). In eight clinical subgroups based on the three clinical parameters, both the mean NTCP and the mean ΔNTCP with 95% CIs were able to be estimated computationally. The multivariable NTCP model with the assessment of 95% CIs has potential to improve the reliability of the NTCP model-based approach to select the appropriate radiotherapy modality for each patient.  相似文献   

16.
Our current study aimed to estimate the relationship between dietary patterns and hyperuricemia among the Chinese elderly over 60 years old. All the data were obtained from China Nutrition and Health Surveillance during 2015–2017. A total of 18,691 participants who completed the whole survey were included in our statistical analysis. The definition of hyperuricemia was 420 μmmol/L (7 mg/dL) for male and 360 μmmol/L (6 mg/dL) for female. Exploratory factor analysis was applied to explore posterior dietary patterns in our samples, and five dietary patterns were recognized, namely “Typical Chinese”, “Modern Chinese”, “Western”, “Animal products and alcohol”, and “Tuber and fermented vegetables”. After multiple adjusted logistic regression, participants in the highest quartile of “typical Chinese” (Q4 vs. Q1, OR = 0.32, 95% CI: 0.28–0.37, p-trend < 0.0001), “modern Chinese” (Q4 vs. Q1, OR = 0.81, 95% CI: 0.71–0.93, p-trend = 0.0021) and “tuber and fermented vegetables” (Q4 vs. Q1, OR = 0.78, 95% CI: 0.69–0.88, p-trend < 0.0001) showed a lower risk of hyperuricemia, while animal products and alcohol was positively associated with hyperuricemia (Q4 vs. Q1, OR = 1.49, 95% CI: 1.31–1.7, p-trend < 0.0001). We also found that participants who mainly ate a modern Chinese diet tended to meet the RNI/AI of nutrients we discuss in this paper, which may supply some information for hyperuricemia prevention and management by dietary methods.  相似文献   

17.

Background

Pneumococcal conjugate vaccines (PCV) reduce disease due to Streptococcus pneumoniae. We aimed to determine the efficacy of different PCV schedules in Gambian children.

Methods

We reanalysed data from a randomised placebo-controlled trial. Infants aged 6–51 weeks were allocated to three doses of nine-valent PCV (n = 8718) or placebo (n = 8719) and followed until age 30 months. We categorised participants to compare: (a) a first dose at age 6 or 10 weeks, (b) intervals of 1 or 2 months between doses, and (c) different intervals between second and third doses. The primary endpoint was first episode of radiologic pneumonia; other endpoints were hospitalisation and mortality. Using the placebo group as the reference population, Poisson regression models were used with follow-up after the first dose to estimate the efficacy of each schedule and from age 6 weeks to estimate the incidence rate difference between schedules.

Results

Predicted efficacy in the groups aged 6 weeks (n = 2467, 154 events) or 10 weeks (n = 2420, 106 events) at first dose against radiologic pneumonia were 32% (95% CI 19–43%) and 33% (95% CI 21–44%), against hospitalisation 14% (95% CI 3–23%) and 17% (95% CI 7–26%), and against mortality 17% (95% CI −3 to 33%) and 16% (95% CI −3 to 32%) respectively. Predicted efficacy in the groups with intervals of 1 month (n = 2701, 133 events) or 2 months (n = 1351, 58 events) between doses against radiologic pneumonia were 33% (95% CI 20–44%) and 36% (95% CI 24–46%), against hospitalisation 15% (95% CI 5–24%) and 18% (95% CI 8–27%), and against mortality 17% (95% CI −2 to 33%) and 13% (95% CI −8 to 29%) respectively. Efficacy did not differ by interval between second and third doses, nor did the incidence rate difference between schedules.

Conclusions

We found no evidence that efficacy or effectiveness of PCV9 differed when doses were given with modest variability around the scheduled ages or intervals between doses.  相似文献   

18.
ObjectiveTo determine whether preventive dental visits are associated with fewer subsequent nonpreventive dental visits and lower dental expenditures.Data SourcesIndiana Medicaid enrollment and claims data (2015–2018) and the Area Health Resource File.Study DesignA repeated measures design with individual and year fixed effects examining the relationship between preventive dental visits (PDVs) and nonpreventive dental visits (NPVs) and dental expenditures.Data Collection/Extraction MethodsNot applicable.Principal FindingsOf 28,152 adults (108,349 observation‐years) meeting inclusion criteria, 36.0% had a dental visit, 27.8% a PDV, and 22.1% a NPV. Compared to no PDV in the prior year, at least one was associated with fewer NPVs  = −0.13; 95% CI –0.12, −0.11), lower NPV expenditures (β = −$29.12.53; 95% CI –28.07, −21.05), and lower total dental expenditures (−$70.12; 95% –74.92, −65.31), as well as fewer PDVs (β = −0.24; 95% CI –0.26, −0.23).ConclusionsOur findings suggest that prior year PDVs are associated with fewer subsequent NPVs and lower dental expenditures among Medicaid‐enrolled adults. Thus, from a public insurance program standpoint, supporting preventive dental care use may translate into improved population oral health outcomes and lower dental costs among certain low‐income adult populations, but barriers to consistent utilization of PDV prohibit definitive findings.  相似文献   

19.
Campylobacteriosis is a disease of worldwide importance, but aspects of its transmission dynamics, particularly risk factors, are still poorly understood. We used data from a matched case-control study of 4,269 men who have sex with men (MSM) and 26,215 controls, combined with national surveillance data on Campylobacter spp., Salmonella spp., and Shigella spp., to calculate matched odds ratios (mORs) for infection among MSM and controls. MSM had higher odds of Campylobacter (mOR 14, 95% CI 10–21) and Shigella (mOR 74, 95% CI 27–203) infections, but not Salmonella (mOR 0.2, 95% CI 0–13), and were less likely than controls to have acquired Campylobacter infection abroad (χ2 = 21; p<0.001). Our results confirm that sexual contact is a risk factor for campylobacteriosis and also suggest explanations for unique features of Campylobacter epidemiology. These findings provide a baseline for updating infection risk guidelines to the general population.  相似文献   

20.
The etiology of multifactorial morbidities such as undernutrition and anemia in children living with the human immunodeficiency virus (HIV) (HIV+) on antiretroviral therapy (ART) is poorly understood. Our objective was to examine associations of HIV and iron status with nutritional and inflammatory status, anemia, and dietary intake in school-aged South African children. Using a two-way factorial case-control design, we compared four groups of 8 to 13-year-old South African schoolchildren: (1) HIV+ and low iron stores (inflammation-unadjusted serum ferritin ≤ 40 µg/L), n = 43; (2) HIV+ and iron sufficient non-anemic (inflammation-unadjusted serum ferritin > 40 µg/L, hemoglobin ≥ 115 g/L), n = 41; (3) children without HIV (HIV-ve) and low iron stores, n = 45; and (4) HIV-ve and iron sufficient non-anemic, n = 45. We assessed height, weight, plasma ferritin (PF), soluble transferrin receptor (sTfR), plasma retinol-binding protein, plasma zinc, C-reactive protein (CRP), α-1-acid glycoprotein (AGP), hemoglobin, mean corpuscular volume, and selected nutrient intakes. Both HIV and low iron stores were associated with lower height-for-age Z-scores (HAZ, p < 0.001 and p = 0.02, respectively), while both HIV and sufficient iron stores were associated with significantly higher CRP and AGP concentrations. HIV+ children with low iron stores had significantly lower HAZ, significantly higher sTfR concentrations, and significantly higher prevalence of subclinical inflammation (CRP 0.05 to 4.99 mg/L) (54%) than both HIV-ve groups. HIV was associated with 2.5-fold higher odds of iron deficient erythropoiesis (sTfR > 8.3 mg/L) (95% CI: 1.03–5.8, p = 0.04), 2.7-fold higher odds of subclinical inflammation (95% CI: 1.4–5.3, p = 0.004), and 12-fold higher odds of macrocytosis (95% CI: 6–27, p < 0.001). Compared to HIV-ve counterparts, HIV+ children reported significantly lower daily intake of animal protein, muscle protein, heme iron, calcium, riboflavin, and vitamin B12, and significantly higher proportions of HIV+ children did not meet vitamin A and fiber requirements. Compared to iron sufficient non-anemic counterparts, children with low iron stores reported significantly higher daily intake of plant protein, lower daily intake of vitamin A, and lower proportions of inadequate fiber intake. Along with best treatment practices for HIV, optimizing dietary intake in HIV+ children could improve nutritional status and anemia in this vulnerable population. This study was registered at clinicaltrials.gov as NCT03572010.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号