首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
Background

Breast cancer-related lymphedema (BCRL) represents a major source of morbidity among breast cancer survivors. Increasing data support early detection of subclinical BCRL followed by early intervention. A randomized controlled trial is being conducted comparing lymphedema progression rates using volume measurements calculated from the circumference using a tape measure (TM) or bioimpedance spectroscopy (BIS).

Methods

Patients were enrolled and randomized to either TM or BIS surveillance. Patients requiring early intervention were prescribed a compression sleeve and gauntlet for 4 weeks and then re-evaluated. The primary endpoint of the trial was the rate of progression to clinical lymphedema requiring complex decongestive physiotherapy (CDP), with progression defined as a TM volume change in the at-risk arm ≥ 10% above the presurgical baseline. This prespecified interim analysis was performed when at least 500 trial participants had ≥ 12 months of follow-up.

Results

A total of 508 patients were included in this analysis, with 109 (21.9%) patients triggering prethreshold interventions. Compared with TM, BIS had a lower rate of trigger (15.8% vs. 28.5%, p < 0.001) and longer times to trigger (9.5 vs. 2.8 months, p = 0.002). Twelve triggering patients progressed to CDP (10 in the TM group [14.7%] and 2 in the BIS group [4.9%]), representing a 67% relative reduction and a 9.8% absolute reduction (p = 0.130).

Conclusions

Interim results demonstrated that post-treatment surveillance with BIS reduced the absolute rates of progression of BCRL requiring CDP by approximately 10%, a clinically meaningful improvement. These results support the concept of post-treatment surveillance with BIS to detect subclinical BCRL and initiate early intervention.

  相似文献   

2.
BackgroundObesity has long been considered a risk factor for breast cancer–related lymphedema (BCRL), but the benefits of weight reduction in managing BCRL have not been clearly established.ObjectiveTo evaluate the beneficial effects of weight loss interventions (WLIs) on the reduction and prevention of BCRL.MethodsWe conducted a systematic review and meta-analysis by searching the PubMed, Scopus, and Embase databases from their earliest record to October 1st, 2019. We included randomized and non-randomized controlled trials involving adult patients with a history of breast cancer, that compared WLI groups with no-WLI groups, and provided quantitative measurements of lymphedema.ResultsInitial literature search yielded 461 nonduplicate records. After exclusion based on title, abstract, and full-text review, four randomized controlled trials involving 460 participants were included for quantitative analysis. Our meta-analysis revealed a significant between-group mean difference (MD) regarding the volume of affected arm (MD = 244.7 mL, 95% confidence interval [CI]: 145.3–344.0) and volume of unaffected arm (MD = 234.5 mL, 95% CI: 146.9–322.1). However, a nonsignificant between-group MD of −0.07% (95% CI: 1.22–1.08) was observed regarding the interlimb volume difference at the end of the WLIs.ConclusionsIn patients with BCRL, WLIs are associated with decreased volume of the affected and unaffected arms but not with decreased severity of BCRL measured by interlimb difference in arm volume.  相似文献   

3.
《Transplantation proceedings》2021,53(10):2879-2887
BackgroundThe aim of the study was to assess the influence of pretransplant body mass index (BMI [calculated as weight in kilograms divided by height in meters squared]) to the graft and patient 5- and 10-year survival.MethodsOur study group consisted of 706 patients who received their kidney transplant after the year 2000.ResultsAlmost half, 51.9% (n = 372) of the patients had BMI < 25, and 47.6% (n = 336) had BMI ≥ 25. Patients who were overweight or obese were significantly older than other groups (P = .01). The 5-year recipient survival was significantly better in the BMI < 25 group (n = 291, 79.5%) than the BMI ≥ 25 group (n = 238, 70.2%, P < .05). In addition, 10-year recipient survival was better in the BMI < 25 group (n = 175, 47.8%) compared with the BMI ≥ 25 group (n = 127, 37.5%, P < .05). Similarly, 5-year graft survival was better in the BMI < 25 group (66.9%, n = 242) compared with the BMI ≥ 25 group (61.1%, n = 204, P < .05). However, 10-year graft survival was not statistically significant (P = .08). Regarding the impact of diabetes on survival, we found patients with diabetes mellitus to have worse survival in all groups (P = .009).ConclusionsRecipient graft survival was affected by diabetes mellitus independently from being overweight. In the current study, we demonstrated that pretransplant obesity or being overweight affects recipient and graft short-term survival, but long-term comparison of patients who were overweight or obese with patients with normal BMI revealed minimal recipient survival differences and in graft survival analysis no difference. Although in many studies obesity and being overweight predict a bad outcome for kidney transplant recipient survival, our research did not fully confirm it. Diabetes mellitus had worse outcome in all patients groups.  相似文献   

4.
《Transplantation proceedings》2021,53(9):2756-2759
BackgroundThe aim of this study was to determine the effects of Kidney Donor Profile Index (KDPI) and body mass index (BMI) of the deceased donor on the kidney allograft outcome 1 year after transplantation.MethodsWe retrospectively studied 98 deceased kidney allograft donors with a mean age of 56 ± 12 years. The donors were divided into 5 groups according to their BMI: Normal ΒΜΙ = 25 (n = 25); ΒΜΙ 25 to 29 = Overweight (n = 33); ΒΜΙ 30 to 34.9 = Obese class I (n = 19); ΒΜΙ 35 to 39 = Obese class ΙΙ (n = 11); and ΒΜΙ >40 = Obese class III (n = 10). We examined the impact of the deceased donor's BMI and KDPI on delayed graft function (DGF) and estimated renal glomerular filtration rate (eGFR) (measured by the Chronic Kidney Disease Epidemiology Collaboration equation) 1 year after transplantation.ResultsDonor BMI significantly increased the prevalence of DGF (P = .031), and it was associated with higher cold ischemia time (P = .021). However, there was no significant association between the aforementioned BMI groups and 1-year eGFR (P = 0.57), as deceased grafts from donors with increased BMI (BMI > 40) gained sufficient renal function during the first year of transplantation. Moreover, high KDPI was associated not only with DGF (P = .015), but also with decreased values of eGFR (P = .033).ConclusionIn this population, we identified no significant association between donor BMI and long-term clinical outcomes in deceased donor kidney transplants. KDPI, and not ΒΜΙ, of the deceased donor seems to be a good prognostic factor of renal function at the end of the first year after kidney transplant, whereas high BMI and high KDPI markedly induce DGF.  相似文献   

5.
《The Journal of arthroplasty》2023,38(2):314-322.e1
BackgroundObesity is associated with component malpositioning and increased revision risk after total hip arthroplasty (THA). With anterior approaches (AAs) becoming increasingly popular, the goal of this study was to assess whether clinical outcome post-AA-THA is affected by body mass index (BMI).MethodsThis multicenter, multisurgeon, consecutive case series used a prospective database of 1,784 AA-THAs (1,597 patients) through bikini (n = 1,172) or standard (n = 612) incisions. Mean age was 63 years (range, 20-94 years) and there were 57.5% women, who had a mean follow-up of 2.7 years (range, 2.0-4.1 years). Patients were classified into the following BMI groups: normal (BMI < 25.0; n = 572); overweight (BMI: 25.0-29.9; n = 739); obese (BMI: 30.0-34.9; n = 330); and severely obese (BMI ≥ 35.0; n = 143). Outcomes evaluated included hip reconstruction (inclination/anteversion and leg-length, complications, and revision rates) and patient-reported outcomes including Oxford Hip Scores (OHS).ResultsMean postoperative leg-length difference was 2.0 mm (range: ?17.5 to 39.0) with a mean cup inclination of 34.8° (range, 14.0-58.0°) and anteversion of 20.3° (range, 8.0-38.6°). Radiographic measurements were similar between BMI groups (P = .1-.7). Complication and revision rates were 2.5% and 1.7%, respectively. The most common complications were fracture (0.7%), periprosthetic joint infection (PJI) (0.5%), and dislocation (0.5%). There was no difference in dislocation (P = .885) or fracture rates (P = .588) between BMI groups. There was a higher rate of wound complications (1.8%; P = .053) and PJIs (2.1%; P = .029) among obese and severely obese patients. Wound complications were less common among obese patients with the ‘bikini’ incision (odds ratio 2.7). Preoperative OHS was worse among the severely obese (P < .001), which showed similar improvements (Change in OHS; P = .144).ConclusionAA-THA is a credible option for obese patients, with low dislocation or fracture risk and excellent ability to reconstruct the hip, leading to comparable functional improvements among BMI groups. Obese patients have a higher risk of PJIs. Bikini incision for AA-THA can help minimize the risk of wound complications.  相似文献   

6.
《The Journal of arthroplasty》2022,37(11):2171-2177
BackgroundHigher body mass index (BMI) has been associated with higher rates of aseptic loosening following cemented total knee arthroplasty (TKA). However, there is a paucity of evidence on the effect of BMI on the durability of modern cementless TKA. We aimed to assess the association between BMI and clinical outcomes following cementless TKA and to determine if there was a BMI threshold beyond which the risk of revision significantly increased.MethodsWe identified 1,408 cementless TKAs of a modern design from an institutional registry. Patients were classified into BMI categories: normal (n = 136), overweight (n = 476), obese class I (n = 423), II (n = 258), and III (n = 115). The Knee Injury and Osteoarthritis Outcome Score for Joint Replacement and 12-item Short Form Health Survey scores were collected preoperatively and 2 years postoperatively. Survivorship was recorded at minimum 2 years (range, 24 to 88 months). BMI was analyzed as a continuous and categorical variable.ResultsThe improvement in patient-reported outcomes was similar across the groups. Thirty four knees (2.4%) were revised and 14 (1.0%) were for aseptic failure. Mean time-to-revision was 1.2 ± 1.3 years and did not differ across BMI categories (P = .455). Survivorship free from all-cause and aseptic revision was 97.1% and 99.0% at mean 4 years, respectively. Using Cox regression to control for demographics and bilateral procedures, BMI had no association with all-cause revision (P = .612) or aseptic revision (P = .186). Receiver operating characteristic curve analysis found no relationship between BMI and revision risk (c-statistic = 0.51).ConclusionBMI did not influence functional outcomes and survivorship of modern cementless TKA, possibly due to improved biological fixation at the bone-implant interface. Longer follow-up is necessary to confirm these findings.  相似文献   

7.
This single‐institution experience evaluated the use of bioimpedance spectroscopy to facilitate early detection and treatment of breast cancer‐related lymphedema (BCRL) in a cohort of 596 patients (79.6% high risk). Seventy‐three patients (12%) developed an elevated L‐Dex score with axillary lymph node dissection (P < .001), taxane chemotherapy (P = .008), and regional nodal irradiation (P < .001) associated. At last follow‐up, only 18 patients (3%) had unresolved clinically significant BCRL requiring complete decongestive physiotherapy. This rate of BCRL is lower than reported in contemporary studies, supporting recent NCCN guidelines promoting prospective screening, education and intervention for BCRL.  相似文献   

8.
《Transplantation proceedings》2021,53(7):2238-2241
BackgroundThe purpose of this study was to identify factors influencing changes in the body mass index (BMI) of kidney transplant (KT) patients and provide data for the management of the BMI of patients who have undergone KT.MethodThe participants were 106 patients who underwent KT at a single center from August 2014 to June 2017. BMIs were compared and analyzed for 6 months and 24 months after KT, and the survey details were collected through medical records. Analysis was performed between 2 groups, one with increased BMI and the other without. Multivariate logistic regression analysis was performed to identify the factors related to an increase in BMI.ResultsBMI increased from 22.60 ± 2.72 kg/m2 at 6 months to 23.18 ± 3.06 kg/m2 2 years after KT. The group with increased BMI (n = 39) had more patients with higher low-density cholesterol levels at the time of KT (low-density cholesterol ≥100 mg/dL; 34 [54.0%] vs 10 [26.3]; P = .008) and without statin drug use than the other group (n = 67) (statin drug use, 48 [70.6%] vs 34 [87.2%], P = .044). Multiple logistic regression analysis showed that age >50 years (odds ratio [OR] = 2.942; 95% confidence interval [CI], 1.075-8.055; P = .036), low-density lipoprotein >100 mg/dL at KT (OR = 6.618; 95% CI, 2.225-19.682; P = 0.001), and no statin drugs (OR = 5.094; 95% CI, 1.449-17.911, P = .011) were the risk factors for an increased BMI after KT.ConclusionsAfter KT, to prevent an increase in the BMI, clinicians should strongly recommend the use of drugs to treat hyperlipidemia, especially in elderly patients with high low-density lipoprotein levels before KT.  相似文献   

9.
BackgroundThe axillary reverse mapping (ARM) technique, identify and preserve arm nodes during sentinel lymph node biopsy (SLNB) or axillary lymph node dissection (ALND), was developed to prevent breast-cancer related lymphedema (BCRL) remains controversial.MethodsA comprehensive search of Medline Ovid, Pubmed, Web of Science and the Cochrane CENTRAL databases was conducted from the inception till January 2020. The key word including “breast cancer”, “axillary reverse mapping”, and “lymphedema”. Stata 15.1 software was used for the meta-analysis.ResultsAs a result, twenty-nine related studies involving 4954 patients met our inclusion criteria. The pooled overall estimate lymphedema incidence was 7% (95% CI 4%–11%, I2 = 90.35%, P < 0.05), with SLNB showed a relatively lower pooled incidence of lymphedema (2%, 95% CI 1%–3%), I2 = 26.06%, P = 0.23) than that of ALND (14%, 95% CI 5%–26%, I2 = 93.28%, P < 0.05) or SLNB and ALND combined (11%, 95% CI 1%–30%). The ARM preservation during ALND procedure could significantly reduce upper extremity lymphedema in contrast with ARM resection (OR = 0.27, 95% CI 0.20–0.36, I2 = 31%, P = 0.161). Intriguingly, the result favored ALND-ARM over standard-ALND in preventing lymphedema occurrence (OR = 0.21, 95% CI 0.14–0.31, I2 = 43%, P = 0.153). The risk of metastases in the ARM-nodes was not significantly lower in the patients who had received neoadjuvant chemotherapy, as compared to those without neoadjuvant treatment (OR = 1.20, 95% CI 0.74–1.94, I2 = 49.4%, P = 0.095).ConclusionsARM was found to significantly reduce the incidence of BCRL. The selection of patients for this procedure should be based on their axillary nodal status. Preoperative neoadjuvant chemotherapy has no significant impact on the ARM lymph node metastasis rate.  相似文献   

10.
PurposeIn recent years, the increasing number of obese individuals in Japan has made transplant teams sometimes forced to select candidates with a high body mass index (BMI) as marginal donors in living donor liver transplantation. However, data are lacking regarding the impact of a high BMI on the outcome for liver donors, particularly over the long term. Here, we aimed to clarify the impact of a high BMI on postoperative short- and long-term outcomes in liver donors.MethodsWe selected 80 cases that had complete 5-year data available from hepatectomies performed in 2005 to 2015 in our institute. We divided donors into overweight (BMI ≥ 25 kg/m2, n = 16) and normal-weight (BMI < 25, n = 64) groups.ResultsPreoperatively, the overweight group had significantly higher preoperative levels of serum alanine aminotransferase and γ-glutamyl transpeptidase and a larger liver volume than the normal-weight group. Although the overweight group had significantly greater intraoperative blood loss (660 ± 455 vs 312 ± 268 mL, P = .0018) and longer operation times (463 ± 88 vs 386 ± 79 min, P = .0013), the groups showed similar frequencies of postoperative complications. At 1 year post hepatectomy, liver regeneration and spleen enlargement ratios did not significantly differ between the 2 groups. Remarkably, the overweight group showed significantly higher serum γ-glutamyl transpeptidase levels over the long term.ConclusionsOverweight status alone was not a risk factor for either short- or long-term postoperative outcomes after a donor hepatectomy. However, donors with elevated γ-glutamyl transpeptidase levels, which was frequent among overweight donors, may require special attention.  相似文献   

11.
《The Journal of arthroplasty》2023,38(6):1089-1095
BackgroundThere remains inconsistent data about the association of surgical approach and periprosthetic joint infection (PJI). We sought to evaluate the risk of reoperation for superficial infection and PJI after primary total hip arthroplasty (THA) in a multivariate model.MethodsWe reviewed 16,500 primary THAs, collecting data on surgical approach and all reoperations within 1 year for superficial infection (n = 36) or PJI (n = 70). Considering superficial infection and PJI separately, we used Kaplan–Meier survivorship to assess survival free from reoperation and a Cox Proportional Hazards multivariate models to assess risk factors for reoperation.ResultsBetween direct anterior approach (DAA) (N = 3,351) and PLA (N = 13,149) cohorts, rates of superficial infection (0.4 versus 0.2%) and PJI (0.3 versus 0.5%) were low and survivorship free from reoperation for superficial infection (99.6 versus 99.8%) and PJI (99.4 versus 99.7%) were excellent at both 1 and 2 years. The risk of developing superficial infection increased with high body mass index (BMI) (hazard ratio [HR] = 1.1 per unit increase, P = .003), DAA (HR = 2.7, P = .01), and smoking status (HR = 2.9, P = .03). The risk of developing PJI increased with the high BMI (HR = 1.04, P = .03), but not surgical approach (HR = 0.68, P = .3).ConclusionIn this study of 16,500 primary THAs, DAA was independently associated with an elevated risk of superficial infection reoperation compared to the PLA, but there was no association between surgical approach and PJI. An elevated patient BMI was the strongest risk factor for superficial infection and PJI in our cohort.Level of EvidenceIII, retrospective cohort study.  相似文献   

12.
AimThis study aims to validate Japanese diagnostic criteria for acute-on-chronic liver failure (ACLF) and confirm the feasibility of performing transplantation.MethodsWe included 60 patients with acute liver injury. Demographic and clinical features were retrospectively collected, and the primary outcome was compared among 4 types: acute liver failure (ALF) with hepatic coma (n = 23), ALF without hepatic coma (n = 12), acute liver injury (n = 20), and ACLF (n = 5). Moreover, 80 transplanted patients were enrolled to compare the difficulty of transplantation between ALF (n = 8) vs non-ALF (n = 72) patients.ResultsSeven patients in the ALF with hepatic coma group and 1 patient in the ACLF with hepatic coma group were transplanted. Ten patients who could not be registered for transplantation died. In univariate analysis, liver failure type (P < .0001), total bilirubin level (P = .05), and prothrombin time internationalized ratio (P < .0001) were associated with patient survival. In multivariate analysis, liver failure type was associated with patient survival (P < .0001). The respective 1-, 3-, and 5-year patient survival rates were 45.9%, 45.9%, and 45.9% for ALF patients with hepatic coma; 100.0%, 100.0%, and 100.0% for ALF patients without hepatic coma and acute liver injury; and 80.0%, 80.0%, and 80.0% for ACLF patients (P < .0001). Chronic liver disease did not affect operation time (P = .46) and bleeding volume (P = .49).ConclusionPatients diagnosed with ACLF via Japanese criteria presented significantly higher survival rates than ALF patients with hepatic coma.  相似文献   

13.
BackgroundAcute graft-versus-host disease (aGVHD) is one of the leading causes of limitation and mortality after allogeneic hematopoietic stem cell transplantation (allo-HSCT). Numerous studies have shown that changes in the gut microbiome diversity increased post-transplant problems, including the occurrence of aGVHD. Probiotics and prebiotics can reconstitute the gut microbiota and thus increase bacterial metabolites such as short-chain fatty acids (SCFAs) that have immunomodulatory effects preventing aGVHD in recipients of allo-HSCTs.Methods/Study DesignWe conducted a pilot randomized clinical trial to investigate whether oral synbiotics are associated with the prevention or reduction in occurrence/severity and mitigate complications of aGVHD following allo-HSCT. A commercially available synbiotic mixture containing high levels of 7 safe bacterial strains plus fructo-oligosaccharides as a prebiotic was administered to allo-HSCT recipients. Out of 40 allo-HSCT patients, 20 received daily a synbiotic 21 days prior to transplantation (days −21 to day 0). In contrast, in the control group 20 recipients of allo-HSCT did not receive a symbiotic therapy.ResultsWithin first 100 days of observation, the incidence of severe (grade III/IV) aGVHD in the a synbiotic-therapy group was 0% (0 out of 20 patients), whereas it was 25% (5 out of 20 patients) in the control group (P = 0.047). The median percentage of CD4 + CD25 + Foxp3+ regulatory T cells (Tregs) among CD4+ lymphocytes on day 28 after HSCT in the synbiotic group was higher (2.54%) than in control group (1.73%; P = 0.01). There was no difference in Treg cells on day 7 after HSCT between two groups. However, the median percentage and the absolute count of Tregs in patients who experience aGVHD was significantly lower on days 7 and 28 after HSCT (both P < 0.05). The overall 12-month survival (OS) rate was higher (90%) in the symbiotic-treated patients than in the control group (75%), but the difference was not statistically significant (P = 0.234).ConclusionOur preliminary findings suggest that synbiotic intake before and during the conditioning regimen of allo-HSCT patients may lead to a reduction in the incidence and severity of aGVHD through the induction of CD4 + CD25 + Foxp3+ regulatory T cells, thus contributing to the improvement of transplant outcomes. Much larger studies are needed to confirm our observations.  相似文献   

14.
《The Journal of arthroplasty》2022,37(9):1776-1782.e4
BackgroundSimultaneous bilateral total knee arthroplasty (BTKA) is associated with a higher risk but can be perceived to afford faster improvement and mitigated costs versus staged BTKA. We aimed to explore (1) health care utilization, (2) surgical supply costs of simultaneous BTKA; and (3) 1-year improvement in patient-reported pain, function, and quality of life (QOL) versus staged BTKA.MethodsA prospective cohort of 198 simultaneous and 625 staged BTKAs was obtained (2016-2020). Simultaneous BTKA cohort was propensity score-matched (1:2) to a similar group of staged patients (simultaneous = 198 versus staged = 396). Outcomes included length of stay, discharge disposition, 90-day readmission, 1-year reoperation, surgical episode supply cost, Knee Injury and Osteoarthritis Outcome Score (KOOS)-pain, KOOS-Physical Function Short Form, and KOOS-QOL. Rates of attaining minimal clinically important difference and Patient Acceptable Symptomatic State were calculated.ResultsCompared to both staged BTKA surgeries combined, simultaneous BTKA demonstrated shorter median net length of stay (2.00 [2.00, 3.00] days versus 2.00 [2.00, 4.00] days; P < .001) but higher rates of nonhome discharge (n = 56 [28.3%] versus n = 32 [4.04%]; P < .001), 90-day readmission (n = 20 [10.1%] versus n = 48 [6.06%]; P = .047) and similar reoperation rates (P = .44). Simultaneous BTKA afforded slight reduction in net surgical cost compared to that of both staged BTKAs combined ($643; P = .028). There was no significant difference in 1-year improvement and minimal clinically important difference attainment rates with simultaneous versus staged BTKA for KOOS-pain (P = .137 and P = .99), KOOS-QOL (P = .095 and P = .81), or KOOS-Physical Function Short Form (P = .75 and P = .49, respectively) or Patient Acceptable Symptomatic State (P = .12).ConclusionStaged BTKA is associated with similar 1-year pain, function, and QOL at a better safety profile and minimal surgical supply cost increase compared to simultaneous BTKA.  相似文献   

15.
BackgroundBioelectrical impedance analysis is a simple, noninvasive method of assessing body composition. Dialysis modality and selection of buffer type may have an impact on body composition. The aim of our study was to compare body compositions of patients from the waiting list for cadaveric renal transplantation according to the dialysis modality.MethodsWe examined a total of 152 (110 hemodialysis [HD] and 42 continuous ambulatory peritoneal dialysis [CAPD]) patients. Demographic data were collected from patient charts. The last 6 months routine laboratory evaluations including hemoglobin, serum creatinine, intact parathyroid hormone, albumin, C reactive protein, calcium, phosphorus were collected. Body compositions were measured using the Tanita BC-420MA Body Composition Analyzer (Tanita, Tokyo, Japan). We made a subanalysis of the CAPD group according to buffer choices as follows: lactate-buffered (n = 16) and bicarbonate/lactate–buffered (n = 26) solution users.ResultsThe body weight (P = .022), body mass index (BMI; 25.8 ± 4.7 vs 23.4 ± 4.9 kg/m2, P = .009), muscle mass (P = .01), fat-free mass (P = .013), and visceral fat ratio (9.5 ± 5.4 vs 7.3 ± 4.1 %, P = .022) were significantly higher in the CAPD group. Total body water of CAPD patients were also higher (P = .003), but total body water ratios of HD and CAPD groups were similar. Fat and fat-free mass ratios of patient groups were also similar. Comparing CAPD subgroups we observed that patients using bicarbonate/lactate–buffered solutions had higher body weights (P = .038), BMI (27.1 ± 5 vs 23.7 ± 3.5 kg/m2, P = .018) values, and visceral fat ratios (8.0 ± 5.2 vs 4.6 ± 2.5 %, P = .023). These patients also tend to have higher fat mass without statistical significance (P = .074). Fat, muscle, and fat-free mass total body water ratios of peritoneal dialysis subgroups were similar.ConclusionWe believe that body composition analysis should be used as a complementary method for assessing nutritional status of PD and CAPD patients as body weight or BMI measurements do not reflect fat, muscle masses, and visceral fat ratios in these patients. Stable, well nourished CAPD patients should be closely observed and be encouraged to increase daily exercise and/or decrease calorie intake from other sources to decrease risks associated with abdominal obesity.  相似文献   

16.
《Transplantation proceedings》2022,54(7):1786-1794
BackgroundThe aim of this study was to evaluate the effect of a recipient's obesity on posttransplant complications and patient and graft survival.MethodsA single-institution, retrospective study was performed on obese renal transplant recipients (BMI ≥ 30 kg/m2, n = 102) from January 2010 to December 2018, matched with non-obese recipients (BMI < 30 kg/m2, n = 204). For comparison, for every obese patient we selected 2 nonobese patients with a similar age, sex, and period of transplantation. The comparative analysis included patient and graft survival as primary outcomes and graft function and postoperative complications as a secondary outcome.ResultsRecipient demographics were comparable in both groups except for diabetic nephropathy in obese patients (P = .0006). Obesity was strongly related to a poorer patient survival (risk ratio [RR] = 2.83 confidence interval [CI] 95% 1.14-7.04; P = .020) but there was no observed difference in graft survival (P = .6). While early graft function was inferior in the obese population (RR = 2.41; CI 95% 1.53-3.79; P = .00016), during late follow-up, no statistically significant differences were observed between both groups (P = .36). Obese recipients had a significantly higher risk of delayed graft function (RR = 1.93; CI 95% (1.19-3.1), P = .0077), heart infarction (RR = 7; CI 95% 1.68-29.26; P = .0042), wound infections (RR = 8; CI 95% 1.96-32.87; P = .0015), diabetes aggravation (RR = 3.13; CI 95% 1.29-7.6; P = .011), and surgical revision for eventration (RR = 8; CI 95% 1.22-52.82; P = .026) when compared with nonobese recipients.ConclusionsDespite the inferior early kidney graft function in obese recipients, there was no difference observed at the long-term follow-up. However, recipient obesity demonstrated a negative effect on patient survival and postoperative complications.  相似文献   

17.
《Transplantation proceedings》2022,54(5):1345-1348
IntroductionDonor hepatic artery thrombosis (dHAT) identified during liver procurement and backtable is a rare and little-reported event that can make liver transplants unfeasible.MethodsThis is a retrospective study of dHAT identified during liver grafts procurements or backtable procedures. All grafts were recovered from brain-dead donors. The demographic characteristics of the donors and the incidence of dHAT were analyzed. The data were also compared to a cohort of donors without dHAT.ResultsThere was a total of 486 donors during the study period. The incidence of dHAT was 1.85% (n = 9). The diagnosis of dHAT was made during procurement in 5 cases (55.5%) and during the backtable in 4 (44.4%). Most donors were female (n = 5), with an average BMI of 28.14 ± 6.9 kg/m2, hypertensive (n = 5), and with stroke as cause of brain death (n = 8). The most prevalent site of dHAT was a left hepatic artery originating from the left gastric artery (n = 4). Of the 9 cases reported, 2 livers were used for transplantation, and 7 were discarded. Comparing those cases to a cohort of 260 donors without dHAT, we found a higher incidence of anatomic variations in the hepatic artery (P = .01) and of stroke as cause of brain death (P = .05).ConclusionThe occurrence of dHAT before liver procurement is a rare event, however it may become a treacherous pitfall if the diagnosis is late. Grafts with anatomic variations recovered from women with brain death due to stroke and with past history of hypertension seem to be at a higher risk of presenting dHAT.  相似文献   

18.
ObjectiveThe relationship between intraluminal thrombus (ILT) and abdominal aortic aneurysm (AAA) growth and rupture risk remains ambiguous. Studies have shown a limited effect of antiplatelet therapy on ILT size, whereas the impact of anticoagulant therapy on ILT is unresolved. This study aims to evaluate an association between antithrombotic therapy and ILT size assessed with three-dimensional contrast-enhanced ultrasound (3D-CEUS) examination in a cohort of patients with AAA.MethodsIn a cross-sectional study, 309 patients with small AAAs were examined with 3D-CEUS. Patients were divided into three groups based on prescribed antithrombotic therapy: anticoagulant (n = 36), antiplatelet (n = 222), and no antithrombotic therapy (n = 51). Patient ILT size was calculated in volume and thickness and compared between the three groups.ResultsPatients on anticoagulants had a significantly lower estimated marginal mean ILT volume of 16 mL (standard error [SE], ±3.2) compared with 28 mL (SE, ±2.7) in the no antithrombotic group and 30 mL (SE, ±1.3) in the antiplatelet group when adjusting for AAA volume (P < .001) and comorbidities (P < .001). In addition, patients on anticoagulant therapy had significantly lower estimated marginal mean ILT thickness of 10 mm (SE, ±1.1) compared with 13 mm (SE, ±0.9) in the no antithrombotic group of and 13mm (SE, ±0.4) in the antiplatelet group when adjusting for AAA diameter (P = .03) and comorbidities (P = .035).ConclusionsA 3D-CEUS examination is applicable for ILT assessment and demonstrates that patients with AAA on anticoagulant therapy have lower ILT thickness and volume than patients with AAA on antiplatelet therapy and those without antithrombotic therapy. Causality between anticoagulants and ILT size, and extrapolation to AAA growth and rupture risk, is unknown and merits further investigations, to further nuance US-based AAA surveillance strategy.  相似文献   

19.
《Transplantation proceedings》2023,55(5):1239-1244
AimThis study aimed to evaluate the course of bone and mineral metabolism after liver transplantation (LT) in patients with chronic liver disease.MethodsOne hundred four patients who had undergone LT and had a minimum of 6 months of follow-up after LT were included in this prospective cohort study. The following parameters were evaluated for each patient: preoperative and postoperative (postoperative day [POD]30, POD90, POD180) osteocalcin, bone-specific alkaline phosphatase (BALP), type 1 collagen, beta-C-terminal end telopeptide (β-CTx), vitamin D, parathyroid hormone (PTH), ALP, calcium, phosphate, sedimentation, and bone mineral densitometer scores (L2, L4, L total, and F total). The parameters were compared in terms of sex, presence of liver tumor (hepatocellular carcinoma [HCC; n = 19] vs non-HCC [n = 85]), and presence of autoimmune liver disease (autoimmune liver disease [ALD; n = 8] vs non-ALD [n = 96]).ResultsThe median age of the patients (n = 81 men and n = 23 women) was 52 years (95% CI, 50-56). There was a significant change in the defined time intervals in parameters such as osteocalcin (P < .001), BALP (P < .001), β-CTx (P < .001), vitamin D (P < .001), PTH (P < .001), ALP (P = .001), calcium (P < .001), phosphate (P = .001), L2 (P = .038), L total (P = .026), and F total (P < .001) scores. There was a significant difference in POD90 ALP (P = .033), POD180 calcium (P = .011), POD180 phosphate (P = .011), preoperative sedimentation (P = .032), and POD180 F total (P = .013) scores between both sexes. There was a significant difference in POD180 osteocalcin (P = .023), POD180 β-CTx (P = .017), and preOP calcium (P = .003) among the HCC and non-HCC groups. Furthermore, we found significant differences in preoperative ALP (P = .008), preoperative sedimentation (P = .019), POD90 (P = .037) and POD180 L2 (P = .005) scores, preoperative (P = .049) and POD180 L4 (P = .017), and POD180 L total (P = .010) and F total (P = .022) scores between the patients with and without ALD.ConclusionThis study shows that the bone and mineral metabolism of the LT recipients was negatively affected after LT. In addition, we showed that bone and mineral metabolism was more prominent in patients with HCC, and bone mineral density scores were higher in patients with ALD.  相似文献   

20.
《The Journal of arthroplasty》2023,38(9):1793-1801
BackgroundThe primary aim was to assess whether a short (125 millimeter (mm)) stem offered an equivalent hip-specific function compared to the standard (150 mm) stem when used for cemented total hip arthroplasty. Secondary aims were to evaluate health-related quality of life, patient satisfaction, stem height and alignment, as well as radiographic loosenings and complications between the two stems.MethodsA prospective twin-center double-blind randomized control trial was conducted. During a 15-month period, 220 patients undergoing total hip arthroplasty were randomized to either a standard (n = 110) or a short (n = 110) stem. There were no significant (P ≥ .065) differences in preoperative variables between the groups. Functional outcomes and radiographic assessment were undertaken at a mean of 1 and 2 years.ResultsThere were no differences (P = .428) in hip-specific function according to the mean Oxford hip scores at 1 year (primary endpoint) or at 2 years (P = .622) between the groups. The short stem group had greater varus angulation (0.9 degrees, P = .003) when compared to the standard group and were more likely (odds ratio 2.42, P = .002) to have varus stem alignment beyond one standard deviation from the mean. There were no significant (P ≥ .083) differences in the forgotten joint scores, EuroQol-5-Dimension, EuroQol-visual analogue scale, short form 12, patient satisfaction, complications, stem height, or radiolucent zones at 1 or 2 years between the groups.ConclusionThe cemented short stem used in this study had equivalent hip-specific function, health-related quality of life, and patient satisfaction when compared to the standard stem at mean 2 years post operation. However, the short stem was associated with a greater rate of varus malalignment, which may influence future implant survival.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号