首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.

Background

Rituximab is frequently used in solid organ transplantation off-label, especially in patients with renal allografts. Few data are available on the safety aspects of solid organ transplant recipients receiving rituximab. There is a knowledge gap on long-term follow-up data, in particular on infectious complications.

Patients and methods

A retrospective observational registry study (German Registry on Autoimmune Diseases) comprising a total of 681 patients was conducted. The data of 63 adult kidney transplant recipients who received rituximab between 2006 and 2013 were used in this analysis.

Results

Median follow-up was 42 (1–109) months. At least 1 severe infection occurred in 57% of patients. The median time between the first rituximab infusion and the first infection was 4 (1–48) months. Of the overall 88 infections, 74 were severe bacterial infections, 5 were severe viral infections, 3 were severe fungal infections, 2 were combined severe bacterial and fungal infections, and 4 were combined severe viral, fungal and bacterial infections. Seven patients died during the observational period, 2 of them due to infectious complications. In the observational period, 1 case of squamous cell carcinoma but no other malignancies were observed.

Conclusion

Consistent with previous data, a high incidence of infections was observed after rituximab treatment in kidney transplant recipients. Most infections occurred within 6 months after rituximab initiation. With more than 3 years of follow-up, we were able to document a low incidence of secondary malignancies after rituximab with only 1 case in our cohort.  相似文献   

2.

Background

Few data exist on recurrence rates, treatment response, and long-term outcomes in kidney transplant recipients (KTR) with primary focal segmental glomerulosclerosis (FSGS).

Methods

This retrospective, observational study included 1218 consecutive KTR during 2002 to 2016. All patients with primary idiopathic FSGS were identified through application of strict diagnostic criteria. Outcomes were followed over an average of 70.4 months.

Results

We identified 48 KTR (3.9%) with primary FSGS. Seven-year death-censored graft survival rate was 81% (primary FSGS) versus 85% (control) (P = .297). Eighteen KTR had FSGS recurrence (predicted incidence, 50% after 7 years). Seven-year death-censored graft survival rate in KTR with FSGS recurrence was significantly worse than in FSGS KTR without recurrence (63% versus 96%, P = .010). In the case of FSGS recurrence, a multi-modal treatment approach was applied, including plasma exchange (PE) (100% of patients), intravenous cyclosporine (50%), rituximab (61%), and the “Multiple Target Treatment” (39%). The median number of PE sessions was 27. Proteinuria decreased significantly and persistently during the course of treatment. Complete remission of FSGS was observed in 7 patients (39%); another 7 patients (39%) had partial remission (PE dependence was observed in 4 patients [22%]). Four patients (22%) with FSGS recurrence had early graft loss (<6 months after transplant) despite all treatment efforts.

Conclusions

In KTR with primary FSGS, a high proportion of recurrence occurred, and recurrence was associated with significantly worse death-censored graft survival rates. However, a multi-modal treatment approach led to improvement of proteinuria and full or partial remission in most patients. Importantly, overall death-censored graft survival rate in KTR with primary FSGS was comparable with that in the control group.  相似文献   

3.

Background

The aim of this study is to analyze the long-term immunologic outcomes of living-related kidney transplantations depending on the donor-recipient relationship.

Methods

This retrospective single-center study included adult kidney transplant recipients (KTR) transplanted between 2000 and 2014. Among 1117 KTRs, 178 patients (15.9%) received living-related donations. Those patients were further categorized according to the donor-recipient relationship: 65 transplantations between siblings, 39 father-to-child (F-t-C) and 74 mother-to-child (M-t-C) donations. Allograft biopsies were performed for clinically suspected rejections. Data analysis included patient and graft survival, biopsy proven rejections (T-cell mediated [TCMR] or antibody mediated) and development of de novo donor-specific antibody. Outcome data were assessed over a period of a maximum 14 years.

Results

There was no significant difference between the groups (F-t-C, M-t-C, and siblings) with regard to HLA-mismatches, prior kidney transplantations, time on dialysis, and cold ischemia time. Among KTRs with related donors, the type of relationship had no significant influence on graft survival. F-t-C and M-t-C pairs showed comparable incidences of TCMR at 7 years post-transplantation, both significantly exceeding the rate in sibling-to-sibling pairs (26.2% and 26.8% vs 10%, respectively; P = .043). A multivariate Cox regression analysis adjusted for recipient age, donor age, and HLA (A, B, DR)–mismatches identified both M-t-C- and F-t-C-donations as important independent risk factors for TCMR (hazard ratio: 8.13; P < .001 and hazard ratio: 8.09; P = .001, respectively). There was no significant difference between the groups concerning the incidence of antibody-mediated rejection and de novo donor-specific antibody.

Conclusion

Our results indicate that parent-to-child kidney donation is an independent risk factor for TCMR.  相似文献   

4.

Background

Conversion to belatacept at a later point after kidney transplantation (KT) as a rescue therapy has been shown to be beneficiary in an increasing number of patients, but prognostic factors for a favorable outcome have never been investigated.

Methods

The present study analyzed all KT patients after late conversion to belatacept in a single center regarding graft survival and changes in estimated glomerular filtration rate (eGFR), proteinuria, and mean fluorescence intensity (MFI) of donor-specific antibodies (DSA).

Results

A total of 69 KT patients were converted to belatacept. eGFR increased from 28.9 ± 18.2 mL/min/1.73 m2 at time of conversion to 34.8 ± 20.1 mL/min/1.73 m2 after 18 months (P = .025). After conversion, 26/69 patients (37.7%) showed a sustained increase in eGFR of >5 mL/min/1.73 m2 after 12 months and were defined as responders. All other patients (43/69, 62.3%) were defined as nonresponders. In multivariate analysis, nonresponders presented with significantly higher proteinuria (552 ± 690 vs 165 ± 158 mg/L; P = .004) at the time of conversion. Changes of eGFR from before conversion and the time of conversion were similar in both subgroups (?5.7 ± 9.2 and 29.2 ± 17.3 mL/min/1.73 m2 in responders and ?4.6 ± 10.7 and 28.7 ± 19.0 mL/min/1.73 m2 in nonresponders). HLA antibody panel reactivity did not change after conversion. DSA-MFI was higher in nonresponders (7,155 ± 6,785) than in responders (2,336 ± 2,173; P = .001). One patient (1/69, 1.4%) developed de novo DSA after conversion, and no antibody-mediated rejection was diagnosed within 1,540 treatment months.

Conclusions

Late conversion to belatacept is beneficiary for a subgroup of patients, with lower proteinuria at the time of conversion being an indicator for a favorable outcome.  相似文献   

5.

Background

Prevalence of extended-spectrum beta-lactamase–producing Enterobacteriaceae (ESBL-E) has risen in kidney transplant (KT) patients, with no long-term data so far on graft function or survival.

Methods

KT patients with ESBL-E–positive urine culture were retrospectively analyzed regarding initial adequate antimicrobial therapy, recurrent infection, transplant function, and survival compared with an ESBL-E–negative KT control cohort.

Results

ESBL-E–positive KT patients (n = 93) were older (55.5 ± 16.1 vs 49.5 ± 16.8 y; P = .001), presented with higher trough levels of cyclosporine and tacrolimus (121 ± 71 vs 102 ± 32 ng/mL [P = .04]; and 7.9 ± 3.3 vs 7.0 ± 2.3 ng/mL [P = .04], respectively), higher dosages of mycophenolate (1,533 ± 670 vs 1,493 ± 436; P = .001), and more acute rejection episodes within 3 months before diagnosis (12.9% vs 0.8%; P < .0001) compared with control subjects (n = 591). Five-year patient survival was superior in control subjects compared with ESBL-E–positive patients (91.2% vs 83.5%; P = .034) but long-term graft function was similar. Hospitalization rates were higher in patients presenting with ESBL-E–related urinary tract infection (UTI) compared with control subjects with ESBL-E–negative UTI (60.3% vs 31.3%; P = .002) but 5-year graft survival was superior in patients presenting with ESBL-E–related UTI (88.6% vs 69.8%; P = .035) compared with control subjects with ESBL-E–negative UTI. Recurrence rates were similar in patients with or without ESBL-E–related UTI. Initial antibiotic treatment was adequate in 41.2% of patients presenting with ESBL-E–related urosepsis, resulting in a reevaluation of antibiotic stewardship in our clinic.

Conclusions

ESBL-E detection in general was associated with higher mortality, but graft survival in patients with ESBL-E–related UTI was significantly better compared with ESBL-E–negative UTI.  相似文献   

6.

Intoduction

Infection by cytomegalovirus (CMV) is a major cause of morbidity among immunosuppressed patients, especially after solid organ transplantation. The risk of CMV after organ transplantation is strongly related to the serology of the donor and the recipient. The objective of this study was to analyze the outcomes and costs of pre-emptive therapy in patients after liver transplantation with donor-positive/recipient-negative (D+/R?) serostatus.

Methods

This retrospective study analyzed all patients who underwent liver transplantation with CMV serostatus D+/R? between January 2012 and December 2015. The service protocol adopts pre-emptive therapy. The outcomes and costs of this therapy are described.

Results

Of the 119 patients undergoing liver transplantation, 19 were D+/R? and entered the main analysis. Of these, 7 had positive polymerase chain reaction (PCR) results, and 1 developed CMV disease. Of the 6 patients who received no treatment, none developed CMV disease. Analyzing costs, pre-emptive therapy for these patients generated service savings of R$32,346.00.

Conclusions

Although outcomes of universal prophylaxis and pre-emptive therapy are similar, pre-emptive therapy save on costs and have to be considered in patients with high-risk CMV disease after liver transplantation.  相似文献   

7.

Objectives

Using a strategy of placing a surgical drain after kidney transplantation, the duration of a lymphatic fluid leakage and prevalence of a symptomatic lymphocele were retrospectively analyzed. The risk factors for persistent lymphatic fluid leakage or asymptomatic lymphocele were evaluated using multivariate analysis to estimate the origin of the lymphatic fluid leakage.

Materials and methods

Patients with persistent lymphatic fluid leakage and symptomatic lymphocele were defined as those with lymphatic fluid drainage >50 mL for more than 15 days and those who required a percutaneous drainage of the lymphocele, respectively.

Results

Persistent lymphatic fluid leakage and symptomatic lymphocele were observed in 40 (16.4%) and 10 (4.1%) of a total of 244 patients, respectively. The maximum durations of lymphatic fluid drainage from the initial drain tube and the second drainage of the symptomatic lymphocele were 48 and 28 days, respectively. Anastomosis of the graft artery to the external iliac artery was an independent risk factor to predict persistent lymphatic fluid leakage or symptomatic lymphocele after kidney transplantation (odds = 2.597, P = .008).

Conclusion

The findings of the study suggest that the lymphatic fluid originates from the recipient's iliac lymph trunk rather than from the graft kidney.  相似文献   

8.

Study design

A retrospective single-center and single-surgeon study.

Objectives

This study investigated the clinical and radiological results of skip pedicle screw fixation for adolescent idiopathic scoliosis (AIS).

Summary of background data

At present, the generally used technique for pedicle screw fixation for the surgical correction of AIS entails inserting a pedicle screw into every segment on the corrective side and into every or every other segment on the supportive side. To reduce operation time, blood loss, and cost, we developed skip pedicle screw fixation to achieve correction of AIS using fewer pedicle screws.

Methods

We evaluated 62 consecutive patients who had undergone computer-assisted skip pedicle screw fixation from August 2005 to June 2014. All patients were followed up for at least two years. We investigated the clinical results of skip pedicle screw fixation for AIS.

Results

The mean number of fused vertebrae was 10.3 ± 2.0, the mean surgical time was 242 ± 78 min, and the mean blood loss volume was 1060 ± 688 ml. The mean Cobb angle of main thoracic (MT) curve two years after surgery improved significantly compared with that before surgery (p < 0.01). The mean correction rate of MT curve immediately after surgery was 62.4 ± 12.4% and correction loss of MT curve at two years after surgery was 1.9 ± 5.8°. The SRS-22 subtotal score two years after surgery improved significantly compared to that before surgery (p < 0.01). Although no patients experienced major complications, eight (12.9%) encountered minor complications (two [3.2%] had massive blood loss [>3000 ml], three [4.8%] had a broken screw, one [1.6%] had a set-screw that dropped out, one [1.6%] experienced deep vein thrombosis, one [1.6%] experienced acute renal failure, and one [1.6%] experienced intercostal neuralgia). Revision surgery was not performed.

Conclusions

Subjects with AIS who underwent skip pedicle screw fixation had significantly improved clinical and radiological parameters at two years after surgery, indicating that skip pedicle screw fixation could be used to successfully treat AIS.

Level of evidence

Level 4  相似文献   

9.

Background

Despite recommendations on how to prevent baseball injuries in youths by the Japanese Society of Clinical Sports Medicine, shoulder and elbow pain still frequently occurs in young baseball players. We conducted a questionnaire survey among baseball players at elementary schools across the country to understand the practice conditions of players, examining the risk factors of shoulder and elbow pain in baseball players.

Methods

The questionnaire survey was conducted among elementary school baseball players as members of the Baseball Federation of Japan in September 2015.

Results

A total of 8354 players belonging to 412 teams (average age: 8.9) responded to the survey. Among 7894 players who did not have any shoulder and/or elbow pain in September 2014, elbow pain was experienced in 12.3% of them, shoulder pain in 8.0% and shoulder and/or elbow pain in 17.4% during the previous one year. A total of 2835 (39.9% of the total) practiced four days or more per week and 97.6% practiced 3 h or more per day on Saturdays and Sundays. The risk factors associated shoulder and elbow pain included a male sex, older age, pitchers and catchers, and players throwing more than 50 balls per day.

Conclusions

It has been revealed that Japanese elementary school baseball players train too much. Coaches should pay attention to older players, male players, pitchers and catchers in order to prevent shoulder and elbow pain. Furthermore, elementary school baseball players should not be allowed to throw more than 50 balls per day.

Study design

Retrospective cohort study.  相似文献   

10.

Background

Today, sequencing technology has markedly reduced the cost and time needed to read the human genome than ever before. Genome-wide association studies have successfully identified a number of disease risk genes.

Contribution to understanding of disease pathophysiology

Recent advancements in genomic technology have substantially furthered our understanding of the pathophysiology of many diseases, such as rheumatoid arthritis.

Toward drug discovery and future direction

Accumulating genomic information is now expected to accelerate the discovery of novel drugs. Rapidly growing multi-dimensional information in life sciences would make human genetics significantly important in the near future.  相似文献   

11.

Background

The Meridional Hospital Liver transplant unit is the only one active in all Espírito Santo State, Brazil, since 2004.

Objective

The aim is to analyze data of the first 250 transplants performed by the team.

Methods

This retrospective study reviewed files from patients transplanted in the Meridional Hospital from January 2005 to December 2015.

Results

There were 250 liver transplants in 236 patients and 14 retransplants. 72.4% were male recipients, with average age of 51.1 years (1–70 years), and the main etiology was alcoholic cirrhosis (33.6% of the cases). Surgical reintervention occurred in 58 patients (include retransplantations) during the same hospitalization, with revision of homeostasis and retransplant as main indications. In the retransplant group, 73.3% of patients died within 2 months. Thrombosis of the hepatic artery was responsible for 40% of the indications for retransplant. The average time between first and second transplant was 223 days (median 14 days). Currently 152 of 236 patients are living, with 1-year life expectancy of approximately 71%. The mortality peak occurred from the immediate postoperative period to 2 months post-transplant (63.8% of the deaths). 32% of subjects did not need intraoperative blood transfusion. The average time of intensive care unit stay was of 8.52 days, and overall hospital stay was 21.7 (median 15 days).

Conclusion

Despite the logistic difficulties and lack of donors our unit, keep in advance with survival comparable to other national centers (68% to 74% in 1-year).  相似文献   

12.

Background

Atypical femoral fractures (AFFs) have been reported to occur in patients with bone metastases who received long-term bisphosphonate treatment. However, the incidence of AFFs in breast cancer patients with bone metastases who received intravenous bisphosphonate is unclear. The purpose of this study is to examine the incidence of AFFs in breast cancer patients with bone metastases who received intravenous bisphosphonate. In addition, we estimated the number of dose and duration of intravenous bisphosphonate at the time of occurrence of AFFs.

Methods

We identified 356 female breast cancer patients with bone metastases who received intravenous bisphosphonate between November 2004 and October 2013 in our institution. The median number of doses of intravenous bisphosphonate was 18 (range, 1–103). The median duration of intravenous bisphosphonate treatment was 16 months (range, 1–102 months).We estimated the incidence of AFFs in patients who received intravenous bisphosphonate and used Poisson regression model to obtain the incidence rates of AFFs.

Results

Three AFFs in two patients were identified and the estimated incidence of AFFs was 2.99 per 1000 person-years. At the time of occurrence of AFFs, the patients had received 41 and 83 doses of intravenous bisphosphonate, for 37 and 79 months, respectively. The patients underwent open reduction and internal fixation with intramedullary nail. The frequency and incidence of AFFs in patients who received intravenous bisphosphonate for at least 41 or 83 doses or for more than 37 or 79 months were 2/60 (3.3%), 1/7 (14.3%), 2/70 (2.9%), and 1/9 (11.1%), respectively.

Conclusions

The incidence of AFFs is low in breast cancer patients with bone metastases who received intravenous bisphosphonate. Careful observation is warranted and radiography should be performed to investigate AFFs when clinical signs such as thigh pain appear.

Study design

Clinical study.  相似文献   

13.

Background

Lumbar decompression surgery is often used to treat neurological symptoms of the lower extremity as a result of lumbar disease. However, this method also leads to the improvement of the accompanying low back pain (LBP). We studied the extent of LBP improvement after lumbar decompression surgery without fusion and the associated preoperative factors.

Methods

Patients (n = 140) with lumbar spinal stenosis (n = 90) or lumbar disc herniation (n = 50) were included. To evaluate the change in LBP, VAS scores and the Oswestry disability index scores were measured before surgery and 2 weeks, 3 months, and 6 months after surgery. The predictors of residual LBP were investigated using logistic regression analyses.

Results

In total, 140 patients were examined. The VAS scores for LBP before surgery and 2 weeks, 3 months, and 6 months after surgery were 4.4 ± 3.0 (mean ± standard deviation), 1.1 ± 1.5, 1.3 ± 1.8, and 1.9 ± 2.2, respectively. LBP significantly improved 2 weeks after surgery (P < 0.001), stabilized between 2 weeks and 3 months after surgery, but was significantly aggravated 3–6 months after surgery (P < 0.001). At 6 months after surgery, 67 (47.9%) patients had a VAS score of >1. The predictors of residual LBP included severe preoperative LBP, degenerative scoliosis and the size of the Cobb angle. The independent predictors, determined by multivariate analysis were degenerative scoliosis and the size of the Cobb angle.

Conclusions

LBP was alleviated at 2 weeks after lumbar decompression surgery for lumbar disc herniation and lumbar spinal stenosis. The predictors of residual LBP after decompression included more severe LBP at baseline, degenerative scoliosis and the size of Cobb angle.

Level of evidence

Level 3.  相似文献   

14.

Background

In an autologous hematopoietic cell transplantation (AHCT) setting, routine cytomegalovirus (CMV) surveillance is not indicated except in high-risk situations. On the other hand, some studies reported increased CMV reactivation in AHCT setting as a result of incorporation of novel agents into treatment algorithms, such as bortezomib and rituximab. We retrospectively analyzed CMV reactivation and infection rates in patients with no high-risk features, who were treated with AHCT.

Methods

During January 2010 to November 2015, all consecutive, CMV-seropositive patients were included. The viral copy numbers were measured twice a week from the start of the conditioning regimen until engraftment, once a week for the remaining time period until day 30 after AHCT and once weekly only for patients who had been diagnosed with CMV reactivation before and who developed primary/secondary engraftment failure during 31 to 60 days after AHCT.

Results

One hundred one (61.6%) men and 63 (38.4%) women were included in the study. The median age of study cohort was 51 years (range, 16–71 years). The indications for AHCT were Hodgkin lymphoma, non-Hodgkin lymphoma, and multiple myeloma in 44 (26.8%), 41 (25%), and 79 (48.2%) patients, respectively. CMV reactivation occurred in 60 (37%) patients, and 13 patients (8%) received pre-emptive ganciclovir treatment.

Conclusions

On the basis of our results, it might be stated that CMV surveillance may be recommended during 40 days after AHCT in countries with a high CMV prevalence, even in patients without high-risk features regarding reactivation. Additionally, the risky conditions necessitating CMV screening after AHCT must be re-defined in the era of novel agents.  相似文献   

15.

Background

Outcomes of patients with end-stage renal disease are mainly affected by their comorbidities. Detailed data evaluating the impact of pre-transplant comorbidities on long-term outcome after kidney transplantation are largely missing.

Methods

In a long-term retrospective analysis, we investigated 839 deceased donor kidney transplant recipients (KTRs) who received transplants between 1999 and 2014. The prevalence and impact of the most relevant comorbidities were studied in detail.

Results

At the time of transplantation, 25% of KTRs had coronary artery disease (CAD), 16% had diabetes mellitus (DM), 11% had peripheral arterial disease (PAD), 8% had chronic heart failure (CHF), and 7% had cerebrovascular disease (CVD). KTRs with pre-existing CAD, DM, PAD, and CHF showed a significantly inferior patient survival. Multivariate analysis adjusting for all relevant factors and comorbidities confirmed CAD as most hazardous independent risk factor for premature death (hazard ratio [HR] 1.70; P = .002). A multivariate analysis revealed CHF and PAD as independent risk factors for death censored graft loss (HR 2.20; P = .003 and HR 1.80; P = .013). Diabetes was independently and significantly associated with T-cell- (HR 1.46; P = .020) and antibody-mediated rejections (HR 2.27; P = .030).

Conclusions

Detailed quantification of the impact of pre-transplant comorbidities may facilitate the evaluation of transplant candidates, guide post-transplant follow-up, and may help to further refine prediction algorithms and allocation systems.  相似文献   

16.

Background

Systemic inflammation affects kidney function in a wide range of diseases. Even in kidney transplant recipients, higher levels of C-reactive protein (CRP) are invariably associated with both worse short- and long-term graft outcomes. However, little is known about systemic inflammation in kidney donors and, notably, brain death causes a strong systemic inflammatory response.

Objective

To analyze the role of systemic inflammation of brain-dead donors on short-term kidney graft outcomes (ie, delayed graft function [DGF], defined as the need of dialysis during the first week after transplantation).

Materials and methods

Retrospective analysis of clinical and biochemical characteristics of all brain-dead kidney donors generated in the Hospital Clínic of Barcelona in the 2006 to 2015 period (n = 194). Donors who were tested for CRP in the 24 hours before BD declaration were included (n = 97, 50% of initial population). Clinical and biochemical features of their respective recipients (n = 165) were analyzed, comparing recipients who developed DGF (n = 30) with recipients who did not (n = 135).

Results

Donors whose recipients later developed DGF had much higher CRP values (10.58 [5.1-18.21] vs 4.81 [1.42-12.2] mg/dL, P = .025). Other characteristics associated with the development of DGF were renal biopsy score and recipient dialysis vintage (P = .025 and P = .002, respectively). In logistic regression analysis, PCR maintained significance in the non–expanded criteria donor (ECD) group (odds ratio [OR], 1.102; P = .027), but it lost significance in the ECD group (P = .67).

Conclusions

Terminal donor CRP was associated with DGF in kidney transplant recipients and proved to be mostly significant in younger donors.  相似文献   

17.

Background

Data on drug-resistant cytomegalovirus (CMV) infection in solid organ transplantation (SOT) are not often reported from resource-limited settings. We aimed to investigate the epidemiology and outcomes of this infection in SOT recipients at our institution.

Methods

This was a retrospective study conducted from January 2012 to May 2015. We included all SOT recipients who were suspected for drug-resistant CMV infection. Genotypic assay for UL97 gene mutation was analyzed by real-time polymerase chain reaction. Patients were reviewed for demographic data, clinical presentation, virologic data, treatment, and outcomes.

Results

The population consisted of 18 (12 kidney, 6 liver) SOT recipients with a median age of 20 years (interquartile range [IQR], 1–49); 44% were male. Anti-CMV resistance testing was analyzed at a median time of 23 days (IQR, 14–33) after initiation of anti-CMV therapy with a median CMV load of log 3.79 copies/mL (IQR, 3.37–4.58). During a median period of 2 years (IQR, 1–3), 6 SOT recipients were identified with UL97 gene mutation in codon 460, conferring ganciclovir (GCV) resistance. Patients with UL97 gene mutation had a longer mean duration of CMV DNAemia compared with those without mutation (263 vs 107 days; P = .04). All patients received high-dose GCV. Two patients received foscarnet and cidofovir. Two patients died (non–CMV-related), and 4 patients.

Conclusions

GCV-resistant CMV infection in SOT recipients is an emerging clinical problem in resource-limited country. Those with UL97 mutation CMV infection have prolonged duration of CMV DNAemia. Clinicians should be aware of this condition when caring for SOT recipients.  相似文献   

18.

Objective

The neutrophil-to-lymphocyte ratio (NLR) has been used as a surrogate marker of systemic inflammation. We sought to investigate the association between NLR and wound healing in diabetic wounds.

Methods

The outcomes of 120 diabetic foot ulcers in 101 patients referred from August 2011 to December 2014 were examined retrospectively. Demographic, patient-specific, and wound-specific variables as well as NLR at baseline visit were assessed. Outcomes were classified as ulcer healing, minor amputation, major amputation, and chronic ulcer.

Results

The subjects' mean age was 59.4 ± 13.0 years, and 67 (66%) were male. Final outcome was complete healing in 24 ulcers (20%), minor amputation in 58 (48%) and major amputation in 16 (13%), and 22 chronic ulcers (18%) at the last follow-up (median follow-up time, 6.8 months). In multivariate analysis, higher NLR (odds ratio, 13.61; P = .01) was associated with higher odds of nonhealing.

Conclusions

NLR can predict odds of complete healing in diabetic foot ulcers independent of wound infection and other factors.  相似文献   

19.

Objective

This study evaluated whether the use of a staged Hemodialysis Reliable Outflow (HeRO; Merit Medical, South Jordan, Utah) implantation strategy incurs increased early infection risk compared with conventional primary HeRO implantation.

Methods

A retrospective review was performed of 192 hemodialysis patients who underwent HeRO graft implantation: 105 patients underwent primary HeRO implantation in the operating room, and 87 underwent a staged implantation where a previously inserted tunneled central venous catheter was used for guidewire access for the venous outflow component. Within the staged implantation group, 32 were performed via an existing tunneled hemodialysis catheter (incidentally staged), and 55 were performed via a tunneled catheter inserted across a central venous occlusion in an interventional radiology suite specifically for HeRO implantation (intentionally staged). Early infection was defined as episodes of bacteremia or HeRO infection requiring resection ≤30 days of HeRO implantation.

Results

For staged HeRO implantations, the median interval between tunneled catheter insertion and conversion to a HeRO graft was 42 days. The overall HeRO-related infection rate ≤30 days of implantation was 8.6% for primary HeRO implantation and 2.3% for staged implantations (P = .12). The rates of early bacteremia and HeRO resection requiring surgical resection were not significantly different between groups (P = .19 and P = .065, respectively), nor were age, gender, laterality, anastomosis to an existing arteriovenous access, human immunodeficiency virus status, diabetes, steroids, chemotherapy, body mass index, or graft location. None of the patient variables, techniques, or graft-related variables correlated significantly with the early infection rate.

Conclusions

The staged HeRO implantation strategy did not result in an increased early infection risk compared with conventional primary implantation and is thus a reasonable strategy for HeRO insertion in hemodialysis patients with complex central venous disease.  相似文献   

20.

Background

We aimed to determine correlations between the hip joint center position and pelvic dimensions and whether the three-dimensional position of the original hip joint center could be estimated from pelvic landmarks in dysplastic and normal hips.

Methods

We reviewed the pelvic CT scans of 70 patients (70 hips) with hip dysplasia. Seventy-seven normal hips were used as controls. The hip joint center coordinates (Cx, Cy, and Cz) and pelvic dimensions were measured with reference to the anterior pelvic plane coordinate system. Multiple regression formulas were used to estimate the original hip joint center.

Results

The hip center for both dysplastic and normal hip was highly correlated with the distance between the anterior superior iliac spine (ASIS) in the coronal plane (r = 0.76 and 0.84), the distance from the ASIS to the pubic tubercle in the sagittal plane (r = 0.81 and 0.76), and distance from the pubic tubercle to the most posterior point of the ischium on the transverse plane (r = 0.76 and 0.78). The hip joint center could be estimated within a 5-mm error for more than 80% of hips on their respective axes in both dysplastic and normal hips.

Conclusions

The three-dimensional position of the original hip joint center was correlated with pelvic dimensions, and can be estimated with substantial accuracy using pelvic landmarks as references. Although these results are preliminary, this estimation method may be useful for surgeons planning total hip arthroplasties.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号