首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.

Background

The Great East Japan Earthquake and devastating Tsunami hit hard everything on the northeastern coast of Japan. This study aimed to determine socio-psychological factors for “subjective shoulder pain” of the survivors at 2 years evaluated by a self-report questionnaire.

Methods

Between November 2012 to February 2013, survivors replied to the self-report questionnaire, and 2275 people consented to join this study. Living status was divided into 5 categories (1. same house as before the earthquake (reference group), 2. temporary small house, 3. apartment, 4. house of relatives or acquaintance, 5. new house) and economic hardship was divided into 4 categories (1. normal (reference group), 2. a little bit hard, 3. hard, 4. very hard). Gender, age, body mass index, living areas, smoking and drinking habits, complications of diabetes mellitus and cerebral stroke, working status, and walking time were considered as the confounding factors. Kessler Psychological Distress Scale of ≥10/24 and Athens Insomnia Scale of ≥6/24 points were defined as a presence of psychological distress and sleep disturbance, respectively. We used multiple logistic regression analysis to examine the association of shoulder pain with living environment, economic hardship, psychological distress, and sleep disturbance at 2 years after the earthquake.

Results

There were significant differences in the risk of having shoulder pain in those with “apartment” (OR = 1.74, 95% CI = 1.03–2.96), “house of relatives or acquaintance” (OR = 2.98, 95% CI = 1.42–6.25), economic hardship of “hard” (OR = 1.71, 95% CI = 1.08–2.7) and “very hard” (OR = 2.51, 95% CI = 1.47–4.29), and sleep disturbance (OR = 2.96, 95% CI = 2.05–4.27).

Conclusions

Living status of “apartment” and “house of relatives or acquaintance”, economic hardship of “hard” and “very hard”, and “sleep disturbance” were significantly associated with shoulder pain.  相似文献   

2.

Background

The Great East Japan Earthquake and subsequent tsunami devastated the northeastern part of Japan. Low back pain is thought to increase after a natural disaster and is related to various factors. The aim of this study was to examine the influencing factors of “Living environment” and “Subjective economic hardship” on new-onset of low back pain in the chronic phase for the survivors of the earthquake evaluated by a self-report questionnaire.

Methods

A panel study was conducted with the Great East Japan Earthquake survivors at 2 and 3 years after the disaster. New-onset of low back pain was defined as low back pain absent at the 1st period (2 years after the earthquake) and present at the 2nd period (3 years after the earthquake). Living environment was divided into 4 categories (1. Living in the same house as before the earthquake, 2. Living in a prefabricated house, 3. Living in a new house, 4. Others: Living in an apartment, house of relatives or acquaintance). Subjective economic hardship was obtained using the following self-report question: “How do you feel about the current economic situation of your household?” The response alternatives were “Normal”, “A little bit hard”, “Hard”, and “Very hard”. A univariate and multivariate logistic regression models were used.

Results

1357 survivors consented to join this study. There was no significant association between new-onset of low back pain and living environment. There was significant association between new-onset of low back pain and “A little hard” (OR = 1.6, 95% CI = 1.07–2.40), “Hard” (OR = 2.2, 95% CI = 1.56–3.74), and “Very hard” (OR = 3.19, 95% CI = 1.84–5.53) in subjective economic hardship.

Conclusions

Subjective economic hardship was significantly associated with new-onset of low back pain in the chronic phase for survivors of the Great East Japan Earthquake.  相似文献   

3.

Background

Postoperative outcomes following pancreaticoduodenectomy are well described for pancreatic cancers. Due to a lower incidence rate, complication rates and relative predictive factors are less detailed for ampullary, bile duct and duodenal cancers.

Methods

Medical charts of patients operated on between 2001 and 2011 for an ampullary, bile duct or duodenal cancer were reviewed. Data were retrospectively studied with respect to demographics, surgical management, postoperative complications and histological findings. Specific complication rates were reported, and predictive factors for severe morbidity and mortality were determined by multivariate analysis.

Results

135 patients were identified: 55 ampullary, 55 bile duct and 25 duodenal cancers. Twelve patients (8.9%) deceased postoperatively, and 36 others (26.7%) presented severe complications. Sixty-seven percent of the pancreas was soft, and pancreatic hardness was found to be the main protective factor against severe morbidity (HR = 0.36, 95% CI = 0.14–0.94, P = 0.037). Age and postpancreatectomy haemorrhage were independent predictors for death (HR = 14.63, 95% CI = 1.57–135.77, P = 0.018, and HR = 14.71, 95% CI = 2.86–75.62, P = 0.001, respectively). Only the use of an external transanastomotic duct stent significantly reduced both the morbidity (HR = 0.37, 95% CI = 0.16–0.83, P = 0.016), and the mortality (HR = 0.12, 95% CI = 0.02–0.69, P = 0.017).

Conclusions

Pancreaticoduodenectomy for ampullary, bile duct and duodenal cancers is a high-risk procedure. The systematic use of transanastomotic duct stents would significantly decrease the complication rate. Older patients should beneficiate from specific preoperative evaluation using an adapted index. Omental flap techniques to prevent a postpancreatectomy haemorrhage should be efficient. Effects of preoperative octreotid to harden the pancreas should be clarified.  相似文献   

4.

Objective

Bone morphogenetic proteins (BMP) belong to the transforming growth factor beta superfamily of proteins. This study was performed to evaluate the association of BMP gene polymorphisms with acute renal allograft rejection (AR) and graft dysfunction (GD) in Koreans.

Methods

Three hundred thirty-one patients who had kidney transplantation procedures were recruited. Transplantation outcomes were determined in terms of AR and GD criteria. We selected six single nucleotide polymorphisms (SNPs): rs1979855 (5′ near gene), rs1049007 (Ser87Ser), rs235767 (intron), rs1005464 (intron), rs235768 (Arg190Ser), and rs3178250 (3; untranslated region).

Results

Among the six SNPs tested, the rs235767, rs1005464, and rs3178250 SNPs were significantly associated with AR (P < .05). The rs1049007 and rs235768 SNPs also showed an association with GD (P < .05).

Conclusions

In conclusion, these results suggest that the BMP2 gene polymorphism may be related to the development of AR and GD in kidney transplant recipients.  相似文献   

5.

Purpose

Studies have shown that arecoline, the major alkaloid component of betel nuts, alters the activity of enzymes in the cytochrome P450 (CYP-450) family. Tacrolimus, an immunosuppressant that protects against organ rejection in transplant recipients, not only is mainly metabolized by CYP3A enzymes but also has a narrow therapeutic range. We aimed to investigate whether dose-adjusted blood trough levels of tacrolimus differed over time between betel nut-chewing and non–betel nut-chewing liver transplant recipients.

Methods

In this retrospective case-control study, 14 active betel nut-using liver recipients were matched at a 1:2 ratio to 28 non-betel nut-using liver recipients by sex, age, graft source, duration of follow-up after liver transplantation, and estimated glomerular filtration rate. Differences in liver function index, renal function index, and dose-adjusted blood trough levels of tacrolimus over an 18-month period were compared between the 2 groups by using the Generalized Estimating Equation approach.

Results

Dose-adjusted blood trough levels of tacrolimus tended to be significantly (P = .04) lower in betel nut chewers (mean = 0.81, medium = 0.7, 95% confidence interval [CI] = 0.73 to 0.90) than in nonchewers (mean = 1.12, medium = 0.88, 95% CI = 1.03 to 1.22) during the 18-month study period. However, there was no significant difference in renal and liver function index between the 2 groups.

Conclusion

Liver transplant recipients receiving tacrolimus tend to have lower blood trough levels of the drug over time if they chew betel nuts.  相似文献   

6.

Objective

The objective of this study was to identify possible biopsychosocial predictors of organizational complexity in patients referred to the consultant psychiatrist for assessment before liver transplantation.

Methods

This was a case-control study. All psychiatric consultations performed before and after liver transplantation from January 1, 2008 to December 31, 2013 were included. Complexity was operationalized as “undergoing two or more psychiatric consultations”. Controls were defined as patients who were assessed only once by the consultant. Cases were represented by patients who underwent two or more consultations. Statistical analysis was performed with STATA 13.1, using logistic regressions.

Results

In this study, 515 consultations were requested for 309 patients potentially eligible for liver transplantation. Controls were 209 (67.6%); cases were 100 (32.4%). Positive psychiatric history (odds ratio [OR] = 2.44; 95% confidence interval [CI], 1.43–4.16), viral or toxic (alcohol- or drug-related) liver disease (OR = 1.93; 95% CI, 1.09–3.42), use of psychotropic medications at the baseline (OR = 2.15; 95% CI, 1.14–4.07), and female gender (OR = 1.77; 95% CI, 1.01–3.11) were significantly associated with an increased probability of being cases.

Conclusions

Positive psychiatric history, viral or toxic liver disease, use of psychotropic medications at the index referral, and female gender are possible biopsychosocial predictors of complexity in patients eligible for liver transplantation.  相似文献   

7.

Background

Periprosthetic joint infection (PJI) is a serious complication of total hip arthroplasty (THA). Although the number of revision cases is increasing, the prevalence of PJI as an indication for revision surgery, and the variability of this indication among surgeons and hospitals, is unclear.

Methods

The New York Statewide Planning and Research Cooperative System was used to identify 33,582 patients undergoing revision THA between 2000 and 2013. PJI was identified using International Classification of Diseases, Ninth Revision diagnosis codes. Volume was defined using mean number of revision THAs performed annually by each hospital and surgeon.

Results

PJI was the indication for 13.0% of all revision THAs. The percentage of revision THAs for PJI increased between years 2000 and 2007 (odds ratio [OR] = 1.05, P < .001), but decreased between years 2008 and 2013 (OR = 0.96, P = .001). Compared to medium-volume hospitals, the PJI burden at high-volume hospitals decreased during years 2000-2007 (OR = 0.58, P < .001) and 2008-2013 (OR = 0.57, P < .001). Compared to medium-volume surgeons, the PJI burden for high-volume surgeons increased during years 2000-2007 (OR = 1.39, P < .001), but did not differ during years 2008-2013 (P = .618).

Conclusion

The burden of PJI as an indication for revision THA may be plateauing. High-volume institutions have seen decreases in the percentage of revisions performed for PJI over the complete study duration. Specific surgeon may be associated with the plateauing in PJI rates as high-volume surgeons in 2008-2013 were no longer found to be at increased risk of PJI as an indication for revision THA.  相似文献   

8.

Background

Venous thromboembolic disease (VTED) is a serious complication of primary and revision total knee arthroplasty (TKA). However, the incidence and risk of VTED for revision compared with primary TKA cases have not been well-described.

Methods

We identified 225,584 TKAs (208,954 primaries, 16,630 revisions) in the 2003-2012 Statewide Planning and Research Cooperative System database. Odds ratios (ORs) expressed the risk of VTED for revision vs primary TKA, and models were adjusted for age, gender, race, and Charlson comorbidity scores. Outcome analyses were further stratified into deep venous thromboses (DVTs) and pulmonary emboli (PEs).

Results

The incidence of VTED within 30 days was 2.24% for primary and 1.84% for revision. In multivariable-adjusted regression, the OR of VTED within 30 days for revision compared with primary was 0.81 (95% confidence interval = 0.72-0.91; P < .001). The incidence of VTED within 90 days was 2.42% for primary and 2.13% for revision (P = .022), with a multivariable-adjusted OR of 0.87 (95% confidence interval = 0.78-0.97; P = .010) for revision compared with primary. The association was stronger for PE (OR = 0.63; P < .001) than DVT (OR = 0.87; P = .035) at 30 days, and significant for PE (OR = 0.69; P < .001), but not DVT (OR = 0.94; P = .284) at 90 days.

Conclusion

In a large statewide database, the risk of VTED was lower for revision TKA compared with primary TKA. The reasons for this observation are not known, but might be related to aggressive prophylactic management of patients undergoing revision procedures. Future studies should attempt to clarify differences in patient selection and management for primary vs revision procedures.  相似文献   

9.

Background

In a large prospective cohort, we recently showed that only 66.1% of total knee arthroplasty (TKA) with a perfect outcome according to Knee Society Knee Score was completely forgotten in all everyday activities. The main objective of this study was to identify clinical and orthopedic factors associated with the acquisition of “forgotten knee” (FK).

Methods

Patients undergoing TKA were enrolled between January 2001 and January 2008. Preoperative medical history, anthropometric data, and clinical data were recorded, and composite scores (Knee Society Score, Lequesne) were assessed. Radiography was performed before and after surgery. At each follow-up, FK acquisition was assessed by a closed question “Does the operated knee feel always normal in all everyday activities?”

Results

We included 510 TKAs performed in 423 patients followed up for a mean of 76.6 ± 28.5 months. On multivariate analysis, depression at baseline and presence of patellar subluxation after surgery were negatively associated with FK acquisition (odds ratio [OR] = 0.28 [95% confidence interval {CI}, 0.13-0.61], P = .001; and OR = 0.31 [0.12-0.79], P = .01, respectively), whereas increased active flexion at last follow-up was positively associated (OR = 1.07 [1.03-1.10], P < .0001). In patients with a perfect outcome (Knee Society Knee Score = 100), preoperative patellar pain, and postoperative patellar subluxation were negatively associated with FK acquisition (OR = 0.41 [0.18-0.93], P = .03 and OR = 0.21 [0.05-0.90], P = .04, respectively). Gender, age, body mass index, preoperative pain and functional limitation, and patellar resurfacing were not significantly related to FK.

Conclusion

Depression and patella maltracking may be associated with lack of FK acquisition after TKA, while postoperative increase in flexion may have a positive impact.  相似文献   

10.

Aim

To investigate the efficacy of cerebral oximetry (CO) as an auxiliary diagnostic tool in brain death (BD).

Materials and Methods

This observational case-control study was performed on patients with suspected BD. Patients with diagnosis of BD confirmed by the brain death committee were enrolled as the BD group and other patients as the non-BD group. CO monitoring was performed at least 6 h, and cerebral tissue oxygen saturation (ScO2) parameters were compared.

Results

Mean ScO2 level in the BD group was lower than non-brain-dead patients: mean difference for right lobe = 6.48 (95% confidence interval [CI] 0.08–12.88) and for left lobe = 6.09 (95% CI ?0.22–12.41). Maximum ScO2 values in the BD group were significantly lower than the non-BD group: mean difference for right lobe = 8.20 (95% CI 1.64–14.77) and for left lobe = 9.54 (95% CI 3.06–16.03). The area under the curve for right lobe maximum ScO2 was 0.69 (95% CI 0.55–0.81) and for left lobe was 0.72 (95% CI 0.58–0.84).

Conclusion

Maximum ScO2 in brain-dead patients at CO monitoring is significantly low. However, this cannot be used to differentiate brain-dead and non-brain-dead patients. CO monitoring is therefore not an appropriate auxiliary diagnostic tool for confirming BD.  相似文献   

11.

Background

Although many risk factors are reported about graft rejection after heart transplantation (HTx), the effect of HLA mismatch (MM) still remains unknown, especially in the Japanese population. The aim of the present study was to investigate the influence of HLA MM on graft rejection among HTx recipients in Japan.

Methods

We retrospectively investigated the association of the number of HLA MM including class I (A, B) and class II (DR) (for each locus MM: 0 to 2, total MM: 0 to 6) and the incidence of moderate to severe acute cellular rejection (ACR) confirmed by endomyocardial biopsy (International Society for Heart and Lung Transplantation grade ≥ 3A/2R) within 1 year after HTx.

Results

Between 2007 and 2014, we had 49 HTx cases in our institute. After excluding those with insufficient data and positive donor-specific antigen, finally 35 patients were enrolled. Moderate to severe ACR was observed in 16 (45.7%) patients. The number of HLA-DR MM was significantly associated with the development of ACR (ACR+: 1.50 ± 0.63, ACR?: 1.11 ± 0.46, P = .029). From univariate analysis, DR MM = 2 was the only independent risk factor for ACR episodes (P = .017). The frequency of ACR within 1 year was significantly higher in those with DR MM = 2 (DR MM = 0 to 1: 0.3 ± 0.47, DR MM = 2: 1.17 ± 1.34 times, P = .007).

Conclusions

The number of HLA-DR MMs was associated with the development and recurrence of ACR episodes among HTx recipients within 1 year after transplantation in Japanese population.  相似文献   

12.

Background

Lumbar decompression surgery is often used to treat neurological symptoms of the lower extremity as a result of lumbar disease. However, this method also leads to the improvement of the accompanying low back pain (LBP). We studied the extent of LBP improvement after lumbar decompression surgery without fusion and the associated preoperative factors.

Methods

Patients (n = 140) with lumbar spinal stenosis (n = 90) or lumbar disc herniation (n = 50) were included. To evaluate the change in LBP, VAS scores and the Oswestry disability index scores were measured before surgery and 2 weeks, 3 months, and 6 months after surgery. The predictors of residual LBP were investigated using logistic regression analyses.

Results

In total, 140 patients were examined. The VAS scores for LBP before surgery and 2 weeks, 3 months, and 6 months after surgery were 4.4 ± 3.0 (mean ± standard deviation), 1.1 ± 1.5, 1.3 ± 1.8, and 1.9 ± 2.2, respectively. LBP significantly improved 2 weeks after surgery (P < 0.001), stabilized between 2 weeks and 3 months after surgery, but was significantly aggravated 3–6 months after surgery (P < 0.001). At 6 months after surgery, 67 (47.9%) patients had a VAS score of >1. The predictors of residual LBP included severe preoperative LBP, degenerative scoliosis and the size of the Cobb angle. The independent predictors, determined by multivariate analysis were degenerative scoliosis and the size of the Cobb angle.

Conclusions

LBP was alleviated at 2 weeks after lumbar decompression surgery for lumbar disc herniation and lumbar spinal stenosis. The predictors of residual LBP after decompression included more severe LBP at baseline, degenerative scoliosis and the size of Cobb angle.

Level of evidence

Level 3.  相似文献   

13.

Background

Locomotive disorders are one of the main causative pathologies for the condition requiring assistance on activities of daily living (ADL). Although psychological concerns such as feeling of depression and anxiety are prevalent in elderly people, the causal relation among motor function, ADL disability, and psychological concerns is controversial.

Purpose

Purpose of this study was to investigate causal relationship among motor function, ADL disability, and psychological concerns in elderly people with locomotive disorders.

Methods

The data for this study were from a community-dwelling sample of 314 elderly persons with locomotive disorders aged 65 and older who visited orthopedic clinics and/or affiliated institutions. Motor function was assessed by one-leg standing time with eyes open, leg extension power and grip power. We assessed ADL disability using the 25-question Geriatric Locomotive Function Scale (GLFS-25), and psychological concerns by three self-reported questions. We constructed two models and tested fitness of the models to the data using a structural equation modeling (SEM). Model 1: motor function affects ADL disability and ADL disability affects psychological concerns, Model 2: motor function affects psychological concerns and psychological concerns affects ADL disability.

Results

The fit indices were chi-square = 23.152 (p = 0.081), RMSEA = 0.042, GFI = 0.981, AGFI = 0.955, CFI = 0.987 for Model 1, and chi-square = 84.583 (p < 0.001), RMSEA = 0.119, GFI = 0.935, AGFI = 0.854, CFI = 0.892 for Model 2. These fit indices indicated a good fit of the model 1 and inadequate fit of model 2 to the data.

Conclusion

Decline of motor function contributed toward psychological concerns via ADL disability in elderly people with locomotive disorders.  相似文献   

14.

Background

Autosomal-dominant polycystic kidney disease (ADPKD) has a feature of disruption of tubular integrity with increased cellular proliferation and apoptosis. There are several known tubular membrane proteins in the pathogenesis of ADPKD, and one of these proteins is the neutrophil gelatinase-associated lipocalin (NGAL). NGAL is a protein expressed on renal tubular cells of which production is markedly increased in response to harmful stimuli such as ischemia or toxicity.

Objective

We aim to study whether urinary NGAL levels could be used as a marker to identify the severity of ADPKD in patients.

Methods

Urinary NGAL levels were measured in 30 patients with ADPKD compared with 30 control patients who were matched by age, gender, and glomerular filtration rate (GFR). All patients with ADPKD were diagnosed by using both phenotypic and genotypic criteria, which showed that all cases of ADPKD were caused by PKD1 gene mutation. The urinary NGAL level was measured using The NGAL Test by Roche, with analytic range of 25–1000 ng/mL.

Results

In the ADPKD group, there was significant negative correlation between urinary NGAL and GFR (Pearson r = ?0.472; P = .008) and significant positive correlation between urinary NGAL and serum creatinine (Pearson r = 0.718; P < .01). Elevated urinary NGAL was increased as GFR of ADPKD patients was decreased.

Conclusion

Urinary NGAL might play role in the pathway of renal tubular damage in patients with ADPKD and might be useful in the prediction of the possibility to progress to chronic kidney disease in patients with ADPKD.  相似文献   

15.

Background

Systemic inflammation affects kidney function in a wide range of diseases. Even in kidney transplant recipients, higher levels of C-reactive protein (CRP) are invariably associated with both worse short- and long-term graft outcomes. However, little is known about systemic inflammation in kidney donors and, notably, brain death causes a strong systemic inflammatory response.

Objective

To analyze the role of systemic inflammation of brain-dead donors on short-term kidney graft outcomes (ie, delayed graft function [DGF], defined as the need of dialysis during the first week after transplantation).

Materials and methods

Retrospective analysis of clinical and biochemical characteristics of all brain-dead kidney donors generated in the Hospital Clínic of Barcelona in the 2006 to 2015 period (n = 194). Donors who were tested for CRP in the 24 hours before BD declaration were included (n = 97, 50% of initial population). Clinical and biochemical features of their respective recipients (n = 165) were analyzed, comparing recipients who developed DGF (n = 30) with recipients who did not (n = 135).

Results

Donors whose recipients later developed DGF had much higher CRP values (10.58 [5.1-18.21] vs 4.81 [1.42-12.2] mg/dL, P = .025). Other characteristics associated with the development of DGF were renal biopsy score and recipient dialysis vintage (P = .025 and P = .002, respectively). In logistic regression analysis, PCR maintained significance in the non–expanded criteria donor (ECD) group (odds ratio [OR], 1.102; P = .027), but it lost significance in the ECD group (P = .67).

Conclusions

Terminal donor CRP was associated with DGF in kidney transplant recipients and proved to be mostly significant in younger donors.  相似文献   

16.

Objective

Liver resection (LR) and living-donor liver transplantation (LDLT) are considered the two potentially curative treatments for hepatocellular carcinoma (HCC). The aim of this study was to investigate whether there is a difference in the oncologic outcomes between LR and LDLT according to tumor biology.

Methods

Patients (137 LDLTs and 199 LRs) were stratified into four groups by tumor biology according to the number of risk factors for recurrence (preoperative alpha-fetoprotein >200 ng/mL, Edmonson grade 3 or 4, tumor size >3 cm, and presence of microvascular invasion).

Results

In the favorable tumor biology patients (groups I and II), there was a significantly worse recurrence-free survival rate in those patients who underwent LR compared to those who underwent LDLT (group I, P = .002; group II, P = .001). The overall survival rates in the LR and LDLT groups were not different (group I, P = .798; group II, P = .981). In the poor tumor biology patients (groups III and IV), there was no significant difference between the two groups in terms of recurrence-free survival rate (group III, P = .342; group IV, P = .616). The LDLT group showed a significantly lower overall survival rate (group III, P = .001; group IV, P = .025).

Conclusions

Primary LDLT should not be recommended in early stage HCC patients with poor tumor biology because of lower survival rates and a high chance of HCC recurrence.  相似文献   

17.

Introduction

The relative torsional angle of the distal tibia is dependent on a deformity of the proximal tibia, and it is a commonly used torsional parameter to describe deformities of the tibia; however, this parameter cannot show the location and direction of the torsional deformity in the entire tibia. This study aimed to identify the detailed deformity in the entire tibia via a coordinate system based on the diaphysis of the tibia by comparing varus osteoarthritic knees to healthy knees.

Methods

In total, 61 limbs in 58 healthy subjects (age: 54 ± 18 years) and 55 limbs in 50 varus osteoarthritis (OA) subjects (age: 72 ± 7 years) were evaluated. The original coordinate system based on anatomic points only from the tibial diaphysis was established. The evaluation parameters were 1) the relative torsion in the distal tibia to the proximal tibia, 2) the proximal tibial torsion relative to the tibial diaphysis, and 3) the distal tibial torsion relative to the tibial diaphysis.

Results

The relative torsion in the distal tibia to the proximal tibia showed external torsion in both groups, while the external torsion was lower in the OA group than in the healthy group (p < 0.0001). The proximal tibial torsion relative to the tibial diaphysis had a higher external torsion in the OA group (p = 0.012), and the distal tibial torsion relative to the tibial diaphysis had a higher internal torsion in the OA group (p = 0.004) in comparison to the healthy group.

Conclusion

The reverse torsional deformity, showing a higher external torsion in the proximal tibia and a higher internal torsion in the distal tibia, occurred independently in the OA group in comparison to the healthy group. Clinically, this finding may prove to be a pathogenic factor in varus osteoarthritic knees.

Level of evidence

Level Ⅲ.  相似文献   

18.

Background

This study was performed to assess the impact of soft tissue imbalance on the knee flexion angle 2 years after posterior stabilized total knee arthroplasty (TKA).

Methods

A total of 329 consecutive varus knees were included to assess the association of knee flexion angle 2 years after TKA with preoperative, intraoperative, and postoperative variables. All intraoperative soft tissue measurements were performed by a single surgeon under spinal anesthesia in a standardized manner including the subvastus approach, reduced patella, and without use of a pneumonic tourniquet.

Results

Multiple linear regression analysis showed no significant correlations in terms of intraoperative valgus imbalance at 90-degree flexion or the difference in soft tissue tension between 90-degree flexion and 0-degree extension (β = ?0.039; 95% confidence interval [CI], ?0.88 to 0.80; P = .93 and β = 0.015; 95% CI, ?0.29 to 0.32; P = .92, respectively). Preoperative flexion angle was significantly correlated with knee flexion angle 2 years after TKA (β = 0.42; 95% CI, 0.33 to 0.51; P < .0001).

Conclusion

Avoiding valgus imbalance at 90-degree flexion and aiming for strictly equal soft tissue tension between 90-degree flexion and 0-degree extension had little practical value with regard to knee flexion angle 2 years after posterior stabilized TKA.  相似文献   

19.

Background

Acute kidney injury (AKI) is a common complication in the early period of lung transplantation (LTx). We aimed to describe the incidence and perioperative risk factors associated with AKI following LTx.

Methods

Clinical data of 30 patients who underwent LTx were retrospectively reviewed. Primary outcomes were development of AKI and patient mortality within 30 postoperative days. Postoperative AKI is determined based on creatinine criteria from Acute Kidney Injury Network (AKIN) classification. Secondary outcomes included the association between AKI and demographic and clinical parameters of patients and treatment modalities in the pre- and postoperative periods.

Results

Of the 30 LTx recipients included, AKI occurred in 16 patients (53.4%) within the first 30 days. Length of intensive care unit (P = .06) and hospital stay (P = .008) and mechanical ventilation duration (P = .03) were significantly higher in patients with AKI compared with patients without AKI. Factors independently associated with AKI were intraoperative hypotension (odds ratio [OR] 0.500; 95% confidence interval [CI], 1.145 to 26.412, P = .02), longer duration of mechanical ventilation (OR 1.204; 95% CI 0.870 to 1.665, P = .03), and systemic infection (OR 8.067; 95% CI 1.538 to 42.318, P = .014) in the postoperative period. Short-term mortality was similar in patients with and patients without AKI.

Conclusion

By the AKIN definition, AKI occurred in half of the patients following LTx. Several variables including intraoperative hypotension, longer duration of mechanical ventilation, and systemic infection in the postoperative period independently predict AKI in LTx recipients.  相似文献   

20.

Study Design

Multicenter retrospective study.

Background

Postoperative surgical site infection is one of the most serious complications following spine surgery. Previous studies do not appear to have investigated pyogenic discitis following lumbar laminectomy without discectomy. This study aimed to identify risk factors for postoperative pyogenic discitis following lumbar decompression surgery.

Methods

We examined data from 2721 patients undergoing lumbar laminectomy without discectomy in five hospitals from April 2007 to March 2012. Patients who developed postoperative discitis following laminectomy (Group D) and a 4:1 matched cohort (Group C) were included. Fisher's exact test was used to determine risk factors, with values of p < 0.05 considered statistically significant.

Results

The cumulative incidence of postoperative discitis was 0.29% (8/2721 patients). All patients in Group D were male, with a mean age of 71.6 ± 7.2 years. Postoperative discitis was at L1/2 in 1 patient, at L3/4 in 3 patients, and at L4/5 in 4 patients. Except for 1 patient with discitis at L1/2, every patient developed discitis at the level of decompression. The associated pathogens were methicillin-resistant Staphylococcus aureus (n = 3, 37.5%), methicillin-susceptible Staphylococcus epidermidis (n = 1, 12.5%), methicillin-sensitive S. aureus (n = 1, 12.5%), and unknown (n = 3, 37.5%). In the analysis of risk factors for postoperative discitis, Group D showed a significantly lower ratio of patients who underwent surgery in the winter and a significantly higher ratio of patients who had Modic type 1 in the lumbar vertebrae compared to Group C.

Conclusions

Although further prospective studies, in which other preoperative modalities are used for the evaluation, is needed, our data suggest the presence of Modic type 1 as a risk factor for discitis following laminectomy. Latent pyogenic discitis should be carefully ruled out in patients with Modic type 1. If lumbar laminectomy is performed for such patients, more careful observation is necessary to prevent the development of postoperative discitis.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号