首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.

Background and purpose

The appropriate fixation method for hemiarthroplasty of the hip as it relates to implant survivorship and patient mortality is a matter of ongoing debate. We examined the influence of fixation method on revision rate and mortality.

Methods

We analyzed approximately 25,000 hemiarthroplasty cases from the AOA National Joint Replacement Registry. Deaths at 1 day, 1 week, 1 month, and 1 year were compared for all patients and among subgroups based on implant type.

Results

Patients treated with cemented monoblock hemiarthroplasty had a 1.7-times higher day-1 mortality compared to uncemented monoblock components (p < 0.001). This finding was reversed by 1 week, 1 month, and 1 year after surgery (p < 0.001). Modular hemiarthroplasties did not reveal a difference in mortality between fixation methods at any time point.

Interpretation

This study shows lower (or similar) overall mortality with cemented hemiarthroplasty of the hip.The frequency of hip fractures is increasing with our ageing population, with an annual incidence of between 1.4 and 5 per 103 per year (Lonnroos et al. 2006, Icks et al. 2008, Varez-Nebreda et al. 2008). Health model projections have estimated that 6.3 million hip fractures will occur annually worldwide within the next 40 years (Cooper et al. 1992), imposing a significant economic health burden. There is a large reported perioperative mortality rate in this population, ranging from 2.4% to 8.2% at 1 month (Parvizi et al. 2001, Radcliff et al. 2008) and over 25% at 1 year (Elliott et al. 2003, Jiang et al. 2005). Furthermore, it was recently reported that the current mortality rate is higher now than 25 years ago (Vestergaard et al. 2007a). Today, it is generally accepted that displaced intracapsular fractures are best treated with arthroplasty rather than internal fixation (Keating et al. 2006, Leighton et al. 2007). In the at-risk population, however, multiple comorbidities are common and the best form of component fixation is in question.Bone cement implantation syndrome is a well-described complication of cemented hip arthroplasty. It is characterized by a systemic drop in systolic blood pressure, hypoxemia, pulmonary hypertension, cardiac dysrhythmias, and occasionally cardiac arrest and death (Rinecker 1980, Orsini et al. 1987, Parvizi et al. 1999). The prevailing theory to explain the pathophysiology of this phenomenon is embolism of fat, marrow contents, bone, and to some degree methylmethacrylate to the lung (Rinecker 1980, Elmaraghy et al. 1998, Parvizi et al. 1999, Koessler et al. 2001). An increased degree of pulmonary insult with fat microemboli has been demonstrated (mostly in randomized controlled trials) during insertion of a cemented femoral stem rather than an uncemented implant (Orsini et al. 1987, Ries et al. 1993, Christie et al. 1994, Pitto et al. 1999), presumably due to increased intramedullary femoral canal pressures in the cemented group (Kallos et al. 1974, Orsini et al. 1987). These pressures can be reduced by the use of distal venting holes in the femur during stem insertion (Engesæter et al. 1984). It has been shown previously by single-institutional review that patients undergoing cemented hip arthroplasty have a higher intraoperative mortality rate relative to uncemented arthroplasty, presumably due to a reduced incidence of fat embolism in the latter group (Parvizi et al. 1999). The increased mortality risk was also present at 30 days in the treatment of acute fractures with cemented arthroplasty, also from a single-institutional review (Parvizi et al. 2004). Although cement-related mortality is rare (Dearborn and Harris 1998, Parvizi et al. 1999, 2001, 2004, Weinrauch et al. 2006), it is a devastating complication—often reported through observational studies or literature reviews. Proponents of uncemented hip arthroplasty often cite this concern to support their reluctance to use cemented hip arthroplasty in both elective procedures and fracture management. However, many different types of studies have been unable to identify any increased mortality risk with the use of cement (Lausten and Vedel 1982 (observational), Emery et al. 1991 (RCT), Lo et al. 1994 (observational), Khan et al. 2002a,b (literature review), Parker and Gurusamy 2004 (literature review)) and others have shown a decrease in mortality at 30 days when cement is used (Foster et al. 2005).Cemented hip hemiarthroplasty appears to offer improved rate of return to baseline function, reduced postoperative pain, and superior long-term survivorship relative to uncemented arthroplasty (Khan et al. 2002a, b, Parker and Gurusamy 2004). We reasoned that failure to return to baseline function after hemiarthroplasty may be another risk factor for perioperative mortality (Hannan et al. 2001, Braithwaite et al. 2003). Lower revision rates for cemented prostheses and increased mortality at revision surgery contribute further to reducing the overall mortality risk. We evaluated the relationship between the method of fixation of hip arthroplasty and perioperative mortality using a large national joint replacement registry.  相似文献   

2.

Background and purpose

Joint replacement with metal-on-metal (MOM) bearings have gained popularity in the last decades in young and active patients. However, the possible effects of MOM wear debris and its corrosion products are still the subject of debate. Alongside the potential disadvantages such as toxicity, the influences of metal particles and metal ions on infection risk are unclear.

Methods

We reviewed the available literature on the influence of degradation products of MOM bearings in total hip arthroplasties on infection risk.

Results

Wear products were found to influence the risk of infection by hampering the immune system, by inhibiting or accelerating bacterial growth, and by a possible antibiotic resistance and heavy metal co-selection mechanism.

Interpretation

Whether or not the combined effects of MOM wear products make MOM bearings less or more prone to infection requires investigation in the near future.Many young patients with painful coxarthrosis want to return to a high level of activity and require an implant that provides durability. The low wear rates of metal-on-metal (MOM) bearings have led to a resurgence in the use of MOM bearings (Wagner and Wagner 2000, Silva et al. 2005, Pollard et al. 2006, Vendittoli et al. 2007, Delaunay et al. 2008). 35% of all prostheses in the United States in 2006 (Bozic et al. 2009) and 16% of all prostheses implanted in Australia from 1999 through 2007 had MOM bearings (Graves et al. 2008).Metal alloys used in MOM bearings degrade through wear, from corrosion, or by a combination of the two (Yan et al. 2006, Jacobs et al. 2008). Consequently, MOM bearings produce nanometer- to submicrometer-sized metal particles (Campbell et al. 1996, Doorn et al. 1998). The high number of these very small particles presents a large cumulative surface area for corrosion. The biological effects of these particles and their corrosion products in the human body are for the most part unclear. Since the renewed interest in MOM bearings, extensive research has been done to determine the consequences of local and systemic exposure to wear particles and accompanying biologically active corrosion products (Amstutz and Grigoris 1996). It is well known that metal debris can induce pathological changes such as the release of inflammatory cytokines from macrophages, histiocytosis, fibrosis, and necrosis (Basle et al. 1996, Granchi et al. 1998, Caicedo et al. 2008, 2009). Metal debris is also thought to be associated with hypersensitivity and osteolysis (Hallab et al. 2000, 2010, Goodman 2007b, Carr and DeSteiger 2008, Huber et al. 2009). However, there is very little literature on the bacteriological effects of these degradation products (Anwar et al. 2007, Hosman et al. 2009). It is therefore unclear whether they can influence the risk of infection.The Australian and New Zealand joint registries have shown that between 9% and 15% of all total hip arthroplasty (THA) revisions are carried out because of infections related to the primary prosthesis (Rothwell et al. 2007, Graves et al. 2008). In cases of infection, bacteria adopt a biofilm mode of growth on the surface of the prosthesis, thus increasing the antibiotic resistance and resulting in major difficulties in treatment (Trampuz and Widmer 2006). Removal and replacement of an infected implant is usually required to eliminate the infection (Bozic and Ries 2005, Vincent et al. 2006). Recent research has suggested that particulate debris of any composition promotes bacterial growth by providing a scaffold for bacterial adhesion and biofilm growth (Anwar et al. 2007). On the other hand, high concentrations of metal ions have been shown to have bacteriostatic properties (Hosman et al. 2009).Considering the paucity of publications on the effects of MOM particles on infection, we performed a review of the literature on the influence of MOM wear particles and their corrosion products on the risk of infection.  相似文献   

3.

Background and purpose

There is considerable uncertainty about the optimal treatment of displaced 4-part fractures of the proximal humerus. Within the last decade, locking plate technology has been considered a breakthrough in the treatment of these complex injuries.

Methods

We systematically identified and reviewed clinical studies on the benefits and harms after osteosynthesis with locking plates in displaced 4-part fractures.

Results

We included 14 studies with 374 four-part fractures. There were 10 case series, 3 retrospective observational comparative studies, 1 prospective observational comparative study, and no randomized trials. Small studies with a high risk of bias precluded reliable estimates of functional outcome. High rates of complications (16–64%) and reoperations (11–27%) were reported.

Interpretation

The empirical foundation for the value of locking plates in displaced 4-part fractures of the proximal humerus is weak. We emphasize the need for well-conducted randomized trials and observational studies.There is considerable uncertainty about the optimal treatment of displaced 4-part fractures of the proximal humerus (Misra et al. 2001, Handoll et al. 2003, Bhandari et al. 2004, Lanting et al. 2008). Only 2 small inconclusive randomized trials have been published (Stableforth 1984, Hoellen et al. 1997). A large number of interventions are used routinely, ranging from a non-operative approach to open reduction and internal fixation (ORIF), and primary hemiarthroplasty (HA).In the last decade, locking plate technology has been developed and has been heralded as a breakthrough in the treatment of fractures in osteoporotic bone (Gautier and Sommer 2003, Sommer et al. 2003, Haidukewych 2004, Miranda 2007). Locking plate technique is based on the elimination of friction between the plate and cortex, and relies on stability between the subchondral bone and screws. Multiple multidirectional convergent and divergent locking screws enhance the angular stability of the osteosynthesis, possibly resulting in better postoperative function with reduced pain. Reported complications include screw cut-out, varus fracture collapse, tuberosity re-displacement, humeral head necrosis, plate impingement, and plate or screw breakage (Hall et al. 2006, Tolat et al. 2006, van Rooyen et al. 2006, Agudelo et al. 2007, Gardner et al. 2007, Khunda et al. 2007, Ring 2007, Smith et al. 2007, Voigt et al. 2007, Egol et al. 2008, Kirchhoff et al. 2008, Owsley and Gorczyca 2008, Brunner et al. 2009, Micic et al. 2009, Sudkamp et al. 2009). The balance between the benefit and harms of the intervention seems delicate.Several authors of narrative reviews and clinical series have strongly recommended fixation of displaced 4-part fractures of the humerus with locking plates (Bjorkenheim et al. 2004, Hente et al. 2004, Hessler et al. 2006, Koukakis et al. 2006, Kilic et al. 2008, Korkmaz et al. 2008, Shahid et al. 2008, Papadopoulos et al. 2009, Ricchetti et al. 2009) and producers of implants unsurprisingly strongly advocate them (aap Implantate 2010, Stryker 2010, Synthes 2010, Zimmer 2010). Despite the increasing use of locking plates (Illert et al. 2008, Ricchetti et al. 2009), we have been unable to identify systematic reviews on the benefits and harms of this new technology in displaced 4-part fractures. Thus, we systematically identified and reviewed clinical studies on the benefits and harms after osteosynthesis with locking plates in displaced 4-part fractures of the proximal humerus.  相似文献   

4.

Background and purpose

The choice of either all-polyethylene (AP) tibial components or metal-backed (MB) tibial components in total knee arthroplasty (TKA) remains controversial. We therefore performed a meta-analysis and systematic review of randomized controlled trials that have evaluated MB and AP tibial components in primary TKA.

Methods

The search strategy included a computerized literature search (Medline, EMBASE, Scopus, and the Cochrane Central Register of Controlled Trials) and a manual search of major orthopedic journals. A meta-analysis and systematic review of randomized or quasi-randomized trials that compared the performance of tibial components in primary TKA was performed using a fixed or random effects model. We assessed the methodological quality of studies using Detsky quality scale.

Results

9 randomized controlled trials (RCTs) published between 2000 and 2009 met the inclusion quality standards for the systematic review. The mean standardized Detsky score was 14 (SD 3). We found that the frequency of radiolucent lines in the MB group was significantly higher than that in the AP group. There were no statistically significant differences between the MB and AP tibial components regarding component positioning, knee score, knee range of motion, quality of life, and postoperative complications.

Interpretation

Based on evidence obtained from this study, the AP tibial component was comparable with or better than the MB tibial component in TKA. However, high-quality RCTs are required to validate the results.The design of the tibial component is an important factor for implant failure in total knee arthroplasty (TKA) (Pagnano et al. 1999, Forster 2003, Gioe et al. 2007b, Willie et al. 2008, Garcia et al. 2009, KAT Trial Group 2009). The metal-backed (MB) design of tibial component has become predominant in TKA because it is thought to perform better than the all-polyethylene (AP) design (Muller et al. 2006, Gioe et al. 2006, 2007a,b). In theory, the MB tibial component reduces bending strains in the stem, reduces compressive stresses in the cement and cancellous bone beneath the baseplate (especially during asymmetric loading), and distributes load more evenly across the interface (Bartel et al. 1982, 1985, Taylor et al. 1998). However, critics of the MB tibial component claim that there are expensive implant costs, reduced polyethylene thickness with the same amount of bone resection, backside wear, and increased tensile stresses at the interface during eccentric loading (Bartel et al. 1982, 1985, Pomeroy et al. 2000, Rodriguez et al. 2001, Li et al. 2002, Muller et al. 2006, Blumenfeld and Scott 2010, Gioe and Maheshwari 2010).In the past decade, several randomized controlled trials (RCTs) have been performed to assess the effectiveness of the MB tibial component (Adalberth et al. 2000, 2001, Gioe and Bowman 2000, Norgren et al. 2004, Hyldahl et al. 2005a, b, Muller et al. 2006, Gioe et al. 2007, Bettinson et al. 2009, KAT Trial Group 2009). However, data have not been formally and systematically analyzed using quantitative methods in order to determine whether the MB tibial component is indeed optimal for patients in TKA. In this study, we wanted (1) to determine the scientific quality of published RCTs comparing the AP and MB tibial components in TKA using Detsky score (Detsky et al. 1992) and (2) to conduct a meta-analysis and systematic review of all published RCTs that have compared the effects of AP and MB tibial components on the radiographic and clinical outcomes of TKA.  相似文献   

5.

Background and purpose

There is no consensus regarding the clinical relevance of gender-specific prostheses in total knee arthroplasty (TKA). We summarize the current best evidence in a comparison of clinical and radiographic outcomes between gender-specific prostheses and standard unisex prostheses in female patients.

Methods

We used the PubMed, Embase, Cochrane, Science Citation Index, and Scopus databases. We included randomized controlled trials published up to January 2013 that compared gender-specific prostheses with standard unisex prostheses in female patients who underwent primary TKAs.

Results

6 trials involving 423 patients with 846 knee joints met the inclusion criteria. No statistically significant differences were observed between the 2 designs regarding pain, range of motion (ROM), knee scores, satisfaction, preference, complications, and radiographic results. The gender-specific design (Gender Solutions; Zimmer Inc, Warsaw, Indiana) reduced the prevalence of overhang. However, it had less overall coverage of the femoral condyles compared to the unisex group. In fact, the femoral prosthesis in the standard unisex group matched better than that in the gender-specific group.

Interpretation

Gender-specific prostheses do not appear to confer any benefit in terms of clinician- and patient-reported outcomes for the female knee.Women account for almost two-thirds of knee arthroplasties (Kurtz et al. 2007). Recently, a possible effect of gender on functional outcomes and implant survivorship has been identified (Vincent et al. 2006, Ritter et al. 2008, Kamath et al. 2010, Parsley et al. 2010, O’Connor 2011). Gender differences in the anatomy of the distal femur are well documented (Conley et al. 2007, Yue et al. 2011a, b, Yan et al. 2012, Zeng et al. 2012). Women tend to have a less prominent anterior condyle (Conley et al. 2007, Fehring et al. 2009), a higher quadriceps angle (Q-angle) (Hsu et al. 1990, Woodland et al. 1992), and a reduced mediolateral to anteroposterior aspect ratio (Chin et al. 2002, Chaichankul et al. 2011). Investigators have found that standard unisex knee prostheses may not equally match the native anatomy in male and female knees (Clarke and Hentz 2008, Yan et al. 2012). A positive association between the femoral component size and the amount of overhang was observed in females, and femoral component overhang (≥ 3 mm) may result in postoperative knee pain or reduced ROM (Hitt et al. 2003, Lo et al. 2003, Mahoney et al. 2010).The concept of gender-specific knee prostheses was introduced to match these 3 anatomic differences in the female population (Conley et al. 2007). It includes a narrower mediolateral diameter for a given anteroposterior dimension, to match the female knee more closely. Additionally, the anterior flange of the prothesis was modified to include a recessed patellar sulcus and reduced anterior condylar height (to ovoid “overstuffing” during knee flexion) and a lateralized patellar sulcus (to accommodate the increased Q-angle associated with a wider pelvis).Several randomized controlled trials (RCTs) have failed to establish the superiority of the gender-specific prosthesis over the unisex knee prosthesis in the female knee (Kim et al. 2010a, b, Song et al. 2012a, Thomsen et al. 2012, von Roth et al. 2013). In contrast, other studies have found higher patient satisfaction and better radiographic fit in the gender-specific TKAs than in the standard unisex TKAs (Clarke and Hentz 2008, Parratte et al. 2011, Yue et al. 2014). We therefore performed a systematic review and meta-analysis to compare the clinical and radiographic results of TKA in female patients receiving gender-specific prostheses or standard unisex prostheses.  相似文献   

6.

Background and purpose

18F-FDG PET is a widely used tool for molecular imaging of oncological, cardiovascular, and neurological disorders. We evaluated 18F-FDG microPET as an implant osteomyelitis imaging tool using a Staphylococcus aureus-induced peroperative implant infection in rabbits.

Methods

Intramedullary titanium nails were implanted in contaminated and uncontaminated (control) proximal right tibiae of rabbits. Tibiae were quantitatively assessed with microPET for 18F-FDG uptake before and sequentially at 1, 3, and 6 weeks after surgery. Tracer uptake was assessed in soft tissue and bone in both treatment groups with an additional comparison between the operated and unoperated limb. MicroPET analysis was combined with radiographic assessment and complementary histology of the tibiae.

Results

At the first postoperative week, the 18F-FDG uptake in the contaminated implant group was significantly higher than the preoperative measurement, without a significant difference between the contaminated and uncontaminated tibiae. From the third postoperative week onward, 18F-FDG uptake allowed discrimination between osteomyelitis and postoperative aseptic bone healing, as well as quantification of the infection at distinct locations around the implant.

Interpretation

18F-FDG-based microPET imaging allows differentiation between deep infection and undisturbed wound healing after implantation of a titanium intramedullary nail in this rabbit model. Furthermore, our results indicate that 18F-FDG PET may provide a tool in human clinical diagnostics and for the evaluation of antimicrobial strategies in animal models of orthopedic implant infection.With more prostheses and osteosyntheses being implanted every year and a suggested increase in infection rate, the absolute number of implant infections will increase (Dale et al. 2009, Acklin et al. 2011, Kurtz et al. 2012). Deep orthopedic implant infections are difficult to diagnose in the early postoperative weeks, while diagnosis of infection in this period is important for optimal treatment and implant survival. A specific diagnostic tool to monitor implant infections is therefore imperative.Current diagnostics to detect orthopedic implant infections are based on clinical symptoms, hematological parameters, radiology, and nuclear scintigraphy. However, as in low-grade infections, in the early postoperative phase changes such as periosteal reactions and cortical thickening (Calhoun and Mader 1997, Smeltzer et al. 1997, Odekerken et al. 2013) or osteolysis and calcifications (Calhoun and Mader 1997, Smeltzer et al. 1997, Vogely et al. 2000, Odekerken et al. 2013) are not specific enough to differentiate between implant/soft tissue infection and aseptic wound problems. More discriminative power is needed to distinguish aseptic wound healing from bacterial infection and to follow implant infection quantitatively over time. 18F-fluorodeoxyglucose (18F-FDG) is widely used as a positron emission tomography (PET) tracer to diagnose and monitor several pathological conditions in the clinic (Stumpe et al. 2000, Toyama et al. 2004a,b, van der Bruggen et al. 2010, Huang et al. 2012, Marsboom et al. 2012). The use of 18F-FDG as a tracer is based on a local increase in metabolic turnover of glucose. Since the presence of bacteria and increased leukocyte infiltration in an infected area generates such a local increase in glucose turnover and leads to increased 18F-FDG uptake (Stumpe et al. 2000, Koort et al. 2004, Makinen et al. 2005a), local detection of bacterial infections is possible.We hypothesized that 18F-FDG PET scanning would be capable of providing the discriminative power needed to distinguish aseptic wound healing from orthopedic implant infection. To test this hypothesis, we sequentially determined the development of implant osteomyelitis by 18F-FDG microPET scanning of contaminated and uncontaminated rabbit tibiae and explored its possible use in diagnosis of implant infection.  相似文献   

7.
ResultsThe greater the volume of the hospital, the shorter was the average LOS and LUIC. Smaller hospital volume was not unambiguously associated with increased revision, re-admission, or MUA rates. The smaller the annual hospital volume, the more often patients were discharged home.InterpretationLOS and LUIC ought to be shortened in lower-volume hospitals. There is potential for a reduction in length of stay in extended institutional care facilities.Total knee replacement (TKR) is one of the most common orthopedic procedures, and it is expected to increase markedly in volume (Kurtz et al. 2007). Due to the potentially severe complications and the high economic impact of the procedure, efforts to minimize the risks and optimize perioperative efficiency are important.It has been suggested that increased hospital volume and reduction in length of stay (LOS) at the operating hospital after TKR are related, but there is no consensus (Yasunaga et al. 2009, Marlow et al. 2010, Paterson et al. 2010, Bozic et al. 2010, Styron et al. 2011). In addition, results on the association of hospital volume with re-admission rates (Soohoo et al. 2006b, Judge et al. 2006, Bozic et al. 2010, Cram et al. 2011) and revision risk have been inconclusive (Shervin et al. 2007, Manley et al. 2009, Bozic et al. 2010, Paterson et al. 2010). No-one has tried to study the association between length of uninterrupted institutional care (LUIC), incidence of manipulation under anesthesia (MUA) after TKR, and hospital volume.By combining 5 national-level registries, we examined possible associations between hospital volume and LOS, LUIC, discharge disposition, number of re-admissions within 14 and 42 days, MUA, and revisions after TKR for all knee arthroplasties performed in Finland between 1998 and 2010.  相似文献   

8.

Purpose

We wanted to improve the diagnosis of implant-related infection using molecular biological techniques after sonication.

Methods

We studied 258 retrieved implant components (185 prosthetic implants and 73 osteosynthesis implants) from 126 patients. 47 patients had a clinical diagnosis of infection (108 components) and 79 patients did not (150 components). The fluids from sonication of retrieved implants were tested in culture and were also analyzed using a modified commercial PCR kit for detection of Gram-positive and Gram-negative bacteria (GenoType BC; Hain Lifescience) after extraction of the DNA.

Results

38 of 47 patients with a clinical diagnosis of infection were also diagnosed as being infected using culture and/or PCR (35 by culture alone). Also, 24 patients of the 79 cases with no clinical diagnosis of infection were identified microbiologically as being infected (4 by culture, 16 by PCR, and 4 by both culture and PCR). Comparing culture and PCR, positive culture results were obtained in 28 of the 79 patients and positive PCR results were obtained in 35. There were 21 discordant results in patients who were originally clinically diagnosed as being infected and 28 discordant results in patients who had no clinical diagnosis of infection.

Interpretation

For prosthetic joint infections and relative to culture, molecular detection can increase (by one tenth) the number of patients diagnosed as having an infection. Positive results from patients who have no clinical diagnosis of infection must be interpreted carefully.Management of orthopedic implant-related infections starts with a proper etiological diagnosis, which is required for specific antibiotic treatment. Different approaches are used to obtain such a diagnosis (Trampuz et al. 2006, Del Pozo and Patel 2009) and these must take into account the importance of the development of bacterial biofilms in the pathogenesis and management of implant-related infections (Trampuz et al. 2003, 2006, Costerton 2005).The use of low-intensity ultrasound that releases biofilms is an alternative to classical culture methods from implants, and several protocols have been developed for this purpose (Trampuz et al. 2007, Dora et al. 2008, Esteban et al. 2008, Piper et al. 2009, Achermann et al. 2010). In these reports, the use of sonication of retrieved implants was reported to have similar sensitivity to or higher sensitivity than conventional techniques. Nevertheless, there are still patients with a clinical diagnosis of infection and negative cultures (Berbari et al. 2007). Previous use of antibiotics has been implicated as one of the main causes of this problem (Trampuz et al. 2007), but other causes are also possible. To solve the problem, molecular biological techniques have been proposed in order to obtain faster and more accurate results than conventional culture (Tunney et al. 1999, Sauer et al. 2005, Dempsey et al. 2007, Fihman et al. 2007, Moojen et al. 2007, Gallo et al. 2008, Kobayashi et al. 2008, Vandercam et al. 2008, De Man et al. 2009, Piper et al. 2009, Achermann et al. 2010, Riggio et al. 2010, Marin et al. 2012). Most of these reports were based on protocols that were developed in-house, which are difficult to integrate into clinical microbiology routines, even though they may give good results. Recently, however, commercial kits have been designed to work under common routine laboratory conditions. Here, we describe a study on the diagnosis of infection in a broad range of orthopedic implant-related infections, comparing conventional culture with detection of microbial DNA using a commercial kit—in both cases after sonication of retrieved implants.  相似文献   

9.
ResultsUnadjusted 10-year survival with the endpoint revision of any component for any reason was 92.1% (CI: 91.8–92.4). Unadjusted 10-year survival with the endpoint stem revision due to aseptic loosening varied between the stem brands investigated and ranged from 96.7% (CI: 94.4–99.0) to 99.9% (CI: 99.6–100). Of the stem brands with the best survival, stems with and without HA coating were found. The presence of HA coating was not associated with statistically significant effects on the adjusted risk of stem revision due to aseptic loosening, with an HR of 0.8 (CI: 0.5–1.3; p = 0.4). The adjusted risk of revision due to infection was similar in the groups of THAs using HA-coated and non-HA-coated stems, with an HR of 0.9 (CI: 0.8–1.1; p = 0.6) for the presence of HA coating. The commonly used Bimetric stem (n = 25,329) was available both with and without HA coating, and the adjusted risk of stem revision due to aseptic loosening was similar for the 2 variants, with an HR of 0.9 (CI: 0.5–1.4; p = 0.5) for the HA-coated Bimetric stem.InterpretationUncemented HA-coated stems had similar results to those of uncemented stems with porous coating or rough sand-blasted stems. The use of HA coating on stems available both with and without this surface treatment had no clinically relevant effect on their outcome, and we thus question whether HA coating adds any value to well-functioning stem designs.Hydroxyapatite (HA) is thought to improve early implant ingrowth and long-term stability in bone (Overgaard et al. 1997), and many stems intended for uncemented total hip arthroplasty (THA) are thus manufactured with HA coating. Several uncemented stems are only available with HA coating. Some HA-coated stems have excellent long-term outcomes in terms of the risk of revision, both for any reason and due to aseptic loosening (Capello et al. 2003, Shah et al. 2009). Registry data from Norway and Finland also indicate that certain HA-coated stems have excellent survivorship up to 10 years (Eskelinen et al. 2006, Hallan et al. 2007, Makela et al. 2008).On the other hand, a number of studies on stem survival in the setting of randomized trials or smaller observational studies have failed to show beneficial effects of HA coating on clinical outcome and implant survival when compared to alternatives such as porous coating and sand-blasted rough surfaces (McPherson et al. 1995, Tanzer et al. 2001, Kim et al. 2003, Parvizi et al. 2004, Sanchez-Sotelo et al. 2004). Meta-analyses that have pooled data from randomized or cohort studies have come to the conclusion that there is “[…] no clinically beneficial effect to the addition of HA to porous coating alone in primary uncemented hip arthroplasty” (Gandhi et al. 2009, Li et al. 2013). In addition, a Danish registry analysis found that the use of HA coating does not reduce the risk of stem revision (Paulsen et al. 2007). Furthermore, a comparison of 4,772 uncemented Bimetric stems with or without HA coating implanted between 1992 and 2009 did not reveal any difference in survival between the 2 variants (Lazarinis et al. 2011).HA was initially introduced as an implant coating to speed up and facilitate ongrowth and ingrowth of bone and thereby improve fixation, based on comprehensive preclinical and promising clinical documentation (Geesink et al. 1987, Bauer et al. 1991, Overgaard et al. 1997, Karrholm et al. 1998). Later on, concerns were raised due to findings of delamination and generation of HA particles originating from the coating with the potential to trigger osteolysis, acceleration of polyethylene wear, and subsequent implant loosening (Bloebaum and Dupont 1993, Morscher et al. 1998, Lazarinis et al. 2010). Today, there is renewed interest in HA coatings due to possible properties as a carrier for agents aimed at preventing infection (Ghani et al. 2012). Theoretical arguments for and against the use of HA coating can therefore be found. Given the renewed interest in uncemented stems—instigated by favorable outcomes after uncemented stem fixation in younger patients—the question of whether HA coating is beneficial or not is highly relevant (Eskelinen et al. 2006, Hooper et al. 2009, Swedish Hip Arthroplasty Register 2011). We therefore investigated uncemented stems with and without HA coating that are in frequent use in the Nordic countries, regarding early and long-term survival.  相似文献   

10.

Background and purpose

The natural history of, and predictive factors for outcome of cartilage restoration in chondral defects are poorly understood. We investigated the natural history of cartilage filling subchondral bone changes, comparing defects at two locations in the rabbit knee.

Animals and methods

In New Zealand rabbits aged 22 weeks, a 4-mm pure chondral defect (ICRS grade 3b) was created in the patella of one knee and in the medial femoral condyle of the other. A stereo microscope was used to optimize the preparation of the defects. The animals were killed 12, 24, and 36 weeks after surgery. Defect filling and the density of subchondral mineralized tissue was estimated using Analysis Pro software on micrographed histological sections.

Results

The mean filling of the patellar defects was more than twice that of the medial femoral condylar defects at 24 and 36 weeks of follow-up. There was a statistically significant increase in filling from 24 to 36 weeks after surgery at both locations.The density of subchondral mineralized tissue beneath the defects subsided with time in the patellas, in contrast to the density in the medial femoral condyles, which remained unchanged.

Interpretation

The intraarticular location is a predictive factor for spontaneous filling and subchondral bone changes of chondral defects corresponding to ICRS grade 3b. Disregarding location, the spontaneous filling increased with long-term follow-up. This should be considered when evaluating aspects of cartilage restoration.Focal articular cartilage injuries of the knee are common (Hjelle et al. 2002, Aroen et al. 2004) and they can impair patients'' quality of life as much as severe osteoarthritis (Heir et al. 2010). The literature concerning the natural history of focal cartilage defects in patients, and the intrinsic factors affecting it, is limited (Linden 1977, Messner and Gillquist 1996, Drogset and Grontvedt 2002, Shelbourne et al. 2003, Loken et al. 2010). In experimental studies evaluating cartilage restoration in general, the importance of intrinsic factors such as the depth and size of the lesion and the time from when the lesion was made to evaluation have been emphasized (Shapiro et al. 1993, Hunziker 1999, Lietman et al. 2002). Which part of the joint is affected and whether or not the defect is weight-bearing are also of interest (Hurtig 1988, Frisbie et al. 1999). Most of these studies have, however, concerned defects penetrating the subchondral mineralized tissues corresponding to ICRS grade 4 (Brittberg and Winalski 2003). Access to bone marrow elements in these defects might be one of the strongest predictive factors for filling of the defect, making the importance of other factors difficult to evaluate (Hunziker 1999).In experimental studies on pure chondral defects that do not penetrate the subchondral mineralized tissues, corresponding to ICRS grade 3b (Brittberg and Winalski 2003), the type of animal studied, the size of the lesion, and the location of the defects vary, and there is limited data on the influence of these parameters on outcome (Breinan et al. 2000). The information on spontaneous filling comes mainly from observations of untreated defects serving as controls (Grande et al. 1989, Brittberg et al. 1996, Breinan et al. 1997, 2000, Frisbie et al. 1999, 2003, Dorotka et al. 2005) and the information on subchondral bone changes is even more limited (Breinan et al. 1997, Frisbie et al. 1999). Although most human focal cartilage lesions are located on the medial femur condyle (Aroen et al. 2004), there have been few experimental studies involving untreated ICRS grade 3b defects on the medial femur condyle (Dorotka et al. 2005). According to a PubMed search, the rabbit knee is the most widely used experimental animal model for cartilage restoration (Årøen 2005). The locations of ICRS grade 3 chondral defects in the rabbit knee evaluated for spontaneous changes have included the patella (Grande et al. 1989, Brittberg et al. 1996) and, in one study, defects at the distal surface of the femur (Mitchell and Shepard 1976). The latter report did not, however, include quantitative data.To our knowledge, the influence of the intraarticular location on the outcome of cartilage restoration and subchondral bone changes has not been thoroughly studied. Thus, the main purpose of our study was to test the hypothesis that the intraarticular location influences the spontaneous filling of a chondral defect that does not penetrate the subchondral bone. Secondly, we wanted to evaluate whether the intraarticular location would influence changes in the subchondral bone and degenerative changes as evaluated from macroscopic appearance and proteoglycan content of synovial fluid (Messner et al. 1993a).  相似文献   

11.

Background and purpose

Adequate depth of cement penetration and cement mantle thickness is important for the durability of cemented cups. A flanged cup, as opposed to unflanged, has been suggested to give a more uniform cement mantle and superior cement pressurization, thus improving the depth of cement penetration. This hypothesis was tested experimentally.

Materials and methods

The same cup design with and without flange (both without cement spacers) was investigated regarding intraacetabular pressure, cement mantle thickness, and depth of cement penetration. With machine control, the cups were inserted into open-pore ceramic acetabular models (10 flanged, 10 unflanged) and into paired cadaver acetabuli (10 flanged, 10 unflanged) with prior pressurization of the cement.

Results

No differences in intraacetabular pressures during cup insertion were found, but unflanged cups tended to migrate more towards the acetabular pole. Flanged cups resulted in thicker cement mantles because of less bottoming out, whereas no differences in cement penetration into the bone were observed.

Interpretation

Flanged cups do not generate higher cementation pressure or better cement penetration than unflanged cups. A possible advantage of the flange, however, may be to protect the cup from bottoming out, and there is possibly better closure of the periphery around the cup, sealing off the cement-bone interface.The main cause of aseptic loosening is inadequate surgical techniques and inferior prosthetic implants (Herberts and Malchau 2000). Sufficient cement penetration (3–5 mm) into cancellous bone and prevention of bottoming out of the cup, as seen from a uniform cement mantle that is at least 2 mm thick (i.e. cement penetration excluded), have been said to be crucial for cup fixation (Huiskes and Slooff 1981, Noble and Swarts 1983, Schmalzried et al. 1993, Mjöberg 1994, Ranawat et al. 1997, Lichtinger and Muller 1998). A clean bony surface with partly exposed cancellous bone together with cement pressurization before prosthetic implantation improves the depth of cement penetration, thus creating a stronger cement-bone interface (Krause et al. 1982, Rey, Jr. et al. 1987, Mann et al. 1997, Flivik et al. 2006, Abdulghani et al. 2007).Absence of postoperative demarcation at the acetabular cement-bone interface has been related to a reduced risk of aseptic cup loosening (Ranawat et al. 1995, Garcia-Cimbrelo et al. 1997, Ritter et al. 1999, Flivik et al. 2005). The use of a flanged polyethylene cup has demonstrated both less postoperative demarcation at the above interface (Hodgkinson et al. 1993) and less loosening (Garellick et al. 2000). This may be due to its ability to increase cement pressurization at the time of implantation and thereby the depth of cement penetration, though conflicting experimental findings have been reported (Oh et al. 1985, Shelley and Wroblewski 1988, Parsch et al. 2004, Lankester et al. 2007). The previous studies addressing the use of flanged cups have all had cups inserted without prior pressurization of cement, and only Parsch et al. (2004) implanted the cup into a porous material (cadaveric bone).Accordingly, we decided to compare the intraacetabular pressures, cement mantle thickness, and depth of cement penetration obtained using flanged and unflanged cups inserted in an open-pore ceramic acetabular model as well as in paired cadaveric acetabuli, using pressurization of the cement before implantation.  相似文献   

12.
Methods Before surgery, hip pain (THA) or knee pain (TKA), lower-extremity muscle power, functional performance, and physical activity were assessed in a sample of 150 patients and used as independent variables to predict the outcome (dependent variable)—readiness for hospital discharge —for each type of surgery. Discharge readiness was assessed twice daily by blinded assessors.Results Median discharge readiness and actual length of stay until discharge were both 2 days. Univariate linear regression followed by multiple linear regression revealed that age was the only independent predictor of discharge readiness in THA and TKA, but the standardized coefficients were small (≤ 0.03).Interpretation These results support the idea that fast-track THA and TKA with a length of stay of about 2–4 days can be achieved for most patients independently of preoperative functional characteristics.Over the last decade, length of stay (LOS) with discharge to home after primary THA and TKA has declined from about 5–10 days to about 2–4 days in selected series and larger nationwide series (Malviya et al. 2011, Raphael et al. 2011, Husted et al. 2012, Kehlet 2013, Hartog et al. 2013, Jørgensen and Kehlet 2013). However, there is a continuing debate about whether selected patients only or all patients should be scheduled for “fast-track” THA and TKA in relation to psychosocial factors and preoperative pain and functional status (Schneider et al. 2009, Hollowell et al. 2010, Macdonald et al. 2010, Antrobus and Bryson 2011, Jørgensen and Kehlet 2013), or whether organizational or pathophysiological factors in relation to the surgical trauma may determine the length of stay (Husted et al. 2011, Husted 2012).We studied the role of THA and TKA patients’ preoperative pain and functional characteristics in discharge from 2 orthopedic departments with well-established fast-track recovery regimens (Husted et al. 2010).  相似文献   

13.

Background and purpose

Computer navigation in total knee arthroplasty is somewhat controversial. We have previously shown that femoral component positioning is more accurate with computed navigation than with conventional implantation techniques, but the clinical impact of this is unknown. We now report the 5-year outcome of our previously reported 2-year outcome study.

Methods

78 of initially 84 patients (80 of 86 knees) were clinically and radiographically reassessed 5 (5.1–5.9) years after conventional, image-based, and image-free total knee arthroplasty. The methodology was identical to that used preoperatively and at 2 years, including the Knee Society score (KSS) and the functional score (FS), and AP and true lateral standard radiographs.

Results

Although a more accurate femoral component positioning in the navigated groups was obtained, clinical outcome, number of reoperations, KSS, FS, and range of motion were similar between the groups.

Interpretation

The increased costs and time for navigated techniques did not translate into better functional and subjective medium-term outcome compared to conventional techniques.Abnormal wear patterns and component loosening are mainly results of component malalignment and complications of the extensor mechanism, the most common reasons for early failure of TKA (Ritter et al. 1994, Rand et al. 2003, Vince 2003, Bathis et al. 2004). It has been suggested that a varus or valgus malalignment of more the 3° leads to faster wear and debris, followed by early failure of TKA (Ecker et al. 1987, Archibeck and White 2003, Nizard et al. 2004).Several surgical navigation systems for TKA have been introduced to optimize component positioning (Delp et al. 1998, DiGioia et al. 1998, Krackow et al. 1999). It has been shown that navigation provides a more precise component positioning and fewer outliers (Bathis et al. 2004, Nabeyama et al. 2004, Stockl et al. 2004, Victor and Hoste 2004, Anderson et al. 2005, Zumstein et al. 2006). Nevertheless, comparing computer-navigated total knee arthroplasty with conventional implantation techniques, there is no evidence in the current literature of any significant improvement in clinical outcome and in component loosening (Bathis et al. 2004, Jenny et al. 2005, Yau et al. 2005, Bonutti et al. 2008, Molfetta and Caldo 2008).In a prospective study involving 86 patients in 3 different groups (image-based navigation, image-free navigation, and conventional), we showed that femoral component positioning was more accurate with navigation than with conventional implantation techniques, but tibial positioning showed similar results (Zumstein et al. 2006).Although other medium-term data on navigated total knee arthroplasty have already been reported (Ishida et al. 2011, Schmitt et al. 2011), there has been no prospective cohort series with reporting of the clinical, functional, and radiographic outcome with all 3 techniques: image-based navigated, image-free navigated, or conventional TKA. We therefore determined the clinical, functional, and radiographic 5-year results after each of the 3 techniques.  相似文献   

14.

Background and purpose

Glenoid reconstruction and inverted glenoid re-implantation is strongly advocated in revisions of failed reverse shoulder arthroplasty (RSA). Nevertheless, severe glenoid deficiency may preclude glenoid reconstruction and may dictate less favorable solutions, such as conversion to hemiarthropasty or resection arthropasty. The CAD/CAM shoulder (Stanmore Implants, Elstree, UK), a hip arthroplasty-inspired implant, may facilitate glenoid component fixation in these challenging revisions where glenoid reconstruction is not feasible. We questioned (1) whether revision arthroplasty with the CAD/CAM shoulder would alleviate pain and improve shoulder function in patients with failed RSA, not amenable to glenoid reconstruction, (2) whether the CAD/CAM hip-inspired glenoid shell would enable secure and durable glenoid component fixation in these challenging revisions.

Patients and methods

11 patients with failed RSAs and unreconstructable glenoids underwent revision with the CAD/CAM shoulder and were followed-up for mean 35 (28–42) months. Clinical outcomes included the Oxford shoulder score, subjective shoulder value, pain rating, physical examination, and shoulder radiographs.

Results

The average Oxford shoulder score and subjective shoulder value improved statistically significantly after the revision from 50 to 33 points and from 17% to 48% respectively. Pain rating at rest and during activity improved significantly from 5.3 to 2.3 and from 8.1 to 3.8 respectively. Active forward flexion increased from 25 to 54 degrees and external rotation increased from 9 to 21 degrees. 4 patients required reoperation for postoperative complications. No cases of glenoid loosening occurred.

Interpretation

The CAD/CAM shoulder offers an alternative solution for the treatment of failed RSA that is not amenable to glenoid reconstruction.Reverse shoulder arthroplasty (RSA) has become an established treatment for painful and debilitating shoulder pathologies associated with rotator-cuff insufficiency (Boileau et al. 2005, 2006, Frankle et al. 2005). The preoperative condition of shoulders requiring RSA and the technically demanding nature of the procedure make RSA challenging, with an overall complication rate of 15–50% in recently reported series (Guery et al. 2006, Gerber et al. 2009, Kempton et al. 2011). Complications related to the glenoid component (e.g. loosening, mechanical baseplate failure, dissociation) have been reported in 4–16% of cases (Gurey et al. 2006, Fevang et al. 2009, Farshad and Gerber 2010). Aseptic loosening is the most common glenoid-sided complication requiring revision following RSA (Fevang et al. 2009), and is often associated with considerable scapular bone loss (e.g. inferior scapular notching, glenoid deficiency after implant removal), which further complicates surgical revision (Antuna et al. 2001, Boileau et al. 2005, Elhassan et al. 2008, Gerber et al. 2009).Re-implantation of a glenoid component has been found to provide better clinical results than conversion to hemiarthroplasty or resection arthroplasty in revisions of both anatomical (Antuna et al 2001, Elhassan et al. 2008) and reverse shoulder arthroplasties (Farshad et al. 2012, Favard 2013), and it is strongly advocated. However, achievement of secure fixation of a glenoid implant may not be feasible in the presence of severe glenoid bone loss. Glenoid reconstruction with bone graft has been used to facilitate glenoid implant fixation in poor glenoid bone stock in primary shoulder arthroplasty (Hill and Norris 2001) and revision shoulder arthroplasty (Holcomb et al. 2009, Patel et al. 2012). The inconsistent clinical results and durability of fixation achieved with this technique have led to increasing interest in more reliable surgical alternatives for this challenging problem.The CAD/CAM (computer-assisted design/computer-assisted manufacture) shoulder (Stanmore Implants, Elstree, UK) is a constrained hip arthroplasty-inspired shoulder implant that was designed to facilitate glenoid implant fixation by securing a large glenoid shell to the scapula around the deficient glenoid, rather than to the deficient glenoid itself. Unlike Grammont-type implants, the CAD/CAM shoulder has an increased glenohumeral offset (less medialized implant), which has been shown to improve rotational movements of the shoulder (by recruiting anterior and posterior deltoid fibers and re-tensioning of the remaining rotator cuff) and to minimize scapular notching (Holcomb et al. 2009, Valenti et al. 2011)The purpose of this study was (1) to determine whether revision arthroplasty with the CAD/CAM shoulder would alleviate pain and improve shoulder function in patients with failed RSA and severe glenoid deficiency that is not amenable to reconstruction and inverted glenoid re-implantation; and (2) to determine whether the CAD/CAM hip-inspired glenoid shell would enable secure and durable glenoid component fixation in these challenging revisions. To our knowledge, no previous study has evaluated the use of such implants in revision surgery for failed glenoid-deficient RSA.  相似文献   

15.

Background and purpose

Alignment of the glenoid component with the scapula during total shoulder arthroplasty (TSA) is challenging due to glenoid erosion and lack of both bone stock and guiding landmarks. We determined the extent to which the implant position is governed by the preoperative erosion of the glenoid. Also, we investigated whether excessive erosion of the glenoid is associated with perforation of the glenoid vault.

Methods

We used preoperative and postoperative CT scans of 29 TSAs to assess version, inclination, rotation, and offset of the glenoid relative to the scapula plane. The position of the implant keel within the glenoid vault was classified into three types: centrally positioned, component touching vault cortex, and perforation of the cortex.

Results

Preoperative glenoid erosion was statistically significantly linked to the postoperative placement of the implant regarding all position parameters. Retroversion of the eroded glenoid was on average 10° (SD10) and retroversion of the implant after surgery was 7° (SD11). The implant keel was centered within the vault in 7 of 29 patients and the glenoid vault was perforated in 5 patients. Anterior cortex perforation was most frequent and was associated with severe preoperative posterior erosion, causing implant retroversion.

Interpretation

The position of the glenoid component reflected the preoperative erosion and “correction” was not a characteristic of the reconstructive surgery. Severe erosion appears to be linked to vault perforation. If malalignment and perforation are associated with loosening, our results suggest reorientation of the implant relative to the eroded surface.Based on 2,540 shoulder arthroplasties, Bohsali et al. (2006) reported the aseptic loosening rate to be 39%. Many other studies have shown that implant malalignment may cause high radiographic loosening rates (Franklin et al. 1988, Nyffeler et al. 2003, Farron et al. 2006, Habermeyer et al. 2006, Hopkins et al. 2007, Shapiro et al. 2007).Glenoid implant positioning is a challenging procedure. Reasons include poor intraoperative glenoid exposure, lack of reference landmarks, and the surgeon being (mis)guided by the orientation of the eroded glenoid surface. Friedman et al. (1992) and Walch et al. (1999) found that due to osteoarthritic erosions, the preoperative glenoid was retroverted by more than 10°. Walch et al. (1999) observed that in 24% of total shoulder arthroplasties (TSAs) the preoperative retroversion was excessive due to arthritic changes showing on average 23° of retroversion. It seems likely that such deformed glenoid bone will cause malpositioning of glenoid implants. A particular consequence of this is that erosion may lead to an implant position that perforates the glenoid vault (Yian et al. 2005).In anatomical studies, normal glenoid version has been found to vary within a range of about 20°, with an average retroversion of 1–2° (Churchill et al. 2001, Kwon et al. 2005, Codsi et al. 2008). Without knowing the patient native version, the aim of TSA is to position the prosthesis in a neutral orientation, correcting for pre-existing erosion of the glenoid when possible.We hypothesized that in routine surgical practice, the position of the implant is determined by the preoperative orientation of the glenoid and surgery does not achieve neutral positioning. A second hypothesis was that excessive erosion of the glenoid would be associated with perforation of the glenoid vault by the implant, which may have important implications for the success of the arthroplasty.  相似文献   

16.

Background

Metal-on-metal (MOM) total hip arthroplasties were reintroduced because of the problems with osteolysis and aseptic loosening related to polyethylene wear of early metal-on-polyethylene (MOP) arthroplasties. The volumetric wear rate has been greatly reduced with MOM arthroplasties; however, because of nano-size wear particles, the absolute number has been greatly increased. Thus, a source of metal ion exposure with the potential to sensitize patients is present. We hypothesized that higher amounts of wear particles result in increased release of metal ions and ultimately lead to an increased incidence of metal allergy.

Methods

52 hips in 52 patients (median age 60 (51–64) years, 30 women) were randomized to either a MOM hip resurfacing system (ReCap) or a standard MOP total hip arthoplasty (Mallory Head/Exeter). Spot urine samples were collected preoperatively, postoperatively, after 3 months, and after 1, 2, and 5 years and tested with inductively coupled plasma-sector field mass spectrometry. After 5 years, hypersensitivity to metals was evaluated by patch testing and lymphocyte transformation assay. In addition, the patients answered a questionnaire about hypersensitivity.

Results

A statistically significant 10- to 20-fold increase in urinary levels of cobalt and chromium was observed throughout the entire follow-up in the MOM group. The prevalence of metal allergy was similar between groups.

Interpretation

While we observed significantly increased levels of metal ions in the urine during the entire follow-up period, no difference in prevalence of metal allergy was observed in the MOM group. However, the effect of long-term metal exposure remains uncertain.In the 1960s and 1970s, the articulations of hip implants were mainly metal-on-metal (MOM). The implants released cobalt, chromium, and nickel, which could be found in high levels in the blood, hair, and urine (Coleman et al. 1973, Benson et al. 1975, Elves et al. 1975, Gawkrodger 2003). Furthermore, the patients became sensitized to the metals released and an association with early loosening was observed (Coleman et al. 1973, Benson et al. 1975, Elves et al. 1975, Gawkrodger 2003, Jacobs et al. 2009). Gradually, MOM implants were abandoned and the work by Sir John Charley with the metal-on-polyethylene (MOP) bearing advanced hip replacement substantially. However, the MOM articulation was reintroduced in the 1990s, as it became clear that polyethylene debris caused osteolysis, which was a significant clinical issue—especially in young and active patients (Marshall et al. 2008). The MOM Hip Resurfacing System has been proposed to have advantages such as enhanced longevity (Chan et al. 1999, Sieber et al. 1999, Firkins et al. 2001), enhanced implant fixation (Grigoris et al. 2006), lower dislocation rate (Scifert et al. 1998), better reproduction of hip mechanics, and more native femoral shaft bone stock left for revision surgery (Shimmin et al. 2008).MOM articulations have greatly reduced the volumetric wear rate of hip prostheses; however, because of nano-sized metal wear particles, the absolute number of wear particles has greatly increased (Doorn et al. 1998, Chan et al. 1999, Sieber et al. 1999, Firkins et al. 2001, Rieker and Kottig. 2002). Also, Hallab et al. (2004) suggested that the prevalence of metal allergy could be higher in patients with implant failure. In both cases, a source of metal ion exposure with the potential to sensitize patients is present, but the long-term biological effect of the metal wear debris remains unknown.Metal hypersensitivity is a well-established phenomenon and is common, affecting about 10–15% of the general population (Thyssen and Menne 2010). Metal allergy can develop after prolonged or repeated cutaneous exposure to metal, usually from consumer products. Affected individuals typically suffer from allergic contact dermatitis and react with cutaneous erythema, papules, and vesicles after skin contact. This reaction is categorized as a type-4 T-cell-mediated hypersensitivity reaction. Also, metal hypersensitivity may develop following internal exposure to metal-releasing implants. Theoretically, metal hypersensitivity could lead to a powerful reaction to prosthesis implantation (Pandit et al. 2008).We hypothesized that an increased number of wear particles from MOM hip resurfacing arthroplasty (HRA) would lead to increased blood levels and urinary excretion of metal ions, and ultimately to an increased prevalence of metal allergy.  相似文献   

17.

Background and purpose

Length of stay (LOS) following total hip and knee arthroplasty (THA and TKA) has been reduced to about 3 days in fast-track setups with functional discharge criteria. Earlier studies have identified patient characteristics predicting LOS, but little is known about specific reasons for being hospitalized following fast-track THA and TKA.

Patients and methods

To determine clinical and logistical factors that keep patients in hospital for the first postoperative 24–72 hours, we performed a cohort study of consecutive, unselected patients undergoing unilateral primary THA (n = 98) or TKA (n = 109). Median length of stay was 2 days. Patients were operated with spinal anesthesia and received multimodal analgesia with paracetamol, a COX-2 inhibitor, and gabapentin—with opioid only on request. Fulfillment of functional discharge criteria was assessed twice daily and specified reasons for not allowing discharge were registered.

Results

Pain, dizziness, and general weakness were the main clinical reasons for being hospitalized at 24 and 48 hours postoperatively while nausea, vomiting, confusion, and sedation delayed discharge to a minimal extent. Waiting for blood transfusion (when needed), for start of physiotherapy, and for postoperative radiographic examination delayed discharge in one fifth of the patients.

Interpretation

Future efforts to enhance recovery and reduce length of stay after THA and TKA should focus on analgesia, prevention of orthostatism, and rapid recovery of muscle function.Total hip and total knee arthroplasty (THA and TKA) are frequent operations with an average length of stay (LOS) of about 6–12 days in the United Kingdom, Germany, and Denmark (Husted et al. 2006, Bundesauswertung 2009, NHS 2010).During the last decade, however, there has been increased interest in optimal multimodal perioperative care to enhance recovery (the fast-track methodology). Improvement of analgesia; reduction of surgical stress responses and organ dysfunctions including nausea, vomiting, and ileus; early mobilization; and oral nutrition have been of particular interest (Kehlet 2008, Kehlet and Wilmore 2008). These principles have also been applied to THA and TKA, resulting in improvements in pain treatment with multimodal opioid-sparing regimens including a local anesthetic infiltration technique (LIA) or peripheral nerve blocks to facilitate early mobilization (Ilfeld et al. 2006a, b, 2010a, Andersen et al. 2008, Kerr and Kohan 2008), and allowing functional rehabilitation to be initiated a few hours postoperatively (Holm et al. 2010)—ultimately leading to a reduction in LOS (Husted et al. 2008, Barbieri et al. 2009, Husted et al. 2010a, b). Using these evidence-based regimens combined with an improved logistical setup, LOS is reduced to about 2–4 days (Kerr and Kohan 2008, Husted et al. 2010 a,b,c, Lunn et al. 2011).Having well-defined functional discharge criteria is imperative in order to ensure a safe discharge—and it is mandatory if meaningful comparison of LOS is done following alterations in the track (Husted et al. 2008). In the same fast-track setting, an earlier study focused on patient characteristics predicting LOS (Husted et al. 2008). However, little is known about the specific reasons for why patients are hospitalized during the first 1–3 days after THA or TKA; i.e. why can patients not be discharged?We therefore analyzed clinical and organizational factors responsible for being hospitalized in a well-defined prospective setup in a fast-track unit. This unit had previously documented LOS of about 2–3 days (Andersen et al. 2008, Holm et al. 2010, Husted et al. 2010b, c, Lunn et al. 2011).  相似文献   

18.

Background and purpose

Sclerostin is produced by osteocytes and is an inhibitor of bone formation. Thus, inhibition of sclerostin by a monoclonal antibody increases bone formation and improves fracture repair. Sclerostin expression is upregulated in unloaded bone and is downregulated by loading. We wanted to determine whether an anti-sclerostin antibody would stimulate metaphyseal healing in unloaded bone in a rat model.

Methods

10-week-old male rats (n = 48) were divided into 4 groups, with 12 in each. In 24 rats, the right hind limb was unloaded by paralyzing the calf and thigh muscles with an injection of botulinum toxin A (Botox). 3 days later, all the animals had a steel screw inserted into the right proximal tibia. Starting 3 days after screw insertion, either anti-sclerostin antibody (Scl-Ab) or saline was given twice weekly. The other 24 rats did not receive Botox injections and they were treated with Scl-Ab or saline to serve as normal-loaded controls. Screw pull-out force was measured 4 weeks after insertion, as an indicator of the regenerative response of bone to trauma.

Results

Unloading reduced the pull-out force. Scl-Ab treatment increased the pull-out force, with or without unloading. The response to the antibody was similar in both groups, and no statistically significant relationship was found between unloading and antibody treatment. The cancellous bone at a distance from the screw showed changes in bone volume fraction that followed the same pattern as the pull-out force.

Interpretation

Scl-Ab increases bone formation and screw fixation to a similar degree in loaded and unloaded bone.The secreted glycoprotein sclerostin is the product of the SOST gene. Sclerostin is an important negative regulator of bone, and naturally occurring mutations of the SOST gene in humans lead to the high bone mass condition sclerostosis (Balemans et al. 2001, Brunkow et al. 2001). This high bone mass phenotype is also present in animal models of SOST deficiency (Li et al. 2008). Sclerostin asserts its function, in part, by inhibiting canonical Wnt signaling (Li et al. 2005). This is important for osteoblast differentiation (Galli et al. 2010) and also for bone healing and regeneration (Chen et al. 2007, Kim et al. 2007). The SOST gene is expressed almost exclusively in osteocytes (Poole et al. 2005), and sclerostin expression is thought to be a means for osteocytes to locally regulate bone formation (Galli et al. 2010). Sclerostin appears to be vital for the bone to be able to respond to mechanical loading (Robling et al. 2008), and lack of sclerostin prevents osteopenia due to unloading (Lin et al. 2009). One therapeutic option has been to block sclerostin with an antibody. Such treatment has increased bone mass in animal models of postmenopausal osteoporosis (Li et al. 2009) or disuse-induced bone loss (Tian et al. 2011), and in gonad-intact aged male rats and non-human primates (Li et al. 2010, Ominsky et al. 2010). Furthermore, fracture healing has been found to be improved in rodents and non-human primates treated with an anti-sclerostin antibody (Agholme et al. 2010, Ominsky et al. 2011).We have previously shown that inhibition of sclerostin improves bone regeneration and implant fixation during normal loading conditions (Agholme et al. 2010). However, in contrast to laboratory animals, many patients do not bear weight on fractured limbs for a long time. It is therefore important to determine the effect of sclerostin inhibition on bone healing under unloaded conditions.Paralysis of hind limb muscles using botulinum toxin A (Botox) causes rapid bone loss due to reduced weight bearing (Chappard et al. 2001, Warner et al. 2006). We examined the effect of sclerostin inhibition on metaphyseal bone healing in a rat model with Botox injections. Fully weight-bearing animals were included as controls.  相似文献   

19.

Background and purpose

Obesity is a risk factor for osteoarthritis in the lower limb, yet the cardiovascular risks associated with obesity in hip or knee replacement surgery are unknown. We examined associations between body mass index (BMI) and the risk of a major adverse cardiovascular event (MACE: ischemic stroke, acute myocardial infarction, or cardiovascular death) or the risk of all-cause mortality in a nationwide Danish cohort of patients who underwent primary hip or knee replacement surgery.

Methods

Using Danish nationwide registries, we identified 34,744 patients aged ≥ 20 years who underwent elective primary hip or knee replacement surgery between 2005 and 2011. We used multivariable Cox regression models to calculate the 30-day risks of MACE and mortality associated with 5 BMI groups (underweight (BMI < 18.5 kg/m2), normal weight (18.5–24 kg/m2), overweight (25–29 kg/m2), obese 1 (30–34 kg/m2), and obese 2 (≥ 35 kg/m2)).

Results

In total, 232 patients (0.7%) had a MACE and 111 (0.3%) died. Compared with overweight, adjusted hazard ratios (HRs) were 1.2 (95% CI: 0.4–3.3), 1.3 (0.95–1.8), 1.6 (1.1–2.2), and 1.0 (0.6–1.9) for underweight, normal weight, obese 1, and obese 2 regarding MACE. Regarding mortality, the corresponding HRs were 7.0 (2.8–15), 2.0 (1.2–3.2), 1.5 (0.9–2.7), and 1.9 (0.9–4.2). Cubic splines suggested a significant U-shaped relationship between BMI and risks with nadir around 27–28.

Interpretation

In an unselected cohort of patients undergoing elective primary hip or knee replacement surgery, U-shaped risks of perioperative MACE and mortality were found in relation to BMI. Patients within the extreme ranges of BMI may warrant further attention.Obesity is one of the most prominent risk factors for the development and progression of osteoarthritis in the lower limb, especially in the knee (Felson et al. 1988, Sturmer et al. 2000). As a result, overweight people are overrepresented among patients undergoing joint replacement surgery (Bostman 1994, Cooper et al. 1998, Karlson et al. 2003, Jain et al. 2005). With an increasing proportion of elderly people and the high prevalence of overweight/obesity in the general population, the demand for joint replacement surgery is expected to rise (Kurtz et al. 2007). Considerable risks of peroperative and postoperative complications have been reported for obese patients undergoing hip or knee replacement surgery (Winiarsky et al. 1998, Foran et al. 2004a, Schwarzkopf et al. 2012), although with conflicting results (Pritchett and Bortel 1991, Griffin et al. 1998, Hawker et al. 1998, Winiarsky et al. 1998, Spicer et al. 2001, Foran et al. 2004a, b, Flegal et al. 2005, Davis et al. 2011). The majority of previous studies have focused on orthopedic-related outcomes, e.g. risks of infection and prosthesis dislocation (Smith et al. 1992, Griffin et al. 1998, Deshmukh et al. 2002, Amin et al. 2006, Hamoui et al. 2006). Major surgical procedures, including joint replacement surgery, also carry a significant risk of adverse cardiovascular events and mortality. Previous research has suggested that obesity may increase perioperative cardiovascular and mortality risks, but it has not concentrated specifically on elective hip and knee replacement surgery (Bamgbade et al. 2007). We therefore evaluated the relationship between body mass index (BMI) and perioperative cardiovascular events and mortality, as well as 1-year mortality, in patients undergoing elective total hip or knee replacement in a nationwide setting. We hypothesized that obese patients would have higher risk of adverse cardiovascular events than patients who were not obese.  相似文献   

20.

Background and purpose

A considerable number of patients who undergo surgery for spinal stenosis have residual symptoms and inferior function and health-related quality of life after surgery. There have been few studies on factors that may predict outcome. We tried to find predictors of outcome in surgery for spinal stenosis using patient- and imaging-related factors.

Patients and methods

109 patients in the Swedish Spine Register with central spinal stenosis that were operated on by decompression without fusion were prospectively followed up 1 year after surgery. Clinical outcome scores included the EQ-5D, the Oswestry disability index, self-estimated walking distance, and leg and back pain levels (VAS). Central dural sac area, number of levels with stenosis, and spondylolisthesis were included in the MRI analysis. Multivariable analyses were performed to search for correlation between patient-related and imaging factors and clinical outcome at 1-year follow-up.

Results

Several factors predicted outcome statistically significantly. Duration of leg pain exceeding 2 years predicted inferior outcome in terms of leg and back pain, function, and HRLQoL. Regular and intermittent preoperative users of analgesics had higher levels of back pain at follow-up than those not using analgesics. Low preoperative function predicted low function and dissatisfaction at follow-up. Low preoperative EQ-5D scores predicted a high degree of leg and back pain. Narrow dural sac area predicted more gains in terms of back pain at follow-up and lower absolute leg pain.

Interpretation

Multiple factors predict outcome in spinal stenosis surgery, most importantly duration of symptoms and preoperative function. Some of these are modifiable and can be targeted. Our findings can be used in the preoperative patient information and aid the surgeon and the patient in a shared decision making process.Decompressive surgery for lumbar spinal stenosis is the most frequently performed spine operation in many countries (Weinstein et al. 2006, Strömqvist et al. 2009). However, one third of patients are not satisfied with the outcome because of residual leg and back pain, inferior function, and poor health-related quality of life (Katz et al. 1995, Airaksinen et al. 1997, Jönsson et al. 1997, Jansson et al. 2009, Strömqvist et al. 2009, Hara et al. 2010).2 recent randomized studies have shown surgery to be superior to nonoperative treatment in lumbar spinal stenosis (Malmivaara et al. 2007, Weinstein et al. 2008), but many patients improve without surgical treatment (Malmivaara et al. 2007). The question remains as to who benefits most from surgery. Identification of prognostic factors that can aid in selection of patients for surgery is therefore important. Prognostic factors in lumbar spinal stenosis surgery have been studied, but they are not well defined (Turner et al. 1992, Aalto et al. 2006). Aalto et al. (2006) reviewed studies of lumbar spinal stenosis surgery and found that only 21 studies of 885 were of sufficient quality to merit identification of prognostic factors. The main reason for exclusion was a retrospective study design and a limited number of predictors. Cardiovascular and overall comorbidity, disorders influencing walking ability, self-rated health, income, severity of central stenosis, and severity of scoliosis were found to be predictors of outcome, but no single study could identify more than one of these predictors. More recently, smoking, depression, psychiatric illness, and high body mass index have been found to be predictive of negative outcome, as have long duration of symptoms and preoperative resting numbness (Ng et al. 2007, Hara et al. 2010, Athiviraham et al. 2011, Radcliff et al. 2011, Sandén et al. 2011, Sinikallio et al. 2011).Cross-sectional imaging (most often MRI) has an important role in confirming the diagnosis of spinal stenosis, and is essential for surgical planning. Even so, the prognostic value of the narrowness of the dural sac area is not well established (Jönsson et al 1997, Amundsen et al. 2000, Yukawa et al. 2002). Studies incorporating both imaging and patient-related factors in a systematic way have been exceedingly rare (Amundsen et al. 2000, Yukawa et al. 2002).We used patient data from the Swedish Spine Register protocol (Strömqvist et al. 2009) and MRI measurements of central dural sac area, multilevel stenosis, and spondylolisthesis to find predictors of outcome in terms of function, HRLQoL, and leg and back pain after decompression for lumbar spinal stenosis.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号