首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.

Background and purpose —

Metal artifact reduction sequence (MARS) MRI is widely advocated for surveillance of metal-on-metal hip arthroplasties (MOM-HAs). However, its use is limited by susceptibility artifact at the prosthesis-bone interface, local availability, patient compliance, and cost (Hayter et al. 2011a). We wanted to determine whether CT is a suitable substitute for MARS MRI in evaluation of the painful MOM-HA.

Patients and methods —

50 MOM-HA patients (30 female) with unexplained painful prostheses underwent MARS MRI and CT imaging. 2 observers who were blind regarding the clinical data objectively reported the following outcomes: soft tissue lesions (pseudotumors), muscle atrophy, and acetabular and femoral osteolysis. Diagnostic test characteristics were calculated.

Results —

Pseudotumor was diagnosed in 25 of 50 hips by MARS MRI and in 11 of 50 by CT. Pseudotumors were classified as type 1 (n = 2), type 2A (n = 17), type 2B (n = 4), and type 3 (n = 2) by MARS MRI. CT did not permit pseudotumor classification. The sensitivity of CT for diagnosis of pseudotumor was 44% (95% CI: 25–65). CT had “slight” agreement with MARS MRI for quantification of muscle atrophy (κ = 0.23, CI: 0.16–0.29; p < 0.01). Osteolysis was identified in 15 of 50 patients by CT. 4 of these lesions were identified by MARS MRI.

Interpretation —

CT was found to be superior to MRI for detection of osteolysis adjacent to MOM-HA, and should be incorporated into diagnostic algorithms. CT was unable to classify and failed to detect many pseudotumors, and it was unreliable for assessment of muscle atrophy. Where MARS MRI is contraindicated or unavailable, CT would be an unsuitable substitute and other modalities such as ultrasound should be consideredIt is estimated that over 500,000 metal-on-metal (MOM) hip arthroplasties, including both hip resurfacing and total hip replacements (THRs), have been carried out worldwide in the last 15 years (Skinner et al. 2010). There are increasing reports of progressive soft tissue changes in response to metal debris including: solid or cystic, non-malignant masses around the prostheses (termed pseudotumors) (Pandit et al. 2008), perivascular lymphocytic infiltration (Davies et al. 2005), musculotendinous pathology (in particular, wasting of the hip abductors) (Sabah et al. 2011), and periprosthetic osteolysis (Park et al. 2005, Milosev et al. 2006, Korovessis et al. 2006).There is international agreement that the high failure rate of MOM hip arthroplasties (MOM-HAs) has created the need for surveillance of these devices with cross-sectional imaging (MHRA. 2012). Both pseudotumors and muscle atrophy have been associated with high rates of major complications and poorer outcomes after revision surgery (Grammatopolous et al. 2009). To this end, sensitive detection of periprosthetic changes is vital in order to provide the best outcome for MOM-HA patients with early detection and revision.Cross-sectional imaging has been shown to be useful for providing a diagnosis in cases of unexplained pain and in planning of revision surgery (Hayter et al. 2011b). A recent European multidisciplinary consensus statement recommended the use of cross-sectional imaging using any of US, MARS MRI, or CT (Hannemann et al. 2013). The gold standard modality is not clear, which has resulted in a variety of diagnostic algorithms being used in different referral centers.Both CT and MARS MRI similarly offer multi-planar and complete cross-sectional images from which the extent of disease and relationship of the abnormality to normal anatomy can readily be appreciated. MARS MRI has been reliably and extensively used to investigate MOM hip complications (Sabah et al. 2011, Hayter et al. 2012a, Thomas et al. 2013, Nawabi et al. 2013) and has been shown to permit early diagnosis of pseudotumor and other soft tissue pathologies (Toms et al. 2008) associated with pain, loss of function, and higher revision rates. However, the use of MARS MRI is limited by susceptibility artifact at the prosthesis-bone interface, local availability, patient compliance, and cost.CT is more widely available than MARS MRI (Anderson et al. 2011) and has been used routinely at some centers for the screening of periarticular masses (Bosker et al. 2012). It has been proposed as an alternative to it, for example in cases of claustrophobia, pacemaker, and where there are loose metal implants. CT has been shown to be useful in cases of suspected impingement, acetabular osteolysis (Cahir et al. 2007, Roth et al. 2012), and in identification of prostheses at risk of elevated wear (Hart et al. 2009). The notable success in detecting common complications of hip arthroplasty coupled with widespread accessibility has meant that some centers rely entirely on CT (McMinn. 2012) to follow up patients with suspected MOM-associated bony and soft tissue changes, but to date there have been no published studies comparing CT with MRI.We investigated whether CT is a suitable substitute for MARS MRI in the evaluation of the painful MOM-HA. We wanted to provide measures of diagnostic accuracy of CT compared to the current gold standard (MARS MRI) for the detection of common periprosthetic complications. The primary outcome measure focused on the detection of pseudotumors, owing to their high prevalence and strong association with adverse outcomes (Hart et al. 2009), with secondary outcome measures for the detection of muscle atrophy and osteolysis.  相似文献   

2.

Background and purpose

There is no consensus regarding the clinical relevance of gender-specific prostheses in total knee arthroplasty (TKA). We summarize the current best evidence in a comparison of clinical and radiographic outcomes between gender-specific prostheses and standard unisex prostheses in female patients.

Methods

We used the PubMed, Embase, Cochrane, Science Citation Index, and Scopus databases. We included randomized controlled trials published up to January 2013 that compared gender-specific prostheses with standard unisex prostheses in female patients who underwent primary TKAs.

Results

6 trials involving 423 patients with 846 knee joints met the inclusion criteria. No statistically significant differences were observed between the 2 designs regarding pain, range of motion (ROM), knee scores, satisfaction, preference, complications, and radiographic results. The gender-specific design (Gender Solutions; Zimmer Inc, Warsaw, Indiana) reduced the prevalence of overhang. However, it had less overall coverage of the femoral condyles compared to the unisex group. In fact, the femoral prosthesis in the standard unisex group matched better than that in the gender-specific group.

Interpretation

Gender-specific prostheses do not appear to confer any benefit in terms of clinician- and patient-reported outcomes for the female knee.Women account for almost two-thirds of knee arthroplasties (Kurtz et al. 2007). Recently, a possible effect of gender on functional outcomes and implant survivorship has been identified (Vincent et al. 2006, Ritter et al. 2008, Kamath et al. 2010, Parsley et al. 2010, O’Connor 2011). Gender differences in the anatomy of the distal femur are well documented (Conley et al. 2007, Yue et al. 2011a, b, Yan et al. 2012, Zeng et al. 2012). Women tend to have a less prominent anterior condyle (Conley et al. 2007, Fehring et al. 2009), a higher quadriceps angle (Q-angle) (Hsu et al. 1990, Woodland et al. 1992), and a reduced mediolateral to anteroposterior aspect ratio (Chin et al. 2002, Chaichankul et al. 2011). Investigators have found that standard unisex knee prostheses may not equally match the native anatomy in male and female knees (Clarke and Hentz 2008, Yan et al. 2012). A positive association between the femoral component size and the amount of overhang was observed in females, and femoral component overhang (≥ 3 mm) may result in postoperative knee pain or reduced ROM (Hitt et al. 2003, Lo et al. 2003, Mahoney et al. 2010).The concept of gender-specific knee prostheses was introduced to match these 3 anatomic differences in the female population (Conley et al. 2007). It includes a narrower mediolateral diameter for a given anteroposterior dimension, to match the female knee more closely. Additionally, the anterior flange of the prothesis was modified to include a recessed patellar sulcus and reduced anterior condylar height (to ovoid “overstuffing” during knee flexion) and a lateralized patellar sulcus (to accommodate the increased Q-angle associated with a wider pelvis).Several randomized controlled trials (RCTs) have failed to establish the superiority of the gender-specific prosthesis over the unisex knee prosthesis in the female knee (Kim et al. 2010a, b, Song et al. 2012a, Thomsen et al. 2012, von Roth et al. 2013). In contrast, other studies have found higher patient satisfaction and better radiographic fit in the gender-specific TKAs than in the standard unisex TKAs (Clarke and Hentz 2008, Parratte et al. 2011, Yue et al. 2014). We therefore performed a systematic review and meta-analysis to compare the clinical and radiographic results of TKA in female patients receiving gender-specific prostheses or standard unisex prostheses.  相似文献   

3.

Background and purpose

Metal-on-metal hip implants have been widely used, especially in the USA, Australia, England and Wales, and Finland. We assessed risk of death and updated data on the risk of cancer related to metal-on-metal hip replacements.

Patients and methods

A cohort of 10,728 metal-on-metal hip replacement patients and a reference cohort of 18,235 conventional total hip replacement patients were extracted from the Finnish Arthroplasty Register for the years 2001–2010. Data on incident cancer cases and causes of death until 2011 were obtained from the Finnish Cancer Registry and Statistics Finland. The relative risk of cancer and death were expressed as standardized incidence ratio (SIR) and standardized mortality ratio (SMR). SIR/SIR ratios and SMR/SMR ratios, and Poisson regression were used to compare the cancer risk and the risk of death between cohorts.

Results

The overall risk of cancer in the metal-on-metal cohort was not higher than that in the non-metal-on-metal cohort (RR = 0.91, 95% CI: 0.82–1.02). The risk of soft-tissue sarcoma and basalioma in the metal-on-metal cohort was higher than in the non-metal-on-metal cohort (SIR/SIR ratio = 2.6, CI: 1.02–6.4 for soft-tissue sarcoma; SIR/SIR ratio = 1.3, CI: 1.1–1.5 for basalioma). The overall risk of death in the metal-on-metal cohort was less than that in the non-metal-on-metal cohort (RR = 0.78, CI: 0.69–0.88).

Interpretation

The overall risk of cancer or risk of death because of cancer is not increased after metal-on-metal hip replacement. The well-patient effect and selection bias contribute substantially to the findings concerning mortality. Arthrocobaltism does not increase mortality in patients with metal-on-metal hip implants in the short term. However, metal-on-metal hip implants should not be considered safe until data with longer follow-up time are available.Metal-on-metal hip implants have been widely used, especially in the USA, Australia, England and Wales, and Finland (AOANJRR 2010, NJR 2011, Cohen 2012, Seppänen et al. 2012). The theoretical health risks related to chronically elevated blood metal ion concentrations induced by abnormal wear and corrosion of the metal-on-metal implants—apart from local symptoms around the failing implant—include systemic symptoms of poisoning (Steens et al. 2006, Oldenburg et al. 2009, Rizzetti et al. 2009, Tower 2010, 2012, Mao et al. 2011, Sotos and Tower 2013, Zyviel et al. 2013) and carcinogenesis (Mäkelä et al. 2012, Smith et al. 2012, Brewster et al. 2013). Systemic metal ion toxicity cases due to a failed hip replacement are rare. However, there have been several recent reports of systemic cobalt toxicity following revision of fractured ceramic components, and also in patients with a failed metal-on-metal hip replacement (Steens et al. 2006, Oldenburg et al. 2009, Rizzetti et al. 2009, Tower 2010, 2012, Mao et al. 2011, Sotos and Tower 2013, Zyviel et al. 2013). Possible clinical findings include fatigue, weakness, hypothyroidism, cardiomyopathy, polycythemia, visual and hearing impairment, cognitive dysfunction, and neuropathy. Fatal cardiomyopathy due to systemic cobalt toxicity after hip replacement has been reported (Zyviel et al. 2013).Metal debris from hip replacement may be associated with chromosomal aberrations and DNA damage (Case et al. 1996, Bonassi et al. 2000, Daley et al. 2004). However, the risk of cancer is not increased after conventional metal-on-polyethylene total hip replacement or after first-generation metal-on-metal total hip arthroplasty (Visuri et al. 1996, 2010a). The short-term overall cancer risk after modern metal-on-metal hip arthroplasty is not increased either (Mäkelä et al. 2012, Smith et al. 2012, Brewster et al. 2013). However, recent linkage studies of overall cancer risk are based on hospital episode statistics, which may have less quality assurance than cancer registry data (Smith et al. 2012, Brewster et al. 2013). Annual updating of cancer registry data concerning the metal-on-metal issue is advisable.In this paper, we update our earlier published results on risk of cancer (Mäkelä et al. 2012) and give an assessment of the overall and cause-specific mortality in primary metal-on-metal and non-metal-on-metal hip replacement patients who were operated on from 2001 to 2010, by combining data from the Finnish Arthroplasty Register, the Population Register Centre, and the Finnish Cancer Registry. The reason for this early updating of the cancer data was to be able to detect a cancerogenic effect of metal-on-metal implants as early as possible.  相似文献   

4.

Background and purpose

There is considerable uncertainty about the optimal treatment of displaced 4-part fractures of the proximal humerus. Within the last decade, locking plate technology has been considered a breakthrough in the treatment of these complex injuries.

Methods

We systematically identified and reviewed clinical studies on the benefits and harms after osteosynthesis with locking plates in displaced 4-part fractures.

Results

We included 14 studies with 374 four-part fractures. There were 10 case series, 3 retrospective observational comparative studies, 1 prospective observational comparative study, and no randomized trials. Small studies with a high risk of bias precluded reliable estimates of functional outcome. High rates of complications (16–64%) and reoperations (11–27%) were reported.

Interpretation

The empirical foundation for the value of locking plates in displaced 4-part fractures of the proximal humerus is weak. We emphasize the need for well-conducted randomized trials and observational studies.There is considerable uncertainty about the optimal treatment of displaced 4-part fractures of the proximal humerus (Misra et al. 2001, Handoll et al. 2003, Bhandari et al. 2004, Lanting et al. 2008). Only 2 small inconclusive randomized trials have been published (Stableforth 1984, Hoellen et al. 1997). A large number of interventions are used routinely, ranging from a non-operative approach to open reduction and internal fixation (ORIF), and primary hemiarthroplasty (HA).In the last decade, locking plate technology has been developed and has been heralded as a breakthrough in the treatment of fractures in osteoporotic bone (Gautier and Sommer 2003, Sommer et al. 2003, Haidukewych 2004, Miranda 2007). Locking plate technique is based on the elimination of friction between the plate and cortex, and relies on stability between the subchondral bone and screws. Multiple multidirectional convergent and divergent locking screws enhance the angular stability of the osteosynthesis, possibly resulting in better postoperative function with reduced pain. Reported complications include screw cut-out, varus fracture collapse, tuberosity re-displacement, humeral head necrosis, plate impingement, and plate or screw breakage (Hall et al. 2006, Tolat et al. 2006, van Rooyen et al. 2006, Agudelo et al. 2007, Gardner et al. 2007, Khunda et al. 2007, Ring 2007, Smith et al. 2007, Voigt et al. 2007, Egol et al. 2008, Kirchhoff et al. 2008, Owsley and Gorczyca 2008, Brunner et al. 2009, Micic et al. 2009, Sudkamp et al. 2009). The balance between the benefit and harms of the intervention seems delicate.Several authors of narrative reviews and clinical series have strongly recommended fixation of displaced 4-part fractures of the humerus with locking plates (Bjorkenheim et al. 2004, Hente et al. 2004, Hessler et al. 2006, Koukakis et al. 2006, Kilic et al. 2008, Korkmaz et al. 2008, Shahid et al. 2008, Papadopoulos et al. 2009, Ricchetti et al. 2009) and producers of implants unsurprisingly strongly advocate them (aap Implantate 2010, Stryker 2010, Synthes 2010, Zimmer 2010). Despite the increasing use of locking plates (Illert et al. 2008, Ricchetti et al. 2009), we have been unable to identify systematic reviews on the benefits and harms of this new technology in displaced 4-part fractures. Thus, we systematically identified and reviewed clinical studies on the benefits and harms after osteosynthesis with locking plates in displaced 4-part fractures of the proximal humerus.  相似文献   

5.

Background and purpose

The appropriate fixation method for hemiarthroplasty of the hip as it relates to implant survivorship and patient mortality is a matter of ongoing debate. We examined the influence of fixation method on revision rate and mortality.

Methods

We analyzed approximately 25,000 hemiarthroplasty cases from the AOA National Joint Replacement Registry. Deaths at 1 day, 1 week, 1 month, and 1 year were compared for all patients and among subgroups based on implant type.

Results

Patients treated with cemented monoblock hemiarthroplasty had a 1.7-times higher day-1 mortality compared to uncemented monoblock components (p < 0.001). This finding was reversed by 1 week, 1 month, and 1 year after surgery (p < 0.001). Modular hemiarthroplasties did not reveal a difference in mortality between fixation methods at any time point.

Interpretation

This study shows lower (or similar) overall mortality with cemented hemiarthroplasty of the hip.The frequency of hip fractures is increasing with our ageing population, with an annual incidence of between 1.4 and 5 per 103 per year (Lonnroos et al. 2006, Icks et al. 2008, Varez-Nebreda et al. 2008). Health model projections have estimated that 6.3 million hip fractures will occur annually worldwide within the next 40 years (Cooper et al. 1992), imposing a significant economic health burden. There is a large reported perioperative mortality rate in this population, ranging from 2.4% to 8.2% at 1 month (Parvizi et al. 2001, Radcliff et al. 2008) and over 25% at 1 year (Elliott et al. 2003, Jiang et al. 2005). Furthermore, it was recently reported that the current mortality rate is higher now than 25 years ago (Vestergaard et al. 2007a). Today, it is generally accepted that displaced intracapsular fractures are best treated with arthroplasty rather than internal fixation (Keating et al. 2006, Leighton et al. 2007). In the at-risk population, however, multiple comorbidities are common and the best form of component fixation is in question.Bone cement implantation syndrome is a well-described complication of cemented hip arthroplasty. It is characterized by a systemic drop in systolic blood pressure, hypoxemia, pulmonary hypertension, cardiac dysrhythmias, and occasionally cardiac arrest and death (Rinecker 1980, Orsini et al. 1987, Parvizi et al. 1999). The prevailing theory to explain the pathophysiology of this phenomenon is embolism of fat, marrow contents, bone, and to some degree methylmethacrylate to the lung (Rinecker 1980, Elmaraghy et al. 1998, Parvizi et al. 1999, Koessler et al. 2001). An increased degree of pulmonary insult with fat microemboli has been demonstrated (mostly in randomized controlled trials) during insertion of a cemented femoral stem rather than an uncemented implant (Orsini et al. 1987, Ries et al. 1993, Christie et al. 1994, Pitto et al. 1999), presumably due to increased intramedullary femoral canal pressures in the cemented group (Kallos et al. 1974, Orsini et al. 1987). These pressures can be reduced by the use of distal venting holes in the femur during stem insertion (Engesæter et al. 1984). It has been shown previously by single-institutional review that patients undergoing cemented hip arthroplasty have a higher intraoperative mortality rate relative to uncemented arthroplasty, presumably due to a reduced incidence of fat embolism in the latter group (Parvizi et al. 1999). The increased mortality risk was also present at 30 days in the treatment of acute fractures with cemented arthroplasty, also from a single-institutional review (Parvizi et al. 2004). Although cement-related mortality is rare (Dearborn and Harris 1998, Parvizi et al. 1999, 2001, 2004, Weinrauch et al. 2006), it is a devastating complication—often reported through observational studies or literature reviews. Proponents of uncemented hip arthroplasty often cite this concern to support their reluctance to use cemented hip arthroplasty in both elective procedures and fracture management. However, many different types of studies have been unable to identify any increased mortality risk with the use of cement (Lausten and Vedel 1982 (observational), Emery et al. 1991 (RCT), Lo et al. 1994 (observational), Khan et al. 2002a,b (literature review), Parker and Gurusamy 2004 (literature review)) and others have shown a decrease in mortality at 30 days when cement is used (Foster et al. 2005).Cemented hip hemiarthroplasty appears to offer improved rate of return to baseline function, reduced postoperative pain, and superior long-term survivorship relative to uncemented arthroplasty (Khan et al. 2002a, b, Parker and Gurusamy 2004). We reasoned that failure to return to baseline function after hemiarthroplasty may be another risk factor for perioperative mortality (Hannan et al. 2001, Braithwaite et al. 2003). Lower revision rates for cemented prostheses and increased mortality at revision surgery contribute further to reducing the overall mortality risk. We evaluated the relationship between the method of fixation of hip arthroplasty and perioperative mortality using a large national joint replacement registry.  相似文献   

6.
Methods Before surgery, hip pain (THA) or knee pain (TKA), lower-extremity muscle power, functional performance, and physical activity were assessed in a sample of 150 patients and used as independent variables to predict the outcome (dependent variable)—readiness for hospital discharge —for each type of surgery. Discharge readiness was assessed twice daily by blinded assessors.Results Median discharge readiness and actual length of stay until discharge were both 2 days. Univariate linear regression followed by multiple linear regression revealed that age was the only independent predictor of discharge readiness in THA and TKA, but the standardized coefficients were small (≤ 0.03).Interpretation These results support the idea that fast-track THA and TKA with a length of stay of about 2–4 days can be achieved for most patients independently of preoperative functional characteristics.Over the last decade, length of stay (LOS) with discharge to home after primary THA and TKA has declined from about 5–10 days to about 2–4 days in selected series and larger nationwide series (Malviya et al. 2011, Raphael et al. 2011, Husted et al. 2012, Kehlet 2013, Hartog et al. 2013, Jørgensen and Kehlet 2013). However, there is a continuing debate about whether selected patients only or all patients should be scheduled for “fast-track” THA and TKA in relation to psychosocial factors and preoperative pain and functional status (Schneider et al. 2009, Hollowell et al. 2010, Macdonald et al. 2010, Antrobus and Bryson 2011, Jørgensen and Kehlet 2013), or whether organizational or pathophysiological factors in relation to the surgical trauma may determine the length of stay (Husted et al. 2011, Husted 2012).We studied the role of THA and TKA patients’ preoperative pain and functional characteristics in discharge from 2 orthopedic departments with well-established fast-track recovery regimens (Husted et al. 2010).  相似文献   

7.
ResultsThe greater the volume of the hospital, the shorter was the average LOS and LUIC. Smaller hospital volume was not unambiguously associated with increased revision, re-admission, or MUA rates. The smaller the annual hospital volume, the more often patients were discharged home.InterpretationLOS and LUIC ought to be shortened in lower-volume hospitals. There is potential for a reduction in length of stay in extended institutional care facilities.Total knee replacement (TKR) is one of the most common orthopedic procedures, and it is expected to increase markedly in volume (Kurtz et al. 2007). Due to the potentially severe complications and the high economic impact of the procedure, efforts to minimize the risks and optimize perioperative efficiency are important.It has been suggested that increased hospital volume and reduction in length of stay (LOS) at the operating hospital after TKR are related, but there is no consensus (Yasunaga et al. 2009, Marlow et al. 2010, Paterson et al. 2010, Bozic et al. 2010, Styron et al. 2011). In addition, results on the association of hospital volume with re-admission rates (Soohoo et al. 2006b, Judge et al. 2006, Bozic et al. 2010, Cram et al. 2011) and revision risk have been inconclusive (Shervin et al. 2007, Manley et al. 2009, Bozic et al. 2010, Paterson et al. 2010). No-one has tried to study the association between length of uninterrupted institutional care (LUIC), incidence of manipulation under anesthesia (MUA) after TKR, and hospital volume.By combining 5 national-level registries, we examined possible associations between hospital volume and LOS, LUIC, discharge disposition, number of re-admissions within 14 and 42 days, MUA, and revisions after TKR for all knee arthroplasties performed in Finland between 1998 and 2010.  相似文献   

8.

Background and purpose

Joint replacement with metal-on-metal (MOM) bearings have gained popularity in the last decades in young and active patients. However, the possible effects of MOM wear debris and its corrosion products are still the subject of debate. Alongside the potential disadvantages such as toxicity, the influences of metal particles and metal ions on infection risk are unclear.

Methods

We reviewed the available literature on the influence of degradation products of MOM bearings in total hip arthroplasties on infection risk.

Results

Wear products were found to influence the risk of infection by hampering the immune system, by inhibiting or accelerating bacterial growth, and by a possible antibiotic resistance and heavy metal co-selection mechanism.

Interpretation

Whether or not the combined effects of MOM wear products make MOM bearings less or more prone to infection requires investigation in the near future.Many young patients with painful coxarthrosis want to return to a high level of activity and require an implant that provides durability. The low wear rates of metal-on-metal (MOM) bearings have led to a resurgence in the use of MOM bearings (Wagner and Wagner 2000, Silva et al. 2005, Pollard et al. 2006, Vendittoli et al. 2007, Delaunay et al. 2008). 35% of all prostheses in the United States in 2006 (Bozic et al. 2009) and 16% of all prostheses implanted in Australia from 1999 through 2007 had MOM bearings (Graves et al. 2008).Metal alloys used in MOM bearings degrade through wear, from corrosion, or by a combination of the two (Yan et al. 2006, Jacobs et al. 2008). Consequently, MOM bearings produce nanometer- to submicrometer-sized metal particles (Campbell et al. 1996, Doorn et al. 1998). The high number of these very small particles presents a large cumulative surface area for corrosion. The biological effects of these particles and their corrosion products in the human body are for the most part unclear. Since the renewed interest in MOM bearings, extensive research has been done to determine the consequences of local and systemic exposure to wear particles and accompanying biologically active corrosion products (Amstutz and Grigoris 1996). It is well known that metal debris can induce pathological changes such as the release of inflammatory cytokines from macrophages, histiocytosis, fibrosis, and necrosis (Basle et al. 1996, Granchi et al. 1998, Caicedo et al. 2008, 2009). Metal debris is also thought to be associated with hypersensitivity and osteolysis (Hallab et al. 2000, 2010, Goodman 2007b, Carr and DeSteiger 2008, Huber et al. 2009). However, there is very little literature on the bacteriological effects of these degradation products (Anwar et al. 2007, Hosman et al. 2009). It is therefore unclear whether they can influence the risk of infection.The Australian and New Zealand joint registries have shown that between 9% and 15% of all total hip arthroplasty (THA) revisions are carried out because of infections related to the primary prosthesis (Rothwell et al. 2007, Graves et al. 2008). In cases of infection, bacteria adopt a biofilm mode of growth on the surface of the prosthesis, thus increasing the antibiotic resistance and resulting in major difficulties in treatment (Trampuz and Widmer 2006). Removal and replacement of an infected implant is usually required to eliminate the infection (Bozic and Ries 2005, Vincent et al. 2006). Recent research has suggested that particulate debris of any composition promotes bacterial growth by providing a scaffold for bacterial adhesion and biofilm growth (Anwar et al. 2007). On the other hand, high concentrations of metal ions have been shown to have bacteriostatic properties (Hosman et al. 2009).Considering the paucity of publications on the effects of MOM particles on infection, we performed a review of the literature on the influence of MOM wear particles and their corrosion products on the risk of infection.  相似文献   

9.

Background and purpose

It is controversial whether the transverse acetabular ligament (TAL) is a reliable guide for determining the cup orientation during total hip arthroplasty (THA). We investigated the variations in TAL anatomy and the TAL-guided cup orientation.

Methods

80 hips with osteoarthritis secondary to hip dysplasia (OA) and 80 hips with osteonecrosis of the femoral head (ON) were examined. We compared the anatomical anteversion of TAL and the TAL-guided cup orientation in relation to both disease and gender using 3D reconstruction of computed tomography (CT) images.

Results

Mean TAL anteversion was 11° (SD 10, range –12 to 35). The OA group (least-square mean 16°, 95% confidence interval (CI): 14–18) had larger anteversion than the ON group (least-square mean 6.2°, CI: 3.8 – 7.5). Females (least-square mean 20°, CI: 17–23) had larger anteversion than males (least-square mean 7.0°, CI: 4.6–9.3) in the OA group, while there were no differences between the sexes in the ON group. When TAL was used for anteversion guidance with the radiographic cup inclination fixed at 40°, 39% of OA hips and 9% of ON hips had more than 10° variance from the target anteversion, which was 15°.

Interpretation

In ON hips, TAL is a good guide for determining cup orientation during THA, although it is not a reliable guide in hips with OA secondary to dysplasia. This is because TAL orientation has large individual variation and is influenced by disease and gender.Malalignment of the acetabular cup may lead to dislocation (Jolles et al. 2002, Shon et al. 2005), accelerated wear or breakage of the bearing, and component loosening (Kennedy et al. 1998). The use of a mechanical guide for cup implantation may give inaccurate results because of pelvic rotation on the operating table (Sugano et al. 2007, Minoda et al. 2010).Recently, the transverse acetabular ligament (TAL), which bridges the acetabular notch (Löhe et al. 1996) as part of the acetabular labrum, has been reported to be useful for determining proper orientation of the acetabular components (Archbold et al. 2006, 2008, Pearce et al. 2008, Kalteis et al. 2011). TAL-guided cup orientation has been reported to guide the cup placement within Lewinnek’s safe zone (Lewinnek et al. 1978). Other studies have shown that the TAL is not a reliable guide (Epstein et al. 2010, Viste et al. 2011). We hypothesized that these divergent results could be explained by individual anatomical variation; in addition, orientation of the TAL may be affected by hip disease and gender. Furthermore, cup orientation is influenced by sagittal pelvic tilt (Nishihara et al. 2003, DiGioia et al. 2006).We determined (1) the variation in the TAL orientation and the influence of hip disease and gender on this variation, (2) the reliability of using the TAL for guiding cup orientation, and (3) the influence of pelvic tilt on the TAL-guided cup orientation, using computed tomography (CT) scan and computer simulation.  相似文献   

10.

Background and purpose

Adverse reactions to metal debris have been reported to be a cause of pain in metal-on-metal hip arthroplasty. We assessed the incidence of both symptomatic and asymptomatic adverse reactions in a consecutive series of patients with a modern large-head metal-on-metal hip arthroplasty.

Methods

We studied the early clinical results and results of routine metal artifact-reduction MRI screening in a series of 79 large-head metal-on-metal hip arthroplasties (ASR; DePuy, Leeds, UK) in 68 patients. 75 hips were MRI scanned at mean 31 (12–52) months after surgery.

Results

27 of 75 hips had MRI-detected metal debris-related abnormalities, of which 5 were mild, 18 moderate, and 4 severe. 8 of these hips have been revised, 6 of which were revised for an adverse reaction to metal debris, diagnosed preoperatively with MRI and confirmed histologically. The mean Oxford hip score (OHS) for the whole cohort was 21. It was mean 23 for patients with no MRI-based evidence of adverse reactions and 19 for those with adverse reactions detected by MRI. 6 of 12 patients with a best possible OHS of 12 had MRI-based evidence of an adverse reaction.

Interpretation

We have found a high early revision rate with a modern, large-head metal-on-metal hip arthroplasty. MRI-detected adverse rections to metal debris was common and often clinically “silent”. We recommend that patients with this implant should be closely followed up and undergo routine metal artifact-reduction MRI screening.Metal-on-metal (MoM) total hip replacements have been used since the 1960s. Failure in early designs was attributed to mechanical loosening caused by poor bearing tolerances producing high friction (Amstutz and Grigoris 1996, Kothari et al. 1996). Improved manufacturing and engineering techniques enabled the development of a new generation of MoM hip replacements. In the 1990s, the Birmingham Hip Resurfacing (BHR) was developed, and good early to medium-term results have been published (Daniel et al. 2004, Treacy et al. 2005, Heilpern et al. 2008). Similar implants, both resurfacings and large MoM bearings, coupled with standard femoral stems were subsequently developed and marketed by other manufacturers.The development of magnetic resonance imaging (MRI) metal artifact reduction (MAR) sequences has enabled good visualization of the periprosthetic tissues (Toms et al. 2008), and been reported to be a clinically useful part of the assessment of painful MoM hip replacements (Hart et al. 2009). A number of authors have described the appearance of collections of fluid and inflammatory masses around painful MoM hip arthroplasties (Boardman et al. 2006, Pandit et al. 2008, Toms et al. 2008). These have been grouped under a variety of headings such as “aseptic lymphocyte-dominated vasculitis-associated lesions” (Willert et al. 2005), “pseudotumors” (Pandit et al. 2008), or “adverse reactions to metal debris (ARMD)” (Langton et al. 2010). Although these lesions have been previously described in patients investigated for pain, there have been no studies on the overall incidence of these lesions in an unselected series of patients, including those with no, or few, symptoms. It is not known whether these lesions may occur in the absence of symptoms.At our institution, we have a policy of offering routine MAR MRI imaging to patients who have undergone MoM total hip replacement or resurfacing. We determined the early clinical outcome, revision rate, and incidence of ARMD using MAR MRI screening in a consecutive series of patients with an ASR THR or resurfacing (ASR; DePuy, Leeds, UK).  相似文献   

11.

Background and purpose

A considerable number of patients who undergo surgery for spinal stenosis have residual symptoms and inferior function and health-related quality of life after surgery. There have been few studies on factors that may predict outcome. We tried to find predictors of outcome in surgery for spinal stenosis using patient- and imaging-related factors.

Patients and methods

109 patients in the Swedish Spine Register with central spinal stenosis that were operated on by decompression without fusion were prospectively followed up 1 year after surgery. Clinical outcome scores included the EQ-5D, the Oswestry disability index, self-estimated walking distance, and leg and back pain levels (VAS). Central dural sac area, number of levels with stenosis, and spondylolisthesis were included in the MRI analysis. Multivariable analyses were performed to search for correlation between patient-related and imaging factors and clinical outcome at 1-year follow-up.

Results

Several factors predicted outcome statistically significantly. Duration of leg pain exceeding 2 years predicted inferior outcome in terms of leg and back pain, function, and HRLQoL. Regular and intermittent preoperative users of analgesics had higher levels of back pain at follow-up than those not using analgesics. Low preoperative function predicted low function and dissatisfaction at follow-up. Low preoperative EQ-5D scores predicted a high degree of leg and back pain. Narrow dural sac area predicted more gains in terms of back pain at follow-up and lower absolute leg pain.

Interpretation

Multiple factors predict outcome in spinal stenosis surgery, most importantly duration of symptoms and preoperative function. Some of these are modifiable and can be targeted. Our findings can be used in the preoperative patient information and aid the surgeon and the patient in a shared decision making process.Decompressive surgery for lumbar spinal stenosis is the most frequently performed spine operation in many countries (Weinstein et al. 2006, Strömqvist et al. 2009). However, one third of patients are not satisfied with the outcome because of residual leg and back pain, inferior function, and poor health-related quality of life (Katz et al. 1995, Airaksinen et al. 1997, Jönsson et al. 1997, Jansson et al. 2009, Strömqvist et al. 2009, Hara et al. 2010).2 recent randomized studies have shown surgery to be superior to nonoperative treatment in lumbar spinal stenosis (Malmivaara et al. 2007, Weinstein et al. 2008), but many patients improve without surgical treatment (Malmivaara et al. 2007). The question remains as to who benefits most from surgery. Identification of prognostic factors that can aid in selection of patients for surgery is therefore important. Prognostic factors in lumbar spinal stenosis surgery have been studied, but they are not well defined (Turner et al. 1992, Aalto et al. 2006). Aalto et al. (2006) reviewed studies of lumbar spinal stenosis surgery and found that only 21 studies of 885 were of sufficient quality to merit identification of prognostic factors. The main reason for exclusion was a retrospective study design and a limited number of predictors. Cardiovascular and overall comorbidity, disorders influencing walking ability, self-rated health, income, severity of central stenosis, and severity of scoliosis were found to be predictors of outcome, but no single study could identify more than one of these predictors. More recently, smoking, depression, psychiatric illness, and high body mass index have been found to be predictive of negative outcome, as have long duration of symptoms and preoperative resting numbness (Ng et al. 2007, Hara et al. 2010, Athiviraham et al. 2011, Radcliff et al. 2011, Sandén et al. 2011, Sinikallio et al. 2011).Cross-sectional imaging (most often MRI) has an important role in confirming the diagnosis of spinal stenosis, and is essential for surgical planning. Even so, the prognostic value of the narrowness of the dural sac area is not well established (Jönsson et al 1997, Amundsen et al. 2000, Yukawa et al. 2002). Studies incorporating both imaging and patient-related factors in a systematic way have been exceedingly rare (Amundsen et al. 2000, Yukawa et al. 2002).We used patient data from the Swedish Spine Register protocol (Strömqvist et al. 2009) and MRI measurements of central dural sac area, multilevel stenosis, and spondylolisthesis to find predictors of outcome in terms of function, HRLQoL, and leg and back pain after decompression for lumbar spinal stenosis.  相似文献   

12.

Background and purpose

The choice of either all-polyethylene (AP) tibial components or metal-backed (MB) tibial components in total knee arthroplasty (TKA) remains controversial. We therefore performed a meta-analysis and systematic review of randomized controlled trials that have evaluated MB and AP tibial components in primary TKA.

Methods

The search strategy included a computerized literature search (Medline, EMBASE, Scopus, and the Cochrane Central Register of Controlled Trials) and a manual search of major orthopedic journals. A meta-analysis and systematic review of randomized or quasi-randomized trials that compared the performance of tibial components in primary TKA was performed using a fixed or random effects model. We assessed the methodological quality of studies using Detsky quality scale.

Results

9 randomized controlled trials (RCTs) published between 2000 and 2009 met the inclusion quality standards for the systematic review. The mean standardized Detsky score was 14 (SD 3). We found that the frequency of radiolucent lines in the MB group was significantly higher than that in the AP group. There were no statistically significant differences between the MB and AP tibial components regarding component positioning, knee score, knee range of motion, quality of life, and postoperative complications.

Interpretation

Based on evidence obtained from this study, the AP tibial component was comparable with or better than the MB tibial component in TKA. However, high-quality RCTs are required to validate the results.The design of the tibial component is an important factor for implant failure in total knee arthroplasty (TKA) (Pagnano et al. 1999, Forster 2003, Gioe et al. 2007b, Willie et al. 2008, Garcia et al. 2009, KAT Trial Group 2009). The metal-backed (MB) design of tibial component has become predominant in TKA because it is thought to perform better than the all-polyethylene (AP) design (Muller et al. 2006, Gioe et al. 2006, 2007a,b). In theory, the MB tibial component reduces bending strains in the stem, reduces compressive stresses in the cement and cancellous bone beneath the baseplate (especially during asymmetric loading), and distributes load more evenly across the interface (Bartel et al. 1982, 1985, Taylor et al. 1998). However, critics of the MB tibial component claim that there are expensive implant costs, reduced polyethylene thickness with the same amount of bone resection, backside wear, and increased tensile stresses at the interface during eccentric loading (Bartel et al. 1982, 1985, Pomeroy et al. 2000, Rodriguez et al. 2001, Li et al. 2002, Muller et al. 2006, Blumenfeld and Scott 2010, Gioe and Maheshwari 2010).In the past decade, several randomized controlled trials (RCTs) have been performed to assess the effectiveness of the MB tibial component (Adalberth et al. 2000, 2001, Gioe and Bowman 2000, Norgren et al. 2004, Hyldahl et al. 2005a, b, Muller et al. 2006, Gioe et al. 2007, Bettinson et al. 2009, KAT Trial Group 2009). However, data have not been formally and systematically analyzed using quantitative methods in order to determine whether the MB tibial component is indeed optimal for patients in TKA. In this study, we wanted (1) to determine the scientific quality of published RCTs comparing the AP and MB tibial components in TKA using Detsky score (Detsky et al. 1992) and (2) to conduct a meta-analysis and systematic review of all published RCTs that have compared the effects of AP and MB tibial components on the radiographic and clinical outcomes of TKA.  相似文献   

13.

Background and purpose —

Slipped capital femoral epiphysis is thought to result in cam deformity and femoroacetabular impingement. We examined: (1) cam-type deformity, (2) labral degeneration, chondrolabral damage, and osteoarthritic development, and (3) the clinical and patient-reported outcome after fixation of slipped capital femoral epiphysis (SCFE).

Methods —

We identified 28 patients who were treated with fixation of SCFE from 1991 to 1998. 17 patients with 24 affected hips were willing to participate and were evaluated 10–17 years postoperatively. Median age at surgery was 12 (10–14) years. Clinical examination, WOMAC, SF-36 measuring physical and mental function, a structured interview, radiography, and MRI examination were conducted at follow-up.

Results —

Median preoperative Southwick angle was 22o (IQR: 12–27). Follow-up radiographs showed cam deformity in 14 of the 24 affected hips and a Tönnis grade > 1 in 1 affected hip. MRI showed pathological alpha angles in 15 affected hips, labral degeneration in 13, and chondrolabral damage in 4. Median SF-36 physical score was 54 (IQR: 49–56) and median mental score was 56 (IQR: 54–58). These scores were comparable to those of a Danish population-based cohort of similar age and sex distribution.Median WOMAC score was 100 (IQR: 84–100).

Interpretation —

In 17 patients (24 affected hips), we found signs of cam deformity in 18 hips and early stages of joint degeneration in 10 hips. Our observations support the emerging consensus that SCFE is a precursor of cam deformity, FAI, and joint degeneration. Neither clinical examination nor SF-36 or WOMAC scores indicated physical compromise.In femoroacetabular impingement (FAI), repeated trauma to the acetabular labrum and adjacent chondral structures may result in labral degeneration, tearing of the labrum, chondral delamination, and osteoarthritis development. Cam-type deformity is characterized by loss of sphericity of the femoral head and decreased head/neck offset laterally and anteriorly (Figure 1A). This deformity has been identified in 17–24% of men and in 4% of women (Gosvig et al. 2008, Reichenbach et al. 2010) and is believed to be one of the main contributors to osteoarthritic development. The etiology of cam-type deformity remains unclear (Beck et al. 2005, Ganz et al. 2008, Jessel et al. 2009, Leunig et al. 2009, Barros et al. 2010, Klit et al. 2011), high intensity of sports activity during adolescence has been associated with increased risk of cam-type deformity (Siebenrock et al. 2011).Open in a separate windowFigure 1.A. CAM-type deformity with characteristic loss of sphericity of the femoral head and decreased head/neck offset laterally and anteriorly after SCFE with in situ screw fixation. B. SCFE with loss of sphericity of the femoral head and decreased head/neck offset laterally and anteriorly and in situ screw fixation. C. Radial reconstruction of a MRI 3T scan showing anterior CAM-type deformity with the characteristic loss of sphericity of the femoral head and decreased head/neck offset anteriorly at 15 years after SCFE with in situ screw fixation.Slipped capital femoral epiphysis (SCFE) is thought to be a precursor of cam-type deformity and therefore possibly also development of osteoarthritis (Murray 1965, Stulberg et al. 1975, Harris 1986, Leunig et al. 2000, 2009, Beck et al. 2005, Ganz et al. 2008, Gosvig et al. 2008b, Murray and Wilson 2008, Mamisch et al. 2009, Barros et al. 2010, Klit et al. 2011). In typical SCFE (Figure 1B), the epiphysis stays in the acetabular socket and the femoral metaphysis is displaced anteriorly and superiorly, creating the impression of an epiphysis that has slipped posteriorly and inferiorly. The consequence is a reduced or complete loss of head/neck offset, which resembles a prototype of cam-type deformity (Harris 1986, Mintz et al. 2005, Lehmann et al. 2006, Jessel et al. 2009).SCFE is the most common hip disorder in adolescence (Lehmann et al. 2006, Gholve et al. 2009) with a prevalence of asymptomatic so-called silent SCFE of 3% in girls and 10% in boys (Lehmann 2008, personal communication), which is less than the population-based prevalence estimates of cam-type deformity in adults of 4–24% (Lehmann 2008, Gosvig et al. 2010, Reichenbach et al. 2010) .To test the hypothesis that SCFE results in cam-type deformity, we raised the following questions: (1) does cam-type deformity or (2) do labral degeneration, chondrolabral damage, and osteoarthritic development appear at 10–17 years of follow-up after fixation of SCFE; and (3) is the clinical and patient-reported outcome affected?  相似文献   

14.

Purpose

We wanted to improve the diagnosis of implant-related infection using molecular biological techniques after sonication.

Methods

We studied 258 retrieved implant components (185 prosthetic implants and 73 osteosynthesis implants) from 126 patients. 47 patients had a clinical diagnosis of infection (108 components) and 79 patients did not (150 components). The fluids from sonication of retrieved implants were tested in culture and were also analyzed using a modified commercial PCR kit for detection of Gram-positive and Gram-negative bacteria (GenoType BC; Hain Lifescience) after extraction of the DNA.

Results

38 of 47 patients with a clinical diagnosis of infection were also diagnosed as being infected using culture and/or PCR (35 by culture alone). Also, 24 patients of the 79 cases with no clinical diagnosis of infection were identified microbiologically as being infected (4 by culture, 16 by PCR, and 4 by both culture and PCR). Comparing culture and PCR, positive culture results were obtained in 28 of the 79 patients and positive PCR results were obtained in 35. There were 21 discordant results in patients who were originally clinically diagnosed as being infected and 28 discordant results in patients who had no clinical diagnosis of infection.

Interpretation

For prosthetic joint infections and relative to culture, molecular detection can increase (by one tenth) the number of patients diagnosed as having an infection. Positive results from patients who have no clinical diagnosis of infection must be interpreted carefully.Management of orthopedic implant-related infections starts with a proper etiological diagnosis, which is required for specific antibiotic treatment. Different approaches are used to obtain such a diagnosis (Trampuz et al. 2006, Del Pozo and Patel 2009) and these must take into account the importance of the development of bacterial biofilms in the pathogenesis and management of implant-related infections (Trampuz et al. 2003, 2006, Costerton 2005).The use of low-intensity ultrasound that releases biofilms is an alternative to classical culture methods from implants, and several protocols have been developed for this purpose (Trampuz et al. 2007, Dora et al. 2008, Esteban et al. 2008, Piper et al. 2009, Achermann et al. 2010). In these reports, the use of sonication of retrieved implants was reported to have similar sensitivity to or higher sensitivity than conventional techniques. Nevertheless, there are still patients with a clinical diagnosis of infection and negative cultures (Berbari et al. 2007). Previous use of antibiotics has been implicated as one of the main causes of this problem (Trampuz et al. 2007), but other causes are also possible. To solve the problem, molecular biological techniques have been proposed in order to obtain faster and more accurate results than conventional culture (Tunney et al. 1999, Sauer et al. 2005, Dempsey et al. 2007, Fihman et al. 2007, Moojen et al. 2007, Gallo et al. 2008, Kobayashi et al. 2008, Vandercam et al. 2008, De Man et al. 2009, Piper et al. 2009, Achermann et al. 2010, Riggio et al. 2010, Marin et al. 2012). Most of these reports were based on protocols that were developed in-house, which are difficult to integrate into clinical microbiology routines, even though they may give good results. Recently, however, commercial kits have been designed to work under common routine laboratory conditions. Here, we describe a study on the diagnosis of infection in a broad range of orthopedic implant-related infections, comparing conventional culture with detection of microbial DNA using a commercial kit—in both cases after sonication of retrieved implants.  相似文献   

15.

Background and purpose

Ceramic-on-ceramic (CoC) bearings have been in use in total hip replacement (THR) for more than 40 years, with excellent long-term survivorship. Although there have been several simulator studies describing the performance of these joints, there have only been a few retrieval analyses. The aim of this study was to investigate the wear patterns, the surface properties, and friction and lubrication regimes of explanted first-generation alumina bearings.

Materials and methods

We studied 9 explanted CoC bearings from Autophor THRs that were revised for aseptic loosening after a mean of 16 (range 7–19) years. The 3D surface roughness profiles of the femoral heads and acetabular cups (Srms, Sa, and Ssk) were measured to determine the microscopic wear. The bearings were imaged using an atomic-force microscope in contact mode, to produce a topographical map of the surfaces of the femoral heads. Friction tests were performed on the bearing couples to determine the lubrication regime under which they were operating during the walking cycle. The diametral clearances were also measured.

Results

3 femoral heads showed stripe wear and the remaining 6 bearings showed minimal wear. The femoral heads with stripe wear had significantly higher surface roughness than the minimally worn bearings (0.645 vs. 0.289, p = 0.04). High diametral clearances, higher than expected friction, and mixed/boundary lubrication regimes prevailed in these retrieved bearings.

Interpretation

Despite the less than ideal tribological factors, these first-generation CoC bearings still showed minimal wear in the long term compared to previous retrieval analyses.Ceramic-on-ceramic (CoC) bearings for total hip replacement (THR) were developed in the early 1970s. The earliest designs, typified by the Ceraver-Osteal implant, failed because of inadequate fixation and high fracture rates of the ceramic (Boutin et al. 1988, Mittelmeier and Heisel 1992). Throughout the 1980s, the Mittelmeier Autophor ceramic prosthesis (Smith and Nephew, Memphis, TN) was widely used. The threaded external surface of the acetabular component gave primary stability, but it had no porous surface for bony ingrowth. This design did not improve the rate of aseptic loosening, but the fracture rate was notably reduced (Boutin et al. 1988, Sedel 2000, Tateiwa et al. 2008, Jeffers and Walter 2012). Since the early 1990s, the predominant design has been a rough or porous-coated titanium shell with a ceramic liner.A recent systematic review of CoC THRs confirmed excellent survivorship of the modern implants of up to 97% at 10 years (Jeffers and Walter 2012). It is likely that the improvements in acetabular fixation as well as in the manufacturing process, design, and quality control of the ceramic bearings have contributed to the excellent clinical results. Ceramic bearings are relatively inert, and they have excellent wear properties (Savarino et al. 2009). There have only been isolated case reports describing osteolysis around CoC bearings possibly making revision surgery easier with the preserved bone stock (Yoon et al. 1998, Sedel 2000, Tateiwa et al. 2008, Hannouche et al. 2011). The fracture rates of modern alumina ceramic bearings have been reported to be as low as 1 in 25,000 (Nizard et al. 2005, Tateiwa et al. 2008, Jeffers and Walter 2012).Hip simulator studies on CoC bearings have consistently shown very low wear rates (Nevelos et al. 2001, Rieker et al. 2001, Tipper et al. 2002, Stewart et al. 2003), but this has not been reflected by the long-term retrieval analyses (Nevelos et al. 1999, 2001, Prudhommeaux et al. 2000, Affatato et al. 2012). It must be understood, however, that retrieval studies are performed on joints that have failed, not well-functioning joints, so this does not give information on the larger proportion of successful CoC THRs. There have only been a few long-term retrieval analyses of explanted CoC bearings (Nevelos et al. 1999, 2001, Prudhommeaux et al. 2000) and even fewer retrieval analyses of modern CoC bearings (Affatato et al. 2012). With the excellent clinical survivorship of the modern implants (Jeffers and Walter 2012), failed first-generation CoC bearings may well have to be studied to more fully understand the in vivo tribology.The aim of this study was to investigate the wear patterns, the surface properties, and friction and lubrication regimes in 9 explanted first-generation alumina CoC bearings. The tribological data from this study are likely to represent the worst case scenario, which can be used for comparison in future retrieval studies featuring modern CoC bearings.  相似文献   

16.

Background and purpose

Computer navigation in total knee arthroplasty is somewhat controversial. We have previously shown that femoral component positioning is more accurate with computed navigation than with conventional implantation techniques, but the clinical impact of this is unknown. We now report the 5-year outcome of our previously reported 2-year outcome study.

Methods

78 of initially 84 patients (80 of 86 knees) were clinically and radiographically reassessed 5 (5.1–5.9) years after conventional, image-based, and image-free total knee arthroplasty. The methodology was identical to that used preoperatively and at 2 years, including the Knee Society score (KSS) and the functional score (FS), and AP and true lateral standard radiographs.

Results

Although a more accurate femoral component positioning in the navigated groups was obtained, clinical outcome, number of reoperations, KSS, FS, and range of motion were similar between the groups.

Interpretation

The increased costs and time for navigated techniques did not translate into better functional and subjective medium-term outcome compared to conventional techniques.Abnormal wear patterns and component loosening are mainly results of component malalignment and complications of the extensor mechanism, the most common reasons for early failure of TKA (Ritter et al. 1994, Rand et al. 2003, Vince 2003, Bathis et al. 2004). It has been suggested that a varus or valgus malalignment of more the 3° leads to faster wear and debris, followed by early failure of TKA (Ecker et al. 1987, Archibeck and White 2003, Nizard et al. 2004).Several surgical navigation systems for TKA have been introduced to optimize component positioning (Delp et al. 1998, DiGioia et al. 1998, Krackow et al. 1999). It has been shown that navigation provides a more precise component positioning and fewer outliers (Bathis et al. 2004, Nabeyama et al. 2004, Stockl et al. 2004, Victor and Hoste 2004, Anderson et al. 2005, Zumstein et al. 2006). Nevertheless, comparing computer-navigated total knee arthroplasty with conventional implantation techniques, there is no evidence in the current literature of any significant improvement in clinical outcome and in component loosening (Bathis et al. 2004, Jenny et al. 2005, Yau et al. 2005, Bonutti et al. 2008, Molfetta and Caldo 2008).In a prospective study involving 86 patients in 3 different groups (image-based navigation, image-free navigation, and conventional), we showed that femoral component positioning was more accurate with navigation than with conventional implantation techniques, but tibial positioning showed similar results (Zumstein et al. 2006).Although other medium-term data on navigated total knee arthroplasty have already been reported (Ishida et al. 2011, Schmitt et al. 2011), there has been no prospective cohort series with reporting of the clinical, functional, and radiographic outcome with all 3 techniques: image-based navigated, image-free navigated, or conventional TKA. We therefore determined the clinical, functional, and radiographic 5-year results after each of the 3 techniques.  相似文献   

17.

Background and purpose

The natural history of, and predictive factors for outcome of cartilage restoration in chondral defects are poorly understood. We investigated the natural history of cartilage filling subchondral bone changes, comparing defects at two locations in the rabbit knee.

Animals and methods

In New Zealand rabbits aged 22 weeks, a 4-mm pure chondral defect (ICRS grade 3b) was created in the patella of one knee and in the medial femoral condyle of the other. A stereo microscope was used to optimize the preparation of the defects. The animals were killed 12, 24, and 36 weeks after surgery. Defect filling and the density of subchondral mineralized tissue was estimated using Analysis Pro software on micrographed histological sections.

Results

The mean filling of the patellar defects was more than twice that of the medial femoral condylar defects at 24 and 36 weeks of follow-up. There was a statistically significant increase in filling from 24 to 36 weeks after surgery at both locations.The density of subchondral mineralized tissue beneath the defects subsided with time in the patellas, in contrast to the density in the medial femoral condyles, which remained unchanged.

Interpretation

The intraarticular location is a predictive factor for spontaneous filling and subchondral bone changes of chondral defects corresponding to ICRS grade 3b. Disregarding location, the spontaneous filling increased with long-term follow-up. This should be considered when evaluating aspects of cartilage restoration.Focal articular cartilage injuries of the knee are common (Hjelle et al. 2002, Aroen et al. 2004) and they can impair patients'' quality of life as much as severe osteoarthritis (Heir et al. 2010). The literature concerning the natural history of focal cartilage defects in patients, and the intrinsic factors affecting it, is limited (Linden 1977, Messner and Gillquist 1996, Drogset and Grontvedt 2002, Shelbourne et al. 2003, Loken et al. 2010). In experimental studies evaluating cartilage restoration in general, the importance of intrinsic factors such as the depth and size of the lesion and the time from when the lesion was made to evaluation have been emphasized (Shapiro et al. 1993, Hunziker 1999, Lietman et al. 2002). Which part of the joint is affected and whether or not the defect is weight-bearing are also of interest (Hurtig 1988, Frisbie et al. 1999). Most of these studies have, however, concerned defects penetrating the subchondral mineralized tissues corresponding to ICRS grade 4 (Brittberg and Winalski 2003). Access to bone marrow elements in these defects might be one of the strongest predictive factors for filling of the defect, making the importance of other factors difficult to evaluate (Hunziker 1999).In experimental studies on pure chondral defects that do not penetrate the subchondral mineralized tissues, corresponding to ICRS grade 3b (Brittberg and Winalski 2003), the type of animal studied, the size of the lesion, and the location of the defects vary, and there is limited data on the influence of these parameters on outcome (Breinan et al. 2000). The information on spontaneous filling comes mainly from observations of untreated defects serving as controls (Grande et al. 1989, Brittberg et al. 1996, Breinan et al. 1997, 2000, Frisbie et al. 1999, 2003, Dorotka et al. 2005) and the information on subchondral bone changes is even more limited (Breinan et al. 1997, Frisbie et al. 1999). Although most human focal cartilage lesions are located on the medial femur condyle (Aroen et al. 2004), there have been few experimental studies involving untreated ICRS grade 3b defects on the medial femur condyle (Dorotka et al. 2005). According to a PubMed search, the rabbit knee is the most widely used experimental animal model for cartilage restoration (Årøen 2005). The locations of ICRS grade 3 chondral defects in the rabbit knee evaluated for spontaneous changes have included the patella (Grande et al. 1989, Brittberg et al. 1996) and, in one study, defects at the distal surface of the femur (Mitchell and Shepard 1976). The latter report did not, however, include quantitative data.To our knowledge, the influence of the intraarticular location on the outcome of cartilage restoration and subchondral bone changes has not been thoroughly studied. Thus, the main purpose of our study was to test the hypothesis that the intraarticular location influences the spontaneous filling of a chondral defect that does not penetrate the subchondral bone. Secondly, we wanted to evaluate whether the intraarticular location would influence changes in the subchondral bone and degenerative changes as evaluated from macroscopic appearance and proteoglycan content of synovial fluid (Messner et al. 1993a).  相似文献   

18.
Results 538 patients were available for analysis. The prevalence of persistent pain was 22% (CI: 18–25), and the prevalence of presumed neuropathic pain was 13% (CI: 10–16). Persistent pain was more frequent in fracture patients (29%) than in osteoarthritis patients (16%), while the prevalence of neuropathic pain was similar. Severe pain during the first postoperative week increased the risk of persistent pain. Risk also increased with hemiprosthesis (as compared to total prosthesis) in osteoarthritis patients, and with previous osteosynthesis and pain elsewhere in fracture patients.Interpretation Persistent pain after shoulder replacement is a daily burden for many patients. Further studies should address patient and prosthesis selection, postoperative pain management, and follow-up of these patients.There is a substantial amount of literature documenting that there is a possible risk of persistent pain after almost any surgical procedure (Macrae 2001, Johansen et al. 2012). The prevalence rates are highly dependent on the type of surgery, and vary from 5% to 85% (Kehlet et al. 2006, Macrae 2008). The consequences of chronic or persistent postsurgical pain are significant, not only in terms of suffering and reduced quality of life for the individual patient, but also with regard to the subsequent costs to healthcare services and social services. Many authors have reported putative risk factors for persistent pain, including genetic factors, age, psychosocial factors, type of anesthesia, pain elsewhere than the surgical site, other comorbidities, preoperative pain, and acute postoperative pain (Althaus et al. 2012, VanDenKerkhof et al. 2013). Intraoperative nerve damage and the extent of surgery are also important risk factors (Katz and Seltzer 2009). In fact, many patients with persistent postsurgical pain present with characteristic symptoms of neuropathic pain in the affected area (Kehlet et al. 2006).There is a scarcity of data on persistent postsurgical pain after orthopedic surgery. To our knowledge, previous studies focusing on persistent postsurgical pain in orthopedic patients have concerned mainly amputation or hip or knee replacement (Nikolajsen et al. 2006, Lundblad et al. 2008, Beswick et al. 2012, Liu et al. 2012, Jansen et al. 2014). Trials of shoulder replacement surgery have more commonly reported pain relief, or a composite score including pain, rather than the prevalence of pain at follow-up. There has been very little research on predictive factors for persistent postsurgical pain following shoulder replacement, but the general outcome has been shown to be associated with diagnosis and prosthesis type (Radnay et al. 2007, Fevang et al. 2013) and with previous shoulder surgery, age, and preoperative Short Form-36 mental score and DASH functional score (Simmen et al. 2008). Identification of subgroups at increased risk is important in order to establish interventions to prevent or minimize the impact of persistent postsurgical pain.We investigated the prevalence of, the characteristics of, and risk factors for persistent pain 1–2 years after more than 500 shoulder replacements performed in Denmark.  相似文献   

19.
BackgroundFast–track has become a well–known concept resulting in improved patient satisfaction and postoperative results. Concerns have been raised about whether increased efficiency could compromise safety, and whether early hospital discharge might result in an increased number of complications. We present 1–year follow–up results after implementing fast–track in a Norwegian university hospital.MethodsThis was a register–based study of 1,069 consecutive fast–track hip and knee arthroplasty patients who were operated on between September 2010 and December 2012. Patients were followed up until 1 year after surgery.Results987 primary and 82 revision hip or knee arthroplasty patients were included. 869 primary and 51 revision hip or knee patients attended 1–year follow–up. Mean patient satisfaction was 9.3 out of a maximum of 10. Mean length of stay was 3.1 days for primary patients. It was 4.2 days in the revision hip patients and 3.9 in the revision knee patients. Revision rates until 1–year follow–up were 2.9% and 3.3% for primary hip and knee patients, and 3.7% and 7.1% for revision hip and knee patients. Function scores and patient–reported outcome scores were improved in all groups.InterpretationWe found reduced length of stay, a high level of patient satisfaction, and low revision rates, together with improved health–related quality of life and functionality, when we introduced fast–track into an orthopedic department in a Norwegian university hospital.The health service in Norway has been reorganized in the last decade. The number of available beds and the length of stay (LOS) in somatic hospitals have been reduced. Patients are increasingly being treated as outpatients rather than being admitted to hospital (SSB 2011). Changes in treatment modalities have contributed to this reorganization. Within elective surgery, the “fast–track” principles are increasingly being adopted, although there is still potential for improvement regarding both treatment and clinical results (Rostlund and Kehlet 2007, Kehlet and Soballe 2010). Fast–track originated in Denmark—in gastrointestinal surgery—and has been further developed and documented in joint replacement surgery in hospitals in Denmark over the last decade (Rasmussen et al. 2001, Husted et al. 2010a,d, 2012, Leonhardt et al. 2010). The fast–track concept is an evidence–based multimodality treatment that reduces convalescence time and improves clinical results, including reduction in morbidity and mortality (Kehlet and Wilmore 2008, Schneider et al. 2009). The particularly important elements are: anesthesia, fluid therapy, pain therapy, and early postoperative mobilization (Husted and Holm 2006, Husted et al. 2010a, 2011a, 2012, Khan et al. 2014) as well as preoperative information and supervision (Kehlet 1997, Andersen et al. 2007, 2009, Holm et al. 2010).It has been said that fast–track may result in increased complication rates and re–admissions (Mauerhan et al. 2003). However, several studies have found that reduced length of stay does not compromise patient safety (Pilot et al. 2006, Mahomed et al. 2008, Schneider et al. 2009) or increase complication rates compared to conventional treatment methods (Husted et al. 2010b). Also, it has been shown that fast–track surgery with early mobilization and short deep–vein thrombosis prophylaxis results in low rates of deep–vein thrombosis and pulmonary embolism (Husted et al. 2010c, Jorgensen et al. 2013).A reorganization in the orthopedic department at Trondheim University Hospital in 2010 led to an increased number of knee and hip arthroplasty patients, from 7 to 17 a week (Egeberg et al. 2010). Based on the successful implementation of fast–track in several hospitals in Denmark (Husted et al. 2008, Kehlet and Wilmore 2008), this model was adopted in our department. To be able to continually monitor treatment quality and process data, we established an internal quality register (Bjorgen et al. 2012). We now present the 1–year follow–up results after implementation of this fast–track procedure.  相似文献   

20.

Background and purpose

Total knee replacement (TKR) is being increasingly performed in elderly patients, yet there is little information on specific requirements and complication rates encountered by this group. We assessed whether elderly patients undergoing TKR had different length of stay, requirements, complication rates, and functional outcomes compared to younger counterparts.

Patients and methods

We analyzed prospectively gathered data on 3,144 consecutive primary TKRs (in 2,092 patients aged less than 75 years, 694 patients aged between 75 and 80 years, and 358 patients aged over 80 years at the time of surgery).

Results

Incidence of blood transfusion, urinary catheterization, postoperative confusion, cardiac arrhythmia, and 1-year mortality increased with age, even after adjusting for confounding factors, whereas the incidences of chest infection and mortality at 1 month were highest in those aged 75–80. Rates of thromboembolism, prosthetic infection, and revision were similar in the 3 age groups. All groups showed similar substantial improvements in American Knee Society (AKS) knee scores, which were maintained at 5 years. Older patients had smaller improvements in AKS function score, which deteriorated between 3 and 5 years postoperatively, in contrast to the younger group.

Interpretation

Elderly people stand to gain considerably from TKR, particularly in terms of pain relief, and they should not be denied surgery based solely on age. However, they should be warned that they can expect a longer length of stay, a higher requirement for blood transfusion and/or urinary catheterization, and more medical complications postoperatively. Mortality was also higher in the older age groups. The risks have been quantified to assist in perioperative counselling, informed consent, and healthcare planning.Healthcare systems and medical professionals will need to cater for increasing numbers of total knee replacements (TKRs) in elderly people in the coming years (Carr et al. 2012), but little is known about inpatient requirements and the postoperative complications suffered by this particular patient group. Some studies have shown good joint-specific pain relief and functional benefits from TKR in the elderly (Anderson et al. 1996, Birdsall et al. 1999), although it has been suggested that elderly patients may attain lower global function than their younger counterparts (Clement et al. 2011, Kennedy et al. 2013). However, studies attempting to describe complications in the elderly undergoing TKR have been small (Zicat et al. 1993, Hosick et al. 1994, Joshi et al. 2003), have lacked comparator groups (Hosick et al. 1994, Joshi et al. 2003), or have failed to quantify the time scales within which complications have occurred (Clement et al. 2011, Kennedy et al. 2013). Other studies and registries have been based on discharge summary databases without specific patient follow-up (Kreder et al. 2005, Mahomed et al. 2005, Scottish Arthroplasty Project, 2012). They therefore relied on third-party coding of discharge summaries and reported only on mortality during the index admission (Kreder et al. 2005) or selected complications requiring hospital re-admission within 30 or 90 days (Kreder et al. 2005, Mahomed et al. 2005, Scottish Arthroplasty Project, 2012).The aim of this study was to determine whether elderly patients undergoing TKR had different postoperative length of stay, inpatient requirements (i.e. blood transfusion and urinary catheterization), complication rates, and mortality rates to those of their younger counterparts. Functional outcomes were assessed as a secondary outcome measure, to determine whether elderly patients gained benefit comparable to that of their younger counterparts, independently of recorded admission requirements and complications.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号