A randomized comparison of a 3-week and 6-week vascular surgery simulation course on junior surgical residents' performance of an end-to-side anastomosis
OBJECTIVE: We assessed the effect of an open vascular simulation course on the surgical skill of junior surgical residents in performing a vascular end-to-side anastomosis and determined the course length required for effectiveness. We hypothesized that a 6-week course would significantly increase the surgical skill of junior residents in performing an end-to-side anastomosis, while a 3-week course would not.
METHODS: We randomized 37 junior residents (postgraduate year 1 to 3) to a course consisting of three (short course, n = 18) or six (long course, n = 19) consecutive weekly 1-hour teaching sessions. Content focused on instrument recognition and performance of an end-to-side vascular anastomosis using a simulation model. A standardized 50-point vascular skills assessment (SVSA) measured knowledge and technical proficiency. Senior residents (postgraduate year 4 to 5) were tested at baseline. Junior residents were tested at baseline and at 1 and 16 weeks after course completion, and their scores were compared with baseline and senior resident scores. Residents and faculty completed a standardized anonymous evaluation of the course.
RESULTS: Baseline scores between short-course and long-course participants were not different. At baseline, junior residents had significantly lower SVSA scores than senior residents (36+/-7 vs 41.4+/-2.5; P=.002). One week after course completion, SVSA scores for short-course (43.5+/-2.9 vs 34.2+/-7.5; P=.008) and long-course (43.9+/-5.6 vs 38.3+/-5.9; P=.006) participants were significantly improved from baseline. SVSA scores decreased slightly at 16 weeks but remained above baseline in short-course (39+/-6.2 vs 34.2+/-7.5; P=.03) and long-course (40+/-4.5 vs 38.3+/-5.9; P=.08) participants. Long vs short course length did not affect improvement in SVSA scores at 1 or 16 weeks. In short-course and long-course participants, SVSA scores at 1 and 16 weeks were not significantly different from senior resident scores. Course ratings were high, and 95% of residents indicated the course "made them a better surgeon." Residents and faculty felt the educational benefit of the course merited the investment of resources.
CONCLUSIONS: An open vascular simulation course consisting of three weekly 1-hour sessions increased the surgical skill of junior residents in performing a vascular end-to-side anastomosis to that of senior residents on a standardized assessment. A 6-week course provided no additional benefit. This study supports the use of an open vascular simulation course to teach vascular surgical skills to junior residents. A course consisting of three 1-hour sessions is an effective and efficient component of a simulation program for junior surgical residents in a busy surgical center. rights reserved.
OBJECTIVE: Single-segment saphenous vein remains the optimal conduit for infrainguinal revascularization. In its absence, prosthetic conduit may be used. Existing data regarding the significance of anastomotic distal vein adjunct (DVA) usage with prosthetic grafts are based on small series.
METHODS: This is a retrospective cohort analysisderived from the regional Vascular Study Group of New England as well as the Brigham and Women's hospital database. A total of 1018 infrainguinal prosthetic bypass grafts were captured in the dataset from 73 surgeons at 15 participating institutions. Propensity scoring and 3:1 matching was performed to create similar exposure groups for analysis. Outcome measures of interest included: primary patency, freedom from major adverse limb events (MALEs), and amputation free survival at 1 year as a function of vein patch utilization. Time to event data were compared with the log-rank test; multivariable Cox proportional hazard models were used to evaluate the adjusted association between vein cuff usage and the primary end points. DVA was defined as a vein patch, cuff, or boot in any configuration.
RESULTS: Of the 1018 bypass operations, 94 (9.2%) had a DVA whereas 924 (90.8%) did not (no DVA). After propensity score matching, 88 DVAs (25%) and 264 no DVAs (75%) were analyzed. On univariate analysis of the matched cohort, the DVA and no DVA groups were similar in terms of mean age (70.0 vs 69.0; P = .55), male sex (58.0% vs 58.3%; P > .99), and preoperative characteristics such as living at home (93.2% vs 94.3%; P = .79) and independent ambulatory status (72.7% vs 75.7%; P = .64). The DVA and no DVA groups had similar rates of major comorbidities such as hypertension chronic obstructive pulmonary disease, diabetes mellitus, coronary artery disease, and dialysis dependence (P > .05 for all). Likewise, they had similar rates of distal origin grafts (13.6% vs 12.5%; P = .85), critical limb ischemia indications (P = .53), and prior arterial bypass (58% vs 47%; P = .08). The DVA group had a higher rate of completion angiogram performed (55.7% vs 37.5%; P =.002) and were more likely to be discharged on coumadin (53.4% vs 37.1%; P =.01). By multivariable analysis, use of a distal DVA was protective against MALEs (hazard ratio, 0.36; 95% confidence interval, 0.14-0.90; P = .03).
CONCLUSIONS: This contemporary multi-institutional propensity-matched study demonstrates that patients that receive distal anastomotic vein adjuncts as part of infrainguinal prosthetic bypass operations in general have more extreme comorbidities and more technically challenging operations based on level of target vessel and prior bypass attempts. After propensity-matched analysis, the use of a DVA may protect against MALEs in prosthetic bypass surgery and should be considered when feasible. rights reserved.
OBJECTIVE: Smoking is the most important modifiable risk factor for patients with vascular disease. The purpose of this study was to examine smoking cessation rates after vascular procedures and delineate factors predictive of postoperative smoking cessation.
METHODS: The Vascular Study Group of New England registry was used to analyze smoking status preoperatively and at 1 year after carotid endarterectomy, carotid artery stenting, lower extremity bypass, and open and endovascular abdominal aortic aneurysm repair between 2003 and 2009. Of 10,734 surviving patients after one of these procedures, 1755 (16%) were lost to follow-up and 1172 (11%) lacked documentation of their smoking status at follow-up. The remaining 7807 patients (73%) were available for analysis. Patient factors independently associated with smoking cessation were determined using multivariate analysis. The relative contribution of patient and procedure factors including treatment center were measured by chi-pie analysis. Variation between treatment centers was further evaluated by calculating expected rates of cessation and by analysis of means. Vascular Study Group of New England surgeons were surveyed regarding their smoking cessation techniques (85% response rate).
RESULTS: At the time of their procedure, 2606 of 7807 patients (33%) were self-reported current smokers. Of these, 1177 (45%) quit within the first year of surgery, with significant variation by procedure type (open abdominal aortic aneurysm repair, 50%; endovascular repair, 49%; lower extremity bypass, 46%; carotid endarterectomy, 43%; carotid artery stenting, 27%). In addition to higher smoking cessation rates with more invasive procedures, age >70 years (odds ratio [OR], 1.90; 95% confidence interval [CI], 1.30-2.76; P < .001) and dialysis dependence (OR, 2.38; 95% CI, 1.04-5.43; P = .04) were independently associated with smoking cessation, whereas hypertension (OR, 1.23; 95% CI, 1.00-1.51; P = .051) demonstrated a trend toward significance. Treatment center was the greatest contributor to smoking cessation, and there was broad variation in smoking cessation rates, from 28% to 62%, between treatment centers. Cessation rates were higher than expected in three centers and significantly lower than expected in two centers. Among survey respondents, 78% offered pharmacologic therapy or referral to a smoking cessation specialist, or both. The smoking cessation rate for patients of these surgeons was 48% compared with 33% in those who did not offer medications or referral (P < .001).
CONCLUSIONS: Patients frequently quit smoking after vascular surgery, and multiple patient-related and procedure-related factors contribute to cessation. However, we note significant influence of treatment center on cessation as well as broad variation in cessation rates between treatment centers. This variation indicates an opportunity for vascular surgeons to impact smoking cessation at the time of surgery. rights reserved.
Severity of chronic obstructive pulmonary disease is associated with adverse outcomes in patients undergoing elective abdominal aortic aneurysm repair
INTRODUCTION: Although chronic obstructive pulmonary disease (COPD) has been implicated as a risk factor for abdominal aortic aneurysm (AAA) rupture, its effect on surgical repair is less defined. Consequently, variation in practice persists regarding patient selection and surgical management. The purpose of this study was to analyze the effect of COPD on patients undergoing AAA repair.
METHODS: We reviewed a prospective regional registry of 3455 patients undergoing elective open AAA repair (OAR) and endovascular AAA repair (EVAR) from 23 centers in the Vascular Study Group of New England from 2003 to 2011. COPD was categorized as none, medical (medically treated but not oxygen [O2]-dependent), and O2-dependent. End points included in-hospital death, pulmonary complications, major postoperative adverse events (MAEs), extubation in the operating room, and 5-year survival. Survival was determined using life-table analysis based on the Social Security Death Index. Predictors of in-hospital and long-term mortality were determined by multivariate logistic regression and Cox proportional hazards analysis.
RESULTS: During the study interval, 2043 patients underwent EVAR and 1412 patients underwent OAR with a nearly equal prevalence of COPD (35% EVAR vs 36% OAR). O2-dependent COPD (4%) was associated with significantly increased in-hospital mortality, pulmonary complications, and MAE and was also associated with significantly decreased extubation in the operating room among patients undergoing both EVAR and OAR. Five-year survival was significantly diminished among all patients undergoing AAA repair with COPD (none, 78%; medical, 72%; O2-dependent, 42%; P < .001). By multivariate analysis, O2-dependent COPD was independently associated with in-hospital mortality (odds ratio 2.02, 95% confidence interval, 1.0-4.0; P = .04) and diminished 5-year survival (hazard ratio, 3.02; 95% confidence interval, 2.2-4.1; P < .001).
CONCLUSIONS: Patients with O2-dependent COPD undergoing AAA repair suffer increased pulmonary complications, overall MAE, and diminished long-term survival. This must be carefully factored into the risk-benefit analysis before recommending elective AAA repair in these patients. rights reserved.
Management of a patient with Turner syndrome presenting with an isolated left subclavian artery aneurysm
We describe a case of a 52-year-old female with Turner syndrome found to have an isolated 3.5-cm left subclavian artery aneurysm. Surgical intervention was performed to decrease the risk of compressive symptoms, distal embolization, and rupture. This entailed exclusion of the aneurysm proximally using thoracic stent graft, carotid-subclavian bypass, and ligation of the subclavian artery distal to the aneurysm. One-year follow-up demonstrated exclusion of the aneurysm with a 5-mm reduction in maximum aneurysm sac diameter. This case represents the management of a rare isolated left subclavian artery aneurysm, in the setting of Turner syndrome, treated with a successful endovascular approach.
Optimal selection of asymptomatic patients for carotid endarterectomy based on predicted 5-year survival
OBJECTIVE: Although carotid endarterectomy (CEA) is performed to prevent stroke, long-term survival is essential to ensure benefit, especially in asymptomatic patients. We examined factors associated with 5-year survival following CEA in patients with asymptomatic internal carotid artery (ICA) stenosis.
METHODS: Prospectively collected data from 4114 isolated CEAs performed for asymptomatic stenosis across 24 centers in the Vascular Study Group of New England between 2003 and 2011 were used for this analysis. Late survival was determined with the Social Security Death Index. Cox proportional hazard models were used to identify risk factors for mortality within the first 5 years after CEA and to calculate a risk score for predicting 5-year survival.
RESULTS: Overall 3- and 5-year survival after CEA in asymptomatic patients were 90% (95% CI 89%-91%) and 82% (95% CI 81%-84%), respectively. By multivariate analysis, increasing age, diabetes, smoking history, congestive heart failure, chronic obstructive pulmonary disease, poor renal function (estimated glomerular filtration rate dependence), absence of statin use, and worse contralateral ICA stenosis were all associated with worse survival. Patients classified as low (27%), medium (68%), and high risk (5%) based on number of risk factors had 5-year survival rates of 96%, 80%, and 51%, respectively (P < .001).
CONCLUSIONS: More than four out of five asymptomatic patients selected for CEA in the Vascular Study Group of New England achieved 5-year survival, demonstrating that, overall, surgeons in our region selected appropriate patients for carotid revascularization. However, there were patients selected for surgery with high risk profiles, and our models suggest that the highest risk patients (such as those with multiple major risk factors including age >/= 80, insulin-dependent diabetes, dialysis dependence, and severe contralateral ICA stenosis) are unlikely to survive long enough to realize a benefit of prophylactic CEA for asymptomatic stenosis. Predicting survival is important for decision making in these patients.
Simulation-based training to teach open abdominal aortic aneurysm repair to surgical residents requires dedicated faculty instruction
OBJECTIVE: We assessed the impact of abdominal aortic aneurysm (AAA)-specific simulation training on resident performance in simulated open AAA repair (SOAAAR) and determined whether simulation training required dedicated faculty instruction.
METHODS: We randomized 18 residents (postgraduate years 3-5) to an AAA simulation course consisting of two mandatory practice sessions proctored either by a surgical skills lab coordinator (Group A, n = 8) or by a vascular surgery faculty instructor (Group B, n = 10). All residents received a detailed manual and video demonstrating the technique of open AAA repair. Using a validated tool, vascular faculty who were blinded to resident identity, level of training, and randomization status graded SOAAAR performance via videos that were recorded before and after the course.
RESULTS: Characteristics and baseline scores between Groups A and B were not different. Postcourse, there was a no significant improvement in performance in Group A. Group B performance was improved significantly from baseline with regard to task-specific checklist scores (44.1 +/- 6.3 vs 34.9 +/- .5; P = .02), global rating scores (28.4 +/- .6 vs 25.3 +/- 5.0; P = .049), and overall assessment of operative competence (P = .02). Time to complete SOAAAR improved in both groups (P = .02). Baseline performance varied significantly with year of training as measured by task-specific checklist scores, global rating scores, final product analysis, time to complete repair, and overall operative competence. Improvement varied inversely with year of training (P < .05) and postcourse scores were equivalent for postgraduate year 3-5 residents.
CONCLUSIONS: An AAA-specific simulation training course improved resident performance in simulated open AAA repair. Dedicated faculty instruction during the simulation training was required for significant improvement in resident performance. The impact of simulation training was greatest in more junior residents. Procedure-specific simulation training with dedicated faculty can be used to effectively teach simulated open AAA repair. rights reserved.
Designation as "unfit for open repair" is associated with poor outcomes after endovascular aortic aneurysm repair
BACKGROUND: Endovascular aortic aneurysm repair (EVAR) is often offered to patients with abdominal aortic aneurysms (AAAs) considered preoperatively to be unfit for open AAA repair (oAAA). This study describes the short- and long-term outcomes of patients undergoing EVAR with AAAs <6.5 cm who are considered unfit for oAAA.
METHODS AND RESULTS: We analyzed elective EVARs for AAAs <6.5 cm diameter in the Vascular Study Group of New England (2003-2011). Patients were designated as fit or unfit for oAAA by the treating surgeon. End points included in-hospital major adverse events and long-term mortality. We identified patient characteristics associated with being unfit for open repair and predictors of survival using multivariable analyses. Of 1653 EVARs, 309 (18.7%) patients were deemed unfit for oAAA. These patients were more likely to have advanced age, cardiac disease, chronic obstructive pulmonary disease, and larger aneurysms at the time of repair (54 versus 56 mm, P=0.001). Patients unfit for oAAA had higher rates of cardiac (7.8% versus 3.1%, P<0.01) and pulmonary (3.6 versus 1.6, P<0.01) complications and worse survival rates at 5 years (61% versus 80%; log rank P<0.01) compared with those deemed fit for oAAA. Finally, patients designated as unfit for oAAA had worse survival, even adjusting for patient characteristics and aneurysm size (hazard ratio, 1.6; 95% confidence interval, 1.2-2.2; P<0.01).
CONCLUSIONS: In patients with AAAs <6.5 cm, designation by the operating surgeon as unfit for oAAA provides insight into both short- and long-term efficacy of EVAR. Patients unable to tolerate oAAA may not benefit from EVAR unless their risk of AAA rupture is very high.
OBJECTIVE: Acute limb ischemia remains one of the most challenging emergencies in vascular surgery. Historically, outcomes following interventions for acute limb ischemia have been associated with high rates of morbidity and mortality. The purpose of this study was to determine contemporary outcomes following lower extremity bypass performed for acute limb ischemia.
METHODS: All patients undergoing infrainguinal lower extremity bypass between 2003 and 2011 within hospitals comprising the Vascular Study Group of New England were identified. Patients were stratified according to whether or not the indication for lower extremity bypass was acute limb ischemia. Primary end points included bypass graft occlusion, major amputation, and mortality at 1 year postoperatively as determined by Kaplan-Meier life table analysis. Multivariable Cox proportional hazards models were constructed to evaluate independent predictors of mortality and major amputation at 1 year.
RESULTS: Of 5712 lower extremity bypass procedures, 323 (5.7%) were performed for acute limb ischemia. Patients undergoing lower extremity bypass for acute limb ischemia were similar in age (66 vs 67; P = .084) and sex (68% male vs 69% male; P = .617) compared with chronic ischemia patients, but were less likely to be on aspirin (63% vs 75%; P < .0001) or a statin (55% vs 68%; P < .0001). Patients with acute limb ischemia were more likely to be current smokers (49% vs 39%; P < .0001), to have had a prior ipsilateral bypass (33% vs 24%; P = .004) or a prior ipsilateral percutaneous intervention (41% vs 29%; P = .001). Bypasses performed for acute limb ischemia were longer in duration (270 vs 244 minutes; P = .007), had greater blood loss (363 vs 272 mL; P < .0001), and more commonly utilized prosthetic conduits (41% vs 33%; P = .003). Acute limb ischemia patients experienced increased in-hospital major adverse events (20% vs 12%; P < .0001) including myocardial infarction, congestive heart failure exacerbation, deterioration in renal function, and respiratory complications. Patients who underwent lower extremity bypass for acute limb ischemia had no difference in rates of graft occlusion (18.1% vs 18.5%; P = .77), but did have significantly higher rates of limb loss (22.4% vs 9.7%; P < .0001) and mortality (20.9% vs 13.1%; P < .0001) at 1 year. On multivariable analysis, acute limb ischemia was an independent predictor of both major amputation (hazard ratio, 2.16; confidence interval, 1.38-3.40; P = .001) and mortality (hazard ratio, 1.41; confidence interval, 1.09-1.83; P = .009) at 1 year.
CONCLUSIONS: Patients who present with acute limb ischemia represent a less medically optimized subgroup within the population of patients undergoing lower extremity bypass. These patients may be expected to have more complex operations followed by increased rates of perioperative adverse events. Additionally, despite equivalent graft patency rates, patients undergoing lower extremity bypass for acute ischemia have significantly higher rates of major amputation and mortality at 1 year. rights reserved.
A contemporary comparative analysis of immediate postoperative prosthesis placement following below-knee amputation
BACKGROUND: Despite advances in the treatment of peripheral arterial disease, a significant number of patients ultimately require major amputations. Traditionally, postoperative management of a below-knee amputation involves soft compressive dressings to allow for complete stump healing before initial prosthesis fitting. This technique is associated with a prolonged period of limited mobility, placing patients at risk for deconditioning or fall with a risk of injury to the stump. In contrast, immediate postoperative prosthesis (IPOP) placement allows patients to begin ambulation and rehabilitation on postoperative day 1, which may be of significant physiologic and psychological benefit. The purpose of this study is to compare the outcomes of patients undergoing IPOP placement to those of a historical control group managed with traditional soft compressive dressing placement.
METHODS: Medical records of all consecutive below-knee amputation patients who underwent IPOP (IPOP group; 37 patients, 2007-2010) and all patients who underwent traditional soft compressive dressing placement and were IPOP candidates (non-IPOP group; 35 patients, 2006-2007) were retrospectively reviewed. Patient comorbidities and preoperative ambulation status were compared between the IPOP and the non-IPOP groups. Primary outcomes evaluated included perioperative systemic complications, wound complications, need for surgical revision, and the time until placement of a definitive prosthesis. Data were analyzed using the chi-squared and Student's t-test.
RESULTS: Preoperative comorbidities and patient characteristics of the 2 groups were similar, although the IPOP group was younger (61.5 vs. 69.0 years; P=0.01). Immediate perioperative systemic complication rates were not significantly different between the 2 groups (IPOP 29.7% vs. non-IPOP 31.4%; P=0.876). Postoperative wound complication rates were as follows: wound infection (IPOP 18.9% vs. non-IPOP 25.0%; P=0.555), wound dehiscence (IPOP 29.7% vs. non-IPOP 25.0%; P=0.673), and skin breakdown separate from the incision (IPOP 18.9% vs. non-IPOP 3.6%; P=0.062). Patients in the IPOP group trended towards fewer postoperative falls (IPOP 10.8% vs. non-IPOP 21.4%; P=0.240). The need for revision was significantly greater in the non-IPOP group (IPOP 5.4% vs. non-IPOP 27.6%; P=0.013). The time from surgery to placement of the preparatory prosthesis was 51 days in the IPOP group.
CONCLUSIONS: Patients undergoing IPOP have similar perioperative systemic and wound complication rates compared to those patients undergoing conventional below-knee amputation, but are less likely to require surgical revision. The use of IPOP allows for early ambulation and rehabilitation, which may be of psychological benefit and may decrease the sequelae of prolonged immobilization. IPOP application should be considered for all appropriate candidates requiring below-knee amputation.
Comparison of graft patency, limb salvage, and antithrombotic therapy between prosthetic and autogenous below-knee bypass for critical limb ischemia
BACKGROUND: The autogenous vein is the preferred conduit in below-knee vascular reconstructions. However, many argue that prosthetic grafts can perform well in crural bypass with adjunctive antithrombotic therapy. We therefore compared outcomes of below-knee prosthetic versus autologous vein bypass grafts for critical limb ischemia and the use of adjunctive antithrombotic therapy in both settings.
METHODS: Utilizing the registry of the Vascular Study Group of New England (2003-2009), we studied 1227 patients who underwent below-knee bypass for critical limb ischemia, 223 of whom received a prosthetic graft to the below-knee popliteal artery (70%) or more distal target (30%). We used propensity matching to identify a patient cohort receiving single-segment saphenous vein yet had remained similar to the prosthetic cohort in terms of characteristics, graft origin/target, and antithrombotic regimen. Main outcome measures were graft patency and major limb amputation within 1 year. Secondary outcomes were bleeding complications (reoperation or transfusion) and mortality. We performed comparisons by conduit type and by antithrombotic therapy.
RESULTS: Patients receiving prosthetic conduit were more likely to be treated with warfarin than those with greater saphenous vein (57% vs. 24%, P<0.001). After propensity score matching, we found no significant difference in primary graft patency (72% vs. 73%, P=0.81) or major amputation rates (17% vs. 13%, P=0.31) between prosthetic and single-segment saphenous vein grafts. In a subanalysis of grafts to tibial versus popliteal targets, we noted equivalent primary patency and amputation rates between prosthetic and venous conduits. Whereas overall 1-year prosthetic graft patency rates varied from 51% (aspirin+clopidogrel) to 78% (aspirin+warfarin), no significant differences were seen in primary patency or major amputation rates by antithrombotic therapy (P=0.32 and 0.17, respectively). Further, the incidence of bleeding complications and 1-year mortality did not differ by conduit type or antithrombotic regimen in the propensity-matched analysis.
CONCLUSIONS: Although limited in size, our study demonstrates that, with appropriate patient selection and antithrombotic therapy, 1-year outcomes for below-knee prosthetic bypass grafting can be comparable to those for greater saphenous vein conduit.
BACKGROUND: Patients with peripheral arterial disease often experience treatment failure from restenosis at the site of a prior peripheral endovascular intervention (PVI) or lower extremity bypass (LEB). The impact of these treatment failures on the utilization and outcomes of secondary interventions is poorly understood.
METHODS AND RESULTS: In our regional vascular quality improvement collaborative, we compared 2350 patients undergoing primary infrainguinal LEB with 1154 patients undergoing secondary infrainguinal LEB (LEB performed after previous revascularization in the index limb) between 2003 and 2011. The proportion of patients undergoing secondary LEB increased by 72% during the study period (22% of all LEBs in 2003 to 38% in 2011, P<0.001). In-hospital outcomes, such as myocardial infarction, death, and amputation, were similar between primary and secondary LEB groups. However, in both crude and propensity-weighted analyses, secondary LEB was associated with significantly inferior 1-year outcomes, including major adverse limb event-free survival (composite of death, new bypass graft, surgical bypass graft revision, thrombectomy/thrombolysis, or above-ankle amputation; Secondary LEB MALE-free survival = 61.6% vs primary LEB MALE-free survival = 67.5%, P=0.002) and reintervention or amputation-free survival (composite of death, reintervention, or above-ankle amputation; Secondary LEB RAO-free survival = 58.9% vs Primary LEB RAO-free survival 64.1%, P=0.003). Inferior outcomes for secondary LEB were observed regardless of the prior failed treatment type (PVI or LEB).
CONCLUSIONS: In an era of increasing utilization of PVI, a growing proportion of patients undergo LEB in the setting of a prior failed PVI or surgical bypass. When caring for patients with peripheral arterial disease, physicians should recognize that first treatment failure (PVI or LEB) affects the success of subsequent revascularizations.
INTRODUCTION: The impact of a postoperative troponin elevation on long-term survival after vascular surgery is not well-defined. We hypothesize that a postoperative troponin elevation is associated with significantly reduced long-term survival.
METHODS: The Vascular Study Group of New England registry identified all patients who underwent carotid revascularization, open abdominal aortic aneurysm repair (AAA), endovascular AAA repair, or infrainguinal lower extremity bypass (2003-2011). The association of postoperative troponin elevation and myocardial infarction (MI) with 5-year survival was evaluated. Multivariable models identified predictors of survival and of postoperative myocardial ischemia.
RESULTS: In the entire cohort (n = 16,363), the incidence of postoperative troponin elevation was 1.3% (n = 211) and for MI was 1.6% (n = 264). Incidences differed across procedures (P < .0001) with the highest incidences after open AAA: troponin elevation, 3.9% (n = 74); MI, 5.1% (n = 96). On Kaplan-Meier analysis, any postoperative myocardial ischemia predicted reduced survival over 5 years postoperatively: no ischemia, 73% (standard error [SE], 0.5%); troponin elevation, 54% (SE, 4%); MI, 33% (SE, 4%) (P < .0001). This pattern was observed for each procedure subgroup analysis (P < .0001). Troponin elevation (hazard ratio, 1.45; 95% confidence interval, 1.1-2.0; P = .02) and MI (hazard ratio, 2.9; 95% confidence interval, 2.3-3.8; P < .0001) were independent predictors of reduced survival at 5 years.
CONCLUSIONS: Postoperative troponin elevation and MI predict a 26% or a 55% relatively lower survival in the 5 years following a vascular surgical procedure, respectively, compared with patients who do not experience myocardial ischemia. This highlights the need to better characterize factors leading to postoperative myocardial ischemia. Postoperative troponin elevation, either alone, or in combination with an MI, may be a useful marker for identifying high-risk patients who might benefit from more aggressive optimization in hopes of reducing adverse long-term outcomes. rights reserved.
The Society for Vascular Surgery Lower Extremity Threatened Limb Classification System: risk stratification based on wound, ischemia, and foot infection (WIfI)
Critical limb ischemia, first defined in 1982, was intended to delineate a subgroup of patients with a threatened lower extremity primarily because of chronic ischemia. It was the intent of the original authors that patients with diabetes be excluded or analyzed separately. The Fontaine and Rutherford Systems have been used to classify risk of amputation and likelihood of benefit from revascularization by subcategorizing patients into two groups: ischemic rest pain and tissue loss. Due to demographic shifts over the last 40 years, especially a dramatic rise in the incidence of diabetes mellitus and rapidly expanding techniques of revascularization, it has become increasingly difficult to perform meaningful outcomes analysis for patients with threatened limbs using these existing classification systems. Particularly in patients with diabetes, limb threat is part of a broad disease spectrum. Perfusion is only one determinant of outcome; wound extent and the presence and severity of infection also greatly impact the threat to a limb. Therefore, the Society for Vascular Surgery Lower Extremity Guidelines Committee undertook the task of creating a new classification of the threatened lower extremity that reflects these important considerations. We term this new framework, the Society for Vascular Surgery Lower Extremity Threatened Limb Classification System. Risk stratification is based on three major factors that impact amputation risk and clinical management: Wound, Ischemia, and foot Infection (WIfI). The implementation of this classification system is intended to permit more meaningful analysis of outcomes for various forms of therapy in this challenging, but heterogeneous population. rights reserved.
Upper extremity injury management by non-physician emergency practitioners in rural Uganda: A pilot study
Introduction: Improper management of and resultant poor outcomes from upper extremity injuries can be economically devastating to patients who rely on manual labour for survival. This is a pilot study using the Quick DASH Survey (disabilities of arm, shoulder and hand), a validated outcome measurement tool. Our objective was to assess functional outcomes of patients with acute upper extremity injuries who were cared for by non-physician clinicians as part of a task-shifting programme.
Methods: This pilot study was performed at the Karoli Lwanga Hospital Emergency Centre (EC) in Uganda. Patients were identified retrospectively by querying the EC quality assurance database. An initial list of all patients who sustained traumatic injury (road traffic accident, assault) between March 2012 and February 2013 was narrowed to patients with upper extremity trauma, those 18 years and older, and those with cellular phone access. This subset of patients was called and administered the Quick DASH. The results were subsequently analysed using the standardised DASH metrics. These outcome measures were further analysed based upon injury type (simple laceration, complex laceration, fracture and subluxation).
Results: There were a total of 25 initial candidates, of which only 17 were able to complete the survey. Using the Quick DASH Outcome Measure, our 17 patients had a mean score of 28.86 (range 5.0–56.8).
Conclusions: When compared to the standardised Quick DASH outcomes (no work limitation at 27.5 vs. work limited by injury at 52.6) the non-physician clinicians appear to be performing upper extremity repairs with good outcomes. The key variable to successful repair was the initial injury type. Although accommodations needed to be made to the standard Quick DASH protocol, the tool appears to be usable in non-traditional settings.
Populations of human cytomegalovirus (HCMV), a large DNA virus, are highly polymorphic in patient samples, which may allow for rapid evolution within human hosts. To understand HCMV evolution, longitudinally sampled genomic populations from the urine and plasma of 5 infants with symptomatic congenital HCMV infection were analyzed. Temporal and compartmental variability of viral populations were quantified using high throughput sequencing and population genetics approaches. HCMV populations were generally stable over time, with ~88% of SNPs displaying similar frequencies. However, samples collected from plasma and urine of the same patient at the same time were highly differentiated with approximately 1700 consensus sequence SNPs (1.2% of the genome) identified between compartments. This inter-compartment differentiation was comparable to the differentiation observed in unrelated hosts. Models of demography (i.e., changes in population size and structure) and positive selection were evaluated to explain the observed patterns of variation. Evidence for strong bottlenecks (>90% reduction in viral population size) was consistent among all patients. From the timing of the bottlenecks, we conclude that fetal infection occurred between 13-18 weeks gestational age in patients analyzed, while colonization of the urine compartment followed roughly 2 months later. The timing of these bottlenecks is consistent with the clinical histories of congenital HCMV infections. We next inferred that positive selection plays a small but measurable role in viral evolution within a single compartment. However, positive selection appears to be a strong and pervasive driver of evolution associated with compartmentalization, affecting >/= 34 of the 167 open reading frames (~20%) of the genome. This work offers the most detailed map of HCMV in vivo evolution to date and provides evidence that viral populations can be stable or rapidly differentiate, depending on host environment. The application of population genetic methods to these data provides clinically useful information, such as the timing of infection and compartment colonization.
Involvement of Escherichia coli DNA Replication Proteins in Phage Lambda Red-Mediated Homologous Recombination
The Red recombination system of bacteriophage lambda is widely used for genetic engineering because of its ability to promote recombination between bacterial chromosomes or plasmids and linear DNA species introduced by electroporation. The process is known to be intimately tied to replication, but the cellular functions which participate with Red in this process are largely unknown. Here two such functions are identified: the GrpE-DnaK-DnaJ chaperone system, and DNA polymerase I. Mutations in either function are found to decrease the efficiency of Red recombination. grpE and dnaJ mutations which greatly decrease Red recombination with electroporated DNA species have only small effects on Red-mediated transduction. This recombination event specificity suggests that the involvement of GrpE-DnaJ-DnaK is not simply an effect on Red structure or stability.
Serum cytokine profiles associated with specific adjuvants used in a DNA prime-protein boost vaccination strategy
In recent years, heterologous prime-boost vaccines have been demonstrated to be an effective strategy for generating protective immunity, consisting of both humoral and cell-mediated immune responses against a variety of pathogens including HIV-1. Previous reports of preclinical and clinical studies have shown the enhanced immunogenicity of viral vector or DNA vaccination followed by heterologous protein boost, compared to using either prime or boost components alone. With such approaches, the selection of an adjuvant for inclusion in the protein boost component is expected to impact the immunogenicity and safety of a vaccine. In this study, we examined in a mouse model the serum cytokine and chemokine profiles for several candidate adjuvants: QS-21, Al(OH)3, monophosphoryl lipid A (MPLA) and ISCOMATRIX adjuvant, in the context of a previously tested pentavalent HIV-1 Env DNA prime-protein boost formulation, DP6-001. Our data revealed that the candidate adjuvants in the context of the DP6-001 formulation are characterized by unique serum cytokine and chemokine profiles. Such information will provide valuable guidance in the selection of an adjuvant for future AIDS vaccine development, with the ultimate goal of enhancing immunogenicity while minimizing reactogenicity associated with the use of an adjuvant. More significantly, results reported here will add to the knowledge on how to include an adjuvant in the context of a heterologous prime-protein boost vaccination strategy in general.
CYLD deubiquitinates RIP1 in the TNFalpha-induced necrosome to facilitate kinase activation and programmed necrosis
BACKGROUND: Necroptosis/programmed necrosis is initiated by a macro-molecular protein complex termed the necrosome. Receptor interacting protein kinase 1 (RIPK1/RIP1) and RIP3 are key components of the necrosome. TNFalpha is a prototypic inducer of necrosome activation, and it is widely believed that deubiquitination of RIP1 at the TNFR-1 signaling complex precedes transition of RIP1 into the cytosol where it forms the RIP1-RIP3 necrosome. Cylindromatosis (CYLD) is believed to promote programmed necrosis by facilitating RIP1 deubiquitination at this membrane receptor complex.
METHODOLOGY/PRINCIPAL FINDINGS: We demonstrate that RIP1 is indeed the primary target of CYLD in TNFalpha-induced programmed necrosis. We observed that CYLD does not regulate RIP1 ubiquitination at the TNF receptor. TNF and zVAD-induced programmed necrosis was highly attenuated in CYLD(-/-) cells. However, in the presence of cycloheximide or SMAC mimetics, programmed necrosis was only moderately reduced in CYLD(-/-) cells. Under the latter conditions, RIP1-RIP3 necrosome formation is only delayed, but not abolished in CYLD(-/-) cells. We further demonstrate that RIP1 within the NP-40 insoluble necrosome is ubiquitinated and that CYLD regulates RIP1 ubiquitination in this compartment. Hence, RIP1 ubiquitination in this late-forming complex is greatly increased in CYLD(-/-) cells. Increased RIP1 ubiquitination impairs RIP1 and RIP3 phosphorylation, a signature of kinase activation.
CONCLUSIONS/SIGNIFICANCE: Our results show that CYLD regulates RIP1 ubiquitination in the TNFalpha-induced necrosome, but not in the TNFR-1 signaling complex. In cells sensitized to programmed necrosis with SMAC mimetics, CYLD is not essential for necrosome assembly. Since SMAC mimetics induces the loss of the E3 ligases cIAP1 and cIAP2, reduced RIP1 ubiquitination could lead to reduced requirement for CYLD to remove ubiquitin chains from RIP1 in the TNFR-1 complex. As increased RIP1 ubiquitination in the necrosome correlates with impaired RIP1 and RIP3 phosphorylation and function, these results suggest that CYLD controls RIP1 kinase activity during necrosome assembly.
Transactive response DNA-binding protein 43 (TDP-43) is a major pathological protein in frontotemporal dementia (FTD) and amyotrophic lateral sclerosis (ALS). There are many disease-associated mutations in TDP-43, and several cellular and animal models with ectopic overexpression of mutant TDP-43 have been established. Here we sought to study altered molecular events in FTD and ALS by using induced pluripotent stem cell (iPSC) derived patient neurons. We generated multiple iPSC lines from an FTD/ALS patient with the TARDBP A90V mutation and from an unaffected family member who lacked the mutation. After extensive characterization, two to three iPSC lines from each subject were selected, differentiated into postmitotic neurons, and screened for relevant cell-autonomous phenotypes. Patient-derived neurons were more sensitive than control neurons to 100 nM straurosporine but not to other inducers of cellular stress. Three disease-relevant cellular phenotypes were revealed under staurosporine-induced stress. First, TDP-43 was localized in the cytoplasm of a higher percentage of patient neurons than control neurons. Second, the total TDP-43 level was lower in patient neurons with the A90V mutation. Third, the levels of microRNA-9 (miR-9) and its precursor pri-miR-9-2 decreased in patient neurons but not in control neurons. The latter is likely because of reduced TDP-43, as shRNA-mediated TDP-43 knockdown in rodent primary neurons also decreased the pri-miR-9-2 level. The reduction in miR-9 expression was confirmed in human neurons derived from iPSC lines containing the more pathogenic TARDBP M337V mutation, suggesting miR-9 downregulation might be a common pathogenic event in FTD/ALS. These results show that iPSC models of FTD/ALS are useful for revealing stress-dependent cellular defects of human patient neurons containing rare TDP-43 mutations in their native genetic contexts.