Designation as "unfit for open repair" is associated with poor outcomes after endovascular aortic aneurysm repair
BACKGROUND: Endovascular aortic aneurysm repair (EVAR) is often offered to patients with abdominal aortic aneurysms (AAAs) considered preoperatively to be unfit for open AAA repair (oAAA). This study describes the short- and long-term outcomes of patients undergoing EVAR with AAAs <6.5 cm who are considered unfit for oAAA.
METHODS AND RESULTS: We analyzed elective EVARs for AAAs <6.5 cm diameter in the Vascular Study Group of New England (2003-2011). Patients were designated as fit or unfit for oAAA by the treating surgeon. End points included in-hospital major adverse events and long-term mortality. We identified patient characteristics associated with being unfit for open repair and predictors of survival using multivariable analyses. Of 1653 EVARs, 309 (18.7%) patients were deemed unfit for oAAA. These patients were more likely to have advanced age, cardiac disease, chronic obstructive pulmonary disease, and larger aneurysms at the time of repair (54 versus 56 mm, P=0.001). Patients unfit for oAAA had higher rates of cardiac (7.8% versus 3.1%, P<0.01) and pulmonary (3.6 versus 1.6, P<0.01) complications and worse survival rates at 5 years (61% versus 80%; log rank P<0.01) compared with those deemed fit for oAAA. Finally, patients designated as unfit for oAAA had worse survival, even adjusting for patient characteristics and aneurysm size (hazard ratio, 1.6; 95% confidence interval, 1.2-2.2; P<0.01).
CONCLUSIONS: In patients with AAAs <6.5 cm, designation by the operating surgeon as unfit for oAAA provides insight into both short- and long-term efficacy of EVAR. Patients unable to tolerate oAAA may not benefit from EVAR unless their risk of AAA rupture is very high.
OBJECTIVE: Acute limb ischemia remains one of the most challenging emergencies in vascular surgery. Historically, outcomes following interventions for acute limb ischemia have been associated with high rates of morbidity and mortality. The purpose of this study was to determine contemporary outcomes following lower extremity bypass performed for acute limb ischemia.
METHODS: All patients undergoing infrainguinal lower extremity bypass between 2003 and 2011 within hospitals comprising the Vascular Study Group of New England were identified. Patients were stratified according to whether or not the indication for lower extremity bypass was acute limb ischemia. Primary end points included bypass graft occlusion, major amputation, and mortality at 1 year postoperatively as determined by Kaplan-Meier life table analysis. Multivariable Cox proportional hazards models were constructed to evaluate independent predictors of mortality and major amputation at 1 year.
RESULTS: Of 5712 lower extremity bypass procedures, 323 (5.7%) were performed for acute limb ischemia. Patients undergoing lower extremity bypass for acute limb ischemia were similar in age (66 vs 67; P = .084) and sex (68% male vs 69% male; P = .617) compared with chronic ischemia patients, but were less likely to be on aspirin (63% vs 75%; P < .0001) or a statin (55% vs 68%; P < .0001). Patients with acute limb ischemia were more likely to be current smokers (49% vs 39%; P < .0001), to have had a prior ipsilateral bypass (33% vs 24%; P = .004) or a prior ipsilateral percutaneous intervention (41% vs 29%; P = .001). Bypasses performed for acute limb ischemia were longer in duration (270 vs 244 minutes; P = .007), had greater blood loss (363 vs 272 mL; P < .0001), and more commonly utilized prosthetic conduits (41% vs 33%; P = .003). Acute limb ischemia patients experienced increased in-hospital major adverse events (20% vs 12%; P < .0001) including myocardial infarction, congestive heart failure exacerbation, deterioration in renal function, and respiratory complications. Patients who underwent lower extremity bypass for acute limb ischemia had no difference in rates of graft occlusion (18.1% vs 18.5%; P = .77), but did have significantly higher rates of limb loss (22.4% vs 9.7%; P < .0001) and mortality (20.9% vs 13.1%; P < .0001) at 1 year. On multivariable analysis, acute limb ischemia was an independent predictor of both major amputation (hazard ratio, 2.16; confidence interval, 1.38-3.40; P = .001) and mortality (hazard ratio, 1.41; confidence interval, 1.09-1.83; P = .009) at 1 year.
CONCLUSIONS: Patients who present with acute limb ischemia represent a less medically optimized subgroup within the population of patients undergoing lower extremity bypass. These patients may be expected to have more complex operations followed by increased rates of perioperative adverse events. Additionally, despite equivalent graft patency rates, patients undergoing lower extremity bypass for acute ischemia have significantly higher rates of major amputation and mortality at 1 year. rights reserved.
A contemporary comparative analysis of immediate postoperative prosthesis placement following below-knee amputation
BACKGROUND: Despite advances in the treatment of peripheral arterial disease, a significant number of patients ultimately require major amputations. Traditionally, postoperative management of a below-knee amputation involves soft compressive dressings to allow for complete stump healing before initial prosthesis fitting. This technique is associated with a prolonged period of limited mobility, placing patients at risk for deconditioning or fall with a risk of injury to the stump. In contrast, immediate postoperative prosthesis (IPOP) placement allows patients to begin ambulation and rehabilitation on postoperative day 1, which may be of significant physiologic and psychological benefit. The purpose of this study is to compare the outcomes of patients undergoing IPOP placement to those of a historical control group managed with traditional soft compressive dressing placement.
METHODS: Medical records of all consecutive below-knee amputation patients who underwent IPOP (IPOP group; 37 patients, 2007-2010) and all patients who underwent traditional soft compressive dressing placement and were IPOP candidates (non-IPOP group; 35 patients, 2006-2007) were retrospectively reviewed. Patient comorbidities and preoperative ambulation status were compared between the IPOP and the non-IPOP groups. Primary outcomes evaluated included perioperative systemic complications, wound complications, need for surgical revision, and the time until placement of a definitive prosthesis. Data were analyzed using the chi-squared and Student's t-test.
RESULTS: Preoperative comorbidities and patient characteristics of the 2 groups were similar, although the IPOP group was younger (61.5 vs. 69.0 years; P=0.01). Immediate perioperative systemic complication rates were not significantly different between the 2 groups (IPOP 29.7% vs. non-IPOP 31.4%; P=0.876). Postoperative wound complication rates were as follows: wound infection (IPOP 18.9% vs. non-IPOP 25.0%; P=0.555), wound dehiscence (IPOP 29.7% vs. non-IPOP 25.0%; P=0.673), and skin breakdown separate from the incision (IPOP 18.9% vs. non-IPOP 3.6%; P=0.062). Patients in the IPOP group trended towards fewer postoperative falls (IPOP 10.8% vs. non-IPOP 21.4%; P=0.240). The need for revision was significantly greater in the non-IPOP group (IPOP 5.4% vs. non-IPOP 27.6%; P=0.013). The time from surgery to placement of the preparatory prosthesis was 51 days in the IPOP group.
CONCLUSIONS: Patients undergoing IPOP have similar perioperative systemic and wound complication rates compared to those patients undergoing conventional below-knee amputation, but are less likely to require surgical revision. The use of IPOP allows for early ambulation and rehabilitation, which may be of psychological benefit and may decrease the sequelae of prolonged immobilization. IPOP application should be considered for all appropriate candidates requiring below-knee amputation.
Comparison of graft patency, limb salvage, and antithrombotic therapy between prosthetic and autogenous below-knee bypass for critical limb ischemia
BACKGROUND: The autogenous vein is the preferred conduit in below-knee vascular reconstructions. However, many argue that prosthetic grafts can perform well in crural bypass with adjunctive antithrombotic therapy. We therefore compared outcomes of below-knee prosthetic versus autologous vein bypass grafts for critical limb ischemia and the use of adjunctive antithrombotic therapy in both settings.
METHODS: Utilizing the registry of the Vascular Study Group of New England (2003-2009), we studied 1227 patients who underwent below-knee bypass for critical limb ischemia, 223 of whom received a prosthetic graft to the below-knee popliteal artery (70%) or more distal target (30%). We used propensity matching to identify a patient cohort receiving single-segment saphenous vein yet had remained similar to the prosthetic cohort in terms of characteristics, graft origin/target, and antithrombotic regimen. Main outcome measures were graft patency and major limb amputation within 1 year. Secondary outcomes were bleeding complications (reoperation or transfusion) and mortality. We performed comparisons by conduit type and by antithrombotic therapy.
RESULTS: Patients receiving prosthetic conduit were more likely to be treated with warfarin than those with greater saphenous vein (57% vs. 24%, P<0.001). After propensity score matching, we found no significant difference in primary graft patency (72% vs. 73%, P=0.81) or major amputation rates (17% vs. 13%, P=0.31) between prosthetic and single-segment saphenous vein grafts. In a subanalysis of grafts to tibial versus popliteal targets, we noted equivalent primary patency and amputation rates between prosthetic and venous conduits. Whereas overall 1-year prosthetic graft patency rates varied from 51% (aspirin+clopidogrel) to 78% (aspirin+warfarin), no significant differences were seen in primary patency or major amputation rates by antithrombotic therapy (P=0.32 and 0.17, respectively). Further, the incidence of bleeding complications and 1-year mortality did not differ by conduit type or antithrombotic regimen in the propensity-matched analysis.
CONCLUSIONS: Although limited in size, our study demonstrates that, with appropriate patient selection and antithrombotic therapy, 1-year outcomes for below-knee prosthetic bypass grafting can be comparable to those for greater saphenous vein conduit.
BACKGROUND: Patients with peripheral arterial disease often experience treatment failure from restenosis at the site of a prior peripheral endovascular intervention (PVI) or lower extremity bypass (LEB). The impact of these treatment failures on the utilization and outcomes of secondary interventions is poorly understood.
METHODS AND RESULTS: In our regional vascular quality improvement collaborative, we compared 2350 patients undergoing primary infrainguinal LEB with 1154 patients undergoing secondary infrainguinal LEB (LEB performed after previous revascularization in the index limb) between 2003 and 2011. The proportion of patients undergoing secondary LEB increased by 72% during the study period (22% of all LEBs in 2003 to 38% in 2011, P<0.001). In-hospital outcomes, such as myocardial infarction, death, and amputation, were similar between primary and secondary LEB groups. However, in both crude and propensity-weighted analyses, secondary LEB was associated with significantly inferior 1-year outcomes, including major adverse limb event-free survival (composite of death, new bypass graft, surgical bypass graft revision, thrombectomy/thrombolysis, or above-ankle amputation; Secondary LEB MALE-free survival = 61.6% vs primary LEB MALE-free survival = 67.5%, P=0.002) and reintervention or amputation-free survival (composite of death, reintervention, or above-ankle amputation; Secondary LEB RAO-free survival = 58.9% vs Primary LEB RAO-free survival 64.1%, P=0.003). Inferior outcomes for secondary LEB were observed regardless of the prior failed treatment type (PVI or LEB).
CONCLUSIONS: In an era of increasing utilization of PVI, a growing proportion of patients undergo LEB in the setting of a prior failed PVI or surgical bypass. When caring for patients with peripheral arterial disease, physicians should recognize that first treatment failure (PVI or LEB) affects the success of subsequent revascularizations.
INTRODUCTION: The impact of a postoperative troponin elevation on long-term survival after vascular surgery is not well-defined. We hypothesize that a postoperative troponin elevation is associated with significantly reduced long-term survival.
METHODS: The Vascular Study Group of New England registry identified all patients who underwent carotid revascularization, open abdominal aortic aneurysm repair (AAA), endovascular AAA repair, or infrainguinal lower extremity bypass (2003-2011). The association of postoperative troponin elevation and myocardial infarction (MI) with 5-year survival was evaluated. Multivariable models identified predictors of survival and of postoperative myocardial ischemia.
RESULTS: In the entire cohort (n = 16,363), the incidence of postoperative troponin elevation was 1.3% (n = 211) and for MI was 1.6% (n = 264). Incidences differed across procedures (P < .0001) with the highest incidences after open AAA: troponin elevation, 3.9% (n = 74); MI, 5.1% (n = 96). On Kaplan-Meier analysis, any postoperative myocardial ischemia predicted reduced survival over 5 years postoperatively: no ischemia, 73% (standard error [SE], 0.5%); troponin elevation, 54% (SE, 4%); MI, 33% (SE, 4%) (P < .0001). This pattern was observed for each procedure subgroup analysis (P < .0001). Troponin elevation (hazard ratio, 1.45; 95% confidence interval, 1.1-2.0; P = .02) and MI (hazard ratio, 2.9; 95% confidence interval, 2.3-3.8; P < .0001) were independent predictors of reduced survival at 5 years.
CONCLUSIONS: Postoperative troponin elevation and MI predict a 26% or a 55% relatively lower survival in the 5 years following a vascular surgical procedure, respectively, compared with patients who do not experience myocardial ischemia. This highlights the need to better characterize factors leading to postoperative myocardial ischemia. Postoperative troponin elevation, either alone, or in combination with an MI, may be a useful marker for identifying high-risk patients who might benefit from more aggressive optimization in hopes of reducing adverse long-term outcomes. rights reserved.
The Society for Vascular Surgery Lower Extremity Threatened Limb Classification System: risk stratification based on wound, ischemia, and foot infection (WIfI)
Critical limb ischemia, first defined in 1982, was intended to delineate a subgroup of patients with a threatened lower extremity primarily because of chronic ischemia. It was the intent of the original authors that patients with diabetes be excluded or analyzed separately. The Fontaine and Rutherford Systems have been used to classify risk of amputation and likelihood of benefit from revascularization by subcategorizing patients into two groups: ischemic rest pain and tissue loss. Due to demographic shifts over the last 40 years, especially a dramatic rise in the incidence of diabetes mellitus and rapidly expanding techniques of revascularization, it has become increasingly difficult to perform meaningful outcomes analysis for patients with threatened limbs using these existing classification systems. Particularly in patients with diabetes, limb threat is part of a broad disease spectrum. Perfusion is only one determinant of outcome; wound extent and the presence and severity of infection also greatly impact the threat to a limb. Therefore, the Society for Vascular Surgery Lower Extremity Guidelines Committee undertook the task of creating a new classification of the threatened lower extremity that reflects these important considerations. We term this new framework, the Society for Vascular Surgery Lower Extremity Threatened Limb Classification System. Risk stratification is based on three major factors that impact amputation risk and clinical management: Wound, Ischemia, and foot Infection (WIfI). The implementation of this classification system is intended to permit more meaningful analysis of outcomes for various forms of therapy in this challenging, but heterogeneous population. rights reserved.
Upper extremity injury management by non-physician emergency practitioners in rural Uganda: A pilot study
Introduction: Improper management of and resultant poor outcomes from upper extremity injuries can be economically devastating to patients who rely on manual labour for survival. This is a pilot study using the Quick DASH Survey (disabilities of arm, shoulder and hand), a validated outcome measurement tool. Our objective was to assess functional outcomes of patients with acute upper extremity injuries who were cared for by non-physician clinicians as part of a task-shifting programme.
Methods: This pilot study was performed at the Karoli Lwanga Hospital Emergency Centre (EC) in Uganda. Patients were identified retrospectively by querying the EC quality assurance database. An initial list of all patients who sustained traumatic injury (road traffic accident, assault) between March 2012 and February 2013 was narrowed to patients with upper extremity trauma, those 18 years and older, and those with cellular phone access. This subset of patients was called and administered the Quick DASH. The results were subsequently analysed using the standardised DASH metrics. These outcome measures were further analysed based upon injury type (simple laceration, complex laceration, fracture and subluxation).
Results: There were a total of 25 initial candidates, of which only 17 were able to complete the survey. Using the Quick DASH Outcome Measure, our 17 patients had a mean score of 28.86 (range 5.0–56.8).
Conclusions: When compared to the standardised Quick DASH outcomes (no work limitation at 27.5 vs. work limited by injury at 52.6) the non-physician clinicians appear to be performing upper extremity repairs with good outcomes. The key variable to successful repair was the initial injury type. Although accommodations needed to be made to the standard Quick DASH protocol, the tool appears to be usable in non-traditional settings.
Populations of human cytomegalovirus (HCMV), a large DNA virus, are highly polymorphic in patient samples, which may allow for rapid evolution within human hosts. To understand HCMV evolution, longitudinally sampled genomic populations from the urine and plasma of 5 infants with symptomatic congenital HCMV infection were analyzed. Temporal and compartmental variability of viral populations were quantified using high throughput sequencing and population genetics approaches. HCMV populations were generally stable over time, with ~88% of SNPs displaying similar frequencies. However, samples collected from plasma and urine of the same patient at the same time were highly differentiated with approximately 1700 consensus sequence SNPs (1.2% of the genome) identified between compartments. This inter-compartment differentiation was comparable to the differentiation observed in unrelated hosts. Models of demography (i.e., changes in population size and structure) and positive selection were evaluated to explain the observed patterns of variation. Evidence for strong bottlenecks (>90% reduction in viral population size) was consistent among all patients. From the timing of the bottlenecks, we conclude that fetal infection occurred between 13-18 weeks gestational age in patients analyzed, while colonization of the urine compartment followed roughly 2 months later. The timing of these bottlenecks is consistent with the clinical histories of congenital HCMV infections. We next inferred that positive selection plays a small but measurable role in viral evolution within a single compartment. However, positive selection appears to be a strong and pervasive driver of evolution associated with compartmentalization, affecting >/= 34 of the 167 open reading frames (~20%) of the genome. This work offers the most detailed map of HCMV in vivo evolution to date and provides evidence that viral populations can be stable or rapidly differentiate, depending on host environment. The application of population genetic methods to these data provides clinically useful information, such as the timing of infection and compartment colonization.
Involvement of Escherichia coli DNA Replication Proteins in Phage Lambda Red-Mediated Homologous Recombination
The Red recombination system of bacteriophage lambda is widely used for genetic engineering because of its ability to promote recombination between bacterial chromosomes or plasmids and linear DNA species introduced by electroporation. The process is known to be intimately tied to replication, but the cellular functions which participate with Red in this process are largely unknown. Here two such functions are identified: the GrpE-DnaK-DnaJ chaperone system, and DNA polymerase I. Mutations in either function are found to decrease the efficiency of Red recombination. grpE and dnaJ mutations which greatly decrease Red recombination with electroporated DNA species have only small effects on Red-mediated transduction. This recombination event specificity suggests that the involvement of GrpE-DnaJ-DnaK is not simply an effect on Red structure or stability.
Serum cytokine profiles associated with specific adjuvants used in a DNA prime-protein boost vaccination strategy
In recent years, heterologous prime-boost vaccines have been demonstrated to be an effective strategy for generating protective immunity, consisting of both humoral and cell-mediated immune responses against a variety of pathogens including HIV-1. Previous reports of preclinical and clinical studies have shown the enhanced immunogenicity of viral vector or DNA vaccination followed by heterologous protein boost, compared to using either prime or boost components alone. With such approaches, the selection of an adjuvant for inclusion in the protein boost component is expected to impact the immunogenicity and safety of a vaccine. In this study, we examined in a mouse model the serum cytokine and chemokine profiles for several candidate adjuvants: QS-21, Al(OH)3, monophosphoryl lipid A (MPLA) and ISCOMATRIX adjuvant, in the context of a previously tested pentavalent HIV-1 Env DNA prime-protein boost formulation, DP6-001. Our data revealed that the candidate adjuvants in the context of the DP6-001 formulation are characterized by unique serum cytokine and chemokine profiles. Such information will provide valuable guidance in the selection of an adjuvant for future AIDS vaccine development, with the ultimate goal of enhancing immunogenicity while minimizing reactogenicity associated with the use of an adjuvant. More significantly, results reported here will add to the knowledge on how to include an adjuvant in the context of a heterologous prime-protein boost vaccination strategy in general.
CYLD deubiquitinates RIP1 in the TNFalpha-induced necrosome to facilitate kinase activation and programmed necrosis
BACKGROUND: Necroptosis/programmed necrosis is initiated by a macro-molecular protein complex termed the necrosome. Receptor interacting protein kinase 1 (RIPK1/RIP1) and RIP3 are key components of the necrosome. TNFalpha is a prototypic inducer of necrosome activation, and it is widely believed that deubiquitination of RIP1 at the TNFR-1 signaling complex precedes transition of RIP1 into the cytosol where it forms the RIP1-RIP3 necrosome. Cylindromatosis (CYLD) is believed to promote programmed necrosis by facilitating RIP1 deubiquitination at this membrane receptor complex.
METHODOLOGY/PRINCIPAL FINDINGS: We demonstrate that RIP1 is indeed the primary target of CYLD in TNFalpha-induced programmed necrosis. We observed that CYLD does not regulate RIP1 ubiquitination at the TNF receptor. TNF and zVAD-induced programmed necrosis was highly attenuated in CYLD(-/-) cells. However, in the presence of cycloheximide or SMAC mimetics, programmed necrosis was only moderately reduced in CYLD(-/-) cells. Under the latter conditions, RIP1-RIP3 necrosome formation is only delayed, but not abolished in CYLD(-/-) cells. We further demonstrate that RIP1 within the NP-40 insoluble necrosome is ubiquitinated and that CYLD regulates RIP1 ubiquitination in this compartment. Hence, RIP1 ubiquitination in this late-forming complex is greatly increased in CYLD(-/-) cells. Increased RIP1 ubiquitination impairs RIP1 and RIP3 phosphorylation, a signature of kinase activation.
CONCLUSIONS/SIGNIFICANCE: Our results show that CYLD regulates RIP1 ubiquitination in the TNFalpha-induced necrosome, but not in the TNFR-1 signaling complex. In cells sensitized to programmed necrosis with SMAC mimetics, CYLD is not essential for necrosome assembly. Since SMAC mimetics induces the loss of the E3 ligases cIAP1 and cIAP2, reduced RIP1 ubiquitination could lead to reduced requirement for CYLD to remove ubiquitin chains from RIP1 in the TNFR-1 complex. As increased RIP1 ubiquitination in the necrosome correlates with impaired RIP1 and RIP3 phosphorylation and function, these results suggest that CYLD controls RIP1 kinase activity during necrosome assembly.
Transactive response DNA-binding protein 43 (TDP-43) is a major pathological protein in frontotemporal dementia (FTD) and amyotrophic lateral sclerosis (ALS). There are many disease-associated mutations in TDP-43, and several cellular and animal models with ectopic overexpression of mutant TDP-43 have been established. Here we sought to study altered molecular events in FTD and ALS by using induced pluripotent stem cell (iPSC) derived patient neurons. We generated multiple iPSC lines from an FTD/ALS patient with the TARDBP A90V mutation and from an unaffected family member who lacked the mutation. After extensive characterization, two to three iPSC lines from each subject were selected, differentiated into postmitotic neurons, and screened for relevant cell-autonomous phenotypes. Patient-derived neurons were more sensitive than control neurons to 100 nM straurosporine but not to other inducers of cellular stress. Three disease-relevant cellular phenotypes were revealed under staurosporine-induced stress. First, TDP-43 was localized in the cytoplasm of a higher percentage of patient neurons than control neurons. Second, the total TDP-43 level was lower in patient neurons with the A90V mutation. Third, the levels of microRNA-9 (miR-9) and its precursor pri-miR-9-2 decreased in patient neurons but not in control neurons. The latter is likely because of reduced TDP-43, as shRNA-mediated TDP-43 knockdown in rodent primary neurons also decreased the pri-miR-9-2 level. The reduction in miR-9 expression was confirmed in human neurons derived from iPSC lines containing the more pathogenic TARDBP M337V mutation, suggesting miR-9 downregulation might be a common pathogenic event in FTD/ALS. These results show that iPSC models of FTD/ALS are useful for revealing stress-dependent cellular defects of human patient neurons containing rare TDP-43 mutations in their native genetic contexts.
Epidemiologic and clinical evidence suggests that virus infection plays an important role in human type 1 diabetes pathogenesis. We used the virus-inducible BioBreeding Diabetes Resistant (BBDR) rat to investigate the ability of sodium salicylate, a non-steroidal anti-inflammatory drug (NSAID), to modulate development of type 1 diabetes. BBDR rats treated with Kilham rat virus (KRV) and polyinosinic:polycytidylic acid (pIC, a TLR3 agonist) develop diabetes at nearly 100% incidence by ~2 weeks. We found distinct temporal profiles of the proinflammatory serum cytokines, IL-1beta, IL-6, IFN-gamma, IL-12, and haptoglobin (an acute phase protein) in KRV+pIC treated rats. Significant elevations of IL-1beta and IL-12, coupled with sustained elevations of haptoglobin, were specific to KRV+pIC and not found in rats co-treated with pIC and H1, a non-diabetogenic virus. Salicylate administered concurrently with KRV+pIC inhibited the elevations in IL-1beta, IL-6, IFN-gamma and haptoglobin almost completely, and reduced IL-12 levels significantly. Salicylate prevented diabetes in a dose-dependent manner, and diabetes-free animals had no evidence of insulitis. Our data support an important role for innate immunity in virus-induced type 1 diabetes pathogenesis. The ability of salicylate to prevent diabetes in this robust animal model demonstrates its potential use to prevent or attenuate human autoimmune diabetes.
Obesity places major demands on the protein folding capacity of the endoplasmic reticulum (ER), resulting in ER stress, a condition that promotes hepatic insulin resistance and steatosis. Here we identify the transcription factor, Kruppel-like factor 15 (KLF15), as an essential mediator of ER stress-induced insulin resistance in the liver. Mice with a targeted deletion of KLF15 exhibit increased hepatic ER stress, inflammation, and JNK activation compared to WT mice; however, KLF15 (-/-) mice are protected against hepatic insulin resistance and fatty liver under high-fat feeding conditions and in response to pharmacological induction of ER stress. The mammalian target of rapamycin complex 1 (mTORC1), a key regulator of cellular energy homeostasis, has been shown to cooperate with ER stress signaling pathways to promote hepatic insulin resistance and lipid accumulation. We find that the uncoupling of ER stress and insulin resistance in KLF15 (-/-) liver is associated with the maintenance of a low energy state characterized by decreased mTORC1 activity, increased AMPK phosphorylation and PGC-1alpha expression and activation of autophagy, an intracellular degradation process that enhances hepatic insulin sensitivity. Furthermore, in primary hepatocytes, KLF15 deficiency markedly inhibits activation of mTORC1 by amino acids and insulin, suggesting a mechanism by which KLF15 controls mTORC1-mediated insulin resistance. This study establishes KLF15 as an important molecular link between ER stress and insulin action.
Toll-like receptor induced pro-interleukin-1beta and interleukin-6 in monocytes are lower in healthy infants compared to adults
Infants have long been known to have higher infectious diseases morbidity and mortality and suboptimal vaccination responses compared to older children and adults. A variety of differences in innate and adaptive immune responses have been described between these two groups. We compared Toll-like receptor (TLR)-induced production of pro-interleukin (IL)-1beta, IL-6, and tumor necrosis factor (TNF)-alpha between 2-month-old infants and adults. TLR 7/8-induced production of pro-IL-1beta and IL-6 in monocytes was lower in 2-month-old infants compared to adults. There was no difference in TLR 7/8-induced production of TNF-alpha. Lower TLR-induced production of pro-IL-1beta and IL-6 in innate immune cells during early infancy likely contributes to suboptimal vaccine responses and infectious diseases susceptibility.
E-Science as a Catalyst for Transformational Change in University Research Libraries: A Dissertation
Changes in how research is conducted, from the growth of e-science to the emergence of big data, have lead to new opportunities for librarians to become involved in the creation and management of research data, at the same time the duties and responsibilities of university libraries continue to evolve. This study examines those roles related to e-science while exploring the concept of transformational change and leadership issues in bringing about such a change. Using the framework established by Levy and Merry for first- and second-order change, four case studies of libraries whose institutions are members in the Association of Research Libraries (ARL) are developed. The case studies highlight why the libraries became involved in e-science, the role librarians are assuming related to data management education and policy, and the provision of e-science programs and services. Each case study documents the structural and programmatic changes that have occurred in a library to provide e-science services and programs, the future changes library leaders are working to implement, and the change management process used by managerial leaders to bringing about, and permanently embed those changes into the library culture. Themes such as vision, team leadership, the role of library
Roundtable discussion of: Berkowitz B. Studying the outcomes of community-based coalitions. Am J Community Psychol. 2001 Apr;29(2):213-27
Health Stop is a major chain of ambulatory care centers operating for profit. Until 1985 its physicians were paid a flat hourly wage. In the middle of that year, a new compensation plan was instituted to provide doctors with financial incentives to increase revenues. Physicians could earn bonuses the size of which depended on the gross incomes they generated individually. We compared the practice patterns of 15 doctors, each employed full time at a different Health Stop center in the Boston area, in the same winter months before and after the start of the new arrangement. During the periods compared, the physicians increased the number of laboratory tests performed per patient visit by 23 percent and the number of x-ray films per visit by 16 percent. The total charges per month, adjusted for inflation, grew 20 percent, mostly as a result of a 12 percent increase in the average number of patient visits per month. The wages of the seven physicians who regularly earned the bonus rose 19 percent. We conclude that substantial monetary incentives based on individual performance may induce a group of physicians to increase the intensity of their practice, even though not all of them benefit from the incentives.
Carrying out the Medicine/Public Health Initiative: the roles of preventive medicine and community-responsive care
Leaders in medicine and public health, recognizing the inherent interdependency of these fields, established the Medicine/Public Health Initiative in the mid-1990s as "an evolving forum in which representatives of both sectors can explore their mutual interests in improving health and [can] define collaborative mechanisms to achieve that goal." The Initiative's participants developed six goals that they and others in medicine and public health across the nation should implement: engage the community; change the education process; create joint research efforts by clinical, public health, and preventive medicine investigators; develop a shared view of illness between medicine and public health; work together to provide health care; and work jointly to develop health care assessment measures. The authors describe the six goals in depth and explain the important combined roles of clinically-oriented preventive medicine and community-oriented preventive medicine--as practiced in a model of health care delivery called community-oriented primary care (COPC)--in implementing the Initiative's goals. They then report recent efforts, including two in Boston and Dallas, to merge medicine and public health, and state that academic health centers, which are in the process of reshaping themselves, can help themselves as well as the public by embracing their key role in the effort to integrate medicine and public health. In particular, they can expand and strengthen existing training programs in preventive medicine and COPC or add these programs to their curricula.