Syndicate content
Recent documents in eScholarship@UMMS
Updated: 2 hours 33 min ago

Human monoclonal antibodies to Plasmodium falciparum circumsporozoite protein for transient passive protection of malaria travelers to endemic areas

Fri, 05/20/2016 - 3:30pm

Plasmodium falciparum, is a protozoa that causes over 214 million cases of Malaria worldwide and the World Health Organization reported an estimated 438,000 deaths attributed to malaria in 2015. Current prevention strategies have reduced malaria cases but they are either costly, have poor efficacy or resistance has begun to develop. There is a global need for an effective pre-exposure prophylaxis treatment.

The leading Malaria vaccine candidate is RTS,S which contains a monovalent Plasmodium falciparum circumsporozoite protein (CSP). The goal of this vaccine is to induce anti-CSP antibodies that would block sporozoite invasion of hepatocytes and thereby hinder parasite development into a blood-stage infection that causes malaria morbidity and mortality. Antibodies isolated from individuals who have received the RTS,S vaccine have been shown to prevent infection of hepatocytes, suggesting that CSP antibodies could be used prophylactically. However, phase III trial results of the vaccine have shown underwhelming efficacy in children.

Growing resistance to transient protection strategies for travelers and low efficacy in vaccine trials suggest there is a need for a new treatment strategy. The generation of CSP specific human monoclonal antibodies (mAbs) would be useful as prevention especially for individuals that are temporarily exposed to Malaria in endemic regions such as travelers or military personnel.

Isolation and production of therapeutic mAbs traditionally utilizes a handful of techniques including antibody engineering, phage display or hybridoma generation from transgenic mice. We have sorted antigen-specific memory B-cells from the peripheral blood of children naturally infected with malaria to isolate CSP-specific memory B-cells. These cells were individually sorted and PCR was performed to amplify antibody variable regions of the B-cell’s antibody mRNA. Samples that produced heavy and light chain antibody sequence were cloned and transiently expressed. We plan to characterize these mAbs for binding and neutralization of CSP to identify functional therapeutic mAbs.

The Use of Complementary and Alternative Medicine (CAM) and Imported Medications by Brazilians in Massachusetts

Fri, 05/20/2016 - 3:30pm

Background: The use of CAM products and imported pharmaceuticals has been rising in the United States. These practices are particularly common in Latino populations. This descriptive study sought to investigate the use of pharmaceuticals imported from Brazil and CAM products by the Brazilian population in Massachusetts.

Methods: A brief anonymous survey was administered online via social media and in paper during visits to Brazilian establishments to a sample of first-generation Brazilian immigrants residing in Massachusetts. The survey questionnaire was administered in Portuguese and explored participants’ use of CAM products and imported pharmaceuticals as well as patient disclosure of use to physician.

Results: 595 surveys responses were collected and a total of 540 surveys were included in the statistical analysis. 59.1% of respondents reported having used imported medications from Brazil during their time of residence in the US. The most commonly used classes of imported medications were analgesics and antibiotics. 31.5% of participants reported use of CAM products; most commonly for cold-like symptoms. CAM products and imported medications were most often obtained through friends or relatives who brought them from Brazil. 63.9% of respondents did not inform their physician about their use of imported medications and/or CAM products. The most common reason for not reporting was because the doctor did not ask.

Conclusions: To improve care of Brazilian immigrants, it is essential that US physicians ask patients about the use of imported medications and CAM products. Familiarity with the most commonly used products is important for patient education regarding efficacy, toxicity, and possible drug interactions.

Physiological and Social Stress on Cognitive Performance

Fri, 05/20/2016 - 3:30pm

Humans are highly social creatures and this provides us with a number of benefits, such as protection and support, but it also brings new avenues for stress from social sources. Basic and translational neuroendocrine research has yielded a rich set of findings and a general understanding of how acute and chronic stress can result in reduced health, earlier aging, and earlier death. Although stress can be indexed by level of cortisol, the major stress hormone in humans, many interrelated physiological systems are involved in a stress response, including the cardio and vascular systems. Research toward greater understanding of stress buffering mechanisms holds value for improved human health in the face of entrenched social stressors.

In particular, acute and chronic stress have consistently been found to impair cognitive performance, Many adults in high stress environments also face a changing social landscape during college years: changes in living partners, less control over noise, sleep, exercise, and nutrition. In this pilot investigation, we are interested in measuring the influences of acute stress on cognitive performance and whether social support, a factor that is modifiable, would be protective on the multi-systems relationships between stress and cognition.

Broadly, we found (1) that higher levels of cortisol measured in saliva was associated with a faster return to resting levels of salivary cortisol (a measure of flexible, adaptive functioning of the central HPA stress system) after the stressor is removed and may also be associated with lower cortisol in the initial response to the stressor. In parallel, we found (2) that higher levels of cortisol were associated with impaired cognitive performance after the stress task, (3) finally, we found that those reporting high social support showed faster recovery to baseline in the cardiovascular systems and greater social support produced some buffering of stress response on their post-stress cognitive performance.

Terpenes as ‘resistance-busting” anthelmintic drug

Fri, 05/20/2016 - 3:30pm

There is an urgent need for new therapies for parasitic helminthic diseases affecting 1.5-2 billion people worldwide due to the threat of wide-spread resistance development to existing treatments and due to problems of incomplete efficacies.

Terpenes are plant secondary metabolites and major essential oil constituents. Historically, the terpene thymol was successfully used to cure hookworm infections in the 1900’s. Although effective, large doses were needed and thymol treatment had significant side effects. Because free terpenes are absorbed in the stomach, less than 10% of oral terpenes entered the site where the parasites live. To overcome these problems we have developed microparticle encapsulated terpenes and enteric coated terpene capsules.

We screened 20 terpenes for anthelmintic activity in vitro against adult stages of the hookworm and whipworm parasitic nematodes Ancylostoma ceylanicum and Trichuris muris. Here we will present results of this work, which shows the promising potential for some terpenes as pan-nematode anthelmintics. This work has allowed us to classify terpenes into at least two groups based on their in vitro killing kinetics. We have also shown that some terpenes are effective against an albendazole-resistant Caenorhabditis elegans strain suggesting that terpenes may play an important role in overcoming helminthic drug resistance. We will also present our work on optimizing lead terpene formulations in vitro and in vivo in animal models of parasitic nematode infection in order to overcome the challenges and realize the potential of “resistance-busting” terpene-based anthelmintic therapies.

Emergency Medicine Providers Systematically Underestimate Their Opioid Prescribing Practices

Fri, 05/20/2016 - 3:30pm

Background: Opioid misuse is a known public health problem, nationwide and in Massachusetts. The Massachusetts Hospital Association (MHA) developed recommendations to address opioid prescribing in the ED setting, and UMassMemorial Health Care recently implemented a system-wide opioid practice guideline mirroring the MHA policy. Little is known about methods to influence behavior change among ED providers related to opioid prescribing practices. Guideline implementation provided a unique opportunity for a natural experiment related to prescribing patterns, and we hypothesized that a simultaneous experimental intervention to provide clinicians with their individual prescribing data would alter their practices beyond any effect achieved solely by being subject to the new guidelines.

Methods: As part of an ongoing, prospective, randomized trial of an intervention hypothesized to influence providers’ opioid prescribing, we developed a survey instrument consisting of graphical depictions of the distributions of three measures of opioid prescribing among all ED providers at four UMass-affiliated EDs (attending and resident physicians and advanced practice providers). Clinicians randomized to the intervention arm were asked to identify his/her perceived position on each distribution. We compared each provider’s self-perception to their actual decile.

Results: Fifty-one providers were randomized to the intervention arm. Forty-eight completed the survey (94%). Providers underestimated their decile of opioid prescriptions per hundred total prescriptions by a median of one decile (p=0.0399 for difference from zero). Attendings underestimated their decile of percentage of patients dispositioned with an opioid prescription by a median of two deciles (p=0.0292), while residents did not exhibit a significant difference. Providers showed systematic disagreement with their raw number of prescriptions for extended-release opioid formulations (kappa -0.18), underestimating by a median of one.

Conclusions: Based upon three measures of ED opioid prescribing, providers’ self-perceptions of their practices systematically underestimated their actual prescribing, which likely has implications related to efforts to influence clinician behavior change.

Correlates of hyaluronic acid and corticosteroid injections among patients with radiographically confirmed osteoarthritis

Fri, 05/20/2016 - 3:30pm

Objective: Despite the rapid proliferation of hyaluronate (HA) and corticosteroid (CO) injections and clinical guidelines regarding their use in osteoarthritis (OA), information on the characteristics of people receiving them is scarce. We described use of injections among adults with radiographically confirmed knee OA and identified correlates of injection use.

Methods: We used publicly available data from Osteoarthritis Initiative and included participants with ≥ one radiographically confirmed knee OA (Kellgren-Lawrence grade (K-L) > 2) at baseline. We matched 415 participants reporting HA and/or CO during the 6 month before one of the first 7 annual follow-up assessments to 1,841 non-injection users by randomly selecting a study visit to match the distribution observed in the injection users. Multinomial logistic regression models identified correlates of injection use including sociodemographics and clinical/functional factors.

Results: Injections were common (16.9% -year 1, 13.7% -year 2, 16.6 % -year 3, 13.5% - year 4, 15.9% -year 5, 13.5 % -year 6 and 9.9% -year 7) with corticosteroid injections most common (68.4%). HA and CO were more commonly reported by those with higher income (e.g. adjusted Odds Ratio (aOR) HA > $50k versus < $25k: 3.63; (95% CI: 1.20-10.99)) and less common among blacks (aOR HA: 0.19; 95% CI: 0.06-0.55). Greater K-L grade (grade 4 versus 2) was associated with increased odds of HA (aOR: 4.79; 95% CI: 2.47-9.30), CO (aOR: 1.56; 95% CI: 1.04-2.34), or both (aOR: 4.94; 95% CI: 1.99-12.27).

Conclusion: Hyaluronic acid or corticosteroid injections are associated with higher socioeconomic positioning and indicators of greater disease severity.

Using Interviews to Understand Patients’ Post-operative Pain Management Educational Needs Before and After Elective Total Joint Replacement Surgery

Fri, 05/20/2016 - 3:30pm

Objective: To better understand the education needs of patients electing to have TJR in managing their pain in the post-operative period after discharge from the hospital.

Methods: An exploratory, descriptive, qualitative design. Convenience sample of people who reported that they had not received information about pain management prior to TJR surgery were recruited from 9 surgeon practices in 8 states to participate in telephone interviews, utilizing open-ended questions. Questions included: recollection of pre-op class attended and content; experiences with surgical pain after surgery and how it was managed; experiences with pain medicine; experience using non-medicine related pain reduction methods; suggestions for delivery of pain management information. Interviews were recorded and transcribed. Data were categorized using content analysis techniques.

Results: Seventeen patients were interviewed. Although all remembered attending a pre-operative class prior to their joint replacement surgery, none remembered receiving information during that class about managing pain once they were discharged. All had been prescribed an opioid for pain management post-operatively; however no patients reported receiving any information regarding use of the medication other than the information on the pill bottle. Many had concerns regarding the use of opioids to control their pain, including side effects, such as constipation and the risk of addiction. The most common non-medicine method used to manage pain was the use of ice. Participants believed that information about pain management, including both non-medicine approaches and instructions for taking opioids would be helpful and should be delivered at multiple time points, including pre-operatively, at discharge, and within the first few days after discharge.

Conclusion: With trends toward shorter hospital stays, home based pain management is a priority. Understanding the pain management education needs of patients considering elective TJR could inform interventions for this population as well as provide insight into the needs of other patients undergoing surgery.

Effective Pain Information Pre-operatively is Associated with Improved Functional Gain after Total Joint Replacement

Fri, 05/20/2016 - 3:30pm

Objective: We evaluated receipt of pre-operative pain management education in a national prospective cohort on post-operative pain and function.

Methods: Preoperative, 2 week and 6 month postoperative data from a nationally representative cohort of 1404 primary unilateral TJR patients with a date of surgery between May 2011 and December 2014. Data included demographics, comorbid conditions, operative joint pain severity (HOOS/KOOS), musculoskeletal disease burden, physical function (SF36 PCS), and mental health (SF36 MCS). At 2 weeks post-op, patients were asked if they had received information prior to surgery about pain management options and if so, how helpful the information was. Additionally, patients were asked about use of non-medication methods to relieve operative joint pain. Descriptive statistics were performed.

Results: One third reported not receiving information about pain management; an additional 11% did not find it helpful. There were no differences pre-operatively in demographics, comorbid conditions, operative joint pain severity, musculoskeletal disease burden, SF36 PCS and MCS between those who received information and those who did not. Patients who received information about pain management options were more likely to use non-medication methods to relieve operative joint pain (p< 0.000). They reported less current pain (p = 0.02) and maximum pain (p = 0.03) in their operative joint at 2 weeks post-op. At 6 months post-op, patients who reported not receiving information about pain management had statistically lower physical function scores that those receiving information (p = 0.04). There was no difference in HOOS/KOOS pain scores 6 months post-op.

Conclusion: More than 40% of TJR patients in this study reported that they did not receive or received unhelpful information regarding post-op pain management options, highlighting a need for more consistent patient education. In this study, the lack of pain management information appears to negatively impact 6 month post-operative function.

Does the Indication for Breast Surgery Impact Surgical Outcomes? A Contemporary Analysis of the ACS-NSQIP Database

Fri, 05/20/2016 - 3:30pm

Background. There is limited data about whether perioperative outcomes differ based on the indication for breast surgery. Herein we aim to assess if breast surgery for prophylaxis, compared to that for malignancy, impacts surgical outcomes.

Methods. All women who underwent simple or subcutaneous mastectomy were identified from the 2007-2012 ACS-NSQIP database. Patients were identified by their ICD-9 codes and categorized into two groups. Group 1 consisted of patients diagnosed with breast cancer or carcinoma in situ; group 2 consisted of patients diagnosed with a genetic predisposition to malignant neoplasm of the breast (i.e., BRCA mutation). Demographic and preoperative variables were compared between groups and outcome variables. Outcome variables were analyzed using age- and operative time-adjusted logistic regression models.

Results. 30,803 patients were identified. Group 1 consisted of 30,644 (99.5%) patients diagnosed with malignancy; group 2 consisted of 159 (0.5%) who underwent prophylactic surgery. In univariate analyses, those undergoing prophylactic surgery were significantly younger (p < 0.01). There were no other preoperative differences between groups. When adjusted, the prophylactic group demonstrated a greater risk of DVT (p = 0.03). There were no differences in mortality, superficial/deep/organ space infections, UTI, wound dehiscence, or MI.

Conclusion. In this analysis of a national cohort of breast surgery patients, those undergoing prophylactic surgery due to a genetic predisposition had a greater risk of perioperative DVT, compared to those who underwent surgery for a diagnosis of malignancy. This data may allow for improved perioperative management of patients to prevent DVT development and their devastating consequences.

Synthetic intestinal mucosal barrier using a hydrogel slab integrated microfluidic chip

Fri, 05/20/2016 - 3:30pm

The mucosal barrier lining along the intestinal tract plays a key role in metabolic and immunological homeostasis. Repeated disruption of the mucosal barrier integrity has been suggested to be a precursor event that derives inflammatory bowel diseases and colorectal cancer. Multiple in vitro platform technologies have been developed to understand the mucosal barrier function including trans-wells and microfabricated devices, but a static and vertical axis culture settings limits to simulate and observe dynamic complexity of the gut microenvironment. Here, we introduce a biomaterials engineering approach to create a synthetic mucosal barrier in a transverse manner for direct observation of cellular processes. A type I collagen hybridized polyacrylamide hydrogel supporting small molecular transport and epithelial cell adhesion was used as a framework and subsequently anchored covalently to a glass slide via silanization chemistry. Villous microstructures ~250µm in height were manufactured by casting the hydrogel precursor solution in a pre-designed, removable polydimethylsiloxane micropattern mold and polymerizing using UV light. After sealing the device with another glass slide, we increased the cellular and extracellular complexity of this microfluidic chip by sequentially introducing (i) HT-29 colon epithelial cells, (ii) mucin extracts from a pig intestine, (iii) bacteria, and (iv) human peripheral blood-derived mononuclear cells and co-cultured them in a single device. This modular in vitro microphysiological intestinal tissue model may serve as a translational platform to discover the biophysical etiology for disruption of the mucosal barrier and associated inflammatory diseases.

Estimated and self-reported workloads and lower extremity symptoms for nurses and nursing assistants

Fri, 05/20/2016 - 3:30pm

Objectives and Significance: In US nursing homes, nursing assistants (NAs) are responsible for direct care and resident handling, while nurses’ roles consist primarily of medication distribution and administrative duties. This study examines differences in observed physical exposures and self-reported knee and ankle symptoms of nurses and NAs.

Methods: Observations of clinical staff’s postures and handling were made at fixed time intervals using the PATH Method. An additive physical workload index (PWI) was computed to compare LE workload of NAs and nurses. The PWI combined observed frequencies of postures and handling with their associated forces on the knee and ankle derived from the University of Michigan’s 3D Static Strength Prediction Program. Additionally, surveys on health and working conditions were distributed to employees at 24 nursing homes. Knee and ankle symptoms in the past three months and physical demands were examined by clinical job.

Results: Frequencies of postures and handling input into the PWI were based on observations of 275 NAs and 40 nurses. The analysis of PWI for the LE demonstrated higher physical exposures on both the knee and ankle for NAs compared to nurses, especially while NAs were performing resident handling. Among survey participants (n = 1467), NAs reported higher mean physical exertion scores than nurses and also higher frequencies of knee and ankle symptoms (p=0.0076) in the previous three months.

Conclusions: In this study, both estimated and self-reported physical workloads were higher among NAs compared to nurses. LE symptoms were also more common among NAs. Safe handling equipment helps reduce some LE exposures for NAs, but interventions for other strenuous tasks should be considered to reduce LE pain symptoms, such as introducing lighter food carts often pushed by NAs and limiting the number of dirty linens bagged before transporting to the soiled linen drop-off.

Health Applications of Social Network Analysis and Computational Social Science

Fri, 05/20/2016 - 3:30pm

Social network analysis has proliferated across the social and behavioral sciences, shifting our analytical focus from individuals to the patterns of social ties that connect them. This perspective has enriched our understanding of a great variety of health-related phenomena, including the spread of STDs on contact networks, the spread of health care practices on physicians’ professional networks, the dynamics of patient transfers on networks of clinics, and the spread of weight-related behaviors among adolescents at risk for obesity. The advent of the era of computational social science has augmented the contributions of this perspective, by moving beyond expensive and laborious methods of questionnaires and direct observation to incorporate new techniques of data collection and analysis. For example, these include analysis of electronic health records or other time-stamped communication traces among healthcare practitioners; streams of behavioral data from wearable sensors, location-aware devices, or electronic calendars; automated analysis of text in documents; and mapping networks of interaction by citations and collaboration in clinical research literatures. Whereas much of computational social science has offered new ways of monitoring health behavior and healthcare behavior, or for analyzing those data, a further contribution has been to directly analyze these social processes in system dynamics models, microsimulation, and agent-based models. These approaches allow for computational experiments that assist in predicting and interpreting outcomes from health interventions. This poster will highlight some of my recent and pending work in this domain, aiming to identify potential collaborators in UMCCTS for projects that involve social networks or computational social science.

A Pilot Randomized Controlled Trial of a Videoconferencing Smoking Cessation Intervention for Korean American Women: Preliminary Findings

Fri, 05/20/2016 - 3:30pm

Introduction: Korean American women prefer online or telephone smoking cessation interventions that can be remotely accessed from home. However, these interventions have been found ineffective for the group.

Methods: This study is a pilot clinical trial testing the feasibility and acceptability of a videoconferencing smoking cessation intervention for Korean American women and compared its preliminary efficacy with a telephone-based smoking cessation intervention. Korean women in the United States were recruited nationwide and randomly assigned at a ratio of 1:1 to either a video arm or a telephone arm. Participants in both arms received eight 30-minute weekly individualized counseling sessions of a culturally adapted smoking cessation intervention and nicotine patches for 8 weeks. They were followed up at post-quit 1, 2, and 3 months.

Results: A total of 168 Korean Americans were assessed for eligibility, 77 were determined to eligible and 49 participated in the study. The videoconferencing intervention was acceptable and feasible for women under 50 years, whereas it was not for older women. The videoconferencing intervention produced abstinence rates of 67% at post-quit 1 month and 42% at post-quit 3 months based on self-report. The rate at post-quit 3 months dropped to 33% when those women whose abstinence could not be validated with salivary cotinine tests were treated as smoking. Abstinence rates in the telephone arm did not differ from those in the video arm.

Conclusion: Findings suggest that videoconferencing smoking cessation intervention may be feasible and acceptable for Korean American women under 50 years. However, for older Korean American women, the intervention may not be feasible and telephone-based intervention seems to be just as effective if smoking cessation intervention components are adapted at a deep structural level of Korean culture by integrating its core cultural values and addressing psychosocial, social and environmental forces affecting the behavior.

Long-lasting Effects of Perinatal Exposure to Brominated Flame Retardant on Male Reproductive Outcomes in Rat Model

Fri, 05/20/2016 - 3:30pm

Meta-analysis of 101 studies published between 1934 and 1996 indicates that mean sperm concentration decreased around 50% during this period. More recent studies have found alarmingly poor semen quality in the general population of Northern Europe. Additional adverse trends include increased incidence of testicular cancer, and congenital malformations such as cryptorchidism and hypospadias. Testicular germ cell cancers increased by about 400% over the period of 50 years in industrialized countries. Decreased quality of male reproductive health has been linked to environmental endocrine disruptors exposure. However, the ability of xenobiotics to produce long-lasting effects and mechanisms of perturbation of the male reproductive system following developmental exposures are not well understood. Both animal experiments and human studies show male reproductive toxicity to polybrominated diphenyl ethers (PBDE), a group of ubiquitous, persistent, and bioaccumulative environmental xenobiotics. Here we report the result of experiment in which pregnant Wistar rats were fed 0.2 mg/kg body weight BDE-47 (the most ubiquitous PBDE congener) daily starting from the eighth day of pregnancy until weaning. Multiple endpoints of male reproductive health were assessed in offspring on postnatal week 20: testis size, sperm production, morphology, motility, circulating testosterone, select gene expression in prostate (qRT-PCR) and all-genome gene expression in testis (RNA-seq). Seventeen weeks after exposure was abolished testis size was significantly smaller in adult rats and genes of inflammatory response were significantly upregulated in testis tissue among other results. Our findings confirm male-reproductive toxicity of PBDE and identify inflammatory response as a long lasting mechanism of repro-toxicity triggered by perinatal exposure.

Characterization of Respiratory Phenotype in Very Long-chain Acyl-CoA Dehydrogenase Deficient Mice.

Fri, 05/20/2016 - 3:30pm

Rationale: Very Long-chain Acyl-CoA dehydrogenase (VLCAD) deficiency the most common inherited long-chain fatty acid disorder. The VLCAD enzyme catalyzes the first step of mitochondrial fatty acid oxidation and loss of the enzyme results in energy deficiency as well as accumulation of long chain fatty acids. Recently, a related enzyme, Long-chain Acyl-CoA dehydrogensase (LCAD), which unlike VLCAD is not highly expressed in metabolic tissues like liver, heart and skeletal muscle, was found to be expressed in the lung and surfactant and lung dysfunction were observed in LCAD deficient mice. Respiratory distress syndrome has been described in other fatty acid oxidation disorders. VLCAD is expressed in lung, and likely plays an important role in lung compliance.

Methods: VLCAD deficient mice and litter-mate controls were fasted for 18 hours, then exercised on a treadmill for 2 hours. Breathing was immediately assessed using whole body plethysmography in unanaesthetized spontaneously breathing mice. After a stable baseline was achieved, mice were given a “respiratory” challenge with 7% hypercapnia. In a subgroup of animals, pulmonary mechanics were assessed using Flexivent (Scireq).

Results: Following exercise, VLCAD deficient mice had a decreased tidal volume and minute ventilation compared to their wild type controls. However, post-exercise VLCAD deficient mice were able to stabilize to similar levels as wild-type during baseline. The VLCAD deficient mice had a decreased response to a respiratory challenge with 7% hypercapnia. Early preliminary results suggest that VLCAD deficient animals have lower airway resistance.

Conclusions: Respiratory insufficiency was demonstrated in a fasted and exercise challenged VLCAD deficient mice.

Neighborhood Differences in the Availability of Healthy Foods in the City of Worcester

Fri, 05/20/2016 - 3:30pm

INTRODUCTION. Neighborhood food environment is important to healthy eating. The availability and proximity of healthy foods has been shown to affect dietary quality, obesity, and overall health. We surveyed food stores throughout City of Worcester to assess the variability of food availability in neighborhoods and inequalities in access to fresh produce, unprocessed foods, and other healthy food options by neighborhood socioeconomic status (N-SES).

METHODS. Where permitted by the store manager, the Community Nutrition Environment Evaluation Data Systems (C-NEEDS) survey was completed inside the store by trained staff. Healthy Food Availability Index (HFAI; range 0-56) and Unhealthy Food Availability Index (UFAI; range 0-39) were calculated for each store. Higher HFAI indicates higher availability of healthy food items, and higher UFAI indicates high availability of unhealthy foods. Median household income and car ownership data were derived at the census tract level as measures of N-SES using the 2013 US Census American Community Surveys 5-Year estimates.

RESULTS. Convenience stores (mean HFAI 7.9, UFAI 21.1) had lower availability of both healthy and unhealthy foods than grocery stores (HFAI 32.4, UFAI 29.8). However, convenience stores had a higher proportion of unhealthy foods to healthy foods. Neighborhoods with lower median income and car ownership had a greater density of convenience stores. Neighborhoods with higher SES and car ownership had less access to convenience stores. Grocery stores in higher SES neighborhoods had more healthy food options.

DISCUSSION. These results demonstrate that residents in lower SES neighborhoods may be disadvantaged when it comes to availability of healthy foods. These neighborhoods have higher density of convenience stores that may promote an unhealthy eating environment. Residents in these neighborhoods may wish to make healthy choices, but without access to a car may be unable or unwilling to walk to the nearest store where healthy alternatives are available.

Towards Pharmacovigilance Using Machine Learning To Identify Unknown Adverse Reactions Triggered By Drug-Drug Interaction

Fri, 05/20/2016 - 3:30pm

Adverse Drug Reactions (ADRs) are a major cause of morbidity and mortality in world. There is thus a growing need of methods facilitating the automated detection of drugs-related ADR; especially ADRs that were not known from clinical trials but later arise due to drug-drug interactions. In this research our goal is to discover the severe unknown Adverse Drug Reactions caused by a combination of drugs, also known as Drug-Drug-Interaction. We propose to use Association Rule Mining to find the ADRs caused by using a combination of drugs yet not known to be caused if these drugs were taken individually. For evaluation, we will test out the proposed strategies on real-world medical data extracted from the spontaneous adverse drug reaction reporting system called FAERS. The results mined by our tool will be checked both manually by literature review and then verified by domain experts for interestingness and accuracy.

Reducing Phlebotomy-­Induced Blood Loss in the PICU: A Quality Improvement Study

Fri, 05/20/2016 - 3:30pm

Introduction: Phlebotomy­induced blood loss contributes to development of anemia in critically ill children. Major factors contributing to this include blood overdraw from indwelling catheters and utilization of larger volume containers. We targeted these causes of excessive blood draws to decrease volumes of both the discarded blood and sample sent to the laboratory for standard tests. Methods: Pre and post quality improvement study in a 10­bed pediatric intensive care unit 2014­2015. All patients admitted to PICU during each 2­month study period were eligible for enrollment. Pre-intervention, nurses used standard pediatric tubes (3­3.5mL). A wash­out period followed. For intervention, email survey and nursing/resident education sessions were used to standardize discard volumes and introduce microtainers (500µL) for standard hematology, chemistry, and coagulation tests. All blood draws were recorded. Incomplete tests due to lack of volume or clotting were recorded. Results: 45 patients (138 blood draws) in pre­intervention phase and 32 patients (142 draws) in intervention phase were enrolled. Pre­intervention, mean total blood volume sent to the lab was 2.25 ±1.87mL and mean discard volume was 1.64 ±1.67mL. Post­intervention, mean total volume of blood sent and mean discard volume were significantly reduced, 1.52 ±1.50mL, p<0.05, and 0.89 ±0.61mL, p<0.05, demonstrating 32.4% and 45.7% reductions in blood volume respectively. There was no change in test failures due to low volume or clotting. Samples from peripheral intravenous catheters comprised the majority of the cohort. Pre­intervention, mean total blood volume from PIVs (n=116) was 2.06 ±1.51mL, and mean discard volume was 1.74 ±1.33mL. Post­ intervention (n=72), mean total volumes significantly decreased to 1.39 ±1.49mL and 1.15 ±0.53mL respectively (p<0.05). Conclusions: We demonstrated a significant reduction in phlebotomy­induced blood loss by standardizing discard volumes and using microtainers to avoid sending unnecessary blood volumes to the lab.

Translating dosage compensation to trisomy 21

Fri, 05/20/2016 - 3:30pm

Down syndrome is the leading genetic cause of intellectual disabilities, occurring in 1 out of 700 live births. Given that Down syndrome is caused by an extra copy of chromosome 21 that involves over-expression of 400 genes across a whole chromosome, it precludes any possibility of a genetic therapy. Our lab has long studied the natural dosage compensation mechanism for X chromosome inactivation. To “dosage compensate” X-linked genes between females and males, the X-linked XIST gene produces a large non-coding RNA that silences one of the two X chromosomes in female cells. The initial motivation of this study was to translate the natural mechanisms of X chromosome inactivation into chromosome therapy for Down syndrome. Using genome editing with zinc finger nucleases, we have successfully inserted a large XIST transgene into Chromosome 21 in Down syndrome iPS cells, which results in chromosome-wide transcriptional silencing of the extra Chromosome 21. Remarkably, deficits in proliferation and neural growth are rapidly reversed upon silencing one chromosome 21. Successful trisomy silencing in vitro surmounts the major first step towards potential development of “chromosome therapy” for Down syndrome. The human iPSC-based trisomy correction system we established opens a unique opportunity to identify therapeutic targets and study transplantation therapies for Down syndrome.

Which is the primary factor influencing running stride parameters: age or lower limb strength?

Fri, 05/20/2016 - 3:30pm

Much still remains unknown about the impact of age, and age-related changes in muscle function, on gait parameters. The aim of this study was to examine the impact of strength on running gait parameters across the adult lifespan. We tested the hypothesis that a greater amount of the variance in peak hip, knee and ankle sagittal plane moments would be explained by peak isometric joint torques as compared to age. Twenty-four healthy adults, ages 20-66 years, completed 5 trials on an overground 20-meter runway at a standardized velocity of 3.5 ms-1 (± 5%). Participants performed maximal isometric plantar flexion and knee extension for three contractions lasting three seconds each. Linear regression analysis between strength, age, and moments were performed. At the ankle, age alone explained 14.4% of the variance in the peak ankle joint moment. There was not a significant increase in the variance explained when strength was added to the model. At the knee, neither age nor strength explained a significant portion of the variance in peak knee moments. However, together age and strength explained 27.9% of the variance in the peak knee moment. No significant associations were found between the hip moments and either knee and ankle strength. These results suggest that other age-related physiological changes may drive changes in gait mechanics more so than maximal torque production. A more dynamic measure of muscle function, such as power or isokinetic torque at varying speeds may have greater predictive value for gait performance.