17 minute read
Newsline
Assessing Safety of Aspirin in High-Risk Patients With Atherosclerotic CV Disease
THE USE OF ASPIRIN as a primary prevention for atherosclerotic cardiovascular disease (ASCVD) in high-risk patients has been controversial. Aspirin is known to lower the risk of heart attack and stroke, but it also may increase the risk of major bleeding episodes. In a new study, researchers have found no evidence that the use of aspirin for higher-risk primary prevention is beneficial. The findings were published in The American Journal of Medicine.
The study authors used a database to examine all prospective randomized clinical trials (RCTs) that tested aspirin as a primary prevention method for ASCVD. A total of 12 RCTs that compared aspirin with nonaspirin (either placebo or control) for primary ASCVD prevention were identified (n=145,435).
Patients treated with aspirin had a statistically significant reduction in ASCVD events compared with no aspirin (4.7 vs 5.3 events per 1000 patient-years). Despite the beneficial effects, the authors found a statistically significant increase in major bleeding events for patients treated with aspirin compared with control patients (2.5 vs 1.8 events per 1000 patient-years). “These results trended toward an increased benefit for aspirin in higher-risk patients, but this finding did not meet statistical significance,” the researchers reported.
In studies published prior to 2010 (n=7) and during or after 2018 (n=5), aspirin use resulted in a statistically significant reduction of ASCVD events. Researchers found that aspirin had a greater treatment effect in older trials
A significant increase in major bleeding events was seen with aspirin therapy.
compared with newer trials. The study authors found no heterogeneity of treatment effect between higher vs lower doses of aspirin and major bleeding events.
“Despite the thought that aspirin for primary prevention may still be useful for those at high risk for ASCVD, insufficient randomized data currently exist to recommend aspirin in this group,” concluded the authors.
Antibiotic Use May Reduce Efficacy of Hormonal Oral Contraceptives
THE EFFICACY OF hormonal contraceptives may be reduced with use of antibiotics, according to a study published online in BMJ Evidence-Based Medicine.
Jeffrey K. Aronson, MBChB, DPhil, from Oxford University and Robin E. Ferner, MD, from the University of Birmingham, both in the United Kingdom, examined whether antibiotics impair the effectiveness of oral contraceptives in a database review of Yellow Card reports to the UK Medicines and Healthcare products Regulatory Agency. Data were included from spontaneous reports of suspected adverse drug reactions in people taking antibacterial drugs (74,623), enzyme-inducing medicines (32,872), or control medicines (65,578).
The researchers found that unintended pregnancies were 7 and 13 times more commonly reported with antibiotics and enzyme inducers (positive controls), respectively, compared with control medicines. Congenital abnormalities were not more common with antibiotics, but were reported 7 times more often with enzyme inducers. Diarrhea was not identified as a confounding factor.
“This evidence suggests that there is an interaction of antibacterial drugs with hormonal contraceptives, which can potentially impair the effectiveness of the contraceptives,” the authors write. “The precautionary principle dictates that women taking hormonal contraceptives should be advised to take extra contraceptive precautions when a short course of an antibacterial drug is added.”
Newsline
Seasonality of Coronaviruses May Provide Insight for Future SARS-CoV-2
Coronaviruses, flu, and RSV are more common during winter.
SEASONAL coronaviruses are more prevalent during the winter months, coinciding with influenza and respiratory syncytial virus (RSV) seasons in temperate climate regions, according to a review published in The Journal of Infectious Diseases.
After the initial pandemic wave of novel SARS-CoV-2 (COVID-19), it remains unclear what trajectory COVID19 transmission will take. Similar to influenza and RSV, SARS-CoV-2 could recur as a circulating pattern of seasonal outbreaks, which has been observed among other preexisting human seasonal coronaviruses.
Currently, there are 4 known seasonal coronaviruses that circulate in human populations, including NL63, 229E, OC43, and HKU1. The systemic review compared the seasonality of seasonal coronaviruses with the influenza virus and RSV and modeled monthly activity of seasonal coronaviruses using sitespecific weather data.
In total, seasonality data from 40 sites in 21 countries were included. The number of positive seasonal coronavirus cases by month across years for each site was aggregated, and the annual average percentage was calculated for strength of virus activity. Heat maps were created to display the activity of seasonal coronaviruses, influenza virus, and RSV for each site to evaluate peak activity for each. Furthermore, meteorological data from each site was extracted to the site to model monthly seasonal coronavirus activity.
Results suggested that human seasonal coronaviruses are prevalent in winter months, coinciding with influenza and RSV season in most temperate sites. High activity of seasonal coronaviruses was observed in winter months in most temperate sites, with the exception of China where seasonal coronavirus activity was noted year-round. More variations in seasonal coronaviruses activity were observed in tropical sites.
When China was excluded, 53.1% of the annual seasonal coronavirus cases that occurred in temperate sites occurred during influenza seasons, while 49.6% of seasonal coronavirus cases occurred during RSV season (interquartile range, 34.6-61.9 and 30.2-60.2, respectively). At tropical sites and sites in temperate China, there was less overlap observed between seasonal coronavirus activity and influenza activity (20% of seasonal coronavirus cases during the season) and RSV activity (29% of seasonal coronavirus cases during the season). Sites with low temperature and high relative humidity were found to be associated with a higher expected proportion of seasonal coronavirus cases, and dew points had a similar relationship with seasonal coronavirus activity.
“Our findings offer clues to the possible post-pandemic circulating season of SARS-CoV-2 and add to the knowledge pool necessary for post-pandemic preparedness for SARS-CoV-2,” the researchers concluded.
Feeling Dizzy Upon Standing May Point to Later Dementia
SYSTOLIC ORTHOSTATIC hypotension (OHYPO) and variability in visitto-visit seated systolic blood pressure (BP) postural change are associated with greater dementia risk, according to a study published online in Neurology.
Laure Rouch, PharmD, PhD, from University of California in San Francisco, and colleagues used data from 2131 older adults (mean age, 73 years) participating in the Health, Aging, Body Composition study to evaluate whether OHYPO and visitto-visit BP postural changes variability are associated with incident dementia. OHYPO was measured repeatedly during a 5-year baseline period and was defined as a fall of ≥15 mm Hg in systolic BP or ≥7 mm Hg in diastolic BP after standing from a sitting position for at least one-third of visits.
The researchers found that overall, 9% of participants had systolic OHYPO, 6.2% had diastolic OHYPO, and 21.7% developed dementia. Systolic OHYPO was associated with greater dementia risk (adjusted hazard ratio [HR], 1.37) when adjusting for demographics, seated systolic BP, antihypertensive drugs, cerebrovascular disease, diabetes, depressive symptoms, smoking, alcohol, body mass index, and the presence of APOE ε4 alleles. However, there was no association seen for diastolic OHYPO. Variability in seated systolic BP postural changes was also associated with higher dementia risk (highest tertile of variability: adjusted HR, 1.35).
“Our findings raise the question of potential preventive interventions to control orthostatic systolic BP and its fluctuations,” the authors write.
Newsline
New DNP/MPH Dual Degree Program at Johns Hopkins
JOHNS HOPKINS UNIVERSITY announced that students can earn a Doctor of Nursing Practice/Executive Master of Public Health (DNP/MPH) in as few as 3 years through a dual degree program that the university will launch in the summer of 2021.
The program aims to provide students with blended knowledge and skills in nursing practice and population health to prepare graduates to work in local and global health agencies, advocacy groups, and academic institutions where leadership is needed to improve health outcomes, promote health equity, and influence policy.
“When nursing and public health bring the best of their skills together, there is so much to be accomplished within advancing health equity and developing solutions to our changpublished in Annals of Internal Medicine. conducted. The researchers focused on ing national and global health needs,”
Students will complete an integrated DNP project as part of their coursework.
Bloomberg School of Public Health hospitalized and 43% intubated between the index date and date of death. Of the comorbidities. Compared with norDean Ellen J. MacKenzie, PhD, ScM said in a press release.
Students enrolled in the dual degree program will take courses from the Johns Hopkins School of Nursing and the Johns Hopkins Bloomberg School of Public Health in both online and inperson formats, noted a press release. Program applicants must have a master’s degree in nursing from an accredited college or university, RN licensure, and 2 years of health care experience.
“COVID-19 has amplified the critical importance of nurse leaders who develop interventions that are based in both nursing and public health,” said Dean of the Johns Hopkins School of Nursing Patricia Davidson, PhD, MEd, RN, FAAN, according to the press release. “We are excited to be able to launch the program during this time in history when the perspective of nursing is well recognized and ever essential to creating the path forward to a healthier and more population-focused
Higher BMI Linked to Poor Outcomes From COVID-19
SEVERE OBESITY, specifically in men and younger patients, is a risk factor for adverse outcomes from COVID-19, particularly mortality, according to a study A retrospective study of members of Kaiser Permanente Southern California who were diagnosed with COVID-19 206 (3%) patients died within 21 days of their COVID-19 diagnosis, with 67% patients that survived, 15% were hospitalized and 3% were intubated.
Researchers found a J-shaped association between BMI and risk for death,
Researchers found a J-shaped association between higher body mass index and risk for death from COVID-19, even after adjusting for obesity-related comorbidities.
from February 13 to May 2, 2020 was the effect body mass index (BMI) had on mortality within 21 days of COVID-19 diagnosis (index date). A total of 6916 patients with COVID-19 were identified; even after adjusting for obesity-related future.”
mal weight patients, those with class III obesity had 2.68 relative risk of morbidity (95% CI, 1.43-5.04). Patients with a BMI >45 kg/m2 were 4 times at risk of dying compared to normal weight patients (95% CI, 2.12 to 8.26).
Among patients aged ≤60 years, an increased risk for death with high BMI was found compared with the overall model. For those aged ≥61 years, BMI was associated with death to a much lesser degree and only for the highest measures. Men had a higher risk of death compared with women; women had no increased risk for death associated with BMI. Severe obesity, particularly among younger patients, “eclipses the mortality risk posed by other obesity-related conditions, such as history of myocardial infarction, diabetes, hypertension, or hyperlipidemia, which suggests a significant pathophysiologic link between excess adiposity and severe COVID-19 illness,” concluded the authors. © SDI PRODUCTIONS / GETTY IMAGES
IN PATIENTS WITH type 2 diabetes (T2D) and/or cardiovascular disease (CVD), elevated triglyceride levels were found to be an independent risk factor associated with adverse cardiovascular outcomes, according to a study published in the American Journal of Cardiology.
The study, which used data from the Bypass Angioplasty Revascularization Investigation 2 Diabetes (BARI 2D) trial, sought to determine the prevalence and distribution of hypertriglyceridemia and examine the associations between baseline triglyceride levels and cardiovascular events overall. Participants in the original BARI 2D trial were randomized by whether they were receiving cardiovascular treatment or type 2 diabetes treatment. The protocol included guideline-mandated treatment for hypertension, dyslipidemia, and obesity, in addition to a goal HbA1c of <7.0% regardless of randomization assignment. Lipid measures were obtained from fasting blood acquired prior to randomization.
The primary endpoint of the current analysis was time to death due to cardiovascular outcomes, myocardial infarction, or stroke. Secondary outcomes were the individual components of the primary
Every 50 mg/dL rise in triglyceride level was linked to a 3.2% increase in death.
endpoint in addition to time to coronary revascularization and all-cause death. Participants were grouped by baseline triglyceride levels; those with elevated triglyceride levels were further stratified into 3 groups: ≥150 to 199 mg/dL; 200 to 499 mg/dL; ≥500 mg/dL.
The study population consisted of 2307 patients (97% of the BARI 2D study; median age, 62 years). Most patients were obese, had a history of T2D just under 10 years, and were prescribed a statin. Overall, 51% of patients had triglyceride levels <150 mg/dL; 18%, 150-199 mg/dL; 28%, 200-499 mg/dL; and 3%, 500-1000 mg/ dL. Patients in the highest triglyceride level group were more likely to be younger, current smokers, have lower HDL-C cholesterol, and higher insulin levels.
Elevated baseline triglyceride levels were associated with the primary composite outcome of cardiovascular death, myocardial infarction, and stroke. In adjusted analysis, every 50 mg/dL increase in triglyceride level was associated with a 3.2% increase in cardiovascular death, myocardial infarction, or stroke and a 5.8% increase in cardiovascular death alone. In the fully adjusted analysis, every 50 mg/dL increase in triglyceride level was associated with a 3.8% increase in the primary composite outcome and a 6.4% increase in the secondary outcome.
Inclusion of age, sex, region, race, ethnicity, systolic blood pressure, and smoking status covariates modestly increased the impact of triglyceride levels on risk for adverse cardiovascular outcomes. In contrast, the addition of body mass index, T2D duration, HbA1c level, LDL-C, and statin use all modestly attenuated the impact of triglyceride levels on risk for both outcomes.
“Whether lowering [triglyceride] levels leads to improved cardiovascular outcomes remains to be seen,” said the authors.
Taking Vitamin D3 Does Not Prevent Depression in Adults
LONG-TERM supplementation with 25-hydroxyvitamin D (vitamin D3) did not significantly alter symptoms of late-life depression among the general population, according to study results published in JAMA.
A total of 18,353 adults aged >50 years were enrolled in the Vitamin D and Omega-3 Trial-Depression Endpoint Prevention (VITAL-DEP) clinical trial between 2011 and 2014. Participants were randomly assigned to receive either 2000 IU/d cholecalciferol (vitamin D3) and fish oil (n=9181) or a placebo (n=9172). Patients were followed through 2017 and assessed for depression using the 8-item Patient Health Questionnaire depression scale (PHQ-8). A change in PHQ-8 score of 0.5 was considered the minimally important clinical change.
The 2 treatment groups were varied demographically; mean participant age was 67.5±7.1 years, 49.2% of participants were women, 27% were minorities, and the mean baseline vitamin D3 level was 31.1 ng/mL. The majority of study participants (n=16,657) had no history of depression and the remaining participants had no treatment for depression during the previous 2 years.
Reported incidence of depression or clinically relevant depressive symptoms were similar between the treatment and placebo cohorts (609 vs 625). Reports of depression were similar between treatment and placebo cohorts (459 vs 461, respectively); reports of recurrent depression were also similar (150 vs 164, respectively), and overall risk for incident depression did not differ between cohorts. ■
Pharmacogenetic Testing: Does It Improve Therapy in Patients With MDD?
Testing for genetic variants may allow clinicians to predict how patients with major depressive disorder metabolize antidepressants.
Most antidepressants are metabolized through the CYP450 pathway.
Major depressive disorder (MDD) is a common mental disorder that a ects more than 264 million people worldwide and is a leading cause of disability, including death by suicide.1 MDD is a complicated disorder that involves the interaction of social, psychological, and biological factors.1 MDD can prevent patients from living healthy, productive lives and can complicate treatment of other comorbid conditions.1
Although MDD commonly is encountered in primary care settings, its treatment has become integrated into all elds of medicine due to its high prevalence. Cognitive behavioral therapy, interpersonal psychotherapy, and antidepressants, such as selective serotonin reuptake inhibitors and tricyclic antidepressants, are the mainstays of MDD treatment.1
Prescribing an antidepressant may be simple, but that does not make it easy. E cacy and tolerability of antidepressants vary among patients, which can make it challenging to relieve patients’ symptoms.2 Although no genes have been associated with depression,3 several genetic variants may help clinicians predict how patients with MDD will metabolize antidepressants.4 Performing genetic testing of patients with MDD and matching patients with an antidepressant class based on identi cation of genetic variants that convey sensitivity to particular antidepressants could improve response to drug therapy in patients with MDD.5
Continues on page 14
Current Antidepressant Management
The process of selecting an antidepressant should take into account cost, tolerability, adverse e ect pro les, and patient preferences.2 When evaluating treatment options for patients with MDD, the current standard of care is to initiate an antidepressant at a starting dose and reassess e ectiveness within 2 to 4 weeks, with adjustments to monitoring frequency dependent on the patient’s suicide and self-harm risk, comorbid conditions, age, and concomitant medication use.2
Several metrics are used to determine whether a selected antidepressant is working: • Does the patient feel better? • If not, the clinician should increase dosage to see if the desired e ect can be produced. • If dose titration does not reduce symptoms, the clinician should select another antidepressant. • If the patient does feel better, is he or she experiencing adverse e ects; if so, how tolerable are they?
Clinicians can mitigate adverse e ects by decreasing the dosage or switching to a di erent class of antidepressant. However, several weeks are needed after each change in drug or dose alteration to truly assess response. Finding and settling on a drug that produces a response with minimal adverse e ects can take months. During the trial period, patients may become frustrated with the process and stop therapy and/or may be at increased risk for suicide or self-harm.
Pharmacogenetic Testing
The study of drug metabolism in patients with MDD is a growing area of interest.3-5 A management approach incorporating pharmacogenetic testing in combination with clinical
POLL POSITION
If a person is an extensive metabolizer, they:
■ Process medications signi cantly slower ■ Process medications signi cantly faster ■ Process medications as predicted ■ Are unable to process medications 84.14% 6.90% 2.06%
6.90%
For more polls, visit ClinicalAdvisor.com/Polls. judgment may be superior to the standard trial and error method for nding an e ective antidepressant regimen and could improve patient outcomes.5
Genome-wide association studies are used to identify single-nucleotide variations (SNVs, formerly SNPs) in genes related to a particular disease or drug metabolism.6 Several laboratory testing companies o er pharmacogenetic panels to evaluate metabolism of drugs used to treat MDD.3 The FDA also has approved direct-to-consumer genetic testing panels (eg, 23andMe), which are widely available to the public without a health care provider’s prescription.7 A concern with these latter tests is that the results are reported directly from the company to the patient; thus, the patient decides whether or not to present the information to his or her health care provider. Many pharmacogenetic testing panels also include genes that have shown correlations with the pathogenesis of MDD, despite the lack of clinical research replicating the role of these genes in the disorder.3
Drug Metabolism
Most antidepressants are metabolized through the cytochrome P (CYP) 450 pathway; therefore, most drug metabolism genes on pharmacogenetic testing panels are related to CYP450 oxidase enzyme subunits. When pharmaceutical companies develop a drug, they provide primary and proposed secondary enzymes through which a medication is metabolized. Some antidepressants are metabolized through multiple enzymes in the CYP450 pathway. The extent to which a drug is metabolized through primary or secondary enzymes varies among individuals, further complicating understanding of drug metabolism.4 In some patients, the secondary enzymes involved in drug metabolism can compensate for underperforming primary enzymes.
Based on their genetic pro le, individuals are categorized into 4 main classes: ultra-rapid, extensive, intermediate, or poor metabolizers of drugs metabolized via various CYP450 enzyme subunits.5 • Ultra-rapid metabolizers process medications quickly, decreasing the amount of active metabolite. With ultrarapid drug metabolism, patients may never experience a clinical response to medication. Thus, these patients may require higher doses to achieve the same therapeutic e ect as patients who metabolize the medication as predicted. • Extensive metabolizers process the medication as predicted.
Pharmaceutical companies provide dosing guidelines based on extensive metabolizers, so clinicians can assume that no dosing adjustments are needed. • Intermediate metabolizers process medications more slowly than extensive metabolizers.6 Because of this, these patients may exhibit variable responses or experience greater adverse e ects and may need lower starting doses of drugs