All adults with atrial fibrillation (AF) over age 75 meet CHA2DS2-VASc criteria-based guidelines for long-term oral anticoagulation. In reality, about 60% of these people are anticoagulated. Although the risks of anticoagulation outweigh its benefits for some, for many the net clinical benefit remains high. So what’s driving this 40% gap between guidelines and real-world practice?
Prescribers are hesitant to anticoagulate these patients for systemic reasons like the need for individualized decision-making and limited evidence, as well as patient-specific reasons like fear of bleeding risk, falls, and inability to adhere to therapy.
Let’s start with the systemic factors. Conventional risk stratification tools for stroke and bleeding have limited use in older adults. This is not only because anyone over age 75 starts with a CHA2DS2-VASc of 2 and HAS-BLED of 1, but also because those scores increase in parallel for older adults with comorbidities. The ATRIA risk score does account for increasing age after 75, and has been found in some studies to perform better than CHA2DS2-VASc, however, it is less widely used in part due to its more recent development.
Decision making is made even more difficult by the relative paucity of evidence. Older adults were underrepresented in many of the large trials that demonstrated the better risk-benefit profile of warfarin compared to aspirin and that of DOACs compared with warfarin. Many of the observational and retrospective studies that have attempted to fill this gap unfortunately suffer from selection bias that could favor anticoagulation, as people perceived to be healthier are more likely to be prescribed anticoagulants.
Recent observational data on the clinical benefit of anticoagulation in this age group are mixed. Chao and others compared 11,064 Taiwanese patients with AF with 14,658 patients without, all of whom were above age 90. They found a positive net clinical benefit of warfarin compared with no treatment and antiplatelet therapy, and that DOACs conferred the same benefit as warfarin with a lower risk of intracranial hemorrhage. Shah and colleagues explored this question using decision analysis techniques, and found that the net clinical benefit of warfarin and apixaban decreased with age beyond 75 years in part due to the competing risk of death. Warfarin reached minimal net benefit for the median patient at age 87 while Apixaban reached the same threshold at 92.
While these studies can help guide shared decision-making, they do not address the many patient-specific factors upon which these choices often hinge. Specifically, frailty and cognitive impairment are among the most commonly cited reasons for anticoagulant non-prescription. Studies have shown widely varying rates of anticoagulant prescription in frail older adults with AF, with most hospital-based studies finding under-prescription compared with those who are not non-frail. Cognitive impairment is another concern, as it may inhibit adherence to a daily oral regimen or increase the risk of adverse events such as falls. Paradoxically though, AF is associated with increased risk for cognitive impairment and dementia, and Gaita and colleagues have described and found that among people with AF there is more silent cerebral ischemia and lower mean cognitive performance compared with those without AF. This suggests that anticoagulation may be beneficial in preventing progressive cognitive decline, although more definitive data are needed.
So how can we improve treatment for older adults going forward? First and foremost, we need randomized clinical trials that focus on the “oldest old” and have relatively few exclusion criteria, in order to reflect real-world clinical practice. The recent ELDERCARE-AF trial is therefore relevant since it only enrolled patients over age 80 with AF. Among people assigned to received low-dose edoxaban (15 mg PO daily), rates of stroke and systemic embolism were reduced compared with placebo, without significantly increasing major bleeding events. Second, we need better tools to support individualized decision-making, both better risk stratification criteria that incorporate geriatric impairments, as well as decision aids to help patients and clinicians align their priorities and make fully informed decisions. Given the aging of the population in the U.S. and elsewhere, these efforts are timely and essential.
By Aaron Troy, MPH