Background Clinical decisions about management of rheumatoid arthritis (RA) happen at intermittent clinic visits. In the absence of objective measures of disease severity between visits, understanding fluctuating disease severity largely relies on patients' symptom reports, and thus on patients' recall, eloquence, stoicism and willingness to discuss symptoms.
The Remote Monitoring of Rheumatoid Arthritis (REMORA) study aims to improve monitoring of disease severity in RA. Patients, clinicians and researchers co-designed a smartphone app to enable RA patients to report daily symptoms in between clinic visits with data integrated into the electronic health record. The data presented here were collected as part of the REMORA feasibility evaluation.
Objectives To evaluate the completeness of patient-reported symptom data submitted through the REMORA smartphone app over three months.
Methods We invited 23 RA patients treated at the outpatient clinic of Salford Royal Foundation Trust (UK) to record their symptoms for three months using the REMORA smartphone app. Participants received notifications to record: daily scores for the seven items of the RA Impact of Disease score, as well as morning stiffness on a scale from 0 to 10; weekly scores for thirteen items on 28 tender and swollen joint self-assessments, global assessment, impact on work and activity, and flare occurrence; and monthly scores for the Health Assessment Questionnaire (HAQ) 20-item disability scale. We calculated the time that participants were in the study as the number of days between the first and last day of submitting daily scores. We then explored patterns of data entry, as well as entry completeness.
Results Twenty patients accepted the invitation to participate. Eight (40%) were male, all but one were white British, and their mean age was 56.9±11.1 years. The median number of days in the study was 82 (interquartile range [IQR], 80 to 82). While being in the study, participants submitted daily scores on almost all days (median, 91% of days; IQR, 78 to 95), with four doing this on <60% of study days. Across participants, almost all of 1325 daily entries were complete, with only nine (<1%) having missing values for up to two individual items. Participants submitted weekly scores for a medium of 11 out of 13 weeks (IQR, 10 to 12). Of all 213 weekly entries, fifteen (7%) had missing values, but never more than two. Lastly, 8/20 participants provided monthly HAQ scores only once, while a further 9/20 and 3/20 participants did this for two and all three months, respectively. No monthly entries had any missing values.
Conclusions Our feasibility study showed that smartphones have the potential to support collection of daily patient reports of symptoms with high levels of completeness over three months. Lengthier monthly question sets were less likely to be completed compared to briefer daily and weekly ones. Future steps include exploring methods for adapting data entry frequency to (fluctuations in) disease severity in order to support sustained symptom reports over longer time periods and in a wider group of patients.
Disclosure of Interest None declared
Statistics from Altmetric.com
If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.