Background Osteoporosis occurs more frequently in rheumatoid arthritis (RA) patients than in healthy individuals. However the appropriate interval for the bone mineral density (BMD) measurement in RA patients is not well established.
Objectives This study investigated the effective BMD measurement interval and the risk factors associated with the development of osteoporosis for RA patients.
Methods A retrospective study was performed on 511 RA patients aged more than 40 years old who had undergone BMD (DXA, GE LUNAR PRODIGY ADVANCE) testing more than once and who had normal BMD or osteopenia at the baseline BMD test and no history of any fracture of the spine or femur. The subjects were categorized into four subgroups: normal BMD (T-score > -1), mild (-1 ≥ T-score > -1.5), moderate (-1.5 ≥ T-score > -2), and advanced (-2 ≥ T-score > -2.5) osteopenia. The BMD testing interval was defined as the estimated time for 10% of the RA patients to make the transition into osteoporosis without osteoporotic fracture or the administration of any osteoporosis drug.
Results The observation period was 2,214 patient-years, with an average of 4.3 years. The estimated BMD testing interval was more than 10 years for normal, 4.3 years for mild, 2.5 years for moderate, and 1.5 years for advanced osteopenia in each of the RA patient groups.
Conclusions Our study indicated that in normal or osteopenic RA groups, a baseline BMD T-score is the most important factor in estimating the interval in which osteoporosis is predicted to occur. In addition, we recommend that the BMD measuring interval should be greater than 10 years in normal BMD RA patients, 4 years in mild, 2 years in moderate, and 1 year in advanced osteopenic RA patients on the basis of L-spine BMD.
Disclosure of Interest None declared