advertisement
In this study, Gardiner & Demirel compared regression analyses of three standard perimetric summary indices to determine which index can detect statistically significant deterioration earliest in eyes without severe cataract and with POAG 'and/or likelihood of developing glaucomatous damage'. Most eyes must have lacked manifest glaucoma based on most recent MD values. On the average MD showed deterioration somewhat sooner (7.3 years) than VFI (8.5 years), while PSD changes appeared later (10.5 years). In what the authors call 'moderately damaged eyes, with most recent MD values between -0.505 (!) and -19.5 dB', the worst half of their material, MD and VFI did not differ significantly, but 'were almost equivalent'. This must mean that in eyes with field loss MD and VFI must have been very similar.
MD and PSD were rather insensitive for detection of progression events
Detecting glaucoma progression events based upon the significance of regression slopes of global indices, as the authors have done was standard in the 1980's. It soon became apparent, however, that MD and PSD were rather insensitive for detection of progression events, as reported by Chauhan, Drance & Douglas in 1990.1
These early experiences led to the development of analyses focusing on localized change, e.g., glaucoma change probability maps and Progressor. These and other types of event analysis were used in the important CRTs, e.g., AGIS, CIGTS, EMGT and recently UKGTS, and performed well.
Today, linear regression of VFI and MD is used to determine rates of progression, rather than progression events. A statistically significant and very shallow slope is usually without clinical importance. In contrast a steep, but still statistically non-significant slope may require immediate attention. We commend the authors for confirming these historical findings.