advertisement
One of the most challenging aspects of detection of visual field progression is the determination of the optimal
frequency of testing. Jansonius (129) proposed a theoretical framework to compare two different strategies for detection of progressive visual field loss in glaucoma. In the first strategy, visual field testing is performed at fixed intervals of six months. In the second strategy, the frequency of testing is set to one test per year as long as the fields are apparently unchanged, but as soon as progression is suspected, subsequent visual field tests are performed shortly thereafter to confirm or discard progression. Considering a need of two additional visual fields to confirm progression, he concluded that the first strategy results in a maximum delay of 18 months until progression can be confirmed, whereas the second strategy results in a maximum delay of only 12 months.
The results of Jansonius are interesting and in line with what is generally done in clinical practice, that is, in the presence of suspicious progression, clinicians tend to request confirmatory visual fields in a shorter time interval. However, the calculations of time delay until confirmation of progression, as presented in the paper, are a simplification of the reality. They assume that a clinician is able to unambiguously detect a visual field that shows suspicious progression and that confirmation of deterioration can always be performed with two additional tests. Clinical experience and results from clinical trials show us that this might not always be the case. It is likely that the results of the present study would also be influenced by the use of different methods for assessing visual field progression, such as the Glaucoma Change Probability Analysis or algorithms that have been used in clinical trials, such as AGIS or CIGTS. In fact, a combination of an optimal frequency of testing and optimal strategy for detection of visual field progression in glaucoma has yet to be determined.