Keywords

1 Introduction

Pupillometry is used to measure pupillary light reflex (PLR) as a response to a certain stimulus, usually light. This method has been used by clinicians to infer patient’s consciousness and neurological status doing either the swinging flashlight test or using automated pupillometers.

Chromatic pupillometry, using coloured light stimuli, emerged over the last 20 years due to the discovery of melanopsin, a photopigment present in intrinsically photosensitive retinal ganglion cells (ipRGCs) [1, 2]. These cells are sensitive to the absorption of short-wavelength (blue) visible light. [3] These recently found cells brought renewed interest in pupillometry research and in its potential use to detect neuro-ophthalmological diseases [4,5,6].

Automated pupillometers, usually using near-infrared cameras, allow effective quantitative measurement of PLR. However, these types of systems are mainly expensive, not portable and with need of a trained operator. To have a screening tool, more accessible technologies need to be found. Smartphones could be a technological option to widespread pupillometry, reducing those mentioned limitations. The usage of smartphones for pupillometry started around 2013 [7] and there are some studies using this device as a pupillometer since then [7,8,9]. However, there is no evidence of a smartphone-based system built for chromatic pupillometry, apart from some preliminary results presented in a study from our group [10, 11].

In the present work, a smartphone-based pupillometer was tested and validated to be used for chromatic pupillometry, with the goal of understanding its viability and effectiveness to substitute traditional equipments used for this technique. Thus, this work is part of a validation stage of a medium range Android smartphone pupillometer app for chromatic pupillometry in healthy individuals. Acquisition protocol was also a part of the research in this study, to assess the protocols needed for chromatic pupillometry and neuro-ophthalmological diseases screening.

2 Methods

2.1 Study Participants

Six participants with no known visual abnormalities have been selected for this study. First, was determined the subject’s dominant eye, information needed for the acquisition set up as explained in section B. Then pupillometry measurements were made to the opposite eye, for different conditions. All the recorded videos and participants data was anonymized and codified.

This study was approved by the Hospital Santa Maria (Lisbon – Portugal) ethics committee, and a written informed consent was obtained from all the participants.

2.2 Smartphone-Based Pupillometry System

The proposed system uses a smartphone for acquisition, through an Android application developed to support video acquisition and control of the flash light. This study used the rear facing cameras of a Nokia 7 Plus (Nokia Corporation, HMD Global, Finland), with Android version 9 Pie. The application uses the rear-facing cameras to acquire video and activates the rear-facing flash to work as a stimulus. Camera2 API from Android Developers was used for development to implement these requirements.

In terms of acquisition protocol, the app was prepared to enable the user to start recording, the flash activates automatically at a certain instant and with a certain duration and the recording continues for a defined period. When finished, the video file is saved in the smartphone. The app allows some parameters to be configured: time to stimulus (before stimulus), stimulus duration and total recording time.

While the traditional pupillometers use a near-infrared camera, in this study we are using only the smartphone cameras as they are. So, to have a sufficient image resolution, the subjects face was illuminated with an external light, working as a background light, which was placed in front of the face, in a lower position in comparison to the eye. In this way the eye and face get illuminated to obtain a good image quality and enough contrast between pupil and iris but without a direct light entering the retina.

As for the chromatic pupillometry, the rear-facing white flash light was filtered using a standard grade cellophane paper in front of it, in blue and red colors. In previous works these filters spectrum were referred and characterized [10], with the peak wavelengths of 451.9 nm for the blue light and 611.5 nm for the red one.

A chinrest was used in all the experiments to guarantee a steady support for the subject’s face and a stabilization of the eye in front of the smartphone. It was asked for the individuals to focus with the dominant eye in a target image placed at the same high of the flash light at a distance of around 70 cm to the smartphone, near to the region where the individuals no longer saw the target due to smartphone occlusion. Accommodation of the non-dominant eye, which was the eye recorded, was then reduced and proper quantity of light entering the retina was expected when stimulus occurred.

2.3 Pupillary Data Processing

Data processing was performed in a computer after the acquisitions, using a Python algorithm and OpenCV library. First step was the extraction of all the frames from the video, followed by cropping each frame in the region of the eye and then a pupil detection algorithm is executed in each frame to get pupil characteristics.

After getting all the frames, the cropped eye image in each one of them is converted to gray, and the image is ready for pupil detection algorithms. Two algorithms were used for pupil detection, first Pupil Reconstructor (PuRe), developed by Santini et al. [12], was applied to the frame and in case it failed detecting the pupil, a threshold-based algorithm developed by us was run to detect the pupil.

PuRe is based on a novel edge segment selection and conditional segment combination schemes. It was designed for eye images acquired with near-infrared cameras and works purely based on edges through the selection of curved edges that could be significant points of pupil’s contour. These selected segments are the conditionally combined to get pupil outlines candidates. An ellipse is fitted in each candidate, which are then evaluated through a confidence measure until the best pupil contour is found, considering its roundness and close to circular shape.

Fig. 1.
figure 1

Acquisition protocol schema [11].

Fig. 2.
figure 2

Examples of eye frames in red (a) or white (b) background light.

Fig. 3.
figure 3

Examples of eye frames in gray scale with pupil detected.

When PuRe fails, a second algorithm is used. This algorithm is threshold-based, meaning that the initial gray image, after a blur filter to remove noise, is binarized according to a certain threshold, manually chosen to properly segment the pupil. Then morphological operations are applied followed by the OpenCV findContours function. The detected contours are then filtered, by rejecting the contours with big extend that are not close to a pupil shape and the ones with a small area. Finally, the minEnclosingCircle function is used to find the best fitting circle and this is the detected pupil.

After having pupil data for each frame, pupil response signal is filtered to remove blinks and outliers using an exponential weighted moving average filter.

Pupil area was the parameter used in this study. Pupil size in pixels was then normalized by baseline, which is the mean pupil size before stimulus. Based in the equation mentioned in [13], pupil size was normalized according to the Eq. 1, where 100% means pupil’s baseline size.

$${\text{pupil}}\;{\text{constriction}}\, = \,100 - \frac{{{\text{baseline}} - {\text{absolute}}\;{\text{size}}}}{{{\text{baseline}}\;{\text{size}}}} \times 100$$
(1)

2.4 Chromatic Pupillometry Protocols

The acquisition protocols are influenced by the recording and stimuli durations. In this work a protocol was defined with initial light adaptation period, followed by 5 s of initial recording to get pupil’s baseline, then the stimulus (with different durations) was shown and finally were 30 s of continuous recording to get pupil recovery. A pause was made in between measurements. Each scenario was repeated three times. A schema of this protocol is represented in Fig. 1.

Different acquisition scenarios were tested to understand what were the ones that would allow the activation of ipRGCs with the blue light stimuli, which in pupillometric data means a sustained and slower response using blue stimuli in comparison to the red ones [14]. Stimuli duration was varied: 1, 2, 3 or 10 s; background light was also varied: white and red lights. To each stimuli duration were made measurements with blue and red stimuli.

3 Results

Six participants were included in this preliminary study, with an average age of 33 years old. Gender ratio was 2:4 (Male: Female).

The smartphone application performed as expected allowing to do all the acquisition protocols desired and saving the video files in the smartphone. Video data processing occurred after the acquisitions in a computer. Figure 2 shows examples of frames with both white and red background lights to illustrate the quality of the acquired images. There is a clear contrast between iris and pupil. The pupil detection algorithms performed well in most of the cases; two examples of frames with pupil detected are shown in Fig. 3.

Fig. 4.
figure 4

Average PLR curves for blue and red stimuli with different durations: a) - 1 s; b) - 2 s; c) - 3 s and d) - 10 s. Gray area - time with stimuli on. a) N = 4 individuals. b), c) and d) N = 2 individuals.

Fig. 5.
figure 5

Average PLR curves for blue and red stimuli for same individual for 10 s stimulus duration: a) – white background light, b) – red background light. Gray area represents the time with stimuli on.

The experiments with white background light with 1s stimuli duration were performed with four individuals. Stimuli with 2 s, 3 s and 10 s were only tested in two subjects. The red background light was only tested in one individual for a 10 s stimuli duration.

Graphic results for 1 s, 2 s, 3 s and 10 s stimuli duration with white background light are shown in Fig. 4. These graphs contain the average curves of the experiments for each scenario. Visually, one can observe similarities between red and blue stimuli curves for each different stimuli duration, particularly in the pupil recovery after stimulus.

As for the red background light, was tested only in one individual for now. Figure 5 shows its PLR curve for 10s stimuli duration for both white and red background light, respectively. In the first case both red and blue curves are similar; as for the red background light, there is a small difference between blue and red responses starting at around 18 s in the graph (3 s post-stimulus), with slower response for the blue stimulus.

4 Discussion

The pupillometric system developed in this work presented good results for eye image quality and pupil detection, indicating a sufficient quality of the acquired images with a smartphone for good chromatic pupillometric analysis using background light.

The acquisition protocol was under study in this work so that the activation of different photoreceptors was accomplished using chromatic stimuli and background light. With the goal to find a protocol that would activate ipRGCs, which means different pupil recovery curves for blue and red stimuli, stimuli duration was varied. The results obtained for the different stimuli durations show a similarity between blue and red PLR curves for all the durations using white background light. This similarity for both colours with background light adaptation may indicate the dominance of cones in the PLR, with almost no contribution of rods or melanopsin, according to Park et al. [14]. Although these protocols did not indicate to activate ipRGCs, it could be suggestive of a protocol to test cone function in PLR by using white background light, preferably with 1s duration for better comfort of the individual.

As the expected sustained response after blue stimulus offset was not achieved with the white light background, a red background colour was experimented to see if there was an indication of melanopsin (ipRGCs) activation. Red background test shows a small, sustained response 3s after blue stimulus offset (Fig. 5b). Although this result is from one individual and shows a small difference in pupil recovery curve, it can be an indicator that further research should be made and repeat this protocol with more individuals and clarify the viability of the red background light.

Future work should be increasing the sampling and the number of participants for all scenarios to have more significant results. After protocol validation in healthy individuals, tests should be made in patients with neuro-ophthalmological diseases to assess the viability of this system and protocols to be a screening tool.

5 Conclusions

In this study was developed and validated a smartphone-based pupillometer that allows chromatic stimuli with the main goal to be used as a screening tool for neuro-ophthalmological diseases. It was verified that a smartphone has technology sufficient to accomplish good pupillometric results. The acquisition protocol that activates ipRGCs still needs further research. Red background light seems promising to reach a different response between blue and red stimuli. Further research is needed to validate this system as a screening tool for neuro-ophthalmological diseases.