검색
검색 팝업 닫기

Ex) Article Title, Author, Keywords

Article

Split Viewer

Article

Curr. Opt. Photon. 2022; 6(2): 151-160

Published online April 25, 2022 https://doi.org/10.3807/COPP.2022.6.2.151

Copyright © Optical Society of Korea.

Optical Design of a Snapshot Nonmydriatic Fundus-imaging Spectrometer Based on the Eye Model

Xuehui Zhao1, Jun Chang1 , Wenchao Zhang1, Dajiang Wang2 , Weilin Chen1, Jiajing Cao1

1School of Optics and Photonics, Beijing Institute of Technology, Beijing 100081, China
2Department of Ophthalmology, The Third Medical Center of Chinese PLA General Hospital, Beijing 100143, China

Corresponding author: *optics_chang@126.com, ORCID 0000-0001-8048-1956
**glaucomawang@163.com, ORCID 0000-0002-8726-1996

Received: September 8, 2021; Revised: December 20, 2021; Accepted: January 17, 2022

This is an Open Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (http://creativecommons.org/licenses/by-nc/4.0/) which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original work is properly cited.

Fundus images can reflect ocular diseases and systemic diseases such as glaucoma, diabetes mellitus, and hypertension. Thus, research on fundus-detection equipment is of great importance. The fundus camera has been widely used as a kind of noninvasive detection equipment. Most existing devices can only obtain two-dimensional (2D) retinal-image information, yet the fundus of the human eye also has spectral characteristics. The fundus has many pigments, and their different distributions in the eye lead to dissimilar tissue penetration for light waves, which can reflect the corresponding fundus structure. To obtain more abundant information and improve the detection level of equipment, a snapshot nonmydriatic fundus imaging spectral system, including fundus-imaging spectrometer and illumination system, is studied in this paper. The system uses a microlens array to realize snapshot technology; information can be obtained from only a single exposure. The system does not need to dilate the pupil. Hence, the operation is simple, which reduces its influence on the detected object. The system works in the visible and near-infrared bands (550–800 nm), with a volume less than 400 mm × 120 mm × 75 mm and a spectral resolution better than 6 nm.

Keywords: Fundus camera, Fundus spectrum, Imaging spectroscopy, Optical design

OCIS codes: (080.2740) Geometric optical design; (120.4570) Optical design of instruments; (220.3620) Lens system design; (300.6550) Spectroscopy, visible

The pupil of the human eye is the only window through which small blood vessels can be seen non-invasively, and the eye’s arteriovenous microvessels are many. By observing the fundus, eye diseases such as glaucoma can be detected, and the conditions of several systemic diseases, such as hypertension and diabetes, can be understood [1, 2]. Fundus-imaging technology is not only an important basis for the diagnosis of ophthalmic diseases, but also an important means to judge the extent of other diseases in the body. Thus, its research and development are of great importance. The first fundus camera was developed by Zeiss in 1925. Since then, research on fundus camera equipment has gradually developed [3, 4]. A traditional fundus camera has the advantages of simple operation, low cost, real-time imaging, repeatability, and easy access, making it one of the most economical and effective pieces of fundus-inspection equipment [5].

Traditional devices can only collect two-dimensional images of the retina. However, the human retina is a very complex structure, with many structures behind it, such as the nerve-fiber layer, the inner-plexus layer, the pigment-epithelium layer, and the choroid, for example. The human eye has many pigments, and different pigments absorb different wavelengths of light [6, 7]. Moreover, the different distributions of these pigments in the eye leads to dissimilar tissue penetration of light waves, which can reflect the fundus structure at the corresponding depth. Fundus multispectral imaging (MSI) uses this characteristic to present the detailed changes of the frontal layers of the retina and to realize the non-invasive direct imaging of the retina and the choroid, which can help to obtain images of different tissue layers and structures of the fundus and to provide a new method of fundus examination [810].

Many scholars have carried out research on fundus multispectral detection technology. Li et al. [11] used MSI to image different structures of the choroid in vivo, with a working wavelength covering 550–850 nm. Alterini et al. [12] proposed a multispectral fundus camera (MSFC) with VIS-NIR bands, which selected 15 spectral bands in the band range of 400–1300 nm to visualize the fundus structure from the retina to the choroid. Toslak et al. [13] proposed a portable MSFC and used four narrowband light emitting diode (LED) light sources with different central wavelengths to observe retinal and choroid structures selectively. Huang et al. [14] proposed a MSFC used 12 representative wavelengths within the band range of 470–845 nm to obtain fundus images in different bands and discussed the corresponding relationship between different wavebands and the fundus membrane. In 2020, Carvalho et al. [15] described an eight-band retinal MSI system that featured a high-speed rotating filter wheel containing eight interchangeable optical filters.

In summary, traditional fundus cameras can only obtain 2D retinal images, and thus the information is limited and insufficient to assist the assessment of disease comprehensively. The new multispectral fundus-detection equipment mostly uses different light sources, and the core technology is only the control of the working mode of light. The spectral resolution of most of the multispectral detection equipment mentioned previously is only 30 nm, which is low, and the systems are costly and require multiple exposures to take photos in different wave bands, which makes the operation complicated.

In this paper, a wide-field-of-view snapshot-type nonmydriatic fundus-imaging spectral system (FISS) based on the human-eye model is proposed. Using the method of nonmydriatic photography, the FISS can obtain fundus-image-spectral three-dimensional (3D) cubic data in a single exposure. The system design in this paper has a 60° field-of-view angle and a volume of 400 mm × 120 mm × 75 mm. The spectral resolution is better than 6 nm. The optical design on the system is completed, and its effect analyzed. The innovative research of this fundus-detection device can compensate for the lack of information captured by traditional fundus cameras, and can improve spectral resolution, and simplify detection. This system would assist in the screening, prevention, and detection of related diseases, and in the health monitoring of patients. It could potentially be a helpful tool for telemedicine diagnosis in the future.

2.1. Establishing the Eye Model

The optical properties of the human eye were studied before the design of the system. The human eye is not a perfect optical system, and it contains aberrations. In the optical design of fundus camera, human eye should also be taken into consideration in the design of optical system in order to correct human eye aberration more accurately [16]. It is necessary to establish an optical model of human eye and combine the human eye model with the optical camera system for design.

Improvements were made based on Liou and Brennan’s model [17] to obtain the human-eye model needed for the design process. The structural parameters of human eyes are shown in Table 1, and the model diagram of an eye is obtained using optical design software, as shown in Fig. 1. Based on this model, the illumination optical path and the imaging spectral optical path are simulated in the following chapters.

TABLE 1 Structural parameters of the human-eye model

VariableEye Model Parameter
Radius (mm)Thickness (mm)Source
Retina11--
Vitreum-16.58[17]
Posterior Surface of Crystalline Lens63.7[17]
Anterior Surface of Crystalline Lens−10--
Aqueous Humor-1.5-
Posterior Surface of Cornea−6.70.52[17]
Anterior Surface of Cornea−7.8--
Length of Visual Axis24 mm


Figure 1.Human eye model.

2.2. Structure of the Whole System

The schematic of the FISS is shown in Fig. 2. It mainly consists of an illumination path [part (I) of Fig. 2.] and a snapshot imaging spectrometer [part (II) of Fig. 2.]. These two parts are connected by a beam splitter and share the front set of common eyepieces. The beam splitter is semi-permeable and has semireflective plates [18].

Figure 2.Schematic of the fundus-imaging spectral system (FISS), which contains the following three parts: (I) eyepiece common light path, (II) nonmydriatic lighting path, and (III) imaging spectral light path. ① Human eye model, ② pupil, ③ shared front eyepiece group, ④ beam splitter, ⑤ illumination optical system, ⑥ lighting source, ⑦ front imaging system, ⑧ microlens array, ⑨ collimating system, ⑩ grating, ⑪ rear imaging system, and ⑫ charge-coupled device.

2.3. System Technical Index

In our work, photos are taken without dilating the pupil. The near-infrared (NIR) light source is used to illuminate the fundus, and white-light exposure is used to take photos to obtain the imaging spectral information of the fundus. Therefore, the working band covers the visible (VIS) and NIR bands. Here, the working band of the system is set at 550–800 nm.

Designing a large-field-of-view fundus-detection device with a system-field-of-view angle of 60° is planned. According to [19], the field of view of a fundus camera is defined as the opening angle of light emitted from the pupil of the human eye. A working field of view of ±30° corresponds to an angle of ±24.5° on the retinal surface of the human eye.

An industrial array detector (GS3-U3-23S6C-C; FLIR System Inc., OR, America) [20] with a size of 1/1.2 inch and a pixel size of 5.86 μm × 5.86 μm is used. Then, the image plane size 2y″ of the system should be less than

2y=16mm1.213mm

Based on the previous research and combined with the application environment, the requirements of the FISS system are listed in Table 2.

TABLE 2 Requirements of fundus-imaging spectral system (FISS) system

System ParametersValue
Spectral Range550–800 nm
Field Angle±30°
Minimum Pupil Diameter3 mm
Spectral Resolution<10 nm
Target Surface Size1/1.2 inch
Pixel Size5.86 μm × 5.86 μm
Modulation Transfer Function (MTF)>0.2 @100 lp/mm

3.1. Optical Design Process

The fundus is not a self-luminous object; it needs to be illuminated to be photographed by fundus detection equipment. The reflectivity of the fundus is about 0.1–10%, and the effective average fundus reflectance is less than 0.3%, considering the pupil’s limitation. The VIS light reflectivity of the cornea is about 2%, much higher than that of the fundus [2123]. As a result, the reflected light from fundus imaging is easily overwhelmed by the reflected light from the cornea. To avoid the influence of stray light on the imaging quality of the system, the minimum pupil diameter of the imaging system is set to 3 mm, and the light source is a ring light source that is conjugated with the cornea to form an annular light spot with inner and outer diameters of 3 mm and 4 mm respectively at the cornea. In this manner, the light incident upon the corneal edge will not be reflected in the later system [24, 25]. Eventually the annular spot will pass through the pupil, illuminating the fundus evenly.

Here the NIR- and white-LED light beads are selected to alternate to form an annular light source, and the outer diameter of the ring is 10 mm. NIR light is used to illuminate the fundus of the eye while searching for the region to be imaged. White-LED light is used during image acquisition. The beam splitter is a 50:50 wide band beam splitter. The working mode of the lighting adopts a nonmydriatic pupil design. The nondilated pupil is just a method of fundus-detection and does not affect the illumination effect of the retina.

The lighting system needs to satisfy the conjugate relationship between the annular light source and the cornea. The circular light source is taken as an object, and the surface of the cornea is taken as an image of the circular light source. This is similar to the design of an imaging system. The optimized lighting-system layout is shown in Fig. 3.

Figure 3.3D layout of the illumination system.

3.2. Design Results and Performance

An optical simulation is carried out to evaluate the effect of irradiance.

A total of 500,000 rays are traced to simulate fundus illumination and obtain the irradiance map of the cornea and the fundus, as shown in Fig. 4. The outer diameter of the circular spot on the cornea is approximately 4 mm, and the illuminance of the fundus is even, which can satisfy the requirement of fundus illumination.

Figure 4.Illumination simulation: (a) illumination on cornea, (b) illumination on fundus.

4.1. Principle of the Fundus-imaging Spectrometer

This paper mainly focuses on the design of a fundus imaging spectrometer to obtain a three-dimensional data set composed of two-dimensional spatial information (x, y) and spectral information (λ) of the fundus. This work proposes acquiring data using a snapshot technique that can present a subgraph array of spatial and spectral three-dimensional information of an image on the charge-coupled device (CCD) detector. A microlens-array element is intended to be used to achieve this snapshot technology [26].

During the process, designing a front imaging system (⑦ in Fig. 2) is necessary to form an intermediate image of the retina. Then, the microlens-array (⑧ in Fig. 2) element, which is placed on the primary intermediate image surface to conduct spatial down-sampling of the primary image of the retina and form a sub-image array, is used. After that, a collimation system (⑨ in Fig. 2) is designed to collimate the light-field information after down-sampling to form parallel light [27]. The parallel light passes through the dispersion element (grating ⑩ in Fig. 2) and the rear imaging system (⑪ in Fig. 2), and finally reaches the detector (⑫ in Fig. 2).

The physical process of the system operation can be expressed as follows:

I= O(x,y,λ) i,j M,Nrect(xxci,j,d,yyc i,jd)p(λ,n,θ,ϕ)dxdy

where O(x, y, λ) is the object spatial information (containing fundus spectral information and two-dimensional image information), d is the aperture of each microlens array, i,j M,Nrect(xxc i,j,d,yyc i,jd) represents the down-sampling treatment, (xxc i,j, yyc i,j) are the centroid coordinates of the (i, j) sub image, (M, N) is the number of microlens arrays in two dimensions, and p(λ, n, θ, ϕ) represents the dispersion effect of the dispersion element (θ and φ are respectively the incident angle and diffraction angle on the grating).

During the design, the matching of the microlens array system to the front imaging system should be considered. When the F-number of the front system Ff is less than that of the microlens array system Fm, the subimages will overlap. If Ff > Fm, free gaps will exist between the sub images. So it is important to note that the F-numbers between these two parts should be equal or nearly so [28, 29] (namely Fm = Ff).

The front-facing imaging system designed here is an image square telecentric structure, and the image height of the front system is y′ = 5.5 mm. According to the size of the detector target surface, the final image height of the system should be less than 13 mm. Hence,

y=fBfC·y13mm

where FB and FC are the focal lengths of the rear imaging system and collimation system respectively. Thus,

fBfC13mm5.5mm2.36

The diameter of each single microlens is set as dm = 0.5 mm. Then, the size of each subimage on the final surface can be calculated by

Dm=dmFfFbFmFc=dm×fBfC

where Ff, FB, Fm, and FC are the F-numbers of front imaging, back imaging, microlens array, and collimating system respectively.

In our system, the focal length of the collimating system is FC = 11 mm, and that of the rear imaging system is FB = 13 mm; based on Eq. (3), we set the focal-length ratio of the rear imaging system to the collimating system to be 1.1, and the size of the subimage of the microlens is

Dm<0.5mm×fBfC

According to the pixel size, the number of pixels contained in each sub-image in one dimension is

P=DmPixelsize<0.5mm×fBPixelsize×fC

Thus the number of pixels on the detector is nearly 72 × 72, and the theoretical spectral resolution δλ can reach

δλ=λmaxλminP>λmaxλmin×Pixelsize×fC0.5mm×fB3.5nm

Combined with the technical index requirements proposed here, the theoretical spectral resolution range of the system should be between 3.5 and 10 nm.

For a grating-type imaging spectrometer, multistage diffraction should be considered. For a wide range of working bands, the phenomenon of low diffraction order of the long-wave beam and high diffraction order of the short-wave beam is likely to occur. To prevent overlap, it is generally required to satisfy

mλmax<(m+1)λmin

where m is diffraction order; then we have

m<λminλmaxλmin

We take the value of the diffraction order m = +1.

4.2. Design Result and Performance of the Fundus-imaging Spectrometer

The system uses microlens-array components to realize the snapshot technique, and uses the grating as the dispersion element. Finally, we combine these parts, add the eye model, and optimize the integrated system. The final 3D layout of the fundus imaging spectrometer is obtained, as shown in Fig. 5.

Figure 5.3D layout of the imaging spectral system.

Here we choose 100 lp/mm as the sampling frequency, and the spatial frequency δi of the system can be calculated by

δi=12Lp=5μm

The lateral magnification β of our imaging system is 0.34, and the spatial resolution δo at the object plane is

δo=δiβ14.7μm

which satisfies the need for medical observation (<15 μm, which is a value based on medical experience, to see the fundus structure clearly [17])

Figure 6 shows the modulation transfer function (MTF) curve of the system at a sampling frequency of 100 lp/mm, where the MTF values of the whole system are all higher than 0.3 at typical wavelengths.

Figure 6.Modulation transfer function (MTF) curves at typical wavelengths of the fundus imaging spectrometer: (a) MTF curve at 800 nm, (b) MTF curve at 656 nm, and (c) MTF curve at 550 nm.

We analyze the spectral resolution using optical-design simulation software. We add bands near the edge band and the central reference band with an interval of 6 nm. We observe the spot diagram of the system, and here we mainly analyze the spot diagrams of the center and edge fields of view, as shown in Fig. 7. When the wavelength interval is set at 6 nm, the spots on the spot diagram can be distinguished clearly. Therefore, the spectral resolution of the system is better than 6 nm, which meets our design requirements. Figure 8 is the matrix spot diagram of the whole system.

Figure 7.Partial enlargement of spot diagram (a)–(c) at the center and (d)–(f) edge fields of view, for different bands: (a) 800–788 nm, (b) 662–650 nm, (c) 562–550 nm, (d) 800–788 nm, (e) 662–650 nm, and (f) 562–550 nm.

Figure 8.Matrix spot diagram of the whole system.

5.1. Performance of the System

Finally, all of the systems are combined. The 3D layout of the overall system is shown in Fig. 9. The volume of the whole system is approximately 400 mm × 120 mm × 75 mm.

Figure 9.3D layout of the overall system.

Several traditional fundus-camera desktop systems, such as the RetiCam-3100 (SysEye, Shanghai, China) with a field of view of 50° can only obtain a two-dimensional retinal image of the fundus. The TRC-50DX (TopCon, Tokyo, Japan) is 340 mm × 505 mm × 715 mm in volume and the pupil of the tested person needs to be dilated, which cannot provide fundus-spectrum information either. Some new fundus-spectral-detection schemes, such as the multispectral method Alterini et al. [12] used, can only realize a spectral resolution of 60 nm. Also, the MSFC proposed by Huang [14] has a spectral resolution of 30 nm and is costly.

New technologies and research methods concernig the fundus spectrum are introduced in this paper. The focus is mainly on nondilated snapshot-type technology, which is evidently different from the previous fundus-spectrum-detection technology (existing articles mostly study switching different light sources or mixing white a light with filter, and other technologies to achieve fundus spectrum detection).

Compared to previous systems, the system designed heres (FISS) has advantages in volume and achieves miniaturization. In addition, the field of view angle is larger (60°), and the detection method of a non-mydriatic pupil is adopted. Most importantly, we design a fundus spectral-imaging system that enriches the detected information. Compared to previous multispectral fundus-detection equipment, it realizes technological innovation. By using the snapshot technology, the multispectral fundus image can be obtained with a single exposure. The spectral resolution of our system is better than 6 nm, which is much smaller than the 30-nm spectral resolution of traditional multispectral detection equipment. The simulation results show that the design is feasible and reasonable.

5.2. Systemic Stray-light Analysis

The stray light of the optical system can seriously affect its beam quality and transmission characteristics. The stray light of the system designed here may come from the following several aspects: One is the backscattered light of the glass of the front lens group in front of the beam splitter, which will be transmitted to the imaging spectral system through the beam splitter and then received by the detector. This part will interfere with the reflected light from the bottom of the eye, creating stray light. On the other hand, we may produce a prototype in the future. At that time, the inner wall of the actual prototype of the mechanical shell will certainly have an impact on the lighting part. Thus the influence of the inside surface of the optical machining is also considered. The effect of parasitic light is evaluated by optical simulation. A 0.8% reflectivity is assigned to the rear surface of the front lens group, and the effect of the mechanical inner wall is fully considered. Ambient light larger than the field of view is applied, and the illumination on the image surface is analyzed. Almost no backscattered light can be detected on the image plane. Thus, the stray light of the system has minimal influence, as shown in Fig. 10.

Figure 10.Backscattered light received by detector.

The next part is the stray light generated by the overlap of diffracted light of different orders of the grating. The previous calculation and analysis clarifies that a diffraction order of +1 is reasonable and feasible in theory, and will not cause the overlapping of different diffraction orders. Multiple structures are established for experimental verification.

The result in Fig. 11 shows that different orders of diffracted light can be separated without interference (where green represents a diffraction order of +1 and red represents +2). Thus the influence of stray light is avoided.

Figure 11.Different orders of diffracted light.

In summary, a snapshot nonmydriatic FISS based on an eye model is proposed. The human-eye model is established; relevant theoretical analysis and calculation are carried out; the optical system design is completed; and the design results and performance are analyzed. We overcome the limitations of a traditional desktop fundus camera being able to detect only 2D retinal information, of the MSFC’s complexity, and of low spectral resolution. A snapshot nonmydriatic fundus spectral-imaging system with a larger field of view of 60° and a smaller size (400 mm × 120 mm × 75 mm) is designed in this paper. The system works in the VIS and NIR bands, and the spectral resolution is better than 6 nm, which is much better than the 30-nm spectral resolution of most existing fundus multispectral spectroscopic systems. This new fundus spectral-imaging device can obtain multidimensional fundus information, which is of great importance in assisting the screening, prevention, and detection of related diseases.

Data underlying the results presented in this paper is not publicly available at the time of publication, which may be obtained from the authors upon reasonable request.

The authors acknowledge the ophthalmology staff from the department of ophthalmology of the Third Medical Center of Chinese PLA General Hospital for their guidance of our ophthalmology knowledge.

National Natural Science Foundation of China (NSFC) (Grant No. 61471039), Beijing Natural Science Foundation (BNSF) (Grant No. 7212092) and Capital Medical Development Research (CMDR) (Grant No. 2022-2-5041).

  1. K. Zhao and P. Yang, Ophthalmology, 8th ed. (People’s Medical Publishing House, Beijing, China, 2010), pp. 319-323.
  2. H. Zhang and N. Liu, Atlas of Ocular Fundus Diseases (China Medical Publishing House, Beijing, China, 2007), pp. 3-8.
  3. G. Huang, X. Qi, T. Y. P. Chui, Z. Zhong, and S. A. Burns, “A clinical planning module for adaptive optics SLO imaging,” Optom. Vis. Sci. 89, 593-601 (2012).
    Pubmed KoreaMed CrossRef
  4. X. Wang and Q. Xue, “Optical design of portable nonmydriatic fundus camera with large field of view,” Acta Opt. Sin. 37, 0922001 (2017).
    CrossRef
  5. N. Patton, T. M. Aslam, T. Macgillivray, I. J. Deary, B. Dhillon, R. H. Eikelboom, K. Yogesan, I. J. Constable, “Retinal image analysis: concepts, applications and potential,” Prog. Retin. Eye Res. 25, 99-127 (2006).
    Pubmed CrossRef
  6. H. Lu and H Li, The Principle and Clinical Application for Ocural Optical Coherence Tomography (World Publishing Corporation, Xi ’an, China, 2013), pp. 56-60.
  7. R. Li, Fundus fluorescein Angiography and Optical Imaging (People’s Medical Publishing House, Beijing, China, 2010), pp. 21-23.
  8. S. S. Hayreh, “In vivo choroidal circulation and its watershed zones,” Eye 4, 273-289 (1990).
    Pubmed CrossRef
  9. D. L. Nickla and J. Wallman, “The multifunctional choroid,” Prog. Retin. Eye Res. 29, 144-168 (2010).
    Pubmed KoreaMed CrossRef
  10. H, Takehara, H. Sumi, Z. Wang, T. Kondo, M. Haruta, K. Sasagawa, and J. Ohta. “Multispectral near-infrared imaging technologies for nonmydriatic fundus camera,” in Proc. IEEE Biomedical Circuits and Systems Conference-BioCAS (Nara, Japan, Oct. 17-19, 2019), pp. 1-4.
    Pubmed KoreaMed CrossRef
  11. S. Li, L. Huang, Y. Bai, Y. Cheng, J. Tian, S. Wang, Y. Sun, K. Wang, F. Wang, and Q. Zhang, “In vivo study of retinal transmission function in different sections of the choroidal structure using multispectral imaging,” Investig. Ophthalmol. Vis. Sci. 56, 3731-3742 (2015).
    Pubmed CrossRef
  12. T. Alterini, F. Díaz-Doutón, F. J. Burgos-Fernández, L. González, C. Mateo, and M. Vilaseca, “Fast visible and extended near-infrared multispectral fundus camera,” J. Biomed. Opt. 24, 096007 (2019).
    KoreaMed CrossRef
  13. D. Toslak, T. Son, M. K. Erol, H. Kim, T.-H. Kim, R. V. P. Chan, and X. Yao, “Portable ultra-widefield fundus camera for multispectral imaging of the retina and choroid,” Biomed. Opt. Express 11, 6281-6292 (2020).
    Pubmed KoreaMed CrossRef
  14. Z. Huang, Z. Jiang, Y. Hu, D. Zou, Y. Lu, Q. Ren, G. Liu, and Y. Lu, “Retinal choroidal vessel imaging based on multi-wavelength fundus imaging with the guidance of optical coherence tomography,” Biomed. Opt. Express 11, 5212-5224 (2020).
    Pubmed KoreaMed CrossRef
  15. E. R. de Carvalho, R. J. M. Hoveling, C. J. F. van Noorden, R. O. Schlingemann, and M. C. G. Aalders, “Functional imaging of the ocular fundus using an 8-band retinal multispectral imaging system,” Instruments 4, 12 (2020)
    CrossRef
  16. J. Polans, B. Jaeken, R. P. Mcnabb, P. Artal, J. A. Izatt, “Wide-field optical model of the human eye with asymmetrically tilted and decentered lens that reproduces measured ocular aberrations,” Optica 2, 124-134 (2015).
    CrossRef
  17. H.-L. Liou and N. A. Brennan, “Anatomically accurate, finite model eye for optical modeling,” Opt. Soc. Am. A 14, 1684-1695 (1997).
    Pubmed CrossRef
  18. W. Chen, J. Chang, X. Zhao, and S. Liu, “Optical design and fabrication of a smartphone fundus camera,” Appl. Opt. 60, 1420-1427 (2021).
    Pubmed CrossRef
  19. Ophthalmic instruments — Fundus cameras, ISO 10940:2009, Technical Committee ISO/TC 172, Ophthalmic optics and instruments (2009).
  20. Teledyne FLIR, “GS3-U3-23S6C-C FLIR Grasshopper®3 High Performance USB 3.0 Color Camera," (Teledyne FLIR), https://www.flir.com/products/grasshopper3-usb3/?model=GS3-U3-23S6C-C (Accessed date: Feb. 01, 2022).
  21. F. C. Delori and K. P. Pflibsen, “Spectral reflectance of the human ocular fundus,” Appl. Opt. 28, 1061-1077 (1989).
    Pubmed CrossRef
  22. M. Hammer and D. Schweitzer, “Quantitative reflection spectroscopy at the human ocular fundus,” Phys. Med. Biol. 47, 179-191 (2002).
    Pubmed CrossRef
  23. J. Schwiegerling, Field Guide to Visual and Ophthalmic Optics, (SPIE Press, USA. 2004).
    Pubmed CrossRef
  24. E. Dehoog and J. Schwiegerling, “Optimal parameters for retinal illumination and imaging in fundus cameras,” Appl. Opt. 47, 6769-6777 (2008).
    Pubmed CrossRef
  25. E. Dehoog and J. Schwiegerling, “Fundus camera systems: a comparative analysis,” Appl. Opt. 48, 221-228 (2009).
    Pubmed KoreaMed CrossRef
  26. J. Cui, Y. Tang, P. Han, M. Pan, and J. Zhang, “Development of diagnostic imaging spectrometer for tumor on-line operation,” Opt. Precis. Eng. 21, 3043-3049 (2013).
    CrossRef
  27. J.-N. Liu, J.-C. Cui, L. Yin, C. Sun, J.-J. Chen, R. Zhang, and J.-L. Liu, “Analysis and design of pre-imaging system of integral field imaging spectrometer based on lenslet array,” Spectrosc. Spect. Anal. 38, 3269 (2018).
  28. Z. Zhang, J. Chang, H. Ren, K. Fan, and D. Li, “Snapshot imaging spectrometer based on a microlens array,” Chin. Opt. Lett. 17, 011101- (2019).
    CrossRef
  29. D. W. Palmer, T. Coppin, K. Rana, D. G. Dansereau, M. Suheimat, M. Maynard, D. A. Atchison, J. Roberts, R. Crawford, and A Jaiprakash, “Glare-free retinal imaging using a portable light field fundus camera,” Biomed. Opt. Express 9, 3178-3192 (2018).
    Pubmed KoreaMed CrossRef

Article

Article

Curr. Opt. Photon. 2022; 6(2): 151-160

Published online April 25, 2022 https://doi.org/10.3807/COPP.2022.6.2.151

Copyright © Optical Society of Korea.

Optical Design of a Snapshot Nonmydriatic Fundus-imaging Spectrometer Based on the Eye Model

Xuehui Zhao1, Jun Chang1 , Wenchao Zhang1, Dajiang Wang2 , Weilin Chen1, Jiajing Cao1

1School of Optics and Photonics, Beijing Institute of Technology, Beijing 100081, China
2Department of Ophthalmology, The Third Medical Center of Chinese PLA General Hospital, Beijing 100143, China

Correspondence to:*optics_chang@126.com, ORCID 0000-0001-8048-1956
**glaucomawang@163.com, ORCID 0000-0002-8726-1996

Received: September 8, 2021; Revised: December 20, 2021; Accepted: January 17, 2022

This is an Open Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (http://creativecommons.org/licenses/by-nc/4.0/) which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

Fundus images can reflect ocular diseases and systemic diseases such as glaucoma, diabetes mellitus, and hypertension. Thus, research on fundus-detection equipment is of great importance. The fundus camera has been widely used as a kind of noninvasive detection equipment. Most existing devices can only obtain two-dimensional (2D) retinal-image information, yet the fundus of the human eye also has spectral characteristics. The fundus has many pigments, and their different distributions in the eye lead to dissimilar tissue penetration for light waves, which can reflect the corresponding fundus structure. To obtain more abundant information and improve the detection level of equipment, a snapshot nonmydriatic fundus imaging spectral system, including fundus-imaging spectrometer and illumination system, is studied in this paper. The system uses a microlens array to realize snapshot technology; information can be obtained from only a single exposure. The system does not need to dilate the pupil. Hence, the operation is simple, which reduces its influence on the detected object. The system works in the visible and near-infrared bands (550–800 nm), with a volume less than 400 mm × 120 mm × 75 mm and a spectral resolution better than 6 nm.

Keywords: Fundus camera, Fundus spectrum, Imaging spectroscopy, Optical design

I. INTRODUCTION

The pupil of the human eye is the only window through which small blood vessels can be seen non-invasively, and the eye’s arteriovenous microvessels are many. By observing the fundus, eye diseases such as glaucoma can be detected, and the conditions of several systemic diseases, such as hypertension and diabetes, can be understood [1, 2]. Fundus-imaging technology is not only an important basis for the diagnosis of ophthalmic diseases, but also an important means to judge the extent of other diseases in the body. Thus, its research and development are of great importance. The first fundus camera was developed by Zeiss in 1925. Since then, research on fundus camera equipment has gradually developed [3, 4]. A traditional fundus camera has the advantages of simple operation, low cost, real-time imaging, repeatability, and easy access, making it one of the most economical and effective pieces of fundus-inspection equipment [5].

Traditional devices can only collect two-dimensional images of the retina. However, the human retina is a very complex structure, with many structures behind it, such as the nerve-fiber layer, the inner-plexus layer, the pigment-epithelium layer, and the choroid, for example. The human eye has many pigments, and different pigments absorb different wavelengths of light [6, 7]. Moreover, the different distributions of these pigments in the eye leads to dissimilar tissue penetration of light waves, which can reflect the fundus structure at the corresponding depth. Fundus multispectral imaging (MSI) uses this characteristic to present the detailed changes of the frontal layers of the retina and to realize the non-invasive direct imaging of the retina and the choroid, which can help to obtain images of different tissue layers and structures of the fundus and to provide a new method of fundus examination [810].

Many scholars have carried out research on fundus multispectral detection technology. Li et al. [11] used MSI to image different structures of the choroid in vivo, with a working wavelength covering 550–850 nm. Alterini et al. [12] proposed a multispectral fundus camera (MSFC) with VIS-NIR bands, which selected 15 spectral bands in the band range of 400–1300 nm to visualize the fundus structure from the retina to the choroid. Toslak et al. [13] proposed a portable MSFC and used four narrowband light emitting diode (LED) light sources with different central wavelengths to observe retinal and choroid structures selectively. Huang et al. [14] proposed a MSFC used 12 representative wavelengths within the band range of 470–845 nm to obtain fundus images in different bands and discussed the corresponding relationship between different wavebands and the fundus membrane. In 2020, Carvalho et al. [15] described an eight-band retinal MSI system that featured a high-speed rotating filter wheel containing eight interchangeable optical filters.

In summary, traditional fundus cameras can only obtain 2D retinal images, and thus the information is limited and insufficient to assist the assessment of disease comprehensively. The new multispectral fundus-detection equipment mostly uses different light sources, and the core technology is only the control of the working mode of light. The spectral resolution of most of the multispectral detection equipment mentioned previously is only 30 nm, which is low, and the systems are costly and require multiple exposures to take photos in different wave bands, which makes the operation complicated.

In this paper, a wide-field-of-view snapshot-type nonmydriatic fundus-imaging spectral system (FISS) based on the human-eye model is proposed. Using the method of nonmydriatic photography, the FISS can obtain fundus-image-spectral three-dimensional (3D) cubic data in a single exposure. The system design in this paper has a 60° field-of-view angle and a volume of 400 mm × 120 mm × 75 mm. The spectral resolution is better than 6 nm. The optical design on the system is completed, and its effect analyzed. The innovative research of this fundus-detection device can compensate for the lack of information captured by traditional fundus cameras, and can improve spectral resolution, and simplify detection. This system would assist in the screening, prevention, and detection of related diseases, and in the health monitoring of patients. It could potentially be a helpful tool for telemedicine diagnosis in the future.

II. SYSTEM STRUCTURE AND PARAMETER INDEX

2.1. Establishing the Eye Model

The optical properties of the human eye were studied before the design of the system. The human eye is not a perfect optical system, and it contains aberrations. In the optical design of fundus camera, human eye should also be taken into consideration in the design of optical system in order to correct human eye aberration more accurately [16]. It is necessary to establish an optical model of human eye and combine the human eye model with the optical camera system for design.

Improvements were made based on Liou and Brennan’s model [17] to obtain the human-eye model needed for the design process. The structural parameters of human eyes are shown in Table 1, and the model diagram of an eye is obtained using optical design software, as shown in Fig. 1. Based on this model, the illumination optical path and the imaging spectral optical path are simulated in the following chapters.

TABLE 1. Structural parameters of the human-eye model.

VariableEye Model Parameter
Radius (mm)Thickness (mm)Source
Retina11--
Vitreum-16.58[17]
Posterior Surface of Crystalline Lens63.7[17]
Anterior Surface of Crystalline Lens−10--
Aqueous Humor-1.5-
Posterior Surface of Cornea−6.70.52[17]
Anterior Surface of Cornea−7.8--
Length of Visual Axis24 mm


Figure 1. Human eye model.

2.2. Structure of the Whole System

The schematic of the FISS is shown in Fig. 2. It mainly consists of an illumination path [part (I) of Fig. 2.] and a snapshot imaging spectrometer [part (II) of Fig. 2.]. These two parts are connected by a beam splitter and share the front set of common eyepieces. The beam splitter is semi-permeable and has semireflective plates [18].

Figure 2. Schematic of the fundus-imaging spectral system (FISS), which contains the following three parts: (I) eyepiece common light path, (II) nonmydriatic lighting path, and (III) imaging spectral light path. ① Human eye model, ② pupil, ③ shared front eyepiece group, ④ beam splitter, ⑤ illumination optical system, ⑥ lighting source, ⑦ front imaging system, ⑧ microlens array, ⑨ collimating system, ⑩ grating, ⑪ rear imaging system, and ⑫ charge-coupled device.

2.3. System Technical Index

In our work, photos are taken without dilating the pupil. The near-infrared (NIR) light source is used to illuminate the fundus, and white-light exposure is used to take photos to obtain the imaging spectral information of the fundus. Therefore, the working band covers the visible (VIS) and NIR bands. Here, the working band of the system is set at 550–800 nm.

Designing a large-field-of-view fundus-detection device with a system-field-of-view angle of 60° is planned. According to [19], the field of view of a fundus camera is defined as the opening angle of light emitted from the pupil of the human eye. A working field of view of ±30° corresponds to an angle of ±24.5° on the retinal surface of the human eye.

An industrial array detector (GS3-U3-23S6C-C; FLIR System Inc., OR, America) [20] with a size of 1/1.2 inch and a pixel size of 5.86 μm × 5.86 μm is used. Then, the image plane size 2y″ of the system should be less than

2y=16mm1.213mm

Based on the previous research and combined with the application environment, the requirements of the FISS system are listed in Table 2.

TABLE 2. Requirements of fundus-imaging spectral system (FISS) system.

System ParametersValue
Spectral Range550–800 nm
Field Angle±30°
Minimum Pupil Diameter3 mm
Spectral Resolution<10 nm
Target Surface Size1/1.2 inch
Pixel Size5.86 μm × 5.86 μm
Modulation Transfer Function (MTF)>0.2 @100 lp/mm

III. OPTICAL DESIGN OF ILLUMINATION SYSTEM

3.1. Optical Design Process

The fundus is not a self-luminous object; it needs to be illuminated to be photographed by fundus detection equipment. The reflectivity of the fundus is about 0.1–10%, and the effective average fundus reflectance is less than 0.3%, considering the pupil’s limitation. The VIS light reflectivity of the cornea is about 2%, much higher than that of the fundus [2123]. As a result, the reflected light from fundus imaging is easily overwhelmed by the reflected light from the cornea. To avoid the influence of stray light on the imaging quality of the system, the minimum pupil diameter of the imaging system is set to 3 mm, and the light source is a ring light source that is conjugated with the cornea to form an annular light spot with inner and outer diameters of 3 mm and 4 mm respectively at the cornea. In this manner, the light incident upon the corneal edge will not be reflected in the later system [24, 25]. Eventually the annular spot will pass through the pupil, illuminating the fundus evenly.

Here the NIR- and white-LED light beads are selected to alternate to form an annular light source, and the outer diameter of the ring is 10 mm. NIR light is used to illuminate the fundus of the eye while searching for the region to be imaged. White-LED light is used during image acquisition. The beam splitter is a 50:50 wide band beam splitter. The working mode of the lighting adopts a nonmydriatic pupil design. The nondilated pupil is just a method of fundus-detection and does not affect the illumination effect of the retina.

The lighting system needs to satisfy the conjugate relationship between the annular light source and the cornea. The circular light source is taken as an object, and the surface of the cornea is taken as an image of the circular light source. This is similar to the design of an imaging system. The optimized lighting-system layout is shown in Fig. 3.

Figure 3. 3D layout of the illumination system.

3.2. Design Results and Performance

An optical simulation is carried out to evaluate the effect of irradiance.

A total of 500,000 rays are traced to simulate fundus illumination and obtain the irradiance map of the cornea and the fundus, as shown in Fig. 4. The outer diameter of the circular spot on the cornea is approximately 4 mm, and the illuminance of the fundus is even, which can satisfy the requirement of fundus illumination.

Figure 4. Illumination simulation: (a) illumination on cornea, (b) illumination on fundus.

IV. DESIGN OF THE SNAPSHOT SPECTRAL FUNDUS-IMAGING SYSTEM

4.1. Principle of the Fundus-imaging Spectrometer

This paper mainly focuses on the design of a fundus imaging spectrometer to obtain a three-dimensional data set composed of two-dimensional spatial information (x, y) and spectral information (λ) of the fundus. This work proposes acquiring data using a snapshot technique that can present a subgraph array of spatial and spectral three-dimensional information of an image on the charge-coupled device (CCD) detector. A microlens-array element is intended to be used to achieve this snapshot technology [26].

During the process, designing a front imaging system (⑦ in Fig. 2) is necessary to form an intermediate image of the retina. Then, the microlens-array (⑧ in Fig. 2) element, which is placed on the primary intermediate image surface to conduct spatial down-sampling of the primary image of the retina and form a sub-image array, is used. After that, a collimation system (⑨ in Fig. 2) is designed to collimate the light-field information after down-sampling to form parallel light [27]. The parallel light passes through the dispersion element (grating ⑩ in Fig. 2) and the rear imaging system (⑪ in Fig. 2), and finally reaches the detector (⑫ in Fig. 2).

The physical process of the system operation can be expressed as follows:

I= O(x,y,λ) i,j M,Nrect(xxci,j,d,yyc i,jd)p(λ,n,θ,ϕ)dxdy

where O(x, y, λ) is the object spatial information (containing fundus spectral information and two-dimensional image information), d is the aperture of each microlens array, i,j M,Nrect(xxc i,j,d,yyc i,jd) represents the down-sampling treatment, (xxc i,j, yyc i,j) are the centroid coordinates of the (i, j) sub image, (M, N) is the number of microlens arrays in two dimensions, and p(λ, n, θ, ϕ) represents the dispersion effect of the dispersion element (θ and φ are respectively the incident angle and diffraction angle on the grating).

During the design, the matching of the microlens array system to the front imaging system should be considered. When the F-number of the front system Ff is less than that of the microlens array system Fm, the subimages will overlap. If Ff > Fm, free gaps will exist between the sub images. So it is important to note that the F-numbers between these two parts should be equal or nearly so [28, 29] (namely Fm = Ff).

The front-facing imaging system designed here is an image square telecentric structure, and the image height of the front system is y′ = 5.5 mm. According to the size of the detector target surface, the final image height of the system should be less than 13 mm. Hence,

y=fBfC·y13mm

where FB and FC are the focal lengths of the rear imaging system and collimation system respectively. Thus,

fBfC13mm5.5mm2.36

The diameter of each single microlens is set as dm = 0.5 mm. Then, the size of each subimage on the final surface can be calculated by

Dm=dmFfFbFmFc=dm×fBfC

where Ff, FB, Fm, and FC are the F-numbers of front imaging, back imaging, microlens array, and collimating system respectively.

In our system, the focal length of the collimating system is FC = 11 mm, and that of the rear imaging system is FB = 13 mm; based on Eq. (3), we set the focal-length ratio of the rear imaging system to the collimating system to be 1.1, and the size of the subimage of the microlens is

Dm<0.5mm×fBfC

According to the pixel size, the number of pixels contained in each sub-image in one dimension is

P=DmPixelsize<0.5mm×fBPixelsize×fC

Thus the number of pixels on the detector is nearly 72 × 72, and the theoretical spectral resolution δλ can reach

δλ=λmaxλminP>λmaxλmin×Pixelsize×fC0.5mm×fB3.5nm

Combined with the technical index requirements proposed here, the theoretical spectral resolution range of the system should be between 3.5 and 10 nm.

For a grating-type imaging spectrometer, multistage diffraction should be considered. For a wide range of working bands, the phenomenon of low diffraction order of the long-wave beam and high diffraction order of the short-wave beam is likely to occur. To prevent overlap, it is generally required to satisfy

mλmax<(m+1)λmin

where m is diffraction order; then we have

m<λminλmaxλmin

We take the value of the diffraction order m = +1.

4.2. Design Result and Performance of the Fundus-imaging Spectrometer

The system uses microlens-array components to realize the snapshot technique, and uses the grating as the dispersion element. Finally, we combine these parts, add the eye model, and optimize the integrated system. The final 3D layout of the fundus imaging spectrometer is obtained, as shown in Fig. 5.

Figure 5. 3D layout of the imaging spectral system.

Here we choose 100 lp/mm as the sampling frequency, and the spatial frequency δi of the system can be calculated by

δi=12Lp=5μm

The lateral magnification β of our imaging system is 0.34, and the spatial resolution δo at the object plane is

δo=δiβ14.7μm

which satisfies the need for medical observation (<15 μm, which is a value based on medical experience, to see the fundus structure clearly [17])

Figure 6 shows the modulation transfer function (MTF) curve of the system at a sampling frequency of 100 lp/mm, where the MTF values of the whole system are all higher than 0.3 at typical wavelengths.

Figure 6. Modulation transfer function (MTF) curves at typical wavelengths of the fundus imaging spectrometer: (a) MTF curve at 800 nm, (b) MTF curve at 656 nm, and (c) MTF curve at 550 nm.

We analyze the spectral resolution using optical-design simulation software. We add bands near the edge band and the central reference band with an interval of 6 nm. We observe the spot diagram of the system, and here we mainly analyze the spot diagrams of the center and edge fields of view, as shown in Fig. 7. When the wavelength interval is set at 6 nm, the spots on the spot diagram can be distinguished clearly. Therefore, the spectral resolution of the system is better than 6 nm, which meets our design requirements. Figure 8 is the matrix spot diagram of the whole system.

Figure 7. Partial enlargement of spot diagram (a)–(c) at the center and (d)–(f) edge fields of view, for different bands: (a) 800–788 nm, (b) 662–650 nm, (c) 562–550 nm, (d) 800–788 nm, (e) 662–650 nm, and (f) 562–550 nm.

Figure 8. Matrix spot diagram of the whole system.

V. DISCUSSION

5.1. Performance of the System

Finally, all of the systems are combined. The 3D layout of the overall system is shown in Fig. 9. The volume of the whole system is approximately 400 mm × 120 mm × 75 mm.

Figure 9. 3D layout of the overall system.

Several traditional fundus-camera desktop systems, such as the RetiCam-3100 (SysEye, Shanghai, China) with a field of view of 50° can only obtain a two-dimensional retinal image of the fundus. The TRC-50DX (TopCon, Tokyo, Japan) is 340 mm × 505 mm × 715 mm in volume and the pupil of the tested person needs to be dilated, which cannot provide fundus-spectrum information either. Some new fundus-spectral-detection schemes, such as the multispectral method Alterini et al. [12] used, can only realize a spectral resolution of 60 nm. Also, the MSFC proposed by Huang [14] has a spectral resolution of 30 nm and is costly.

New technologies and research methods concernig the fundus spectrum are introduced in this paper. The focus is mainly on nondilated snapshot-type technology, which is evidently different from the previous fundus-spectrum-detection technology (existing articles mostly study switching different light sources or mixing white a light with filter, and other technologies to achieve fundus spectrum detection).

Compared to previous systems, the system designed heres (FISS) has advantages in volume and achieves miniaturization. In addition, the field of view angle is larger (60°), and the detection method of a non-mydriatic pupil is adopted. Most importantly, we design a fundus spectral-imaging system that enriches the detected information. Compared to previous multispectral fundus-detection equipment, it realizes technological innovation. By using the snapshot technology, the multispectral fundus image can be obtained with a single exposure. The spectral resolution of our system is better than 6 nm, which is much smaller than the 30-nm spectral resolution of traditional multispectral detection equipment. The simulation results show that the design is feasible and reasonable.

5.2. Systemic Stray-light Analysis

The stray light of the optical system can seriously affect its beam quality and transmission characteristics. The stray light of the system designed here may come from the following several aspects: One is the backscattered light of the glass of the front lens group in front of the beam splitter, which will be transmitted to the imaging spectral system through the beam splitter and then received by the detector. This part will interfere with the reflected light from the bottom of the eye, creating stray light. On the other hand, we may produce a prototype in the future. At that time, the inner wall of the actual prototype of the mechanical shell will certainly have an impact on the lighting part. Thus the influence of the inside surface of the optical machining is also considered. The effect of parasitic light is evaluated by optical simulation. A 0.8% reflectivity is assigned to the rear surface of the front lens group, and the effect of the mechanical inner wall is fully considered. Ambient light larger than the field of view is applied, and the illumination on the image surface is analyzed. Almost no backscattered light can be detected on the image plane. Thus, the stray light of the system has minimal influence, as shown in Fig. 10.

Figure 10. Backscattered light received by detector.

The next part is the stray light generated by the overlap of diffracted light of different orders of the grating. The previous calculation and analysis clarifies that a diffraction order of +1 is reasonable and feasible in theory, and will not cause the overlapping of different diffraction orders. Multiple structures are established for experimental verification.

The result in Fig. 11 shows that different orders of diffracted light can be separated without interference (where green represents a diffraction order of +1 and red represents +2). Thus the influence of stray light is avoided.

Figure 11. Different orders of diffracted light.

VI. CONCLUSION

In summary, a snapshot nonmydriatic FISS based on an eye model is proposed. The human-eye model is established; relevant theoretical analysis and calculation are carried out; the optical system design is completed; and the design results and performance are analyzed. We overcome the limitations of a traditional desktop fundus camera being able to detect only 2D retinal information, of the MSFC’s complexity, and of low spectral resolution. A snapshot nonmydriatic fundus spectral-imaging system with a larger field of view of 60° and a smaller size (400 mm × 120 mm × 75 mm) is designed in this paper. The system works in the VIS and NIR bands, and the spectral resolution is better than 6 nm, which is much better than the 30-nm spectral resolution of most existing fundus multispectral spectroscopic systems. This new fundus spectral-imaging device can obtain multidimensional fundus information, which is of great importance in assisting the screening, prevention, and detection of related diseases.

DISCLOSURES

The authors declare no conflicts of interest.

DATA AVAILABILITY

Data underlying the results presented in this paper is not publicly available at the time of publication, which may be obtained from the authors upon reasonable request.

ACKNOWLEDGMENT

The authors acknowledge the ophthalmology staff from the department of ophthalmology of the Third Medical Center of Chinese PLA General Hospital for their guidance of our ophthalmology knowledge.

FUNDING

National Natural Science Foundation of China (NSFC) (Grant No. 61471039), Beijing Natural Science Foundation (BNSF) (Grant No. 7212092) and Capital Medical Development Research (CMDR) (Grant No. 2022-2-5041).

Fig 1.

Figure 1.Human eye model.
Current Optics and Photonics 2022; 6: 151-160https://doi.org/10.3807/COPP.2022.6.2.151

Fig 2.

Figure 2.Schematic of the fundus-imaging spectral system (FISS), which contains the following three parts: (I) eyepiece common light path, (II) nonmydriatic lighting path, and (III) imaging spectral light path. ① Human eye model, ② pupil, ③ shared front eyepiece group, ④ beam splitter, ⑤ illumination optical system, ⑥ lighting source, ⑦ front imaging system, ⑧ microlens array, ⑨ collimating system, ⑩ grating, ⑪ rear imaging system, and ⑫ charge-coupled device.
Current Optics and Photonics 2022; 6: 151-160https://doi.org/10.3807/COPP.2022.6.2.151

Fig 3.

Figure 3.3D layout of the illumination system.
Current Optics and Photonics 2022; 6: 151-160https://doi.org/10.3807/COPP.2022.6.2.151

Fig 4.

Figure 4.Illumination simulation: (a) illumination on cornea, (b) illumination on fundus.
Current Optics and Photonics 2022; 6: 151-160https://doi.org/10.3807/COPP.2022.6.2.151

Fig 5.

Figure 5.3D layout of the imaging spectral system.
Current Optics and Photonics 2022; 6: 151-160https://doi.org/10.3807/COPP.2022.6.2.151

Fig 6.

Figure 6.Modulation transfer function (MTF) curves at typical wavelengths of the fundus imaging spectrometer: (a) MTF curve at 800 nm, (b) MTF curve at 656 nm, and (c) MTF curve at 550 nm.
Current Optics and Photonics 2022; 6: 151-160https://doi.org/10.3807/COPP.2022.6.2.151

Fig 7.

Figure 7.Partial enlargement of spot diagram (a)–(c) at the center and (d)–(f) edge fields of view, for different bands: (a) 800–788 nm, (b) 662–650 nm, (c) 562–550 nm, (d) 800–788 nm, (e) 662–650 nm, and (f) 562–550 nm.
Current Optics and Photonics 2022; 6: 151-160https://doi.org/10.3807/COPP.2022.6.2.151

Fig 8.

Figure 8.Matrix spot diagram of the whole system.
Current Optics and Photonics 2022; 6: 151-160https://doi.org/10.3807/COPP.2022.6.2.151

Fig 9.

Figure 9.3D layout of the overall system.
Current Optics and Photonics 2022; 6: 151-160https://doi.org/10.3807/COPP.2022.6.2.151

Fig 10.

Figure 10.Backscattered light received by detector.
Current Optics and Photonics 2022; 6: 151-160https://doi.org/10.3807/COPP.2022.6.2.151

Fig 11.

Figure 11.Different orders of diffracted light.
Current Optics and Photonics 2022; 6: 151-160https://doi.org/10.3807/COPP.2022.6.2.151

TABLE 1 Structural parameters of the human-eye model

VariableEye Model Parameter
Radius (mm)Thickness (mm)Source
Retina11--
Vitreum-16.58[17]
Posterior Surface of Crystalline Lens63.7[17]
Anterior Surface of Crystalline Lens−10--
Aqueous Humor-1.5-
Posterior Surface of Cornea−6.70.52[17]
Anterior Surface of Cornea−7.8--
Length of Visual Axis24 mm

TABLE 2 Requirements of fundus-imaging spectral system (FISS) system

System ParametersValue
Spectral Range550–800 nm
Field Angle±30°
Minimum Pupil Diameter3 mm
Spectral Resolution<10 nm
Target Surface Size1/1.2 inch
Pixel Size5.86 μm × 5.86 μm
Modulation Transfer Function (MTF)>0.2 @100 lp/mm

References

  1. K. Zhao and P. Yang, Ophthalmology, 8th ed. (People’s Medical Publishing House, Beijing, China, 2010), pp. 319-323.
  2. H. Zhang and N. Liu, Atlas of Ocular Fundus Diseases (China Medical Publishing House, Beijing, China, 2007), pp. 3-8.
  3. G. Huang, X. Qi, T. Y. P. Chui, Z. Zhong, and S. A. Burns, “A clinical planning module for adaptive optics SLO imaging,” Optom. Vis. Sci. 89, 593-601 (2012).
    Pubmed KoreaMed CrossRef
  4. X. Wang and Q. Xue, “Optical design of portable nonmydriatic fundus camera with large field of view,” Acta Opt. Sin. 37, 0922001 (2017).
    CrossRef
  5. N. Patton, T. M. Aslam, T. Macgillivray, I. J. Deary, B. Dhillon, R. H. Eikelboom, K. Yogesan, I. J. Constable, “Retinal image analysis: concepts, applications and potential,” Prog. Retin. Eye Res. 25, 99-127 (2006).
    Pubmed CrossRef
  6. H. Lu and H Li, The Principle and Clinical Application for Ocural Optical Coherence Tomography (World Publishing Corporation, Xi ’an, China, 2013), pp. 56-60.
  7. R. Li, Fundus fluorescein Angiography and Optical Imaging (People’s Medical Publishing House, Beijing, China, 2010), pp. 21-23.
  8. S. S. Hayreh, “In vivo choroidal circulation and its watershed zones,” Eye 4, 273-289 (1990).
    Pubmed CrossRef
  9. D. L. Nickla and J. Wallman, “The multifunctional choroid,” Prog. Retin. Eye Res. 29, 144-168 (2010).
    Pubmed KoreaMed CrossRef
  10. H, Takehara, H. Sumi, Z. Wang, T. Kondo, M. Haruta, K. Sasagawa, and J. Ohta. “Multispectral near-infrared imaging technologies for nonmydriatic fundus camera,” in Proc. IEEE Biomedical Circuits and Systems Conference-BioCAS (Nara, Japan, Oct. 17-19, 2019), pp. 1-4.
    Pubmed KoreaMed CrossRef
  11. S. Li, L. Huang, Y. Bai, Y. Cheng, J. Tian, S. Wang, Y. Sun, K. Wang, F. Wang, and Q. Zhang, “In vivo study of retinal transmission function in different sections of the choroidal structure using multispectral imaging,” Investig. Ophthalmol. Vis. Sci. 56, 3731-3742 (2015).
    Pubmed CrossRef
  12. T. Alterini, F. Díaz-Doutón, F. J. Burgos-Fernández, L. González, C. Mateo, and M. Vilaseca, “Fast visible and extended near-infrared multispectral fundus camera,” J. Biomed. Opt. 24, 096007 (2019).
    KoreaMed CrossRef
  13. D. Toslak, T. Son, M. K. Erol, H. Kim, T.-H. Kim, R. V. P. Chan, and X. Yao, “Portable ultra-widefield fundus camera for multispectral imaging of the retina and choroid,” Biomed. Opt. Express 11, 6281-6292 (2020).
    Pubmed KoreaMed CrossRef
  14. Z. Huang, Z. Jiang, Y. Hu, D. Zou, Y. Lu, Q. Ren, G. Liu, and Y. Lu, “Retinal choroidal vessel imaging based on multi-wavelength fundus imaging with the guidance of optical coherence tomography,” Biomed. Opt. Express 11, 5212-5224 (2020).
    Pubmed KoreaMed CrossRef
  15. E. R. de Carvalho, R. J. M. Hoveling, C. J. F. van Noorden, R. O. Schlingemann, and M. C. G. Aalders, “Functional imaging of the ocular fundus using an 8-band retinal multispectral imaging system,” Instruments 4, 12 (2020)
    CrossRef
  16. J. Polans, B. Jaeken, R. P. Mcnabb, P. Artal, J. A. Izatt, “Wide-field optical model of the human eye with asymmetrically tilted and decentered lens that reproduces measured ocular aberrations,” Optica 2, 124-134 (2015).
    CrossRef
  17. H.-L. Liou and N. A. Brennan, “Anatomically accurate, finite model eye for optical modeling,” Opt. Soc. Am. A 14, 1684-1695 (1997).
    Pubmed CrossRef
  18. W. Chen, J. Chang, X. Zhao, and S. Liu, “Optical design and fabrication of a smartphone fundus camera,” Appl. Opt. 60, 1420-1427 (2021).
    Pubmed CrossRef
  19. Ophthalmic instruments — Fundus cameras, ISO 10940:2009, Technical Committee ISO/TC 172, Ophthalmic optics and instruments (2009).
  20. Teledyne FLIR, “GS3-U3-23S6C-C FLIR Grasshopper®3 High Performance USB 3.0 Color Camera," (Teledyne FLIR), https://www.flir.com/products/grasshopper3-usb3/?model=GS3-U3-23S6C-C (Accessed date: Feb. 01, 2022).
  21. F. C. Delori and K. P. Pflibsen, “Spectral reflectance of the human ocular fundus,” Appl. Opt. 28, 1061-1077 (1989).
    Pubmed CrossRef
  22. M. Hammer and D. Schweitzer, “Quantitative reflection spectroscopy at the human ocular fundus,” Phys. Med. Biol. 47, 179-191 (2002).
    Pubmed CrossRef
  23. J. Schwiegerling, Field Guide to Visual and Ophthalmic Optics, (SPIE Press, USA. 2004).
    Pubmed CrossRef
  24. E. Dehoog and J. Schwiegerling, “Optimal parameters for retinal illumination and imaging in fundus cameras,” Appl. Opt. 47, 6769-6777 (2008).
    Pubmed CrossRef
  25. E. Dehoog and J. Schwiegerling, “Fundus camera systems: a comparative analysis,” Appl. Opt. 48, 221-228 (2009).
    Pubmed KoreaMed CrossRef
  26. J. Cui, Y. Tang, P. Han, M. Pan, and J. Zhang, “Development of diagnostic imaging spectrometer for tumor on-line operation,” Opt. Precis. Eng. 21, 3043-3049 (2013).
    CrossRef
  27. J.-N. Liu, J.-C. Cui, L. Yin, C. Sun, J.-J. Chen, R. Zhang, and J.-L. Liu, “Analysis and design of pre-imaging system of integral field imaging spectrometer based on lenslet array,” Spectrosc. Spect. Anal. 38, 3269 (2018).
  28. Z. Zhang, J. Chang, H. Ren, K. Fan, and D. Li, “Snapshot imaging spectrometer based on a microlens array,” Chin. Opt. Lett. 17, 011101- (2019).
    CrossRef
  29. D. W. Palmer, T. Coppin, K. Rana, D. G. Dansereau, M. Suheimat, M. Maynard, D. A. Atchison, J. Roberts, R. Crawford, and A Jaiprakash, “Glare-free retinal imaging using a portable light field fundus camera,” Biomed. Opt. Express 9, 3178-3192 (2018).
    Pubmed KoreaMed CrossRef