Ex) Article Title, Author, Keywords
Current Optics
and Photonics
Ex) Article Title, Author, Keywords
Curr. Opt. Photon. 2023; 7(1): 54-64
Published online February 25, 2023 https://doi.org/10.3807/COPP.2023.7.1.54
Copyright © Optical Society of Korea.
Soobin Kim, Sehwan Na, Wonwoo Choi, Hwi Kim
Corresponding author: *hwikim@korea.ac.kr, ORCID 0000-0002-4283-8982
This is an Open Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (http://creativecommons.org/licenses/by-nc/4.0/) which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original work is properly cited.
This paper proposes dynamic viewing-zone switching for a binocular holographic three-dimensional display with low interpupil crosstalk and an extended eye-motion box. The optimal pupil geometry for reducing interpupil crosstalk is designed. It is shown that the eye-motion box can be extended by exploiting signal replication in the higher-order viewing zone. Design principles and numerical simulations for verification of the binocular holographic head-up display are presented.
Keywords: Eye-motion box, Head-up display, Holographic display
OCIS codes: (090.0090) Holography; (090.2820) Heads-up displays; (110.0110) Imaging systems
In recent years, binocular holographic three-dimensional (3D) display technology has been actively researched for head-up display (HUD) applications. A conventional HUD generates 2D images at a specific depth plane, regardless of the driver’s focus while recognizing a real object or real scene, which causes augmented-reality (AR) depth mismatch and raises driving safety concerns. A holographic HUD allows the driver to naturally focus on objects and information in 3D space on the road, and expresses it at a longer distance [1–3]. This 3D depth-cue generation contrasts with the visual function of 2D image generation for a conventional automobile HUD. In a binocular holographic 3D display, a single-panel spatial light modulator (SLM) can offer computer-generated hologram (CGH) content with left and right stereoscopic 3D images multiplexed [4–7]. In practice, a binocular CGH can be delivered by a SLM capable of modulating only the amplitude or phase of the light field; However, a complex SLM capable of modulating both amplitude and phase is considered the ultimate device [8–12]. A recently requirement is that the HUD be able to display augmented reality through the front windshield of the automobile to viewers [13, 14]. In this situation practical aberration problems occur, and it is very difficult to configure the optical system. The holographic HUD emerges as a convincing candidate for a next-generation HUD display, because it can dynamically compensate for the highly aberrant optical field.
Eye-motion box (EMB) extension is an essential factor in AR 3D displays. There have been many ways to widen the static EMB for a SLM of finite pixel pitch. The EMB size is strictly determined by the pixel pitch of the SLM. The recently released holographic 3D display prototype using an amplitude SLM with 99-μm pixel pitch and 532-nm wavelength offers a horizontal EMB of 11.3 mm and field of view (FOV) of 5° × 3° for a viewer located 2 m ahead of the display [15]. Although a wide EMB covering the viewer’s left and right eyes is required for binocular display, a SLM with finite pixel dimensions cannot generate such a wide single viewing zone. Extension of the EMB is a challenge. Previous research has investigated random filter structures, such as random-hole filters (or photonic sieves) and random-phase plates [16, 17]. It was recognized that the application of randomness to an optical wavefront can successfully extend the EMB isotropically, at the cost of image resolution and signal-to-noise ratio. The cost of the image resolution is ascribed to the space-bandwidth product (SBP) invariance of the generated light field. That SBP-invariance constraint leads to resolution degradation, even with the achievement of EMB extension. To prevent the degradation of image quality, the dynamic real-time eye-tracking technique synchronized to the holographic display system has been actively researched. The combination of the eye-tracking technique and binocular CGH with dynamic backlighting was recently demonstrated [18–20].
Nonetheless, these eye-tracking-based techniques are still highly limited. The eye-tracking needs to be performed over a wide dynamic range, so that the driver can observe the zeroth-order region through high-dynamic-range beam steering. Eye tracking is a technology for tracking the position of the viewer’s pupil and delivering the image information to it. The observer’s pupil is tracked by eye tracking, but the beam-steering scope is very limited. Consequently the EMB is limited, due to the eye-tracking hardware’s limits. Therefore, a method to reduce the load on the beam-steering system while maintaining a wide viewing region is necessary. We consider that the static EMB is tightly restricted when the eye-tracking target is devoted to the zeroth order, but if the eye-tracking target region is expanded to a higher order, the EMB can be extended beyond the constraints of the conventional technique.
The region of view formed by the light source and the SLM is defined as follows: The EMB is defined as the entire area in which the observer moves freely to observe the hologram in the entire field distribution formed by the optical system and CGH. The region of interest (ROI) in the EMB, where the hologram can be observed within the diffraction region with uniformity, is defined as the viewing zone. A viewing window (VW) is defined as a region where color matching is performed within a single diffraction region generated from the pixel in the viewing zone. These regions are depicted in Fig. A1. In this paper, we describe the design of a higher-order-zone switching for holographic HUD applications, a new way of exploiting the higher-order VWs to extend the EMB with an efficient, small dynamic working range of the eye-tracking system. In Section II, the binocular holographic display design is described, and the characteristics of its optical diffraction are analyzed. Section III develops the interpupil-crosstalk problem in the binocular holographic display. In Section IV, the idea of the higher-order-viewing-zone switching and the associated optimal CGH design method are elucidated, and the concluding remarks follow.
The binocular holographic HUD is schematically illustrated in Fig. 1(a). The right panel of Fig. 1(b) presents the full-color field distribution in the plane of the observer’s eye lens (the viewing zone). The signals for left and right eyes are separated in the eye-lens plane. The zeroth VW is taken as the viewing zone, in which the red-green-blue (RGB) components are matched to exclude RGB chromatic aberration in the observation of holographic images. In this simulation, the complex SLM is assumed to be installed in the picture-generation unit (PGU). The left and right pupils are positioned in the color-matched viewing zone. In other places, the color-separated higher-order diffraction patterns are distributed. Due to the color separation, severely degraded holographic 3D images are observed outside the zeroth VW. The objects consist of the cockpit, road guideline, and directional arrow, located respectively at
The panels below present the left and right holographic views that manifest the holographic accommodation effects. The CGH representing the left and right views simultaneously is synthesized by wave-optic multiplexing. Let us denote the left and right initial CGHs as
where the prism profiles
The diffraction region is determined by the wavelength of the color. When RGB are separated in the observer’s eye-lens plane, the target image appears separated in the retina plane. Therefore, matching each RGB signal at the observation point is important. In general, if the signal is compensated with a linear phase in the zeroth-order-diffraction region, a mismatch occurs in the higher-order region. Therefore, RGB color matching is required for observation in the higher-order-diffraction region. Because the viewing zone is very narrow, effective directional beam steering is required. However, the beam-steering technology can be a hindering factor, not allowing eye-motion-box extension. In practice, the dynamic range of the conventional beam-steering technique is very limited in a narrow directional range, which is not fit for the HUD application. Section IV introduces a solution to this issue.
In addition, the higher-order diffraction patterns induce interpupil crosstalk. Figures 1(b)–1(d) show simulation results based on wave optics in the retina plane of the observer, using the proposed system. However, the optical diffraction spreads out the left image field toward the right eye’s viewing zone, and vice versa; Thus significant interpupil crosstalk is measured. As shown in Figs. 1(c) and 1(d), the left-view (right-view) image is perceived by the right (left) eye, due to the higher-order diffraction. To prevent noise and crosstalk, it is necessary to reduce the higher-order-diffraction components. This higher-order diffraction pattern is determined by the shape of the pixel. This issue will be addressed in Section III.
In the holographic display using a complex SLM with color filter, an interpupil crosstalk phenomenon can deteriorate the image quality, as seen in Fig. 1(b). The interpupil crosstalk can be reduced by designing the optimal pixel shape, because the far-field diffraction patterns in the eye-lens plane vary according to pixel geometry.
Figure 2 compares the far-field diffraction patterns for the left-eye CGH in the viewing-zone plane. In Figs. 2(a)–2(c), the interpupil crosstalk of the binocular HUD AR image is perceived. The typical color-stripe panels shown in Figs. 2(a) and 2(b) generate broadly spreading cross-diffraction field distribution in the viewing zone plane, and then the left CGH signal penetrates into the VW for the right view, leading to significant interpupil crosstalk. This overlapped field distribution is recognizable in Figs. 2(a) and 2(b). Meanwhile, the slant angle of the super-IPS panel (S-IPS) pixel generates an X-shaped diffraction pattern. Thus, the high-intensity X-shaped diffraction pattern in the left-view CGH signal does not overlap exactly at the right viewing zone. The investigation of the zigzag-shaped S-IPS is presented in Figs. 2(c) and 2(d). In the right viewing zone, the S-IPS pixel in Fig 2(d) shows negligible interpupil crosstalk, but for the S-IPS pixel in Fig. 2(c) the interpupil crosstalk still exists.
Table 1 shows the energy comparison at the left VW for the structure in Fig. 2. The ROI is one VW of the left eye, and compares the energies of the crosstalk when only the CGH for the left eye and when the CGH for both eyes are represented.
Table 1 Energy (in arbitrary units) of the ROI (red square area) in Fig. 2 according to each pixel structure
Pixel Shape | |||
Vertical Stripe | 4668.43 | 677.02 | 5522.58 |
Horizontal Stripe | 4678.52 | 2204 | 4717.83 |
Vertical Slanted | 912.30 | 192.78 | 1146.15 |
Horizontal Slanted | 914.44 | 1.16 | 916.10 |
ROI, region of interest.
The size of the aperture in the pixel structure is related to the energy of the propagated area. Stripe-type pixels have a wider aperture than S-IPS-structure pixels, and transmit higher energy. The direction of the diffraction pattern is changed according to the shape of the pixel. Diffraction appears to be robust in the direction of the shorter aperture length. Therefore, vertically arranged pixels cause robust interpupil crosstalk. Pixels arranged in the horizontal direction yield strong diffraction in the vertical direction, and can improve the uniformity of adjacent higher-order VWs. Low crosstalk energy appears in horizontal pixels, but varies depending on the shape of the pixel. At the retina plane in Fig. 2(b), low crosstalk is appeared, but energy is clearly present. In the horizontal-slanted pixel structure of Fig. 2(d), the crosstalk energy converges to almost zero. Therefore, the pixel structure suitable for the proposed system is the horizontal-slanted pixel structure in Fig. 2(d).
The binocular CGH display can be implemented in various forms. One of the design approaches is a binocular holographic display with an interleaved bidirectional prism array. Figure 3(a) illustrates the schematics of the holographic 3D display with a complex SLM, bidirectional linear prism (BLP), and a field lens. The first linear prism reflects the even columns of the display toward the left eye, while the second linear prism reflects the odd columns toward the right eye. The BLP forms the binocular EMB. The viewer can see the holographic 3D images through the left and right VWs. The zeroth-order VWs are constructed in the eye plane. The distance between the left zeroth VW and the right zeroth VW is set to 65 mm. The higher-order-diffraction components appear around the zeroth VW.
The VW size in a tablet-type holographic 3D display is determined by the pixel size of the SLM and the observation distance. Each CGH for a respective eye is independently calculated and interleaved, column by column, with the BLP. In calculating the CGH, the pixel pitch in the vertical direction is maintained, and the pixel pitch in the horizontal direction is twice the original pixel pitch, and the left and right CGHs are interleaved horizontally. We assume that an achromatic doublet lens comprises the BLP, and that it is possible to measure the position of an eye’s pupil using robust control of the eye-tracking system.
As mentioned above, in general hologram observation in the higher-order-diffraction zone is avoided. However, this study focuses on a method to expand the viewing zone by using not only the zeroth-order but also the positive and negative first-order VWs. Each directional illumination system is independently modeled as a prism for the left or right eye, and is interleaved with an interval of one pixel. The width of each BLP prism is equal to the pixel pitch in the horizontal direction, and the height is equal to the vertical length of the SLM. The BLP is represented as
where
The conventional viewing zone refers to only the zeroth-order VW. In the HUD application, the additional beam-steering system aligns the left and right zeroth-order VWs with the viewer’s eye pupils dynamically. The aberration caused by the windshield becomes more significant and the system cost rises as the range of beam steering increases. The issue is to widen the beam-steering dynamic range from the PGU to the viewer’s eye through the windshield. This problem can be solved through the proposed BLP. In Fig. 3(b), the main zeroth VW is replicated in the 3 × 3 higher-order zones. The viewer can see the same holographic image through the higher-order zone as through the zeroth-order VW. Thus, if the eye-tracking system can sense what VW the viewer’s eye position is located in, the beam steering can align the corresponding higher-order zone to the viewer’s eye, without the extension of the physical beam-steering range. Therefore, the effective viewing zone is extended to a 3 × 3 higher-order VW. In the left panel of Fig. 3(d), the conventional beam-steering range is denoted by the blue dotted rectangle. The additional function of the eye-tracking system is to check which higher-order VW includes the viewer’s eye.
Then the beam-steering system aligns the beam signal with the corresponding zeroth VW, which allows the viewer to see the same holographic image through the higher-order VW, as illustrated in Fig. 4. In the proposed HUD system, the eye tracking scans a wide range, and the scale of the system can be reduced by limiting the beam-steering area to zeroth order. This refers to the higher-order-zone-switching technique. When the eye’s position is identified by the eye-tracking system, the CGH for the position can be corrected by applying an appropriate linear phase profile in real time to the CGH, so that the signal reaches the viewer’s position. In the zeroth VW, the carrier wave for CGH compensation when the eye position is (
In the dynamic zone-switching scheme, the carrier waves are defined to align RGB CGH signals in the target VW. Because we utilize field replication in the higher-order-diffraction zone, the carrier wave must shift the corresponding CGH signal in the zeroth VW as well as the higher-order target VW. Therefore, when the eye’s pupil is positioned at (
In Fig. 4, the dynamic viewing-zone switching is numerically simulated. In the simulation the viewer moves to the higher-order-diffraction region in steps of 13.3 mm and 23.7 mm along the vertical direction, but the viewer can see the CGH as in the zeroth VW. Once the eye-tracking system identifies the position of the viewer’s eye, the
Color matching is enabled in the higher-order-diffraction region by applying a linear phase carrier wave for each RGB component to the CGH. The set of carrier waves for the RGB wavelengths,
Probably, the viewer’s eye may be positioned at the boundary of the RGB VW. If the accuracy of eye tracking is reliable, the CGH can be corrected through the adjustment of RGB carrier waves to observe the full-color hologram. The EMB extension should be accompanied by true image-information updating in three dimensions as the viewer’s position varies over the extended VW, not just the replicated two-dimensional image information, which is in contrast to the conventional exit-pupil-replication-based AR display system.
In conclusion, we have investigated the control of the higher-order diffraction pattern for interpupil-crosstalk damping and the EMB extension for a full-color holographic AR-HUD system. The color-matching method in the extended VW through RGB phase compensation was devised in conjunction with the eye-tracking technique, and its feasibility was validated by numerical simulation. The proposed concepts may be useful in developing not only a practical holographic HUD system, but also binocular holographic displays in general.
In general, the eye-motion box, viewing zone, and viewing window are similar in meaning to “regions in which the hologram can be observed in the holographic display”. In this paper, the name of each area is chosen specifically to discriminate the area where the hologram can be seen in the observer’s eye-lens-plane area formed by the proposed system.
The authors declare no conflicts of interest.
Data underlying the results presented in this paper are not publicly available at this time, but may be obtained from the authors upon reasonable request.
Samsung Research Funding & Incubation Center of Samsung Electronics (SRFC-TB1903-05).
Curr. Opt. Photon. 2023; 7(1): 54-64
Published online February 25, 2023 https://doi.org/10.3807/COPP.2023.7.1.54
Copyright © Optical Society of Korea.
Soobin Kim, Sehwan Na, Wonwoo Choi, Hwi Kim
Department of Electronics and Information Engineering, Korea University, Sejong-Campus, Sejong 30019, Korea
Correspondence to:*hwikim@korea.ac.kr, ORCID 0000-0002-4283-8982
This is an Open Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (http://creativecommons.org/licenses/by-nc/4.0/) which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original work is properly cited.
This paper proposes dynamic viewing-zone switching for a binocular holographic three-dimensional display with low interpupil crosstalk and an extended eye-motion box. The optimal pupil geometry for reducing interpupil crosstalk is designed. It is shown that the eye-motion box can be extended by exploiting signal replication in the higher-order viewing zone. Design principles and numerical simulations for verification of the binocular holographic head-up display are presented.
Keywords: Eye-motion box, Head-up display, Holographic display
In recent years, binocular holographic three-dimensional (3D) display technology has been actively researched for head-up display (HUD) applications. A conventional HUD generates 2D images at a specific depth plane, regardless of the driver’s focus while recognizing a real object or real scene, which causes augmented-reality (AR) depth mismatch and raises driving safety concerns. A holographic HUD allows the driver to naturally focus on objects and information in 3D space on the road, and expresses it at a longer distance [1–3]. This 3D depth-cue generation contrasts with the visual function of 2D image generation for a conventional automobile HUD. In a binocular holographic 3D display, a single-panel spatial light modulator (SLM) can offer computer-generated hologram (CGH) content with left and right stereoscopic 3D images multiplexed [4–7]. In practice, a binocular CGH can be delivered by a SLM capable of modulating only the amplitude or phase of the light field; However, a complex SLM capable of modulating both amplitude and phase is considered the ultimate device [8–12]. A recently requirement is that the HUD be able to display augmented reality through the front windshield of the automobile to viewers [13, 14]. In this situation practical aberration problems occur, and it is very difficult to configure the optical system. The holographic HUD emerges as a convincing candidate for a next-generation HUD display, because it can dynamically compensate for the highly aberrant optical field.
Eye-motion box (EMB) extension is an essential factor in AR 3D displays. There have been many ways to widen the static EMB for a SLM of finite pixel pitch. The EMB size is strictly determined by the pixel pitch of the SLM. The recently released holographic 3D display prototype using an amplitude SLM with 99-μm pixel pitch and 532-nm wavelength offers a horizontal EMB of 11.3 mm and field of view (FOV) of 5° × 3° for a viewer located 2 m ahead of the display [15]. Although a wide EMB covering the viewer’s left and right eyes is required for binocular display, a SLM with finite pixel dimensions cannot generate such a wide single viewing zone. Extension of the EMB is a challenge. Previous research has investigated random filter structures, such as random-hole filters (or photonic sieves) and random-phase plates [16, 17]. It was recognized that the application of randomness to an optical wavefront can successfully extend the EMB isotropically, at the cost of image resolution and signal-to-noise ratio. The cost of the image resolution is ascribed to the space-bandwidth product (SBP) invariance of the generated light field. That SBP-invariance constraint leads to resolution degradation, even with the achievement of EMB extension. To prevent the degradation of image quality, the dynamic real-time eye-tracking technique synchronized to the holographic display system has been actively researched. The combination of the eye-tracking technique and binocular CGH with dynamic backlighting was recently demonstrated [18–20].
Nonetheless, these eye-tracking-based techniques are still highly limited. The eye-tracking needs to be performed over a wide dynamic range, so that the driver can observe the zeroth-order region through high-dynamic-range beam steering. Eye tracking is a technology for tracking the position of the viewer’s pupil and delivering the image information to it. The observer’s pupil is tracked by eye tracking, but the beam-steering scope is very limited. Consequently the EMB is limited, due to the eye-tracking hardware’s limits. Therefore, a method to reduce the load on the beam-steering system while maintaining a wide viewing region is necessary. We consider that the static EMB is tightly restricted when the eye-tracking target is devoted to the zeroth order, but if the eye-tracking target region is expanded to a higher order, the EMB can be extended beyond the constraints of the conventional technique.
The region of view formed by the light source and the SLM is defined as follows: The EMB is defined as the entire area in which the observer moves freely to observe the hologram in the entire field distribution formed by the optical system and CGH. The region of interest (ROI) in the EMB, where the hologram can be observed within the diffraction region with uniformity, is defined as the viewing zone. A viewing window (VW) is defined as a region where color matching is performed within a single diffraction region generated from the pixel in the viewing zone. These regions are depicted in Fig. A1. In this paper, we describe the design of a higher-order-zone switching for holographic HUD applications, a new way of exploiting the higher-order VWs to extend the EMB with an efficient, small dynamic working range of the eye-tracking system. In Section II, the binocular holographic display design is described, and the characteristics of its optical diffraction are analyzed. Section III develops the interpupil-crosstalk problem in the binocular holographic display. In Section IV, the idea of the higher-order-viewing-zone switching and the associated optimal CGH design method are elucidated, and the concluding remarks follow.
The binocular holographic HUD is schematically illustrated in Fig. 1(a). The right panel of Fig. 1(b) presents the full-color field distribution in the plane of the observer’s eye lens (the viewing zone). The signals for left and right eyes are separated in the eye-lens plane. The zeroth VW is taken as the viewing zone, in which the red-green-blue (RGB) components are matched to exclude RGB chromatic aberration in the observation of holographic images. In this simulation, the complex SLM is assumed to be installed in the picture-generation unit (PGU). The left and right pupils are positioned in the color-matched viewing zone. In other places, the color-separated higher-order diffraction patterns are distributed. Due to the color separation, severely degraded holographic 3D images are observed outside the zeroth VW. The objects consist of the cockpit, road guideline, and directional arrow, located respectively at
The panels below present the left and right holographic views that manifest the holographic accommodation effects. The CGH representing the left and right views simultaneously is synthesized by wave-optic multiplexing. Let us denote the left and right initial CGHs as
where the prism profiles
The diffraction region is determined by the wavelength of the color. When RGB are separated in the observer’s eye-lens plane, the target image appears separated in the retina plane. Therefore, matching each RGB signal at the observation point is important. In general, if the signal is compensated with a linear phase in the zeroth-order-diffraction region, a mismatch occurs in the higher-order region. Therefore, RGB color matching is required for observation in the higher-order-diffraction region. Because the viewing zone is very narrow, effective directional beam steering is required. However, the beam-steering technology can be a hindering factor, not allowing eye-motion-box extension. In practice, the dynamic range of the conventional beam-steering technique is very limited in a narrow directional range, which is not fit for the HUD application. Section IV introduces a solution to this issue.
In addition, the higher-order diffraction patterns induce interpupil crosstalk. Figures 1(b)–1(d) show simulation results based on wave optics in the retina plane of the observer, using the proposed system. However, the optical diffraction spreads out the left image field toward the right eye’s viewing zone, and vice versa; Thus significant interpupil crosstalk is measured. As shown in Figs. 1(c) and 1(d), the left-view (right-view) image is perceived by the right (left) eye, due to the higher-order diffraction. To prevent noise and crosstalk, it is necessary to reduce the higher-order-diffraction components. This higher-order diffraction pattern is determined by the shape of the pixel. This issue will be addressed in Section III.
In the holographic display using a complex SLM with color filter, an interpupil crosstalk phenomenon can deteriorate the image quality, as seen in Fig. 1(b). The interpupil crosstalk can be reduced by designing the optimal pixel shape, because the far-field diffraction patterns in the eye-lens plane vary according to pixel geometry.
Figure 2 compares the far-field diffraction patterns for the left-eye CGH in the viewing-zone plane. In Figs. 2(a)–2(c), the interpupil crosstalk of the binocular HUD AR image is perceived. The typical color-stripe panels shown in Figs. 2(a) and 2(b) generate broadly spreading cross-diffraction field distribution in the viewing zone plane, and then the left CGH signal penetrates into the VW for the right view, leading to significant interpupil crosstalk. This overlapped field distribution is recognizable in Figs. 2(a) and 2(b). Meanwhile, the slant angle of the super-IPS panel (S-IPS) pixel generates an X-shaped diffraction pattern. Thus, the high-intensity X-shaped diffraction pattern in the left-view CGH signal does not overlap exactly at the right viewing zone. The investigation of the zigzag-shaped S-IPS is presented in Figs. 2(c) and 2(d). In the right viewing zone, the S-IPS pixel in Fig 2(d) shows negligible interpupil crosstalk, but for the S-IPS pixel in Fig. 2(c) the interpupil crosstalk still exists.
Table 1 shows the energy comparison at the left VW for the structure in Fig. 2. The ROI is one VW of the left eye, and compares the energies of the crosstalk when only the CGH for the left eye and when the CGH for both eyes are represented.
Table 1 . Energy (in arbitrary units) of the ROI (red square area) in Fig. 2 according to each pixel structure.
Pixel Shape | |||
Vertical Stripe | 4668.43 | 677.02 | 5522.58 |
Horizontal Stripe | 4678.52 | 2204 | 4717.83 |
Vertical Slanted | 912.30 | 192.78 | 1146.15 |
Horizontal Slanted | 914.44 | 1.16 | 916.10 |
ROI, region of interest..
The size of the aperture in the pixel structure is related to the energy of the propagated area. Stripe-type pixels have a wider aperture than S-IPS-structure pixels, and transmit higher energy. The direction of the diffraction pattern is changed according to the shape of the pixel. Diffraction appears to be robust in the direction of the shorter aperture length. Therefore, vertically arranged pixels cause robust interpupil crosstalk. Pixels arranged in the horizontal direction yield strong diffraction in the vertical direction, and can improve the uniformity of adjacent higher-order VWs. Low crosstalk energy appears in horizontal pixels, but varies depending on the shape of the pixel. At the retina plane in Fig. 2(b), low crosstalk is appeared, but energy is clearly present. In the horizontal-slanted pixel structure of Fig. 2(d), the crosstalk energy converges to almost zero. Therefore, the pixel structure suitable for the proposed system is the horizontal-slanted pixel structure in Fig. 2(d).
The binocular CGH display can be implemented in various forms. One of the design approaches is a binocular holographic display with an interleaved bidirectional prism array. Figure 3(a) illustrates the schematics of the holographic 3D display with a complex SLM, bidirectional linear prism (BLP), and a field lens. The first linear prism reflects the even columns of the display toward the left eye, while the second linear prism reflects the odd columns toward the right eye. The BLP forms the binocular EMB. The viewer can see the holographic 3D images through the left and right VWs. The zeroth-order VWs are constructed in the eye plane. The distance between the left zeroth VW and the right zeroth VW is set to 65 mm. The higher-order-diffraction components appear around the zeroth VW.
The VW size in a tablet-type holographic 3D display is determined by the pixel size of the SLM and the observation distance. Each CGH for a respective eye is independently calculated and interleaved, column by column, with the BLP. In calculating the CGH, the pixel pitch in the vertical direction is maintained, and the pixel pitch in the horizontal direction is twice the original pixel pitch, and the left and right CGHs are interleaved horizontally. We assume that an achromatic doublet lens comprises the BLP, and that it is possible to measure the position of an eye’s pupil using robust control of the eye-tracking system.
As mentioned above, in general hologram observation in the higher-order-diffraction zone is avoided. However, this study focuses on a method to expand the viewing zone by using not only the zeroth-order but also the positive and negative first-order VWs. Each directional illumination system is independently modeled as a prism for the left or right eye, and is interleaved with an interval of one pixel. The width of each BLP prism is equal to the pixel pitch in the horizontal direction, and the height is equal to the vertical length of the SLM. The BLP is represented as
where
The conventional viewing zone refers to only the zeroth-order VW. In the HUD application, the additional beam-steering system aligns the left and right zeroth-order VWs with the viewer’s eye pupils dynamically. The aberration caused by the windshield becomes more significant and the system cost rises as the range of beam steering increases. The issue is to widen the beam-steering dynamic range from the PGU to the viewer’s eye through the windshield. This problem can be solved through the proposed BLP. In Fig. 3(b), the main zeroth VW is replicated in the 3 × 3 higher-order zones. The viewer can see the same holographic image through the higher-order zone as through the zeroth-order VW. Thus, if the eye-tracking system can sense what VW the viewer’s eye position is located in, the beam steering can align the corresponding higher-order zone to the viewer’s eye, without the extension of the physical beam-steering range. Therefore, the effective viewing zone is extended to a 3 × 3 higher-order VW. In the left panel of Fig. 3(d), the conventional beam-steering range is denoted by the blue dotted rectangle. The additional function of the eye-tracking system is to check which higher-order VW includes the viewer’s eye.
Then the beam-steering system aligns the beam signal with the corresponding zeroth VW, which allows the viewer to see the same holographic image through the higher-order VW, as illustrated in Fig. 4. In the proposed HUD system, the eye tracking scans a wide range, and the scale of the system can be reduced by limiting the beam-steering area to zeroth order. This refers to the higher-order-zone-switching technique. When the eye’s position is identified by the eye-tracking system, the CGH for the position can be corrected by applying an appropriate linear phase profile in real time to the CGH, so that the signal reaches the viewer’s position. In the zeroth VW, the carrier wave for CGH compensation when the eye position is (
In the dynamic zone-switching scheme, the carrier waves are defined to align RGB CGH signals in the target VW. Because we utilize field replication in the higher-order-diffraction zone, the carrier wave must shift the corresponding CGH signal in the zeroth VW as well as the higher-order target VW. Therefore, when the eye’s pupil is positioned at (
In Fig. 4, the dynamic viewing-zone switching is numerically simulated. In the simulation the viewer moves to the higher-order-diffraction region in steps of 13.3 mm and 23.7 mm along the vertical direction, but the viewer can see the CGH as in the zeroth VW. Once the eye-tracking system identifies the position of the viewer’s eye, the
Color matching is enabled in the higher-order-diffraction region by applying a linear phase carrier wave for each RGB component to the CGH. The set of carrier waves for the RGB wavelengths,
Probably, the viewer’s eye may be positioned at the boundary of the RGB VW. If the accuracy of eye tracking is reliable, the CGH can be corrected through the adjustment of RGB carrier waves to observe the full-color hologram. The EMB extension should be accompanied by true image-information updating in three dimensions as the viewer’s position varies over the extended VW, not just the replicated two-dimensional image information, which is in contrast to the conventional exit-pupil-replication-based AR display system.
In conclusion, we have investigated the control of the higher-order diffraction pattern for interpupil-crosstalk damping and the EMB extension for a full-color holographic AR-HUD system. The color-matching method in the extended VW through RGB phase compensation was devised in conjunction with the eye-tracking technique, and its feasibility was validated by numerical simulation. The proposed concepts may be useful in developing not only a practical holographic HUD system, but also binocular holographic displays in general.
In general, the eye-motion box, viewing zone, and viewing window are similar in meaning to “regions in which the hologram can be observed in the holographic display”. In this paper, the name of each area is chosen specifically to discriminate the area where the hologram can be seen in the observer’s eye-lens-plane area formed by the proposed system.
The authors declare no conflicts of interest.
Data underlying the results presented in this paper are not publicly available at this time, but may be obtained from the authors upon reasonable request.
Samsung Research Funding & Incubation Center of Samsung Electronics (SRFC-TB1903-05).