검색
검색 팝업 닫기

Ex) Article Title, Author, Keywords

Article

Split Viewer

Research Paper

Curr. Opt. Photon. 2023; 7(4): 387-397

Published online August 25, 2023 https://doi.org/10.3807/COPP.2023.7.4.387

Copyright © Optical Society of Korea.

Adaptive Extraction Method for Phase Foreground Region in Laser Interferometry of Gear

Xian Wang , Yichao Zhao, Chaoyang Ju, Chaoyong Zhang

School of Mechanical and Precision Instrument Engineering, Xi’an University of Technology, Xi’an 710048, China

Corresponding author: *wangxian@xaut.edu.cn, ORCID 0000-0002-1187-3486

Received: May 18, 2023; Revised: June 17, 2023; Accepted: July 5, 2023

This is an Open Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (http://creativecommons.org/licenses/by-nc/4.0/) which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original work is properly cited.

Tooth surface shape error is an important parameter in gear accuracy evaluation. When tooth surface shape error is measured by laser interferometry, the gear interferogram is highly distorted and the gray level distribution is not uniform. Therefore, it is important for gear interferometry to extract the foreground region from the gear interference fringe image directly and accurately. This paper presents an approach for foreground extraction in gear interference images by leveraging the sinusoidal variation characteristics shown by the interference fringes. A gray level mask with an adaptive threshold is established to capture the relevant features, while a local variance evaluation function is employed to analyze the fluctuation state of the interference image and derive a repair mask. By combining these masks, the foreground region is directly extracted. Comparative evaluations using qualitative and quantitative assessment methods are performed to compare the proposed algorithm with both reference results and traditional approaches. The experimental findings reveal a remarkable degree of matching between the algorithm and the reference results. As a result, this method shows great potential for widespread application in the foreground extraction of gear interference images.

Keywords: Adaptive threshold, Error of gear tooth flank, Foreground region extraction, Phase-shifting laser interferometry

OCIS codes: (100.0100) Image processing; (100.3175) Interferometric imaging; (100.5088) Phase unwrapping

The shape error of gear tooth surfaces plays a crucial role in evaluating gear accuracy, as it has a significant impact on transmission efficiency, noise levels, and service life of transmission systems [1]. Therefore, the precise measurement of tooth surface shape error is of paramount importance for the manufacturing of high-precision gears. Currently, the contact measurement method based on coordinates is the mainstream approach due to its simplicity and low cost. However, this point scanning method suffers from low efficiency, limited accuracy, and potential damage to the tooth surface of high-precision gears, and fails to meet the increasing demand for efficiency and accuracy in precision gear detection [2]. Consequently, non-contact optical measurement methods have emerged as the primary research direction in the field of precision gear detection, offering advantages such as high precision, efficiency, and rich measurement data [3]. Among these methods, laser phase-shifting interferometry stands out as a classic non-contact technique [4]. During interferometric measurement, the phase information reflects the topography error of the tooth surface, and thus, the accuracy of phase unwrapping directly affects the measurement precision of tooth surface shape error [5]. However, the presence of non-measured regions in the acquired tooth surface interference image not only hampers phase unwrapping efficiency but also introduces errors from the background region into the essential phase information within the foreground area due to integration effects [6]. Consequently, accurate extraction of the foreground area from the gear interference image prior to phase unwrapping is crucial because it directly influences the accuracy of the final gear shape measurement, thereby assuming great significance in the overall gear shape error assessment.

Gear interference fringe images present unique challenges, including high distortion of fringes, phase deviations caused by noise, and uneven grayscale distribution. Conventional image processing methods, such as the relative modulation algorithm (RMA) [7], fringe contrast method [8], and level set method [9], are ineffective in handling these images. Therefore, it is imperative to explore a suitable foreground area extraction method specifically tailored for gear phase-shift interferometry systems. Aiming et al. [10] proposed the threshold segmentation region method, which leverages the sinusoidal variation of grayscale values in phase-shifted interferometry and analyzes the characteristics of interference images. Pengcheng et al. [11] proposed a spin segmentation method for tooth surface domain extraction to address the challenge of spurious fringe misidentification in gear tooth surface image extraction. Additionally, Wang et al. [12] presented the Gear’s objection gray scale algorithm (GSA) and interference common filtering algorithm (ICFA), along with the approximate average grayscale method and other rapid identification methods. These approaches have significantly advanced the field of gear interferometry. However, they rely on gear non-interference image masks to indirectly extract foreground information from gear interference images [13]. Consequently, the direct extraction of gear interference images remains unattainable, and the segmentation threshold needs manual adjustment. Moreover, the introduction of additional measurement steps may compromise the consistency of extraction results. Furthermore, these methods have limitations regarding the measurement optical path, making them applicable only to non-common optical path measurement systems. In order to address the above challenges, this study presents a method that integrates a threshold adaptive matching mechanism based on the variation characteristics of gear phase shift interference fringes and the fluctuation properties of local phase information. A local variance evaluation model is proposed to establish an automatic extraction method specifically designed for the foreground region of gear interference images. This method effectively enhances the efficiency of foreground region extraction and ensures the accuracy of the extraction process. By achieving accurate extraction of the foreground area, a solid foundation is laid for the subsequent phase unwrapping process that enables its proper implementation.

The gear measurement system using laser phase shift interferometry operates based on the following principle, as illustrated in Fig. 1. Initially, the linearly polarized light emitted from the He-Ne laser is split into two beams: Measurement light and reference light, achieved by passing through a polarization spectroscopic prism. The measurement light is then directed toward the surface of the gear to be measured by the front light wedge. At this stage, the measurement light carries the morphological information of the target gear. Subsequently, the optical path is adjusted using the rear optical wedge, directing the measurement light toward the half-reverse half-lens configuration. The reference light and the measurement light intersect and interfere at this point, resulting in the acquisition of a gear interference image captured by a charge-coupled device (CCD) camera, as depicted in Fig. 2. Within the measurement optical path, the micro-displacement of the mirror is controlled by a piezoelectric transducer (PZT) to achieve a four-step phase shift, with each phase shift step size being π/2.

Figure 1.Principle of laser interferometry for gear measurement.

Figure 2.Interference fringe image of gears.

In gear interferometry, the phase shift is systematically conducted with a consistent step size, resulting in a discernible periodic pattern in the grayscale of the acquired phase-shifted interference fringes. Figure 3 illustrates a simulated image of phase-shifted interference fringes under ideal conditions. It is evident that when the phase shift is 0 and π, π/2 and 3π/2, the discrepancy in grayscale values of pixels at corresponding positions within the interference fringes reaches its peak, showcasing a distinct sinusoidal variation characteristic. The method presented in this paper is still effective for other methods with different phase-shift steps. However, the distinguishing effect is not as obvious as in the four-step phase-shifting images.

Figure 3.Simulation image and phase characteristics of interference fringe without noise.

Figure 4 is an image depicting simulated phase-shifting interference fringes accompanied by various types of noise, including Gaussian noise, salt and pepper noise [with a peak signal-to-noise ratio (PSNR) value of 13.3035], and other typical artifacts introduced by imaging equipment and common optical path phase errors. Despite the presence of these noise sources, the interference fringes continue to show noticeable sinusoidal variation characteristics. This resilience to noise ensures the robustness and stability of the algorithm’s logic when applied to real-world conditions.

Figure 4.Simulation image and phase characteristics of interference fringe with noise.

To analyze the grayscale variations, pixels in row 150 of each group of gear interference fringe images presented in Fig. 2 were extracted. Upon observation, it was found that in the measured gear interference image, the largest discrepancy in gray values between pixels with identical coordinates in the foreground area occurs at phase shifts of 0 and π, respectively, as shown in Fig. 5. This behavior aligns with the sinusoidal variation characteristics exhibited by interference fringes. Conversely, the gray values of spurious foreground pixels in the background area demonstrate minimal changes during phase shifting, as shown in Fig. 6. Exploiting this discrepancy, the distinctive features of pixels at different phases can be used to eliminate erroneous foreground information, establish connectivity domains, and enhance the contrast between the foreground and background regions.

Figure 5.Characteristics of wave surface variation in foreground region.

Figure 6.Characteristics of wave surface variation in background region.

The algorithmic processing flow, as illustrated in Fig. 7, is divided into two stages: Grayscale mask M1 and repair mask M2. The target image undergoes several steps, including preprocessing, differential operation, gray set allocation, threshold extraction, neighborhood local variance analysis, and other procedures to achieve the direct extraction of the foreground area from the final gear interference image.

Figure 7.Foreground extraction flow chart.

Firstly, the processed image undergoes median filter preprocessing to effectively suppress the noise signal, preserving edge information and avoiding blurring of image details. Subsequently, the interference differential image under different phase shifts is calculated using the following equation:

Grayix,y=φi+1x,ysiniπ/2+φi+3x,ysin i+2π/2i=1

Grayix,y=φix,ysin i1π/2+φi+2x,ysin i+1π/2i=1

where Grayi(x, y) represents the grayscale value of the target pixel in the ith processed image, φi(x, y) denotes the grayscale value of the target pixel in the ith phase-shifting interference image, and represents the sum of all pixel grayscale values in the processed image.

At this stage, the processed image is obtained for further extraction of the foreground area. Subsequently, a threshold value is determined to extract the foreground area information from the gear interference image. As depicted in Fig. 8, interval positioning requires reassigning grayscale values to the image pixels. The lowest pixel grayscale value, which accounts for 5% of the total, is categorized into the range of 0 to 0.1, while the highest pixel grayscale value of the last 5% is classified into the range of 0.9 to 1. The maximum gray value among the pixels within each interval is determined, and the remaining pixel grayscale values are redistributed within the 0 to 1 range, and subsequently quantified into 10 smaller intervals [14]. The purpose of this process is to filter and exclude pixels with excessively large or small grayscale values in the image, thus preventing errors that may affect subsequent threshold selection.

Figure 8.Gray value allocation strategy.

A distribution histogram is generated by obtaining the maximum gray value within each interval. Subsequently, a differential process is applied to the maximum gray values of adjacent regions. Finally, a search is performed from left to right to identify the first area in the histogram that shows significant growth and change. The criterion for significant change is that the amount of change should be at least 0.5% of the maximum gray value in the image. In Fig. 8(b), the red area represents the first areas that meet the requirements after reallocation. Equation (3) is then used to calculate the threshold value, denoted as E(X), for the gray values of the adjacent regions.

EX=GrayL+GrayR2

The threshold value, denoted as E(X), is determined based on the gray values of the adjacent left region (GrayL) and the adjacent right region (GrayR) after reassignment, Grayi(x, y) and Grayi+1(x, y), respectively. This threshold value is used to obtain the grayscale mask M1.

M1 reveals the presence of defect areas and redundant connectivity domains in the gray mask of the gear tooth surface interference fringes that do not conform to the expected pattern. As a result, direct extraction of the foreground area using the gray mask alone is not possible. To address this issue, a repair mask, M2, is established to mitigate image defects, promote connectivity, and enhance the stability and accuracy of image segmentation. The aim is to achieve the final extraction of the foreground area by incorporating the repair mask with M1.

To repair the mask M2, an analysis of the phase information and fluctuation state of the differentially processed image pixels is conducted. For this purpose, a neighborhood window of size m × n is established. The optimal detection result is achieved when the neighborhood window size is verified as 3 × 3. Assign the grayscale value of the target pixel as φ(x, y), and denote the grayscale values of its eight neighboring pixels as φi(x, y) i = 1, 2, ..., 8.

In the differential image, pixels belonging to the foreground area show significant fluctuations, indicating variations in the underlying features. On the other hand, pixels in the background area remain stable and show minimal changes with the phase shift. This disparity in fluctuation patterns distinguishes the foreground and background regions in the image. After the neighborhood window is established, calculate the local variance of the pixels in the neighborhood window that sized m × n, as follows:

σ2=φix,yEX2+ i=1 m×n1φix,yEX2m×n

In Eq.(4), φi(x, y) represents the grayscale value of the neighborhood pixels at point (x, y), and E(X) denotes the threshold.

When the calculated local variance of the target pixel’s neighborhood window falls within the σ2 [0.8, 1.0] interval of the histogram before reassignment in Fig. 8, it is determined to be an effective foreground region for repair. Conversely, if the local variance value is outside this interval, it is classified as a background area. This thresholding process allows for the identification and differentiation of foreground and background regions based on their respective local variance values.

Grayx,y= 255 0.8φx,y1.0 0 φx,y<0.8

After traversing the differential image, the repair mask M2 is obtained. These two masks are then applied successively to extract the foreground area in the differential images. A fundamental flowchart for the extraction of the tooth surface’s foreground area is illustrated in Fig. 9.

Figure 9.Process of foreground extraction. (a) Interferogram, (b) differential image, (c) gray mask, (d) repair mask, and (e) final mask.

In this section, to validate the effectiveness of the proposed algorithm, a gear laser phase shift interferometry system is built, as depicted in Fig. 10(a). The system employs a He-Ne laser with a wavelength of 632.8 nm, a polarization degree of 500:1, and power stability within ±5% as the light source. The driving power supply used is a PZT servo-controller E53.C that provides a series output voltage range of 0 to 120 V. The gear under measurement is depicted in Fig. 10(b), and its corresponding parameters are presented in Table 1.

TABLE 1 Parameters of gear to be measured

Measured GearModulusNumber of TeethTooth Width
(mm)
Helical Angle (°)Pressure Angle (°)
Helical Gear360152020
Spur Gear2.55020-20


Figure 10.Laser interferometry for gear measurement. (a) Measurement platform and (b) measured gear tooth flank.

Three sets of gear interference fringe images were acquired using the aforementioned phase-shifting interferometry system, as depicted in Fig. 11. Figures 11(a) and 11(b) represent helical gear interference fringe images, while Fig. 11(c) represents spur gear interference fringe images.

Figure 11.Gear interference image. (a) Group A, (b) group B, and (c) group C.

Initially, based on the grayscale distribution pattern of the gear tooth surface fringes, the foreground area of the interference fringes in the aforementioned images was manually segmented and extracted as the reference result for this particular set. The resulting segmented mask images are displayed in Fig. 12. Figures 12(a)12(c) correspond to the masks of groups A, B, and C, respectively. These reference results will be used for comparison and accuracy analysis in subsequent algorithm extraction processes.

Figure 12.Reference mask. (a) Group A, (b) group B, and (c) group C.

For each group, differential processing was applied to the four-step phase-shift measurement, resulting in differentially extracted images that contained information regarding the phase variation of the tooth surface, as shown in Fig. 13.

Figure 13.Results of differential. (a) Group A, (b) group B, and (c) group C.

The foreground areas were extracted from the above three sets of differentially processed results, and the adaptive threshold algorithm described earlier was employed to redistribute the grayscale intervals. The locked red areas in Figs. 1416 illustrate the grayscale distribution with the respective threshold values. The threshold values for the final gear interference images of groups A, B, and C are 27, 42, and 12, respectively.

Figure 14.Threshold calculation of group A. (a) Gray distribution before redistribution and (b) gray distribution after redistribution.

Figure 15.Threshold calculation of group B. (a) Gray distribution before redistribution and (b) gray distribution after redistribution.

Figure 16.Threshold calculation of Group C. (a) Gray distribution before redistribution and (b) gray distribution after redistribution.

The first segmentation extraction is performed using the threshold obtained through the algorithm’s automatic search. This process results in the coarsely extracted grayscale mask of the gear interference fringe image, denoted as M1. The obtained results are illustrated in Fig. 17.

Figure 17.Gray mask. (a) Group A, (b) group B, and (c) group C.

Subsequently, by analyzing the local variance of the target pixel within the neighborhood window, the foreground and background information are distinguished, leading to the generation of the repair mask, denoted as M2. Figure 18 displays the obtained M2 mask. By combining the two acquired masks, M1 and M2, the foreground area of the tooth surface stripe in the three sets of interference images is extracted. Subsequent post-processing steps address false foreground and background areas, resulting in the extraction mask and tooth surface foreground area of the gear tooth surface. Figure 19 presents the final results.

Figure 18.Repair mask results. (a) Group A, (b) group B, and (c) group C.

Figure 19.Foreground extraction results. (a) Group A, (b) group B, and (c) group C.

First, the similarity of the images is compared using various methods, as depicted in Figs. 2022. The selected comparison methods include the RMA, GSA, ICFA [12], and the algorithm proposed in this paper.

Figure 20.Comparison of extraction results of foreground region of group A.

Figure 21.Comparison of extraction results of foreground region of group B.

Figure 22.Comparison of extraction results of foreground region of group C.

Among the selected methods, GSA (green line) and ICFA (yellow line) are widely employed for the measurement of gear interference. On the other hand, RMA (purple line) is a commonly used approach for recognizing the foreground area in an interferogram [4]. In comparison, the algorithm proposed in this article shows a higher degree of similarity between the extracted image edges (red line) and the reference image edge (blue line). In particular, it demonstrates better alignment between the edges of the tooth top A region and the tooth root B region, which are known to be challenging areas to process in groups B and C.

To further analyze the foreground extraction results, the segmentation results of all pixels in each image group are compared. Tables 24 present a comparison between the reference segmentation results of the three groups, the segmentation results obtained using comparative algorithms, and the segmentation results achieved using the algorithm proposed in this paper. Group A and group B consist of 408,000 image pixels, while group C comprises 745,608 image pixels. The similarity of images in group A reached 97.84%, while in group B it was 96.21%, and in group C it was 99.33%. The overall matching accuracy of the images ranges from approximately 97.50% to 99.00%. The proposed algorithm demonstrates an improvement in accuracy ranging from 1.50% to 3.50% compared to the comparative algorithms.

TABLE 2 Comparison of accuracy between proposed method and Gear’s objection gray scale algorithm (GSA)

Rate ImagesReferenceGSA AlgorithmThis Article
Image PrimesAccuracy
(%)
Image PrimesAccuracy
(%)
A408,000387,68195.02399,22297.84
B408,000384,97594.35392,56496.21
C745,608724,62597.19740,68499.33


TABLE 3 Comparison of accuracy between proposed method and interference common filtering algorithm (ICFA)

Rate ImagesReferenceICFAThis Article
Image PrimesAccuracy
(%)
Image PrimesAccuracy
(%)
A408,000378,06592.66399,22297.84
B408,000372,48991.29392,56496.21
C745,608720,03196.56740,68499.33


TABLE 4 Comparison of accuracy between proposed method and relative modulation algorithm (RMA)

Rate ImagesReferenceRMA AlgorithmThis Article Method
Image PrimesAccuracy
(%)
Image PrimesAccuracy
(%)
A408,000392,90496.30399,22297.84
B408,000385,30294.43392,56496.21
C745,608731,21698.06740,68499.33


Table 5 presents a comparison of the execution efficiency between the proposed algorithm and the alternative algorithms. While maintaining result accuracy, the proposed algorithm achieves a significant reduction in running time by 89.5% to 92.5% due to its independence from the tooth surface image. In contrast, the comparison algorithms require threshold adjustments, resulting in similar runtimes, with the fastest among them reported here.

TABLE 5 Time cost evaluation

Rate ImagesGSA Algorithm/sICF Algorithm/sRMA Algorithm/sThis Article Method/sMaximum Efficiency (%)
A30.1929.8527.932.2592.55
B31.2232.0128.542.4992.02
C39.2539.0736.604.0689.61


To verify the accuracy of the algorithm’s extraction results, a supervised evaluation algorithm [15] was employed to analyze each group of images under detection. The evaluation encompassed image similarity, probabilistic rand index (PRI) [16], variation of information (VOI) [17, 18], and global consistency error (GCE) index [19, 20]. PRI is computed by tallying the total number of pixel pairs with identical labels and those with differing labels between the segmented image and the ground truth, divided by the sum of all pixel pairs. Its range is [0,1], with higher values indicating greater image matching. VOI assesses the magnitude and trend of pixel information changes between the reference image and the algorithm image by assessing the entropy value. A smaller value indicates more accurate algorithm segmentation results. GCE, ranging from [0,1], quantifies the segmentation quality, with smaller values indicating better performance.

By analyzing these evaluation indicators, it is observed that PRI improves by approximately 2.5% to 3.5%. GCE demonstrates a reduction of around 15% to 20%, while VOI decreases by about 13% to 16%. Compared to the results obtained from the comparative algorithms, the algorithm proposed in this paper shows favorable improvements in all evaluated aspects. This is demonstrated in Tables 68.

TABLE 6 Probabilistic rand index (PRI) index extraction results

Rate ImagesPRI
GSAICFARMAThis Article Method
A0.94540.91160.95810.9832
B0.94010.90830.94840.9657
C0.93920.91610.95860.9891


TABLE 7 Variation of information (VOI) index extraction results

Rate ImagesVOI
GSAICFARMAThis Article Method
A1.13791.27151.05270.9876
B1.18451.19521.14391.0322
C1.09981.13440.98440.9213


TABLE 8 Global consistency error (GCE) index extraction results

Rate ImagesGCE
GSAICFARMAThis Article Method
A0.20760.25090.19320.1533
B0.21990.23470.20060.1862
C0.15300.17380.13820.1147


Finally, the phase information within the foreground region is extracted based on the different segmentation results, allowing for a comparison of the phase error between each algorithm and the reference result. This evaluation aims to assess the impact of the foreground extraction algorithm for interference images on the final phase accuracy. The phase extraction results of each algorithm at a specific line are depicted in Fig. 23.

Figure 23.Helical gear line phase data comparison.

Through the analysis of the phase extraction results obtained from each group of measurement targets, it is evident that the phase difference is prominently observed in the edge region of the foreground image extraction. Significantly, the proposed algorithm outperforms the comparison algorithm in accurately obtaining phase data. In the context of helical gear measurements, the proposed algorithm effectively reduces the measurement error at the edge by 11 μm, facilitating the realization of high-precision interferometry for gear analysis and assessment.

In this research paper, a method is presented for the direct adaptive extraction of the foreground area on the tooth surface in gear interferometry. This method eliminates the reliance on conventional extraction methods that depend on non-interference images of the tooth surface. Multiple sets of interferometric measurements were conducted on gears with diverse parameters, and the measurement areas from these sets of interference fringe images were extracted. The extraction results obtained using different methods were both qualitatively and quantitatively compared. The findings demonstrate that the proposed method significantly enhances algorithm efficiency while maintaining the accuracy of the extracted area. This ensures precise measurements and validates the effectiveness and reliability of the proposed algorithm.

Data underlying the results presented in this paper are not publicly available at the time of publication, but may be obtained from the authors upon reasonable request.

NSFC grant number 52205067, 61805195, 52004213; Natural Science Basic Research Program of Shaanxi grant number 2022JQ-403; China Postdoctoral Science Foundation grant number 2020M683683XB.

  1. X. Wang, S. Fang, X. Zhu, J. Ji, P. Yang, M. Komori, and A. Kubo, “Nonlinear diffusion and structure tensor based segmentation of valid measurement region from interference fringe patterns on gear systems,” Curr. Opt. Photonics 1, 587-597 (2017).
  2. Z. Shi, B. Yu, X. Song, and X. Wang, “Development of gear measurement technology during last 20 Years,” Chin. Mech. Eng. 33, 1009-1024 (2022).
  3. Y. Dai, X. Sheng, and P. Yang, “Comparison and analysis of effective measurement area segmentation methods in tooth surface interference fringe pattern,” Machin. Electron. 39, 3-7 (2021).
  4. C. Zuo and Q. Chen, “Computational optical imaging: An overview,” Infrar. Laser Eng. 51, 20220110 (2022).
    CrossRef
  5. L. Meng, S. Fang, P. Yang, L. Wang, M. Komori, and A. Kubo, “Image-inpainting and quality-guided phase unwrapping algorithm,” Appl. Opt. 51, 2457-2462 (2012).
    Pubmed CrossRef
  6. X. Wang, S. Fang, X. Zhu, K. Kou, Y. Liu, and M. Jiao, “Phase unwrapping based on adaptive image in-painting of fringe patterns in measuring gear tooth flanks by laser interferometry,” Opt. Express 28, 17881-17897 (2020).
    Pubmed CrossRef
  7. H. A. Vrooman and A. A. Maas, “Image processing algorithms for the analysis of phase-shifted speckle interference patterns,” Appl. Opt. 30, 1636-1641 (1991).
    Pubmed CrossRef
  8. T. Ino and T. Yatagai, “Oblique incidence interferometry for gear-tooth surface profiling,” Proc. SPIE 1720, 464-469 (1992).
    CrossRef
  9. X. W. Chang, D. W. Li, and Y. H. Xu, “A study of image segmentation based on level set method,” in Proc. 2013 International Conference on Advanced Computer Science and Electronics Information-ICACSEI 2013 (Beijing, China, Jul. 25-26, 2013), pp. 360-363.
    CrossRef
  10. G. Aiming, C. Lei, and C. Jinbang, “Digitalisation processing technique for interference pattern with obstruct,” Acta Optica Sinica 20, 775 (2000).
  11. Y. Pengcheng, F. Suping, W. Leijie, M. Lei, K. Masaharu, and K. Aizoh, “Correction method for segmenting valid measuring region of interference fringe patterns,” Opt. Eng. 50, 095602 (2011).
    CrossRef
  12. L. Wang, S. Fang, P. Yang, and L. Meng, “Comparison of three methods for identifying fringe regions of interference fringe patterns in measuring gear tooth flanks by laser interferometry,” Optik 126, 5668-5671 (2015).
    CrossRef
  13. X. Wang, X. Zhu, K. Kou, J. Liu, Y. Liu, and B. Qian, “Fringe direction weighted autofocusing algorithm for gear tooth flank form deviation measurement based on an interferogram,” Appl. Opt. 60, 11066-11074 (2021).
    Pubmed CrossRef
  14. D. C. Ghiglia and M. D. Pritt, Two-Dimensional Phase Unwrapping: Theory, Algorithms, and Software (Wiley, USA, 1998).
  15. Y. Cao, H. Liu, and X. Jia, “Overview of image quality assessment method based on deep learning,” Comput. Eng. Appl. 57, 27-36 (2021).
  16. Z. Zhu, Y. Liu, and Y. Li, “Image segmentation of non-destructive test based on image patch and cluster information quantity,” Laser Optoelectron. Prog. 58, 1210009 (2021).
    CrossRef
  17. L. Ge and Y. Zhao, “A method of automatically searching lever ring and calculating the misclosure,” Sci. Surv. Map. 37, 209-212 (2012).
  18. J. Liu, D. Yang, and F. Hu, “Multiscale object detection in remote sensing images combined with multi-receptive-field features and relation-connected attention,” Remote Sens. 14, 427 (2022).
    CrossRef
  19. S. G. A. Usha and S. Vasuki, “Significance of texture features in the segmentation of remotely sensed images,” Optik 249, 168241 (2022).
    CrossRef
  20. X. Wang, H. Liu, and Y. Niu, “Binocular stereo matching by combining multiscale local and deep features,” Acta Optica Sinica 40, 0245001 (2020).
    CrossRef

Article

Research Paper

Curr. Opt. Photon. 2023; 7(4): 387-397

Published online August 25, 2023 https://doi.org/10.3807/COPP.2023.7.4.387

Copyright © Optical Society of Korea.

Adaptive Extraction Method for Phase Foreground Region in Laser Interferometry of Gear

Xian Wang , Yichao Zhao, Chaoyang Ju, Chaoyong Zhang

School of Mechanical and Precision Instrument Engineering, Xi’an University of Technology, Xi’an 710048, China

Correspondence to:*wangxian@xaut.edu.cn, ORCID 0000-0002-1187-3486

Received: May 18, 2023; Revised: June 17, 2023; Accepted: July 5, 2023

This is an Open Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (http://creativecommons.org/licenses/by-nc/4.0/) which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

Tooth surface shape error is an important parameter in gear accuracy evaluation. When tooth surface shape error is measured by laser interferometry, the gear interferogram is highly distorted and the gray level distribution is not uniform. Therefore, it is important for gear interferometry to extract the foreground region from the gear interference fringe image directly and accurately. This paper presents an approach for foreground extraction in gear interference images by leveraging the sinusoidal variation characteristics shown by the interference fringes. A gray level mask with an adaptive threshold is established to capture the relevant features, while a local variance evaluation function is employed to analyze the fluctuation state of the interference image and derive a repair mask. By combining these masks, the foreground region is directly extracted. Comparative evaluations using qualitative and quantitative assessment methods are performed to compare the proposed algorithm with both reference results and traditional approaches. The experimental findings reveal a remarkable degree of matching between the algorithm and the reference results. As a result, this method shows great potential for widespread application in the foreground extraction of gear interference images.

Keywords: Adaptive threshold, Error of gear tooth flank, Foreground region extraction, Phase-shifting laser interferometry

I. INTRODUCTION

The shape error of gear tooth surfaces plays a crucial role in evaluating gear accuracy, as it has a significant impact on transmission efficiency, noise levels, and service life of transmission systems [1]. Therefore, the precise measurement of tooth surface shape error is of paramount importance for the manufacturing of high-precision gears. Currently, the contact measurement method based on coordinates is the mainstream approach due to its simplicity and low cost. However, this point scanning method suffers from low efficiency, limited accuracy, and potential damage to the tooth surface of high-precision gears, and fails to meet the increasing demand for efficiency and accuracy in precision gear detection [2]. Consequently, non-contact optical measurement methods have emerged as the primary research direction in the field of precision gear detection, offering advantages such as high precision, efficiency, and rich measurement data [3]. Among these methods, laser phase-shifting interferometry stands out as a classic non-contact technique [4]. During interferometric measurement, the phase information reflects the topography error of the tooth surface, and thus, the accuracy of phase unwrapping directly affects the measurement precision of tooth surface shape error [5]. However, the presence of non-measured regions in the acquired tooth surface interference image not only hampers phase unwrapping efficiency but also introduces errors from the background region into the essential phase information within the foreground area due to integration effects [6]. Consequently, accurate extraction of the foreground area from the gear interference image prior to phase unwrapping is crucial because it directly influences the accuracy of the final gear shape measurement, thereby assuming great significance in the overall gear shape error assessment.

Gear interference fringe images present unique challenges, including high distortion of fringes, phase deviations caused by noise, and uneven grayscale distribution. Conventional image processing methods, such as the relative modulation algorithm (RMA) [7], fringe contrast method [8], and level set method [9], are ineffective in handling these images. Therefore, it is imperative to explore a suitable foreground area extraction method specifically tailored for gear phase-shift interferometry systems. Aiming et al. [10] proposed the threshold segmentation region method, which leverages the sinusoidal variation of grayscale values in phase-shifted interferometry and analyzes the characteristics of interference images. Pengcheng et al. [11] proposed a spin segmentation method for tooth surface domain extraction to address the challenge of spurious fringe misidentification in gear tooth surface image extraction. Additionally, Wang et al. [12] presented the Gear’s objection gray scale algorithm (GSA) and interference common filtering algorithm (ICFA), along with the approximate average grayscale method and other rapid identification methods. These approaches have significantly advanced the field of gear interferometry. However, they rely on gear non-interference image masks to indirectly extract foreground information from gear interference images [13]. Consequently, the direct extraction of gear interference images remains unattainable, and the segmentation threshold needs manual adjustment. Moreover, the introduction of additional measurement steps may compromise the consistency of extraction results. Furthermore, these methods have limitations regarding the measurement optical path, making them applicable only to non-common optical path measurement systems. In order to address the above challenges, this study presents a method that integrates a threshold adaptive matching mechanism based on the variation characteristics of gear phase shift interference fringes and the fluctuation properties of local phase information. A local variance evaluation model is proposed to establish an automatic extraction method specifically designed for the foreground region of gear interference images. This method effectively enhances the efficiency of foreground region extraction and ensures the accuracy of the extraction process. By achieving accurate extraction of the foreground area, a solid foundation is laid for the subsequent phase unwrapping process that enables its proper implementation.

II. GEAR LASER PHASE SHIFT INTERFEROMETRY PRINCIPLE

The gear measurement system using laser phase shift interferometry operates based on the following principle, as illustrated in Fig. 1. Initially, the linearly polarized light emitted from the He-Ne laser is split into two beams: Measurement light and reference light, achieved by passing through a polarization spectroscopic prism. The measurement light is then directed toward the surface of the gear to be measured by the front light wedge. At this stage, the measurement light carries the morphological information of the target gear. Subsequently, the optical path is adjusted using the rear optical wedge, directing the measurement light toward the half-reverse half-lens configuration. The reference light and the measurement light intersect and interfere at this point, resulting in the acquisition of a gear interference image captured by a charge-coupled device (CCD) camera, as depicted in Fig. 2. Within the measurement optical path, the micro-displacement of the mirror is controlled by a piezoelectric transducer (PZT) to achieve a four-step phase shift, with each phase shift step size being π/2.

Figure 1. Principle of laser interferometry for gear measurement.

Figure 2. Interference fringe image of gears.

III. PRINCIPLES OF PHASE FOREGROUND REGION EXTRACTION ALGORITHM

In gear interferometry, the phase shift is systematically conducted with a consistent step size, resulting in a discernible periodic pattern in the grayscale of the acquired phase-shifted interference fringes. Figure 3 illustrates a simulated image of phase-shifted interference fringes under ideal conditions. It is evident that when the phase shift is 0 and π, π/2 and 3π/2, the discrepancy in grayscale values of pixels at corresponding positions within the interference fringes reaches its peak, showcasing a distinct sinusoidal variation characteristic. The method presented in this paper is still effective for other methods with different phase-shift steps. However, the distinguishing effect is not as obvious as in the four-step phase-shifting images.

Figure 3. Simulation image and phase characteristics of interference fringe without noise.

Figure 4 is an image depicting simulated phase-shifting interference fringes accompanied by various types of noise, including Gaussian noise, salt and pepper noise [with a peak signal-to-noise ratio (PSNR) value of 13.3035], and other typical artifacts introduced by imaging equipment and common optical path phase errors. Despite the presence of these noise sources, the interference fringes continue to show noticeable sinusoidal variation characteristics. This resilience to noise ensures the robustness and stability of the algorithm’s logic when applied to real-world conditions.

Figure 4. Simulation image and phase characteristics of interference fringe with noise.

To analyze the grayscale variations, pixels in row 150 of each group of gear interference fringe images presented in Fig. 2 were extracted. Upon observation, it was found that in the measured gear interference image, the largest discrepancy in gray values between pixels with identical coordinates in the foreground area occurs at phase shifts of 0 and π, respectively, as shown in Fig. 5. This behavior aligns with the sinusoidal variation characteristics exhibited by interference fringes. Conversely, the gray values of spurious foreground pixels in the background area demonstrate minimal changes during phase shifting, as shown in Fig. 6. Exploiting this discrepancy, the distinctive features of pixels at different phases can be used to eliminate erroneous foreground information, establish connectivity domains, and enhance the contrast between the foreground and background regions.

Figure 5. Characteristics of wave surface variation in foreground region.

Figure 6. Characteristics of wave surface variation in background region.

The algorithmic processing flow, as illustrated in Fig. 7, is divided into two stages: Grayscale mask M1 and repair mask M2. The target image undergoes several steps, including preprocessing, differential operation, gray set allocation, threshold extraction, neighborhood local variance analysis, and other procedures to achieve the direct extraction of the foreground area from the final gear interference image.

Figure 7. Foreground extraction flow chart.

Firstly, the processed image undergoes median filter preprocessing to effectively suppress the noise signal, preserving edge information and avoiding blurring of image details. Subsequently, the interference differential image under different phase shifts is calculated using the following equation:

Grayix,y=φi+1x,ysiniπ/2+φi+3x,ysin i+2π/2i=1

Grayix,y=φix,ysin i1π/2+φi+2x,ysin i+1π/2i=1

where Grayi(x, y) represents the grayscale value of the target pixel in the ith processed image, φi(x, y) denotes the grayscale value of the target pixel in the ith phase-shifting interference image, and represents the sum of all pixel grayscale values in the processed image.

At this stage, the processed image is obtained for further extraction of the foreground area. Subsequently, a threshold value is determined to extract the foreground area information from the gear interference image. As depicted in Fig. 8, interval positioning requires reassigning grayscale values to the image pixels. The lowest pixel grayscale value, which accounts for 5% of the total, is categorized into the range of 0 to 0.1, while the highest pixel grayscale value of the last 5% is classified into the range of 0.9 to 1. The maximum gray value among the pixels within each interval is determined, and the remaining pixel grayscale values are redistributed within the 0 to 1 range, and subsequently quantified into 10 smaller intervals [14]. The purpose of this process is to filter and exclude pixels with excessively large or small grayscale values in the image, thus preventing errors that may affect subsequent threshold selection.

Figure 8. Gray value allocation strategy.

A distribution histogram is generated by obtaining the maximum gray value within each interval. Subsequently, a differential process is applied to the maximum gray values of adjacent regions. Finally, a search is performed from left to right to identify the first area in the histogram that shows significant growth and change. The criterion for significant change is that the amount of change should be at least 0.5% of the maximum gray value in the image. In Fig. 8(b), the red area represents the first areas that meet the requirements after reallocation. Equation (3) is then used to calculate the threshold value, denoted as E(X), for the gray values of the adjacent regions.

EX=GrayL+GrayR2

The threshold value, denoted as E(X), is determined based on the gray values of the adjacent left region (GrayL) and the adjacent right region (GrayR) after reassignment, Grayi(x, y) and Grayi+1(x, y), respectively. This threshold value is used to obtain the grayscale mask M1.

M1 reveals the presence of defect areas and redundant connectivity domains in the gray mask of the gear tooth surface interference fringes that do not conform to the expected pattern. As a result, direct extraction of the foreground area using the gray mask alone is not possible. To address this issue, a repair mask, M2, is established to mitigate image defects, promote connectivity, and enhance the stability and accuracy of image segmentation. The aim is to achieve the final extraction of the foreground area by incorporating the repair mask with M1.

To repair the mask M2, an analysis of the phase information and fluctuation state of the differentially processed image pixels is conducted. For this purpose, a neighborhood window of size m × n is established. The optimal detection result is achieved when the neighborhood window size is verified as 3 × 3. Assign the grayscale value of the target pixel as φ(x, y), and denote the grayscale values of its eight neighboring pixels as φi(x, y) i = 1, 2, ..., 8.

In the differential image, pixels belonging to the foreground area show significant fluctuations, indicating variations in the underlying features. On the other hand, pixels in the background area remain stable and show minimal changes with the phase shift. This disparity in fluctuation patterns distinguishes the foreground and background regions in the image. After the neighborhood window is established, calculate the local variance of the pixels in the neighborhood window that sized m × n, as follows:

σ2=φix,yEX2+ i=1 m×n1φix,yEX2m×n

In Eq.(4), φi(x, y) represents the grayscale value of the neighborhood pixels at point (x, y), and E(X) denotes the threshold.

When the calculated local variance of the target pixel’s neighborhood window falls within the σ2 [0.8, 1.0] interval of the histogram before reassignment in Fig. 8, it is determined to be an effective foreground region for repair. Conversely, if the local variance value is outside this interval, it is classified as a background area. This thresholding process allows for the identification and differentiation of foreground and background regions based on their respective local variance values.

Grayx,y= 255 0.8φx,y1.0 0 φx,y<0.8

After traversing the differential image, the repair mask M2 is obtained. These two masks are then applied successively to extract the foreground area in the differential images. A fundamental flowchart for the extraction of the tooth surface’s foreground area is illustrated in Fig. 9.

Figure 9. Process of foreground extraction. (a) Interferogram, (b) differential image, (c) gray mask, (d) repair mask, and (e) final mask.

IV. EXPERIMENTAL VERIFICATION

In this section, to validate the effectiveness of the proposed algorithm, a gear laser phase shift interferometry system is built, as depicted in Fig. 10(a). The system employs a He-Ne laser with a wavelength of 632.8 nm, a polarization degree of 500:1, and power stability within ±5% as the light source. The driving power supply used is a PZT servo-controller E53.C that provides a series output voltage range of 0 to 120 V. The gear under measurement is depicted in Fig. 10(b), and its corresponding parameters are presented in Table 1.

TABLE 1. Parameters of gear to be measured.

Measured GearModulusNumber of TeethTooth Width
(mm)
Helical Angle (°)Pressure Angle (°)
Helical Gear360152020
Spur Gear2.55020-20


Figure 10. Laser interferometry for gear measurement. (a) Measurement platform and (b) measured gear tooth flank.

Three sets of gear interference fringe images were acquired using the aforementioned phase-shifting interferometry system, as depicted in Fig. 11. Figures 11(a) and 11(b) represent helical gear interference fringe images, while Fig. 11(c) represents spur gear interference fringe images.

Figure 11. Gear interference image. (a) Group A, (b) group B, and (c) group C.

Initially, based on the grayscale distribution pattern of the gear tooth surface fringes, the foreground area of the interference fringes in the aforementioned images was manually segmented and extracted as the reference result for this particular set. The resulting segmented mask images are displayed in Fig. 12. Figures 12(a)12(c) correspond to the masks of groups A, B, and C, respectively. These reference results will be used for comparison and accuracy analysis in subsequent algorithm extraction processes.

Figure 12. Reference mask. (a) Group A, (b) group B, and (c) group C.

For each group, differential processing was applied to the four-step phase-shift measurement, resulting in differentially extracted images that contained information regarding the phase variation of the tooth surface, as shown in Fig. 13.

Figure 13. Results of differential. (a) Group A, (b) group B, and (c) group C.

The foreground areas were extracted from the above three sets of differentially processed results, and the adaptive threshold algorithm described earlier was employed to redistribute the grayscale intervals. The locked red areas in Figs. 1416 illustrate the grayscale distribution with the respective threshold values. The threshold values for the final gear interference images of groups A, B, and C are 27, 42, and 12, respectively.

Figure 14. Threshold calculation of group A. (a) Gray distribution before redistribution and (b) gray distribution after redistribution.

Figure 15. Threshold calculation of group B. (a) Gray distribution before redistribution and (b) gray distribution after redistribution.

Figure 16. Threshold calculation of Group C. (a) Gray distribution before redistribution and (b) gray distribution after redistribution.

The first segmentation extraction is performed using the threshold obtained through the algorithm’s automatic search. This process results in the coarsely extracted grayscale mask of the gear interference fringe image, denoted as M1. The obtained results are illustrated in Fig. 17.

Figure 17. Gray mask. (a) Group A, (b) group B, and (c) group C.

Subsequently, by analyzing the local variance of the target pixel within the neighborhood window, the foreground and background information are distinguished, leading to the generation of the repair mask, denoted as M2. Figure 18 displays the obtained M2 mask. By combining the two acquired masks, M1 and M2, the foreground area of the tooth surface stripe in the three sets of interference images is extracted. Subsequent post-processing steps address false foreground and background areas, resulting in the extraction mask and tooth surface foreground area of the gear tooth surface. Figure 19 presents the final results.

Figure 18. Repair mask results. (a) Group A, (b) group B, and (c) group C.

Figure 19. Foreground extraction results. (a) Group A, (b) group B, and (c) group C.

First, the similarity of the images is compared using various methods, as depicted in Figs. 2022. The selected comparison methods include the RMA, GSA, ICFA [12], and the algorithm proposed in this paper.

Figure 20. Comparison of extraction results of foreground region of group A.

Figure 21. Comparison of extraction results of foreground region of group B.

Figure 22. Comparison of extraction results of foreground region of group C.

Among the selected methods, GSA (green line) and ICFA (yellow line) are widely employed for the measurement of gear interference. On the other hand, RMA (purple line) is a commonly used approach for recognizing the foreground area in an interferogram [4]. In comparison, the algorithm proposed in this article shows a higher degree of similarity between the extracted image edges (red line) and the reference image edge (blue line). In particular, it demonstrates better alignment between the edges of the tooth top A region and the tooth root B region, which are known to be challenging areas to process in groups B and C.

To further analyze the foreground extraction results, the segmentation results of all pixels in each image group are compared. Tables 24 present a comparison between the reference segmentation results of the three groups, the segmentation results obtained using comparative algorithms, and the segmentation results achieved using the algorithm proposed in this paper. Group A and group B consist of 408,000 image pixels, while group C comprises 745,608 image pixels. The similarity of images in group A reached 97.84%, while in group B it was 96.21%, and in group C it was 99.33%. The overall matching accuracy of the images ranges from approximately 97.50% to 99.00%. The proposed algorithm demonstrates an improvement in accuracy ranging from 1.50% to 3.50% compared to the comparative algorithms.

TABLE 2. Comparison of accuracy between proposed method and Gear’s objection gray scale algorithm (GSA).

Rate ImagesReferenceGSA AlgorithmThis Article
Image PrimesAccuracy
(%)
Image PrimesAccuracy
(%)
A408,000387,68195.02399,22297.84
B408,000384,97594.35392,56496.21
C745,608724,62597.19740,68499.33


TABLE 3. Comparison of accuracy between proposed method and interference common filtering algorithm (ICFA).

Rate ImagesReferenceICFAThis Article
Image PrimesAccuracy
(%)
Image PrimesAccuracy
(%)
A408,000378,06592.66399,22297.84
B408,000372,48991.29392,56496.21
C745,608720,03196.56740,68499.33


TABLE 4. Comparison of accuracy between proposed method and relative modulation algorithm (RMA).

Rate ImagesReferenceRMA AlgorithmThis Article Method
Image PrimesAccuracy
(%)
Image PrimesAccuracy
(%)
A408,000392,90496.30399,22297.84
B408,000385,30294.43392,56496.21
C745,608731,21698.06740,68499.33


Table 5 presents a comparison of the execution efficiency between the proposed algorithm and the alternative algorithms. While maintaining result accuracy, the proposed algorithm achieves a significant reduction in running time by 89.5% to 92.5% due to its independence from the tooth surface image. In contrast, the comparison algorithms require threshold adjustments, resulting in similar runtimes, with the fastest among them reported here.

TABLE 5. Time cost evaluation.

Rate ImagesGSA Algorithm/sICF Algorithm/sRMA Algorithm/sThis Article Method/sMaximum Efficiency (%)
A30.1929.8527.932.2592.55
B31.2232.0128.542.4992.02
C39.2539.0736.604.0689.61


To verify the accuracy of the algorithm’s extraction results, a supervised evaluation algorithm [15] was employed to analyze each group of images under detection. The evaluation encompassed image similarity, probabilistic rand index (PRI) [16], variation of information (VOI) [17, 18], and global consistency error (GCE) index [19, 20]. PRI is computed by tallying the total number of pixel pairs with identical labels and those with differing labels between the segmented image and the ground truth, divided by the sum of all pixel pairs. Its range is [0,1], with higher values indicating greater image matching. VOI assesses the magnitude and trend of pixel information changes between the reference image and the algorithm image by assessing the entropy value. A smaller value indicates more accurate algorithm segmentation results. GCE, ranging from [0,1], quantifies the segmentation quality, with smaller values indicating better performance.

By analyzing these evaluation indicators, it is observed that PRI improves by approximately 2.5% to 3.5%. GCE demonstrates a reduction of around 15% to 20%, while VOI decreases by about 13% to 16%. Compared to the results obtained from the comparative algorithms, the algorithm proposed in this paper shows favorable improvements in all evaluated aspects. This is demonstrated in Tables 68.

TABLE 6. Probabilistic rand index (PRI) index extraction results.

Rate ImagesPRI
GSAICFARMAThis Article Method
A0.94540.91160.95810.9832
B0.94010.90830.94840.9657
C0.93920.91610.95860.9891


TABLE 7. Variation of information (VOI) index extraction results.

Rate ImagesVOI
GSAICFARMAThis Article Method
A1.13791.27151.05270.9876
B1.18451.19521.14391.0322
C1.09981.13440.98440.9213


TABLE 8. Global consistency error (GCE) index extraction results.

Rate ImagesGCE
GSAICFARMAThis Article Method
A0.20760.25090.19320.1533
B0.21990.23470.20060.1862
C0.15300.17380.13820.1147


Finally, the phase information within the foreground region is extracted based on the different segmentation results, allowing for a comparison of the phase error between each algorithm and the reference result. This evaluation aims to assess the impact of the foreground extraction algorithm for interference images on the final phase accuracy. The phase extraction results of each algorithm at a specific line are depicted in Fig. 23.

Figure 23. Helical gear line phase data comparison.

Through the analysis of the phase extraction results obtained from each group of measurement targets, it is evident that the phase difference is prominently observed in the edge region of the foreground image extraction. Significantly, the proposed algorithm outperforms the comparison algorithm in accurately obtaining phase data. In the context of helical gear measurements, the proposed algorithm effectively reduces the measurement error at the edge by 11 μm, facilitating the realization of high-precision interferometry for gear analysis and assessment.

V. CONCLUSION

In this research paper, a method is presented for the direct adaptive extraction of the foreground area on the tooth surface in gear interferometry. This method eliminates the reliance on conventional extraction methods that depend on non-interference images of the tooth surface. Multiple sets of interferometric measurements were conducted on gears with diverse parameters, and the measurement areas from these sets of interference fringe images were extracted. The extraction results obtained using different methods were both qualitatively and quantitatively compared. The findings demonstrate that the proposed method significantly enhances algorithm efficiency while maintaining the accuracy of the extracted area. This ensures precise measurements and validates the effectiveness and reliability of the proposed algorithm.

DISCLOSURES

The authors declare no conflicts of interest.

DATA AVAILABILITY

Data underlying the results presented in this paper are not publicly available at the time of publication, but may be obtained from the authors upon reasonable request.

FUNDING

NSFC grant number 52205067, 61805195, 52004213; Natural Science Basic Research Program of Shaanxi grant number 2022JQ-403; China Postdoctoral Science Foundation grant number 2020M683683XB.

Fig 1.

Figure 1.Principle of laser interferometry for gear measurement.
Current Optics and Photonics 2023; 7: 387-397https://doi.org/10.3807/COPP.2023.7.4.387

Fig 2.

Figure 2.Interference fringe image of gears.
Current Optics and Photonics 2023; 7: 387-397https://doi.org/10.3807/COPP.2023.7.4.387

Fig 3.

Figure 3.Simulation image and phase characteristics of interference fringe without noise.
Current Optics and Photonics 2023; 7: 387-397https://doi.org/10.3807/COPP.2023.7.4.387

Fig 4.

Figure 4.Simulation image and phase characteristics of interference fringe with noise.
Current Optics and Photonics 2023; 7: 387-397https://doi.org/10.3807/COPP.2023.7.4.387

Fig 5.

Figure 5.Characteristics of wave surface variation in foreground region.
Current Optics and Photonics 2023; 7: 387-397https://doi.org/10.3807/COPP.2023.7.4.387

Fig 6.

Figure 6.Characteristics of wave surface variation in background region.
Current Optics and Photonics 2023; 7: 387-397https://doi.org/10.3807/COPP.2023.7.4.387

Fig 7.

Figure 7.Foreground extraction flow chart.
Current Optics and Photonics 2023; 7: 387-397https://doi.org/10.3807/COPP.2023.7.4.387

Fig 8.

Figure 8.Gray value allocation strategy.
Current Optics and Photonics 2023; 7: 387-397https://doi.org/10.3807/COPP.2023.7.4.387

Fig 9.

Figure 9.Process of foreground extraction. (a) Interferogram, (b) differential image, (c) gray mask, (d) repair mask, and (e) final mask.
Current Optics and Photonics 2023; 7: 387-397https://doi.org/10.3807/COPP.2023.7.4.387

Fig 10.

Figure 10.Laser interferometry for gear measurement. (a) Measurement platform and (b) measured gear tooth flank.
Current Optics and Photonics 2023; 7: 387-397https://doi.org/10.3807/COPP.2023.7.4.387

Fig 11.

Figure 11.Gear interference image. (a) Group A, (b) group B, and (c) group C.
Current Optics and Photonics 2023; 7: 387-397https://doi.org/10.3807/COPP.2023.7.4.387

Fig 12.

Figure 12.Reference mask. (a) Group A, (b) group B, and (c) group C.
Current Optics and Photonics 2023; 7: 387-397https://doi.org/10.3807/COPP.2023.7.4.387

Fig 13.

Figure 13.Results of differential. (a) Group A, (b) group B, and (c) group C.
Current Optics and Photonics 2023; 7: 387-397https://doi.org/10.3807/COPP.2023.7.4.387

Fig 14.

Figure 14.Threshold calculation of group A. (a) Gray distribution before redistribution and (b) gray distribution after redistribution.
Current Optics and Photonics 2023; 7: 387-397https://doi.org/10.3807/COPP.2023.7.4.387

Fig 15.

Figure 15.Threshold calculation of group B. (a) Gray distribution before redistribution and (b) gray distribution after redistribution.
Current Optics and Photonics 2023; 7: 387-397https://doi.org/10.3807/COPP.2023.7.4.387

Fig 16.

Figure 16.Threshold calculation of Group C. (a) Gray distribution before redistribution and (b) gray distribution after redistribution.
Current Optics and Photonics 2023; 7: 387-397https://doi.org/10.3807/COPP.2023.7.4.387

Fig 17.

Figure 17.Gray mask. (a) Group A, (b) group B, and (c) group C.
Current Optics and Photonics 2023; 7: 387-397https://doi.org/10.3807/COPP.2023.7.4.387

Fig 18.

Figure 18.Repair mask results. (a) Group A, (b) group B, and (c) group C.
Current Optics and Photonics 2023; 7: 387-397https://doi.org/10.3807/COPP.2023.7.4.387

Fig 19.

Figure 19.Foreground extraction results. (a) Group A, (b) group B, and (c) group C.
Current Optics and Photonics 2023; 7: 387-397https://doi.org/10.3807/COPP.2023.7.4.387

Fig 20.

Figure 20.Comparison of extraction results of foreground region of group A.
Current Optics and Photonics 2023; 7: 387-397https://doi.org/10.3807/COPP.2023.7.4.387

Fig 21.

Figure 21.Comparison of extraction results of foreground region of group B.
Current Optics and Photonics 2023; 7: 387-397https://doi.org/10.3807/COPP.2023.7.4.387

Fig 22.

Figure 22.Comparison of extraction results of foreground region of group C.
Current Optics and Photonics 2023; 7: 387-397https://doi.org/10.3807/COPP.2023.7.4.387

Fig 23.

Figure 23.Helical gear line phase data comparison.
Current Optics and Photonics 2023; 7: 387-397https://doi.org/10.3807/COPP.2023.7.4.387

TABLE 1 Parameters of gear to be measured

Measured GearModulusNumber of TeethTooth Width
(mm)
Helical Angle (°)Pressure Angle (°)
Helical Gear360152020
Spur Gear2.55020-20

TABLE 2 Comparison of accuracy between proposed method and Gear’s objection gray scale algorithm (GSA)

Rate ImagesReferenceGSA AlgorithmThis Article
Image PrimesAccuracy
(%)
Image PrimesAccuracy
(%)
A408,000387,68195.02399,22297.84
B408,000384,97594.35392,56496.21
C745,608724,62597.19740,68499.33

TABLE 3 Comparison of accuracy between proposed method and interference common filtering algorithm (ICFA)

Rate ImagesReferenceICFAThis Article
Image PrimesAccuracy
(%)
Image PrimesAccuracy
(%)
A408,000378,06592.66399,22297.84
B408,000372,48991.29392,56496.21
C745,608720,03196.56740,68499.33

TABLE 4 Comparison of accuracy between proposed method and relative modulation algorithm (RMA)

Rate ImagesReferenceRMA AlgorithmThis Article Method
Image PrimesAccuracy
(%)
Image PrimesAccuracy
(%)
A408,000392,90496.30399,22297.84
B408,000385,30294.43392,56496.21
C745,608731,21698.06740,68499.33

TABLE 5 Time cost evaluation

Rate ImagesGSA Algorithm/sICF Algorithm/sRMA Algorithm/sThis Article Method/sMaximum Efficiency (%)
A30.1929.8527.932.2592.55
B31.2232.0128.542.4992.02
C39.2539.0736.604.0689.61

TABLE 6 Probabilistic rand index (PRI) index extraction results

Rate ImagesPRI
GSAICFARMAThis Article Method
A0.94540.91160.95810.9832
B0.94010.90830.94840.9657
C0.93920.91610.95860.9891

TABLE 7 Variation of information (VOI) index extraction results

Rate ImagesVOI
GSAICFARMAThis Article Method
A1.13791.27151.05270.9876
B1.18451.19521.14391.0322
C1.09981.13440.98440.9213

TABLE 8 Global consistency error (GCE) index extraction results

Rate ImagesGCE
GSAICFARMAThis Article Method
A0.20760.25090.19320.1533
B0.21990.23470.20060.1862
C0.15300.17380.13820.1147

References

  1. X. Wang, S. Fang, X. Zhu, J. Ji, P. Yang, M. Komori, and A. Kubo, “Nonlinear diffusion and structure tensor based segmentation of valid measurement region from interference fringe patterns on gear systems,” Curr. Opt. Photonics 1, 587-597 (2017).
  2. Z. Shi, B. Yu, X. Song, and X. Wang, “Development of gear measurement technology during last 20 Years,” Chin. Mech. Eng. 33, 1009-1024 (2022).
  3. Y. Dai, X. Sheng, and P. Yang, “Comparison and analysis of effective measurement area segmentation methods in tooth surface interference fringe pattern,” Machin. Electron. 39, 3-7 (2021).
  4. C. Zuo and Q. Chen, “Computational optical imaging: An overview,” Infrar. Laser Eng. 51, 20220110 (2022).
    CrossRef
  5. L. Meng, S. Fang, P. Yang, L. Wang, M. Komori, and A. Kubo, “Image-inpainting and quality-guided phase unwrapping algorithm,” Appl. Opt. 51, 2457-2462 (2012).
    Pubmed CrossRef
  6. X. Wang, S. Fang, X. Zhu, K. Kou, Y. Liu, and M. Jiao, “Phase unwrapping based on adaptive image in-painting of fringe patterns in measuring gear tooth flanks by laser interferometry,” Opt. Express 28, 17881-17897 (2020).
    Pubmed CrossRef
  7. H. A. Vrooman and A. A. Maas, “Image processing algorithms for the analysis of phase-shifted speckle interference patterns,” Appl. Opt. 30, 1636-1641 (1991).
    Pubmed CrossRef
  8. T. Ino and T. Yatagai, “Oblique incidence interferometry for gear-tooth surface profiling,” Proc. SPIE 1720, 464-469 (1992).
    CrossRef
  9. X. W. Chang, D. W. Li, and Y. H. Xu, “A study of image segmentation based on level set method,” in Proc. 2013 International Conference on Advanced Computer Science and Electronics Information-ICACSEI 2013 (Beijing, China, Jul. 25-26, 2013), pp. 360-363.
    CrossRef
  10. G. Aiming, C. Lei, and C. Jinbang, “Digitalisation processing technique for interference pattern with obstruct,” Acta Optica Sinica 20, 775 (2000).
  11. Y. Pengcheng, F. Suping, W. Leijie, M. Lei, K. Masaharu, and K. Aizoh, “Correction method for segmenting valid measuring region of interference fringe patterns,” Opt. Eng. 50, 095602 (2011).
    CrossRef
  12. L. Wang, S. Fang, P. Yang, and L. Meng, “Comparison of three methods for identifying fringe regions of interference fringe patterns in measuring gear tooth flanks by laser interferometry,” Optik 126, 5668-5671 (2015).
    CrossRef
  13. X. Wang, X. Zhu, K. Kou, J. Liu, Y. Liu, and B. Qian, “Fringe direction weighted autofocusing algorithm for gear tooth flank form deviation measurement based on an interferogram,” Appl. Opt. 60, 11066-11074 (2021).
    Pubmed CrossRef
  14. D. C. Ghiglia and M. D. Pritt, Two-Dimensional Phase Unwrapping: Theory, Algorithms, and Software (Wiley, USA, 1998).
  15. Y. Cao, H. Liu, and X. Jia, “Overview of image quality assessment method based on deep learning,” Comput. Eng. Appl. 57, 27-36 (2021).
  16. Z. Zhu, Y. Liu, and Y. Li, “Image segmentation of non-destructive test based on image patch and cluster information quantity,” Laser Optoelectron. Prog. 58, 1210009 (2021).
    CrossRef
  17. L. Ge and Y. Zhao, “A method of automatically searching lever ring and calculating the misclosure,” Sci. Surv. Map. 37, 209-212 (2012).
  18. J. Liu, D. Yang, and F. Hu, “Multiscale object detection in remote sensing images combined with multi-receptive-field features and relation-connected attention,” Remote Sens. 14, 427 (2022).
    CrossRef
  19. S. G. A. Usha and S. Vasuki, “Significance of texture features in the segmentation of remotely sensed images,” Optik 249, 168241 (2022).
    CrossRef
  20. X. Wang, H. Liu, and Y. Niu, “Binocular stereo matching by combining multiscale local and deep features,” Acta Optica Sinica 40, 0245001 (2020).
    CrossRef
Optical Society of Korea

Current Optics
and Photonics


Wonshik Choi,
Editor-in-chief

Share this article on :

  • line