Ex) Article Title, Author, Keywords
Current Optics
and Photonics
Ex) Article Title, Author, Keywords
Curr. Opt. Photon. 2023; 7(4): 387-397
Published online August 25, 2023 https://doi.org/10.3807/COPP.2023.7.4.387
Copyright © Optical Society of Korea.
Xian Wang , Yichao Zhao, Chaoyang Ju, Chaoyong Zhang
Corresponding author: *wangxian@xaut.edu.cn, ORCID 0000-0002-1187-3486
This is an Open Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (http://creativecommons.org/licenses/by-nc/4.0/) which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original work is properly cited.
Tooth surface shape error is an important parameter in gear accuracy evaluation. When tooth surface shape error is measured by laser interferometry, the gear interferogram is highly distorted and the gray level distribution is not uniform. Therefore, it is important for gear interferometry to extract the foreground region from the gear interference fringe image directly and accurately. This paper presents an approach for foreground extraction in gear interference images by leveraging the sinusoidal variation characteristics shown by the interference fringes. A gray level mask with an adaptive threshold is established to capture the relevant features, while a local variance evaluation function is employed to analyze the fluctuation state of the interference image and derive a repair mask. By combining these masks, the foreground region is directly extracted. Comparative evaluations using qualitative and quantitative assessment methods are performed to compare the proposed algorithm with both reference results and traditional approaches. The experimental findings reveal a remarkable degree of matching between the algorithm and the reference results. As a result, this method shows great potential for widespread application in the foreground extraction of gear interference images.
Keywords: Adaptive threshold, Error of gear tooth flank, Foreground region extraction, Phase-shifting laser interferometry
OCIS codes: (100.0100) Image processing; (100.3175) Interferometric imaging; (100.5088) Phase unwrapping
The shape error of gear tooth surfaces plays a crucial role in evaluating gear accuracy, as it has a significant impact on transmission efficiency, noise levels, and service life of transmission systems [1]. Therefore, the precise measurement of tooth surface shape error is of paramount importance for the manufacturing of high-precision gears. Currently, the contact measurement method based on coordinates is the mainstream approach due to its simplicity and low cost. However, this point scanning method suffers from low efficiency, limited accuracy, and potential damage to the tooth surface of high-precision gears, and fails to meet the increasing demand for efficiency and accuracy in precision gear detection [2]. Consequently, non-contact optical measurement methods have emerged as the primary research direction in the field of precision gear detection, offering advantages such as high precision, efficiency, and rich measurement data [3]. Among these methods, laser phase-shifting interferometry stands out as a classic non-contact technique [4]. During interferometric measurement, the phase information reflects the topography error of the tooth surface, and thus, the accuracy of phase unwrapping directly affects the measurement precision of tooth surface shape error [5]. However, the presence of non-measured regions in the acquired tooth surface interference image not only hampers phase unwrapping efficiency but also introduces errors from the background region into the essential phase information within the foreground area due to integration effects [6]. Consequently, accurate extraction of the foreground area from the gear interference image prior to phase unwrapping is crucial because it directly influences the accuracy of the final gear shape measurement, thereby assuming great significance in the overall gear shape error assessment.
Gear interference fringe images present unique challenges, including high distortion of fringes, phase deviations caused by noise, and uneven grayscale distribution. Conventional image processing methods, such as the relative modulation algorithm (RMA) [7], fringe contrast method [8], and level set method [9], are ineffective in handling these images. Therefore, it is imperative to explore a suitable foreground area extraction method specifically tailored for gear phase-shift interferometry systems. Aiming
The gear measurement system using laser phase shift interferometry operates based on the following principle, as illustrated in Fig. 1. Initially, the linearly polarized light emitted from the He-Ne laser is split into two beams: Measurement light and reference light, achieved by passing through a polarization spectroscopic prism. The measurement light is then directed toward the surface of the gear to be measured by the front light wedge. At this stage, the measurement light carries the morphological information of the target gear. Subsequently, the optical path is adjusted using the rear optical wedge, directing the measurement light toward the half-reverse half-lens configuration. The reference light and the measurement light intersect and interfere at this point, resulting in the acquisition of a gear interference image captured by a charge-coupled device (CCD) camera, as depicted in Fig. 2. Within the measurement optical path, the micro-displacement of the mirror is controlled by a piezoelectric transducer (PZT) to achieve a four-step phase shift, with each phase shift step size being
In gear interferometry, the phase shift is systematically conducted with a consistent step size, resulting in a discernible periodic pattern in the grayscale of the acquired phase-shifted interference fringes. Figure 3 illustrates a simulated image of phase-shifted interference fringes under ideal conditions. It is evident that when the phase shift is 0 and
Figure 4 is an image depicting simulated phase-shifting interference fringes accompanied by various types of noise, including Gaussian noise, salt and pepper noise [with a peak signal-to-noise ratio (PSNR) value of 13.3035], and other typical artifacts introduced by imaging equipment and common optical path phase errors. Despite the presence of these noise sources, the interference fringes continue to show noticeable sinusoidal variation characteristics. This resilience to noise ensures the robustness and stability of the algorithm’s logic when applied to real-world conditions.
To analyze the grayscale variations, pixels in row 150 of each group of gear interference fringe images presented in Fig. 2 were extracted. Upon observation, it was found that in the measured gear interference image, the largest discrepancy in gray values between pixels with identical coordinates in the foreground area occurs at phase shifts of 0 and
The algorithmic processing flow, as illustrated in Fig. 7, is divided into two stages: Grayscale mask M1 and repair mask M2. The target image undergoes several steps, including preprocessing, differential operation, gray set allocation, threshold extraction, neighborhood local variance analysis, and other procedures to achieve the direct extraction of the foreground area from the final gear interference image.
Firstly, the processed image undergoes median filter preprocessing to effectively suppress the noise signal, preserving edge information and avoiding blurring of image details. Subsequently, the interference differential image under different phase shifts is calculated using the following equation:
where Gray
At this stage, the processed image is obtained for further extraction of the foreground area. Subsequently, a threshold value is determined to extract the foreground area information from the gear interference image. As depicted in Fig. 8, interval positioning requires reassigning grayscale values to the image pixels. The lowest pixel grayscale value, which accounts for 5% of the total, is categorized into the range of 0 to 0.1, while the highest pixel grayscale value of the last 5% is classified into the range of 0.9 to 1. The maximum gray value among the pixels within each interval is determined, and the remaining pixel grayscale values are redistributed within the 0 to 1 range, and subsequently quantified into 10 smaller intervals [14]. The purpose of this process is to filter and exclude pixels with excessively large or small grayscale values in the image, thus preventing errors that may affect subsequent threshold selection.
A distribution histogram is generated by obtaining the maximum gray value within each interval. Subsequently, a differential process is applied to the maximum gray values of adjacent regions. Finally, a search is performed from left to right to identify the first area in the histogram that shows significant growth and change. The criterion for significant change is that the amount of change should be at least 0.5% of the maximum gray value in the image. In Fig. 8(b), the red area represents the first areas that meet the requirements after reallocation. Equation (3) is then used to calculate the threshold value, denoted as
The threshold value, denoted as
To repair the mask
In the differential image, pixels belonging to the foreground area show significant fluctuations, indicating variations in the underlying features. On the other hand, pixels in the background area remain stable and show minimal changes with the phase shift. This disparity in fluctuation patterns distinguishes the foreground and background regions in the image. After the neighborhood window is established, calculate the local variance of the pixels in the neighborhood window that sized
In Eq.(4),
When the calculated local variance of the target pixel’s neighborhood window falls within the
After traversing the differential image, the repair mask
In this section, to validate the effectiveness of the proposed algorithm, a gear laser phase shift interferometry system is built, as depicted in Fig. 10(a). The system employs a He-Ne laser with a wavelength of 632.8 nm, a polarization degree of 500:1, and power stability within ±5% as the light source. The driving power supply used is a PZT servo-controller E53.C that provides a series output voltage range of 0 to 120 V. The gear under measurement is depicted in Fig. 10(b), and its corresponding parameters are presented in Table 1.
TABLE 1 Parameters of gear to be measured
Measured Gear | Modulus | Number of Teeth | Tooth Width (mm) | Helical Angle (°) | Pressure Angle (°) |
---|---|---|---|---|---|
Helical Gear | 3 | 60 | 15 | 20 | 20 |
Spur Gear | 2.5 | 50 | 20 | - | 20 |
Three sets of gear interference fringe images were acquired using the aforementioned phase-shifting interferometry system, as depicted in Fig. 11. Figures 11(a) and 11(b) represent helical gear interference fringe images, while Fig. 11(c) represents spur gear interference fringe images.
Initially, based on the grayscale distribution pattern of the gear tooth surface fringes, the foreground area of the interference fringes in the aforementioned images was manually segmented and extracted as the reference result for this particular set. The resulting segmented mask images are displayed in Fig. 12. Figures 12(a)–12(c) correspond to the masks of groups A, B, and C, respectively. These reference results will be used for comparison and accuracy analysis in subsequent algorithm extraction processes.
For each group, differential processing was applied to the four-step phase-shift measurement, resulting in differentially extracted images that contained information regarding the phase variation of the tooth surface, as shown in Fig. 13.
The foreground areas were extracted from the above three sets of differentially processed results, and the adaptive threshold algorithm described earlier was employed to redistribute the grayscale intervals. The locked red areas in Figs. 14–16 illustrate the grayscale distribution with the respective threshold values. The threshold values for the final gear interference images of groups A, B, and C are 27, 42, and 12, respectively.
The first segmentation extraction is performed using the threshold obtained through the algorithm’s automatic search. This process results in the coarsely extracted grayscale mask of the gear interference fringe image, denoted as
Subsequently, by analyzing the local variance of the target pixel within the neighborhood window, the foreground and background information are distinguished, leading to the generation of the repair mask, denoted as
First, the similarity of the images is compared using various methods, as depicted in Figs. 20–22. The selected comparison methods include the RMA, GSA, ICFA [12], and the algorithm proposed in this paper.
Among the selected methods, GSA (green line) and ICFA (yellow line) are widely employed for the measurement of gear interference. On the other hand, RMA (purple line) is a commonly used approach for recognizing the foreground area in an interferogram [4]. In comparison, the algorithm proposed in this article shows a higher degree of similarity between the extracted image edges (red line) and the reference image edge (blue line). In particular, it demonstrates better alignment between the edges of the tooth top A region and the tooth root B region, which are known to be challenging areas to process in groups B and C.
To further analyze the foreground extraction results, the segmentation results of all pixels in each image group are compared. Tables 2–4 present a comparison between the reference segmentation results of the three groups, the segmentation results obtained using comparative algorithms, and the segmentation results achieved using the algorithm proposed in this paper. Group A and group B consist of 408,000 image pixels, while group C comprises 745,608 image pixels. The similarity of images in group A reached 97.84%, while in group B it was 96.21%, and in group C it was 99.33%. The overall matching accuracy of the images ranges from approximately 97.50% to 99.00%. The proposed algorithm demonstrates an improvement in accuracy ranging from 1.50% to 3.50% compared to the comparative algorithms.
TABLE 2 Comparison of accuracy between proposed method and Gear’s objection gray scale algorithm (GSA)
Rate Images | Reference | GSA Algorithm | This Article | ||
---|---|---|---|---|---|
Image Primes | Accuracy (%) | Image Primes | Accuracy (%) | ||
A | 408,000 | 387,681 | 95.02 | 399,222 | 97.84 |
B | 408,000 | 384,975 | 94.35 | 392,564 | 96.21 |
C | 745,608 | 724,625 | 97.19 | 740,684 | 99.33 |
TABLE 3 Comparison of accuracy between proposed method and interference common filtering algorithm (ICFA)
Rate Images | Reference | ICFA | This Article | ||
---|---|---|---|---|---|
Image Primes | Accuracy (%) | Image Primes | Accuracy (%) | ||
A | 408,000 | 378,065 | 92.66 | 399,222 | 97.84 |
B | 408,000 | 372,489 | 91.29 | 392,564 | 96.21 |
C | 745,608 | 720,031 | 96.56 | 740,684 | 99.33 |
TABLE 4 Comparison of accuracy between proposed method and relative modulation algorithm (RMA)
Rate Images | Reference | RMA Algorithm | This Article Method | ||
---|---|---|---|---|---|
Image Primes | Accuracy (%) | Image Primes | Accuracy (%) | ||
A | 408,000 | 392,904 | 96.30 | 399,222 | 97.84 |
B | 408,000 | 385,302 | 94.43 | 392,564 | 96.21 |
C | 745,608 | 731,216 | 98.06 | 740,684 | 99.33 |
Table 5 presents a comparison of the execution efficiency between the proposed algorithm and the alternative algorithms. While maintaining result accuracy, the proposed algorithm achieves a significant reduction in running time by 89.5% to 92.5% due to its independence from the tooth surface image. In contrast, the comparison algorithms require threshold adjustments, resulting in similar runtimes, with the fastest among them reported here.
TABLE 5 Time cost evaluation
Rate Images | GSA Algorithm/s | ICF Algorithm/s | RMA Algorithm/s | This Article Method/s | Maximum Efficiency (%) |
---|---|---|---|---|---|
A | 30.19 | 29.85 | 27.93 | 2.25 | 92.55 |
B | 31.22 | 32.01 | 28.54 | 2.49 | 92.02 |
C | 39.25 | 39.07 | 36.60 | 4.06 | 89.61 |
To verify the accuracy of the algorithm’s extraction results, a supervised evaluation algorithm [15] was employed to analyze each group of images under detection. The evaluation encompassed image similarity, probabilistic rand index (PRI) [16], variation of information (VOI) [17, 18], and global consistency error (GCE) index [19, 20]. PRI is computed by tallying the total number of pixel pairs with identical labels and those with differing labels between the segmented image and the ground truth, divided by the sum of all pixel pairs. Its range is [0,1], with higher values indicating greater image matching. VOI assesses the magnitude and trend of pixel information changes between the reference image and the algorithm image by assessing the entropy value. A smaller value indicates more accurate algorithm segmentation results. GCE, ranging from [0,1], quantifies the segmentation quality, with smaller values indicating better performance.
By analyzing these evaluation indicators, it is observed that PRI improves by approximately 2.5% to 3.5%. GCE demonstrates a reduction of around 15% to 20%, while VOI decreases by about 13% to 16%. Compared to the results obtained from the comparative algorithms, the algorithm proposed in this paper shows favorable improvements in all evaluated aspects. This is demonstrated in Tables 6–8.
TABLE 6 Probabilistic rand index (PRI) index extraction results
Rate Images | PRI | |||
---|---|---|---|---|
GSA | ICFA | RMA | This Article Method | |
A | 0.9454 | 0.9116 | 0.9581 | 0.9832 |
B | 0.9401 | 0.9083 | 0.9484 | 0.9657 |
C | 0.9392 | 0.9161 | 0.9586 | 0.9891 |
TABLE 7 Variation of information (VOI) index extraction results
Rate Images | VOI | |||
---|---|---|---|---|
GSA | ICFA | RMA | This Article Method | |
A | 1.1379 | 1.2715 | 1.0527 | 0.9876 |
B | 1.1845 | 1.1952 | 1.1439 | 1.0322 |
C | 1.0998 | 1.1344 | 0.9844 | 0.9213 |
TABLE 8 Global consistency error (GCE) index extraction results
Rate Images | GCE | |||
---|---|---|---|---|
GSA | ICFA | RMA | This Article Method | |
A | 0.2076 | 0.2509 | 0.1932 | 0.1533 |
B | 0.2199 | 0.2347 | 0.2006 | 0.1862 |
C | 0.1530 | 0.1738 | 0.1382 | 0.1147 |
Finally, the phase information within the foreground region is extracted based on the different segmentation results, allowing for a comparison of the phase error between each algorithm and the reference result. This evaluation aims to assess the impact of the foreground extraction algorithm for interference images on the final phase accuracy. The phase extraction results of each algorithm at a specific line are depicted in Fig. 23.
Through the analysis of the phase extraction results obtained from each group of measurement targets, it is evident that the phase difference is prominently observed in the edge region of the foreground image extraction. Significantly, the proposed algorithm outperforms the comparison algorithm in accurately obtaining phase data. In the context of helical gear measurements, the proposed algorithm effectively reduces the measurement error at the edge by 11 μm, facilitating the realization of high-precision interferometry for gear analysis and assessment.
In this research paper, a method is presented for the direct adaptive extraction of the foreground area on the tooth surface in gear interferometry. This method eliminates the reliance on conventional extraction methods that depend on non-interference images of the tooth surface. Multiple sets of interferometric measurements were conducted on gears with diverse parameters, and the measurement areas from these sets of interference fringe images were extracted. The extraction results obtained using different methods were both qualitatively and quantitatively compared. The findings demonstrate that the proposed method significantly enhances algorithm efficiency while maintaining the accuracy of the extracted area. This ensures precise measurements and validates the effectiveness and reliability of the proposed algorithm.
The authors declare no conflicts of interest.
Data underlying the results presented in this paper are not publicly available at the time of publication, but may be obtained from the authors upon reasonable request.
NSFC grant number 52205067, 61805195, 52004213; Natural Science Basic Research Program of Shaanxi grant number 2022JQ-403; China Postdoctoral Science Foundation grant number 2020M683683XB.
Curr. Opt. Photon. 2023; 7(4): 387-397
Published online August 25, 2023 https://doi.org/10.3807/COPP.2023.7.4.387
Copyright © Optical Society of Korea.
Xian Wang , Yichao Zhao, Chaoyang Ju, Chaoyong Zhang
School of Mechanical and Precision Instrument Engineering, Xi’an University of Technology, Xi’an 710048, China
Correspondence to:*wangxian@xaut.edu.cn, ORCID 0000-0002-1187-3486
This is an Open Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (http://creativecommons.org/licenses/by-nc/4.0/) which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original work is properly cited.
Tooth surface shape error is an important parameter in gear accuracy evaluation. When tooth surface shape error is measured by laser interferometry, the gear interferogram is highly distorted and the gray level distribution is not uniform. Therefore, it is important for gear interferometry to extract the foreground region from the gear interference fringe image directly and accurately. This paper presents an approach for foreground extraction in gear interference images by leveraging the sinusoidal variation characteristics shown by the interference fringes. A gray level mask with an adaptive threshold is established to capture the relevant features, while a local variance evaluation function is employed to analyze the fluctuation state of the interference image and derive a repair mask. By combining these masks, the foreground region is directly extracted. Comparative evaluations using qualitative and quantitative assessment methods are performed to compare the proposed algorithm with both reference results and traditional approaches. The experimental findings reveal a remarkable degree of matching between the algorithm and the reference results. As a result, this method shows great potential for widespread application in the foreground extraction of gear interference images.
Keywords: Adaptive threshold, Error of gear tooth flank, Foreground region extraction, Phase-shifting laser interferometry
The shape error of gear tooth surfaces plays a crucial role in evaluating gear accuracy, as it has a significant impact on transmission efficiency, noise levels, and service life of transmission systems [1]. Therefore, the precise measurement of tooth surface shape error is of paramount importance for the manufacturing of high-precision gears. Currently, the contact measurement method based on coordinates is the mainstream approach due to its simplicity and low cost. However, this point scanning method suffers from low efficiency, limited accuracy, and potential damage to the tooth surface of high-precision gears, and fails to meet the increasing demand for efficiency and accuracy in precision gear detection [2]. Consequently, non-contact optical measurement methods have emerged as the primary research direction in the field of precision gear detection, offering advantages such as high precision, efficiency, and rich measurement data [3]. Among these methods, laser phase-shifting interferometry stands out as a classic non-contact technique [4]. During interferometric measurement, the phase information reflects the topography error of the tooth surface, and thus, the accuracy of phase unwrapping directly affects the measurement precision of tooth surface shape error [5]. However, the presence of non-measured regions in the acquired tooth surface interference image not only hampers phase unwrapping efficiency but also introduces errors from the background region into the essential phase information within the foreground area due to integration effects [6]. Consequently, accurate extraction of the foreground area from the gear interference image prior to phase unwrapping is crucial because it directly influences the accuracy of the final gear shape measurement, thereby assuming great significance in the overall gear shape error assessment.
Gear interference fringe images present unique challenges, including high distortion of fringes, phase deviations caused by noise, and uneven grayscale distribution. Conventional image processing methods, such as the relative modulation algorithm (RMA) [7], fringe contrast method [8], and level set method [9], are ineffective in handling these images. Therefore, it is imperative to explore a suitable foreground area extraction method specifically tailored for gear phase-shift interferometry systems. Aiming
The gear measurement system using laser phase shift interferometry operates based on the following principle, as illustrated in Fig. 1. Initially, the linearly polarized light emitted from the He-Ne laser is split into two beams: Measurement light and reference light, achieved by passing through a polarization spectroscopic prism. The measurement light is then directed toward the surface of the gear to be measured by the front light wedge. At this stage, the measurement light carries the morphological information of the target gear. Subsequently, the optical path is adjusted using the rear optical wedge, directing the measurement light toward the half-reverse half-lens configuration. The reference light and the measurement light intersect and interfere at this point, resulting in the acquisition of a gear interference image captured by a charge-coupled device (CCD) camera, as depicted in Fig. 2. Within the measurement optical path, the micro-displacement of the mirror is controlled by a piezoelectric transducer (PZT) to achieve a four-step phase shift, with each phase shift step size being
In gear interferometry, the phase shift is systematically conducted with a consistent step size, resulting in a discernible periodic pattern in the grayscale of the acquired phase-shifted interference fringes. Figure 3 illustrates a simulated image of phase-shifted interference fringes under ideal conditions. It is evident that when the phase shift is 0 and
Figure 4 is an image depicting simulated phase-shifting interference fringes accompanied by various types of noise, including Gaussian noise, salt and pepper noise [with a peak signal-to-noise ratio (PSNR) value of 13.3035], and other typical artifacts introduced by imaging equipment and common optical path phase errors. Despite the presence of these noise sources, the interference fringes continue to show noticeable sinusoidal variation characteristics. This resilience to noise ensures the robustness and stability of the algorithm’s logic when applied to real-world conditions.
To analyze the grayscale variations, pixels in row 150 of each group of gear interference fringe images presented in Fig. 2 were extracted. Upon observation, it was found that in the measured gear interference image, the largest discrepancy in gray values between pixels with identical coordinates in the foreground area occurs at phase shifts of 0 and
The algorithmic processing flow, as illustrated in Fig. 7, is divided into two stages: Grayscale mask M1 and repair mask M2. The target image undergoes several steps, including preprocessing, differential operation, gray set allocation, threshold extraction, neighborhood local variance analysis, and other procedures to achieve the direct extraction of the foreground area from the final gear interference image.
Firstly, the processed image undergoes median filter preprocessing to effectively suppress the noise signal, preserving edge information and avoiding blurring of image details. Subsequently, the interference differential image under different phase shifts is calculated using the following equation:
where Gray
At this stage, the processed image is obtained for further extraction of the foreground area. Subsequently, a threshold value is determined to extract the foreground area information from the gear interference image. As depicted in Fig. 8, interval positioning requires reassigning grayscale values to the image pixels. The lowest pixel grayscale value, which accounts for 5% of the total, is categorized into the range of 0 to 0.1, while the highest pixel grayscale value of the last 5% is classified into the range of 0.9 to 1. The maximum gray value among the pixels within each interval is determined, and the remaining pixel grayscale values are redistributed within the 0 to 1 range, and subsequently quantified into 10 smaller intervals [14]. The purpose of this process is to filter and exclude pixels with excessively large or small grayscale values in the image, thus preventing errors that may affect subsequent threshold selection.
A distribution histogram is generated by obtaining the maximum gray value within each interval. Subsequently, a differential process is applied to the maximum gray values of adjacent regions. Finally, a search is performed from left to right to identify the first area in the histogram that shows significant growth and change. The criterion for significant change is that the amount of change should be at least 0.5% of the maximum gray value in the image. In Fig. 8(b), the red area represents the first areas that meet the requirements after reallocation. Equation (3) is then used to calculate the threshold value, denoted as
The threshold value, denoted as
To repair the mask
In the differential image, pixels belonging to the foreground area show significant fluctuations, indicating variations in the underlying features. On the other hand, pixels in the background area remain stable and show minimal changes with the phase shift. This disparity in fluctuation patterns distinguishes the foreground and background regions in the image. After the neighborhood window is established, calculate the local variance of the pixels in the neighborhood window that sized
In Eq.(4),
When the calculated local variance of the target pixel’s neighborhood window falls within the
After traversing the differential image, the repair mask
In this section, to validate the effectiveness of the proposed algorithm, a gear laser phase shift interferometry system is built, as depicted in Fig. 10(a). The system employs a He-Ne laser with a wavelength of 632.8 nm, a polarization degree of 500:1, and power stability within ±5% as the light source. The driving power supply used is a PZT servo-controller E53.C that provides a series output voltage range of 0 to 120 V. The gear under measurement is depicted in Fig. 10(b), and its corresponding parameters are presented in Table 1.
TABLE 1. Parameters of gear to be measured.
Measured Gear | Modulus | Number of Teeth | Tooth Width (mm) | Helical Angle (°) | Pressure Angle (°) |
---|---|---|---|---|---|
Helical Gear | 3 | 60 | 15 | 20 | 20 |
Spur Gear | 2.5 | 50 | 20 | - | 20 |
Three sets of gear interference fringe images were acquired using the aforementioned phase-shifting interferometry system, as depicted in Fig. 11. Figures 11(a) and 11(b) represent helical gear interference fringe images, while Fig. 11(c) represents spur gear interference fringe images.
Initially, based on the grayscale distribution pattern of the gear tooth surface fringes, the foreground area of the interference fringes in the aforementioned images was manually segmented and extracted as the reference result for this particular set. The resulting segmented mask images are displayed in Fig. 12. Figures 12(a)–12(c) correspond to the masks of groups A, B, and C, respectively. These reference results will be used for comparison and accuracy analysis in subsequent algorithm extraction processes.
For each group, differential processing was applied to the four-step phase-shift measurement, resulting in differentially extracted images that contained information regarding the phase variation of the tooth surface, as shown in Fig. 13.
The foreground areas were extracted from the above three sets of differentially processed results, and the adaptive threshold algorithm described earlier was employed to redistribute the grayscale intervals. The locked red areas in Figs. 14–16 illustrate the grayscale distribution with the respective threshold values. The threshold values for the final gear interference images of groups A, B, and C are 27, 42, and 12, respectively.
The first segmentation extraction is performed using the threshold obtained through the algorithm’s automatic search. This process results in the coarsely extracted grayscale mask of the gear interference fringe image, denoted as
Subsequently, by analyzing the local variance of the target pixel within the neighborhood window, the foreground and background information are distinguished, leading to the generation of the repair mask, denoted as
First, the similarity of the images is compared using various methods, as depicted in Figs. 20–22. The selected comparison methods include the RMA, GSA, ICFA [12], and the algorithm proposed in this paper.
Among the selected methods, GSA (green line) and ICFA (yellow line) are widely employed for the measurement of gear interference. On the other hand, RMA (purple line) is a commonly used approach for recognizing the foreground area in an interferogram [4]. In comparison, the algorithm proposed in this article shows a higher degree of similarity between the extracted image edges (red line) and the reference image edge (blue line). In particular, it demonstrates better alignment between the edges of the tooth top A region and the tooth root B region, which are known to be challenging areas to process in groups B and C.
To further analyze the foreground extraction results, the segmentation results of all pixels in each image group are compared. Tables 2–4 present a comparison between the reference segmentation results of the three groups, the segmentation results obtained using comparative algorithms, and the segmentation results achieved using the algorithm proposed in this paper. Group A and group B consist of 408,000 image pixels, while group C comprises 745,608 image pixels. The similarity of images in group A reached 97.84%, while in group B it was 96.21%, and in group C it was 99.33%. The overall matching accuracy of the images ranges from approximately 97.50% to 99.00%. The proposed algorithm demonstrates an improvement in accuracy ranging from 1.50% to 3.50% compared to the comparative algorithms.
TABLE 2. Comparison of accuracy between proposed method and Gear’s objection gray scale algorithm (GSA).
Rate Images | Reference | GSA Algorithm | This Article | ||
---|---|---|---|---|---|
Image Primes | Accuracy (%) | Image Primes | Accuracy (%) | ||
A | 408,000 | 387,681 | 95.02 | 399,222 | 97.84 |
B | 408,000 | 384,975 | 94.35 | 392,564 | 96.21 |
C | 745,608 | 724,625 | 97.19 | 740,684 | 99.33 |
TABLE 3. Comparison of accuracy between proposed method and interference common filtering algorithm (ICFA).
Rate Images | Reference | ICFA | This Article | ||
---|---|---|---|---|---|
Image Primes | Accuracy (%) | Image Primes | Accuracy (%) | ||
A | 408,000 | 378,065 | 92.66 | 399,222 | 97.84 |
B | 408,000 | 372,489 | 91.29 | 392,564 | 96.21 |
C | 745,608 | 720,031 | 96.56 | 740,684 | 99.33 |
TABLE 4. Comparison of accuracy between proposed method and relative modulation algorithm (RMA).
Rate Images | Reference | RMA Algorithm | This Article Method | ||
---|---|---|---|---|---|
Image Primes | Accuracy (%) | Image Primes | Accuracy (%) | ||
A | 408,000 | 392,904 | 96.30 | 399,222 | 97.84 |
B | 408,000 | 385,302 | 94.43 | 392,564 | 96.21 |
C | 745,608 | 731,216 | 98.06 | 740,684 | 99.33 |
Table 5 presents a comparison of the execution efficiency between the proposed algorithm and the alternative algorithms. While maintaining result accuracy, the proposed algorithm achieves a significant reduction in running time by 89.5% to 92.5% due to its independence from the tooth surface image. In contrast, the comparison algorithms require threshold adjustments, resulting in similar runtimes, with the fastest among them reported here.
TABLE 5. Time cost evaluation.
Rate Images | GSA Algorithm/s | ICF Algorithm/s | RMA Algorithm/s | This Article Method/s | Maximum Efficiency (%) |
---|---|---|---|---|---|
A | 30.19 | 29.85 | 27.93 | 2.25 | 92.55 |
B | 31.22 | 32.01 | 28.54 | 2.49 | 92.02 |
C | 39.25 | 39.07 | 36.60 | 4.06 | 89.61 |
To verify the accuracy of the algorithm’s extraction results, a supervised evaluation algorithm [15] was employed to analyze each group of images under detection. The evaluation encompassed image similarity, probabilistic rand index (PRI) [16], variation of information (VOI) [17, 18], and global consistency error (GCE) index [19, 20]. PRI is computed by tallying the total number of pixel pairs with identical labels and those with differing labels between the segmented image and the ground truth, divided by the sum of all pixel pairs. Its range is [0,1], with higher values indicating greater image matching. VOI assesses the magnitude and trend of pixel information changes between the reference image and the algorithm image by assessing the entropy value. A smaller value indicates more accurate algorithm segmentation results. GCE, ranging from [0,1], quantifies the segmentation quality, with smaller values indicating better performance.
By analyzing these evaluation indicators, it is observed that PRI improves by approximately 2.5% to 3.5%. GCE demonstrates a reduction of around 15% to 20%, while VOI decreases by about 13% to 16%. Compared to the results obtained from the comparative algorithms, the algorithm proposed in this paper shows favorable improvements in all evaluated aspects. This is demonstrated in Tables 6–8.
TABLE 6. Probabilistic rand index (PRI) index extraction results.
Rate Images | PRI | |||
---|---|---|---|---|
GSA | ICFA | RMA | This Article Method | |
A | 0.9454 | 0.9116 | 0.9581 | 0.9832 |
B | 0.9401 | 0.9083 | 0.9484 | 0.9657 |
C | 0.9392 | 0.9161 | 0.9586 | 0.9891 |
TABLE 7. Variation of information (VOI) index extraction results.
Rate Images | VOI | |||
---|---|---|---|---|
GSA | ICFA | RMA | This Article Method | |
A | 1.1379 | 1.2715 | 1.0527 | 0.9876 |
B | 1.1845 | 1.1952 | 1.1439 | 1.0322 |
C | 1.0998 | 1.1344 | 0.9844 | 0.9213 |
TABLE 8. Global consistency error (GCE) index extraction results.
Rate Images | GCE | |||
---|---|---|---|---|
GSA | ICFA | RMA | This Article Method | |
A | 0.2076 | 0.2509 | 0.1932 | 0.1533 |
B | 0.2199 | 0.2347 | 0.2006 | 0.1862 |
C | 0.1530 | 0.1738 | 0.1382 | 0.1147 |
Finally, the phase information within the foreground region is extracted based on the different segmentation results, allowing for a comparison of the phase error between each algorithm and the reference result. This evaluation aims to assess the impact of the foreground extraction algorithm for interference images on the final phase accuracy. The phase extraction results of each algorithm at a specific line are depicted in Fig. 23.
Through the analysis of the phase extraction results obtained from each group of measurement targets, it is evident that the phase difference is prominently observed in the edge region of the foreground image extraction. Significantly, the proposed algorithm outperforms the comparison algorithm in accurately obtaining phase data. In the context of helical gear measurements, the proposed algorithm effectively reduces the measurement error at the edge by 11 μm, facilitating the realization of high-precision interferometry for gear analysis and assessment.
In this research paper, a method is presented for the direct adaptive extraction of the foreground area on the tooth surface in gear interferometry. This method eliminates the reliance on conventional extraction methods that depend on non-interference images of the tooth surface. Multiple sets of interferometric measurements were conducted on gears with diverse parameters, and the measurement areas from these sets of interference fringe images were extracted. The extraction results obtained using different methods were both qualitatively and quantitatively compared. The findings demonstrate that the proposed method significantly enhances algorithm efficiency while maintaining the accuracy of the extracted area. This ensures precise measurements and validates the effectiveness and reliability of the proposed algorithm.
The authors declare no conflicts of interest.
Data underlying the results presented in this paper are not publicly available at the time of publication, but may be obtained from the authors upon reasonable request.
NSFC grant number 52205067, 61805195, 52004213; Natural Science Basic Research Program of Shaanxi grant number 2022JQ-403; China Postdoctoral Science Foundation grant number 2020M683683XB.
TABLE 1 Parameters of gear to be measured
Measured Gear | Modulus | Number of Teeth | Tooth Width (mm) | Helical Angle (°) | Pressure Angle (°) |
---|---|---|---|---|---|
Helical Gear | 3 | 60 | 15 | 20 | 20 |
Spur Gear | 2.5 | 50 | 20 | - | 20 |
TABLE 2 Comparison of accuracy between proposed method and Gear’s objection gray scale algorithm (GSA)
Rate Images | Reference | GSA Algorithm | This Article | ||
---|---|---|---|---|---|
Image Primes | Accuracy (%) | Image Primes | Accuracy (%) | ||
A | 408,000 | 387,681 | 95.02 | 399,222 | 97.84 |
B | 408,000 | 384,975 | 94.35 | 392,564 | 96.21 |
C | 745,608 | 724,625 | 97.19 | 740,684 | 99.33 |
TABLE 3 Comparison of accuracy between proposed method and interference common filtering algorithm (ICFA)
Rate Images | Reference | ICFA | This Article | ||
---|---|---|---|---|---|
Image Primes | Accuracy (%) | Image Primes | Accuracy (%) | ||
A | 408,000 | 378,065 | 92.66 | 399,222 | 97.84 |
B | 408,000 | 372,489 | 91.29 | 392,564 | 96.21 |
C | 745,608 | 720,031 | 96.56 | 740,684 | 99.33 |
TABLE 4 Comparison of accuracy between proposed method and relative modulation algorithm (RMA)
Rate Images | Reference | RMA Algorithm | This Article Method | ||
---|---|---|---|---|---|
Image Primes | Accuracy (%) | Image Primes | Accuracy (%) | ||
A | 408,000 | 392,904 | 96.30 | 399,222 | 97.84 |
B | 408,000 | 385,302 | 94.43 | 392,564 | 96.21 |
C | 745,608 | 731,216 | 98.06 | 740,684 | 99.33 |
TABLE 5 Time cost evaluation
Rate Images | GSA Algorithm/s | ICF Algorithm/s | RMA Algorithm/s | This Article Method/s | Maximum Efficiency (%) |
---|---|---|---|---|---|
A | 30.19 | 29.85 | 27.93 | 2.25 | 92.55 |
B | 31.22 | 32.01 | 28.54 | 2.49 | 92.02 |
C | 39.25 | 39.07 | 36.60 | 4.06 | 89.61 |
TABLE 6 Probabilistic rand index (PRI) index extraction results
Rate Images | PRI | |||
---|---|---|---|---|
GSA | ICFA | RMA | This Article Method | |
A | 0.9454 | 0.9116 | 0.9581 | 0.9832 |
B | 0.9401 | 0.9083 | 0.9484 | 0.9657 |
C | 0.9392 | 0.9161 | 0.9586 | 0.9891 |
TABLE 7 Variation of information (VOI) index extraction results
Rate Images | VOI | |||
---|---|---|---|---|
GSA | ICFA | RMA | This Article Method | |
A | 1.1379 | 1.2715 | 1.0527 | 0.9876 |
B | 1.1845 | 1.1952 | 1.1439 | 1.0322 |
C | 1.0998 | 1.1344 | 0.9844 | 0.9213 |
TABLE 8 Global consistency error (GCE) index extraction results
Rate Images | GCE | |||
---|---|---|---|---|
GSA | ICFA | RMA | This Article Method | |
A | 0.2076 | 0.2509 | 0.1932 | 0.1533 |
B | 0.2199 | 0.2347 | 0.2006 | 0.1862 |
C | 0.1530 | 0.1738 | 0.1382 | 0.1147 |