1 Introduction

With the continuing development of the modern manufacturing industry, the application of laser processing technology needs to be expanded to meet the growing demand for three-dimensional (3D) processing. A galvanometer scanner can quickly and accurately change the laser focus position and is therefore widely used in laser drilling [1, 2], welding [3,4,5,6], cutting [7,8,9], microstructure processing [10, 11], and texture marking [12, 13]. A dynamic focusing unit [14] equips the galvanometer scanner with 3D machining capabilities. Unlike in two-dimensional (2D) laser planar processing, one of the difficulties of 3D laser processing is to continually keep the laser focus spot on the specific position of the 3D workpiece. Research has proved that the focus degree of the laser focus spot on the workpiece is critical for the machining quality [15,16,17].

To ensure that this laser focus spot is accurately focused on the 3D workpiece during the 3D laser processing, Wang et al. [18] divided the model into sub-areas, sub-layers, and sub-blocks to ensure the focus of the laser spot in the local area. Diaci et al. [19] proposed a rapid and flexible 3D laser processing method on curved surfaces, which uses a side-axis industrial camera and a low-power laser to measure the 3D information of the workpiece based on the laser triangulation method. However, their way of directly changing the height of the 2D patterns leads to pronounced distortion. Matija et al. [20] installed a 2D galvanometer scanner and a side-axis industrial camera at the end of the robotic arm for laser machining. The object surface was measured by the laser triangulation method, and the machining trajectory was extracted from the measured 3D information. Qi et al. [21] installed a binocular camera near the 2D galvanometer scanner for online 3D contour identification of the object for laser processing. López et al. [22] employed a laser profilometer to conduct real-time 3D contour scanning of a rock for accurate 3D laser surface cleaning.

At present, to transfer patterns on a free-form surface, it is generally necessary to first draw a 3D model and generate 3D processing programs through commercial software. Direct changes of the height of the 2D pattern for 3D processing will cause intolerable distortion. However, convenient online 3D laser processing methods are currently not available. For 3D laser processing, texture mapping of 2D patterns is a very important and practical function, which transfers a 2D pattern on a 3D surface [23, 24]. However, the currently available in situ 3D laser machining is mostly geared toward specific machining tasks, and there is a lack of research on in situ laser machining combined with 3D projection algorithms. To project 2D patterns onto a 3D model in 3D laser machining, previous research adopted model parameterization and texture mapping technologies [25, 26]. Model parameterization in laser machining generally refers to the process of expanding a 3D model to the 2D plane. For a 3D model of the stereolithography (STL) format composed of triangles, the goal of parameterization is to minimize the sum of the angles or area distortions of all triangles after parameterization. The parameterization with the smallest angle distortion is called conformal parameterization, while the parameterization with the smallest area distortion is called authalic parameterization [27, 28]. Texture mapping refers to mapping points from a parameterized 2D plane to a 3D model.

Guo et al. [23] parameterized the entire STL model by using the conformal parameterization method and realized the effect of 2D pattern mapping on a 3D workpiece. Luo et al. [29] projected the 2D machining path onto a 3D surface through angle-based flattening (ABF) and edge-based flattening (EBF) parameterization methods. At present, most of the 3D laser processing uses commercial 3D software or laser processing software for 2D pattern projection. Because of a lack of research on parameterization and texture mapping for 3D laser processing, little research addressed the combination of in situ machining with pattern projection. Moreover, the current research of parameterization and texture mapping in laser processing is mostly based on the STL model. However, the shape and size of the triangle mesh in the STL model are uneven; therefore, it is difficult to guarantee the model quality required for accurate parameterization results. If the internal angle is either too large or too small, or if the triangle has an uneven size, the distortion will be large or parameterization will suffer direct failure. For this reason, Xiao et al. [30] specifically proposed a STL model optimization method for parameterization. They reconstructed the model with the uniform triangle size by resampling and obtained better parameterization results.

The current paper proposes a novel in situ 3D laser machining system that combines the 3D projection algorithm with in situ measurement and 3D modeling, thus forming a complete “scanning-modeling-projection-machining” integrated processing system for pattern machining on the free-form surface. The main contributions of this study include (1) the realization of in situ measurement to obtain the 3D point cloud of a workpiece through self-scanning of the 3D galvanometer scanner; (2) reconstruction of the 3D model with uniform shape and size of the triangle using a high-performance Delaunay triangulation algorithm; (3) development of in situ 3D laser machining software, usage of Least-Squares Conformal Mapping (LSCM) and As-Rigid-As-Possible (ARAP) algorithms for parameterization, and analysis and comparison of algorithms through experimental cases; and (4) proposal of the local parameterization method to improve the accuracy and speed of the parameterization and proposal of the bitmap vectorization method to reduce the required calculation volume and improve both the speed and accuracy of texture mapping. The in situ 3D laser machining system proposed in this study can create an online quality-controlled model through self-scanning and improve both the speed and accuracy of model parameterization and texture mapping through local parameterization and bitmap vectorization methods. Therefore, it has strong practical application value.

2 Methods

The principle and process of the in situ 3D laser machining system are shown in Fig. 1. These include (1) obtaining the original measured images; (2) generating the 3D point cloud of the workpiece using the line-structured light method; (3) simplifying the original point cloud; (4) reconstructing the 3D model; (5) model parameterization; (6) texture mapping of the 2D patterns; and (7) generating 3D machining instructions.

Fig. 1
figure 1

Principle and process of the in situ 3D laser machining system

2.1 In situ 3D measurement

The galvanometer scanner projects the line-structured light using the indicator light of the laser, and the side-axis camera captures relevant images for in situ 3D measurement. The details of the measurement have been published in our previous research [31]. The model of the measurement principle is shown in Fig. 1. The coordinate system includes the world coordinate system Ow-xwywzw, the galvanometer coordinate system Og-xgygzg, the camera coordinate system Oc-xcyczc, and the image coordinate system O-uv. Camera parameters have been calibrated by the calibration method of Zhang [32]. To simplify the model, the galvanometer coordinate system coincides with the world coordinate system. This study used a f-theta field lens. In Fig. 1, point a is the equivalent height of the galvanometer mirror, and points b and c are the two endpoints of the laser line projected by the galvanometer scanner. Therefore, these known non-collinear points can be converted to the camera coordinate system according to the external parameters of the camera. Then, the light plane equation in the camera coordinate system can be determined, as shown in Eq. (1), where A, B, C, and D represent the coefficients of the light plane equation.

$$ A\cdotp {X}_c+B\cdotp {Y}_c+C\cdotp {Z}_c+D=0 $$
(1)
$$ \left\{\begin{array}{c}u={f}_x\frac{X_c}{Z_c}+{c}_x\\ {}v={f}_y\frac{Y_c}{Z_c}+{c}_y\end{array}\right. $$
(2)

The typical pinhole camera model is given by Eq. (2), where fx and fy represent the focal distance expressed in units of horizontal and vertical pixels; and cx and cy represent the principal point coordinates. Equation (3) can be obtained from Eqs. (1) and (2). Thus, the specific pixel coordinates (u, v) can be converted to the camera coordinate system (Xc, Yc, Zc) according to Eq. (3) and then to the world coordinate system (Xw, Yw, Zw) according to Eq. (4). R represents the rotation matrix and T represents the translation matrix in the camera’s external parameters.

$$ \left(\begin{array}{ccc}\begin{array}{c}{f}_x\\ {}0\\ {}A\end{array}& \begin{array}{c}0\\ {}{f}_y\\ {}B\end{array}& \begin{array}{c}{c}_x-u\\ {}{c}_y-v\\ {}C\end{array}\end{array}\right)\left(\begin{array}{c}{X}_c\\ {}{Y}_c\\ {}{Z}_c\end{array}\right)=\left(\begin{array}{c}0\\ {}0\\ {}-D\end{array}\right) $$
(3)
$$ \left(\begin{array}{c}{X}_w\\ {}{Y}_w\\ {}{Z}_w\end{array}\right)={R}^{-1}\left(\left(\begin{array}{c}{X}_c\\ {}{Y}_c\\ {}{Z}_c\end{array}\right)-T\right) $$
(4)

Here, the specific pixel coordinates are the optical center of the line-structured light. This study used the gray centroid method [33] to extract the optical center. This method ensures high extraction accuracy, even if the gray distribution of the laser line is not uniform.

2.2 3D reconstruction

To reconstruct the 3D point cloud that was obtained from the in situ measurement into a 3D model, it is necessary to filter and simplify the original point cloud. This is needed to meet the accuracy and calculation speed requirements of the 3D reconstruction and subsequent model parameterization. Because the in situ 3D measurement obtains the point cloud of the upper surface of the workpiece, this study used a 2D uniform grid method to simplify the point cloud. This method achieves fast calculation speed, and the simplified point cloud is evenly distributed, which is conducive to the construction of a triangle mesh with uniform shape and size. This method uses a rectangular bounding box to divide the original point cloud into multiple uniform and small square grids in the XY plane. The coordinates of the original points in each square grid are averaged and used as the coordinates of a simplified point. The side length of the small square grid is determined according to the accuracy requirements.

After the point cloud has been simplified, a high-performance Delaunay triangulation algorithm [34] was used to connect the discrete points (in the form of triangles), thus constructing a triangle mesh with topological structure. In this study, when the length of any side of a triangle exceeds a set value, this triangle will be deleted, to avoid the addition of noise points to the model and to ensure that the holes in the model can be retained. This set value is determined according to the accuracy requirements and the side length of the small square grid.

2.3 Model parameterization

To transfer the 2D pattern onto the 3D surface, the 3D model obtained from in situ measurement and reconstruction should be firstly flattened to a 2D plane by the model parameterization method. This study used the LSCM and ARAP algorithms for model parameterization. The LSCM algorithm is a kind of free boundary conformal parameterization method, with the ability to minimize the angular distortion. The ARAP algorithm can further optimize the results on the basis of LSCM parameterization to minimize the areal distortion.

Because the 2D to 2D transformation relationship can be easily calculated, the representation of 3D triangle mesh should be reduced to the 2D space. To do so, a local orthogonal coordinate system was established for each triangle in the 3D model. Suppose there is a space triangle T = {p1,  p2,  p3}, let a = p2 − p1, b = p3 − p1, define x = a/|a|, n = a × b/|a × b|, y= n × x/|n × x|; then, x and y are local orthogonal bases of T. In this local coordinate system, p1 = (0, 0), p2 = (a·x, a·y), and p3 = (b·x, b·y). Therefore, the representation of each triangle is simplified to 2D.

2.3.1 LSCM algorithm

The core of the LSCM algorithm is to construct the left matrix A and the right vector B of a linear system, as shown in Eqs. (5) and (6), and finally to solve the parameterization results. The detailed derivation process was published previously [35].

$$ C(x)={\left\Vert Ax-B\right\Vert}^2 $$
(5)
$$ A=\left(\begin{array}{cc}{M}_f^1& -{M}_f^2\\ {}{M}_f^2& {M}_f^1\end{array}\right),B=-\left(\begin{array}{cc}{M}_p^1& -{M}_p^2\\ {}{M}_p^2& {M}_p^1\end{array}\right)\left(\begin{array}{c}{U}_p^1\\ {}{U}_p^2\end{array}\right) $$
(6)

Here, C(x) represents the energy function to reflect the deformation degree. Since C(x) is positively related to the area of each triangle, when the area of the triangle decreases, the energy function will also decrease; therefore, this will cause a degenerate solution. To avoid this degenerate solution, it is necessary to fix at least two vertices in advance, as constraint condition. The coordinates of these two fixed points are Up. Generally, two points that are far away from each other on the surface boundary are selected as fixed points. In Eq. (6), Mf represents a F × (V-2) matrix, and Mp represents a F × 2 matrix, where F represents the number of triangles, and V represents the number of vertices. Here, the superscripts 1 and 2 represent the x and y parts of the coordinates, respectively. Matrix M = (Mf Mp) = (mij) is a sparse F × V matrix the coefficient of which can be calculated by Eqs. (7) and (8).

$$ {m}_{ij}=\left\{\begin{array}{c}\frac{W_{j,{T}_i}^k}{\sqrt{d_{T_i}}}\kern0.5em \mathrm{if}\ \mathrm{vertex}\ \mathrm{j}\ \mathrm{belongs}\ \mathrm{to}\ \mathrm{triangle}\ {T}_i,\kern0.5em j=1,2,3,\kern0.5em k=1,2\\ {}0\kern1em \mathrm{otherwise}\kern12em \end{array}\right. $$
(7)
$$ \left\{\begin{array}{c}{W}_{1,{T}_i}^1={x}_3-{x}_2,\kern0.5em {W}_{1,{T}_i}^2={y}_3-{y}_2\ \\ {}{W}_{2,{T}_i}^1={x}_1-{x}_3,\kern0.5em {W}_{2,{T}_i}^2={y}_1-{y}_3\ \\ {}{W}_{3,{T}_i}^1={x}_2-{x}_1,\kern0.5em {W}_{3,{T}_i}^2={y}_2-{y}_1\ \end{array}\right. $$
(8)

where dTi represents the area of the triangle Ti. x and y are coordinates of vertices. In Eq. (5), matrix A is full rank when the number of fixed points is greater than or equal to 2; therefore, the linear system exists and has a unique minimum value, which is the parameterization result.

The size and direction of the parameterization results can be controlled by the coordinates of two fixed points. To leave the overall size of the triangle mesh unchanged before and after parameterization and to leave the size of 2D patterns unchanged after the texture mapping, this paper proposes a two-step parameterization method. First, the ratios of all triangle edge lengths before and after the first parameterization are sorted. The average value of the middle one-half ratios after the sorting was taken as adjustment factor. Second, this adjustment factor is used to adjust the distance between the two fixed points and to conduct parameterization again. This method ensures that the overall size of the triangle mesh and 2D patterns remain unchanged before and after the parameterization and texture mapping.

2.3.2 ARAP algorithm

The results of LSCM parameterization can be further optimized by ARAP parameterization. The idea underlying the ARAP algorithm is to connect all triangles on the 2D plane through rotation transformation. The ARAP algorithm is a kind of local-global algorithm [36] and is solved via iterative calculation. In the local stage, the Jacobian matrix Jt(u) of the mapping between the 3D space triangle \( {x}_t=\left\{{x}_t^0,{x}_t^1,{x}_t^2\right\} \) and 2D plane triangle \( {u}_t=\left\{{u}_t^0,{u}_t^1,{u}_t^2\right\} \) is calculated first; then, the transformation matrix Lt is calculated, which is obtained by singular value decomposition (SVD) of the Jt(u), and the singular value is set to 1, as shown in Eq. (9). In the global stage, Lt remains fixed. Equation (10) presents the energy function. The parameterization results are calculated according to the linear system Eq. (11), which is obtained according to the gradient of the energy function (which is 0). The above two stages are repeated until the energy function basically remains unchanged.

$$ \left\{\begin{array}{c}{J}_t(u)=\sum \limits_{i=0}^2\cot \left({\theta}_t^i\right)\left({u}_t^i-{u}_t^{i+1}\right){\left({x}_t^i-{x}_t^{i+1}\right)}^T\\ {}{J}_t(u)=U\Sigma {V}^T\kern6.5em \\ {}{L}_t=U{V}^T\kern6.5em \end{array}\right. $$
(9)
$$ E\left(u,L\right)=\frac{1}{2}\sum \limits_{\left(i,j\right)\in he}\cot \left({\theta}_{\mathrm{ij}}\right){\left\Vert \left({u}_i-{u}_j\right)-{L}_{t\left(i,j\right)}\left({x}_i-{x}_j\right)\right\Vert}^2 $$
(10)
$$ \sum \limits_{j\in N(i)}\left[\cot \left({\theta}_{\mathrm{ij}}\right)+\cot \left({\theta}_{\mathrm{ji}}\right)\right]\left({u}_i-{u}_j\right)=\sum \limits_{j\in N(i)}\left[\cot \left({\theta}_{\mathrm{ij}}\right){L}_{t\left(i,j\right)}+\cot \left({\theta}_{\mathrm{ji}}\right){L}_{t\left(j,i\right)}\right]\left({x}_i-{x}_j\right) $$
(11)

In Eq. (9), uti and xti represent coordinates of vertices, and θti represents the angle opposite to the edge (xti, xti+1). In Eq. (10), u and L are expressions of dependence on the parameterization coordinates u and the set L of transformations, respectively. Here, the half-edge structure of the triangle mesh is used, ui and xi represent coordinates of vertices, he represents the set of half-edges, t(i, j) represents the triangle containing edge (i, j), θij represents the angle opposite the edge (i, j), N(i) represents the set of all vertices adjacent to xi, Lt(i, j) represents the transformation matrix of t(i, j), and ||·||2 represents the Frobenius norm.

Unlike the LSCM algorithm, the ARAP algorithm does not require fixed points to constrain the parameterization results. Since the ARAP algorithm is a non-linear system, it requires an initial parameterization result and is calculated iteratively. When the difference of the energy function between two iterations is less than a certain value, the parameterization is considered to be finished. In this study, this value was set to 0.001, which could be changed to control the transformation of parameterization results from LSCM to ARAP.

2.4 Texture mapping

After model parameterization, the 2D patterns to be processed need to be mapped to the 3D model surface through texture mapping technology to obtain 3D machining paths. First, the 2D patterns are projected onto the parameterized plane, and the triangle to which each point on the 2D patterns belongs is quickly identified through the K-neighborhood search algorithm. Then, the barycentric coordinate method is used to solve the mapping relationship between the points in the plane mesh and the surface mesh.

Let q be a point on the 2D plane triangle \( {u}_t=\left\{{u}_t^0,{u}_t^1,{u}_t^2\right\} \) and p be a corresponding point on the 3D space triangle \( {x}_t=\left\{{x}_t^0,{x}_t^1,{x}_t^2\right\} \). When the vertices coordinates of ut and xt are known, the coordinates of p can be calculated from the coordinates of q according to Eqs. (12) and (13):

$$ p={D}_0{x}_t^0+{D}_1{x}_t^1+{D}_2{x}_t^2 $$
(12)
$$ {D}_0=\frac{d_{T_i\left(q,\kern0.5em {u}_t^1,{u}_t^2\right)}}{d_{T_i\left({u}_t^0,{u}_t^1,{u}_t^2\right)}} $$
(13)

where D0, D1, and D2 represent the area ratio, \( {d}_{T_i} \) represents the area of the triangle, and D0 + D1 + D2 = 1.

3 Experiments and discussion

3.1 Equipment condition

A 3D galvanometer scanner (SCANLAB intelliSCANse 14 and varioSCANde20i) equipped with a 255-mm focal length f-theta lens was used in this study. As industrial camera, a 5-million-pixel black-and-white camera (MER-531-20GM-P) was used and was affixed to the side of the 3D galvanometer scanner. The field of view of the industrial camera was about 100 mm × 100 mm at a working distance of 150 mm. The laser source was a 1064-nm fiber laser (RFL-P30Q) with a 680-nm indicator light. The 3D galvanometer scanner and camera were mounted on a four-axis laser machining machine, as shown in Fig. 2.

Fig. 2
figure 2

Four-axis in situ laser machining machine

In this study, different colors were used to distinguish the models in each step. The point cloud model is represented by a model with a color height. The green model represents the 3D model to be parameterized, the red model represents the 2D parameterized model after parameterization, the gray model represents the original 3D model after model segmentation in the local parameterization method, and the black patterns represent 2D or 3D processing patterns.

3.2 In situ measurement and 3D reconstruction

To facilitate observation and measurement, arc and triangle parts were applied for in situ 3D laser machining experiments. During in situ measurement, the galvanometer scanner scanned a laser line every 0.25 mm, the camera exposure time was 50 ms, the scanning speed of the galvanometer scanner was 5000 mm/s, and after one picture was captured, the galvanometer scanner was controlled to move the laser line to the next position. The original 3D point clouds obtained by in situ measurement are shown in Fig. 3a and b. Then, a 2D uniform grid (with a side length of 0.5 mm) was used to simplify the point cloud, and the results are shown in Fig. 3c and d. The number of points in the point cloud of the arc part was decreased from 170,074 to 8086 and from 169,832 to 8031 for the triangle part. The simplification rate was about 95%, which could be adjusted by the side length of the 2D uniform grid according to model complexity and accuracy requirements. A high-performance Delaunay triangulation algorithm was used to reconstruct the simplified point cloud and to generate a 3D model in the form of the triangle mesh, as shown in Fig. 3e and f. The local enlarged view in the red box shows that the size and shape of the triangle mesh are uniform, which is conducive to the accuracy of parameterization.

Fig. 3
figure 3

In situ measurement and reconstruction results of arc and triangle parts. Original point cloud of (a) arc and (b) triangle parts; simplified point cloud of (c) arc and (d) triangle parts; 3D reconstruction results of (e) arc and (f) triangle parts; the red box shows a partially enlarged view

3.3 Entire model parameterization and texture mapping

The LSCM and ARAP algorithms were used to parameterize the models shown in Fig. 3e and f. Figure 4 shows the results of both parameterization and texture mapping. The parameterization results of the LSCM algorithm show obvious distortion (Fig. 4a and b), because the theoretical shape of the arc and triangle models after parameterization should be approximately rectangular. The parameterization results of the ARAP algorithm are closer to the theoretical results (Fig. 4c and d). The chessboard pattern was projected on the parameterized plane, and the pattern was mapped back to the 3D model. The texture mapping results indicate that the results based on the LSCM parameterization also show obvious distortion, and the surrounding edges of the chessboard are not parallel. The texture mapping results based on ARAP parameterization show no obvious distortion, and the surrounding edges are parallel. This is because the ARAP algorithm can better guarantee the minimum area distortion. Therefore, in the case of parameterization of the entire model, the ARAP algorithm achieves better parameterization and texture mapping results than LSCM algorithm.

Fig. 4
figure 4

Entire parameterization and texture mapping results based on the LSCM and ARAP algorithms. LSCM parameterization and texture mapping results of (a) arc and (b) triangle models. ARAP parameterization and texture mapping results of (c) arc and (d) triangle models. Red models are parameterization results, and green models show top views of 3D models

3.4 Local parameterization and accuracy test

Since the parameterization process is an overall optimization process, the shape of each triangle is considered, and the distortion is evenly distributed. When the model is large or complex, the calculation speed will be slow and efficiency will be low. The parameterization accuracy is difficult to guarantee because specific local structures affect the parameterization results. To improve the accuracy and speed of parameterization, this study proposes a local parameterization method. First, the 2D patterns were moved to the top of the projection part of the 3D model, and the 3D model was divided by a rectangular bounding box large enough to surround all 2D patterns (Fig. 5a and b). The triangles inside the bounding box form a new 3D surface model, shown as the green model in Fig. 5. Only the green model is parameterized, while the parts that do not involve projection are not parameterized. This greatly reduces the amount of data involved in the parameterization, avoids interference from the rest of the model, and improves parameterization accuracy.

Fig. 5
figure 5

Local parameterization, texture mapping, and actual machining results. The machining position projection and surface segmentation results of (a) arc and (b) triangle models; the local parameterization results of (c) arc and (d) triangle models; texture mapping results of (e) arc and (f) triangle models based on ARAP parameterization; actual in situ machining results of (g) arc and (h) triangle models based on ARAP parameterization

Figure 5a and b show that the complexity and data volume of the green model are significantly reduced compared with the original model. The local parameterization results show significantly less distortion than the entire parameterization results in Section 3.3. The parameterization results based on the LSCM and ARAP algorithms are basically identical (Fig. 5c and d). The texture mapping results shown in Fig. 5e and f and the actual machining results in Fig. 5g and h show that the machining position and size of the pattern are consistent with the design, and there is no obvious distortion.

The accuracy of the machining results was assessed by a photocopier scanner and optical microscope. The overall side lengths of the processed patterns were measured using DigitalMicrograph software (l1 in Fig. 6), and the side lengths of the single small square were measured with optical microscope (l2 and l3 in Fig. 6). The ideal measurement values are determined by multiplying the value of the pixel distance by the number of the pixel distance. Here, l1 has a pixel distance of 70 × 11 − 2, and the pixel distance is 0.035 mm; therefore, the ideal value of l1 is 26.880 mm. l2 and l3 have a pixel distance of 70 − 2 (minus 2 because there is a blank pixel space between each small square); therefore, the ideal value of l2 and l3 is 2.380 mm. The measurement results are shown in Table 1. The results show that the relative error of l1 is about 1%. The errors based on the ARAP algorithm were slightly smaller than those based on the LSCM algorithm. The final actual machining accuracy is the result of the superposition of multiple factors, such as the calibration accuracy of the galvanometer scanner, the processing speed, and the laser delay parameters. The measured machining accuracy basically meets the requirements of macro 3D laser machining.

Fig. 6
figure 6

Accuracy test of actual machining on (a) triangle and (b) arc parts, where l1 is the overall side length and l2 and l3 are the side lengths of the small square

Table 1 Results of the accuracy test

3.5 Bitmap vectorization

This study developed an in situ machining software (Fig. 7), which imported the point cloud files generated by in situ measurement, automatically generated the triangle mesh model, and projected the 2D bitmap and vector format files to the surface of the 3D model. Because bitmap image generally contain many pixels, if texture mapping calculation were to be conducted for each pixel, the calculation amount would be large, and it would be difficult to guarantee the accuracy of each pixel. To reduce the required calculation amount of texture mapping and to improve the accuracy of texture mapping, this paper proposes a bitmap vectorization method. The continuous black pixels were converted to a line segment along the X or Y direction of the bitmap, and the length of this line segment was determined according to the value and the number of the pixel distance. In this way, the texture mapping calculation was performed only on the two vertices of each line segment and the intersection point between each line segment and triangles. This greatly reduces the required calculation amount of texture mapping and can unify the texture mapping methods for bitmap and vector format files.

Fig. 7
figure 7

Interface of the in situ machining software

An experimental case study was conducted on a resin 3D printed wave–curved surface part, and the measurement results are shown in Fig. 8a. The school logo pattern (in bitmap format) and the hollowed-out text (in vector format) were imported into the software. The range of local parameterization was a rectangular bounding box that surrounded all 2D patterns. The software interface in Fig. 7 shows the texture mapping results after ARAP parameterization. The partial enlarged view in Fig. 8b shows that the school logo was stored and displayed in the form of line segments. The actual machining results are shown in Fig. 8c. The patterns were accurately and completely processed according to the designed position and size. Because of the high damage threshold of resin, the actual trace processed by the fiber laser was a little shallow. The demonstration video of this case is presented as Online Resource 1.

Fig. 8
figure 8

In situ machining of bitmap and vector format patterns on a wave-surface part. a Original and simplified point cloud of the wave surface obtained from in situ measurement; b school logo in bitmap format and text in vector format are projected onto the local parameterization results, and the green box shows a partially enlarged view; (c) actual in situ machining results

3.6 Parameterization with complex boundary shape

In the above cases, the local parameterization method causes the boundary shape of the green model to be a simple rectangle. In this case study, a spoon was used as experimental object. Because the spoon has a handle and a head, its model boundary consists of complex convex and concave shapes. The in situ measurement results are shown in Fig. 9a. After the 3D model was reconstructed, the local parameterization method was conducted to form the green and red models, which also had convex and concave boundaries (Fig. 9 b and c). Then, a dandelion pattern was projected onto the handle and head of the spoon. The parameterization results show that the LSCM algorithm caused a large distortion, the spoon handle was distorted to a larger size, the spoon head was distorted to a smaller size, and the dandelion pattern became incomplete after texture mapping, as shown in Fig. 9b. This must be due to incorrect parameterization results. The LSCM algorithm uses two fixed points to constrain the entire deformation. When both concave and convex boundaries exist, because only the distortion of the angle is limited, the uneven boundary shape makes it difficult to maintain a uniform area distortion. Because the ARAP algorithm guarantees both minimum area distortion and a certain degree of angle distortion, it can also guarantee better parameterization results for models with complex boundary shapes (Fig. 9c). In Fig. 10, texture mapping and in situ machining were conducted based on the results of ARAP parameterization, and the actual machining results were consistent with the design.

Fig. 9
figure 9

In situ measurement, parameterization, and texture mapping results of a spoon. a In situ measurement results; b results of local parameterization and texture mapping based on the LSCM algorithm; and c results of local parameterization and texture mapping based on the ARAP algorithm

Fig. 10
figure 10

Designed and actual machining results of patterns on a spoon. a Texture mapping results of a dandelion pattern; b actual in situ machining results

4 Conclusion

The in situ 3D laser machining system proposed in this study formed a complete “scanning-modeling-projection-machining” integrated processing system for pattern machining on a free-form surface. The galvanometer self-scanning method was used for in situ measurement, the 2D uniform grid method was used to simplify the point cloud, and a high-efficiency Delaunay triangulation algorithm was employed for 3D reconstruction to generate a 3D model with uniform triangle shape and size. Both LSCM and ARAP algorithms were used for model parameterization. A local parameterization method was proposed, which decreased the volume of model data involved in the parameterization calculation, to improve the accuracy and speed of parameterization. A bitmap vectorization method was proposed to decrease the volume of texture mapping calculation and to improve the texture mapping speed and accuracy. An in situ machining software was developed, and both algorithms were verified by actual in situ machining experiments. The LSCM algorithm achieves a fast calculation speed but generates a large distortion if the model is complex. The ARAP algorithm has a slightly slower calculation speed because of the required iteration, but it can guarantee parameterization accuracy.

The advantages of the developed method include as follows: (1) compared with existing in situ processing methods, the proposed machining system has 3D projection ability and can transfer a 2D pattern onto a 3D surface; (2) compared with the existing parameterization methods in laser processing, this paper uses a self-built model to avoid the problem that the quality of the STL model cannot be easily guaranteed; (3) the in situ method eliminates the steps of assembly and clamping of parts; (4) the local parameterization and bitmap vectorization methods improve both the speed and accuracy of parameterization and texture mapping.