[go: up one dir, main page]

WO2025121204A1 - Dispositif d'inspection - Google Patents

Dispositif d'inspection Download PDF

Info

Publication number
WO2025121204A1
WO2025121204A1 PCT/JP2024/041788 JP2024041788W WO2025121204A1 WO 2025121204 A1 WO2025121204 A1 WO 2025121204A1 JP 2024041788 W JP2024041788 W JP 2024041788W WO 2025121204 A1 WO2025121204 A1 WO 2025121204A1
Authority
WO
WIPO (PCT)
Prior art keywords
alignment
target
coordinates
origin
measurement value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/JP2024/041788
Other languages
English (en)
Japanese (ja)
Inventor
歩 高島
洋樹 杉原
敏行 陣田
達弥 岡田
遥 藤重
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Toray Engineering Co Ltd
Original Assignee
Toray Engineering Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2023205564A external-priority patent/JP2025090367A/ja
Priority claimed from JP2023205561A external-priority patent/JP2025090364A/ja
Application filed by Toray Engineering Co Ltd filed Critical Toray Engineering Co Ltd
Publication of WO2025121204A1 publication Critical patent/WO2025121204A1/fr
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras

Definitions

  • This disclosure relates to an inspection device.
  • an inspection device that inspects a circuit board based on an image obtained by capturing an image of the circuit board with a camera.
  • One type of inspection device involves placing a substrate on a stage on which multiple chips are arranged, capturing an image of the substrate placed on the stage with a camera, and measuring the coordinates of each chip on the substrate based on the captured image.
  • the measured coordinate values of each chip on the board obtained based on the captured image may deviate from the true values due to various error factors.
  • This disclosure has been made in light of these points, and its purpose is to provide an inspection device that can obtain accurate coordinates of a chip mounted on a substrate.
  • the inspection device comprises a stage on which a target substrate having a plurality of chips arranged side by side is placed, a camera that images the target substrate placed on the stage to obtain an image divided into a plurality of pixels, and a controller that calculates target measurement values of target coordinates of the chip relative to a target origin provided on the target substrate based on the distance in the direction in which the pixels are arranged between an image reference position in the image and the chip shown in the image, and a relative movement setting value between the stage and the camera, and the controller corrects the target measurement values related to the target coordinates by a projection transformation matrix to calculate target correction values related to the target coordinates, and an alignment substrate corresponding to the target substrate has a plurality of chips arranged side by side.
  • a plurality of alignment marks corresponding to the number of chips are provided, and an alignment origin corresponding to the target origin is provided on the alignment substrate, and the controller calculates an alignment measurement value of the alignment coordinate of the alignment mark relative to the alignment origin based on the distance in the pixel arrangement direction between the image reference position and the alignment mark shown in the captured image with the alignment origin aligned to the image reference position, the controller acquires known alignment measurement values of the alignment coordinates, and the controller calculates parameters of the projective transformation matrix based on the alignment measurement value and the alignment actual measurement value.
  • the target measurement values of the target coordinates of the chip on the target substrate obtained based on the captured image may deviate from the true values due to various error factors.
  • the inspection device disclosed herein uses a projective transformation matrix to convert the target measurement value into a target correction value, bringing it closer to the true value.
  • the alignment mark of the alignment substrate is used with the alignment origin aligned with the image reference position.
  • the parameters of the projective transformation matrix are then calculated based on the alignment measurement values and known actual alignment measurement values. This makes it possible to obtain a highly accurate projective transformation matrix.
  • a plurality of the chips are arranged on the target substrate in a first direction and a second direction intersecting the first direction
  • the controller calculates a first object measurement value of a first object coordinate of the chip in the first direction relative to the object origin based on a horizontal distance between the image reference position and the chip shown in the captured image in the horizontal direction in which the pixels are aligned, and a first relative movement setting value between the stage and the camera in the first direction, and calculates a second object measurement value of a second object coordinate of the chip in the second direction relative to the object origin based on a vertical distance between the image reference position and the chip shown in the captured image in the vertical direction in which the pixels are aligned, and a second relative movement setting value between the stage and the camera in the second direction
  • the controller corrects the first object measurement value related to the first object coordinates by the projection transformation matrix to calculate a first object correction value related to the first object coordinates, and corrects the second object measurement value related to the second object coordinates to calculate a second object correction value
  • the controller uses the projective transformation matrix to correct for shifts in the object measurements due to camera distortion to obtain the object correction values.
  • the target measurement value can be converted into a target correction value using the projective transformation matrix, making it possible to approximate the true value.
  • the inspection device further comprises a stage on which a target substrate having a plurality of chips arranged side by side is placed, a camera which captures an image of the target substrate placed on the stage, and a controller which calculates measurement values of target coordinates of the chip based on the captured image, and the controller corrects the measurement values of the target coordinates and calculates correction values of the target coordinates using a machine learning model in which the measurement values of the target coordinates and the actual measurement values of the target coordinates are used as training data.
  • the measurement values relating to the target coordinates of the chip on the target substrate obtained based on the captured image may deviate from the true values due to various error factors.
  • the inspection device disclosed herein uses a machine learning model to convert the measured values into corrected values, bringing them closer to the true values.
  • the measured values and actual measured values are used as training data in the machine learning model. This makes it possible to obtain a highly accurate machine learning model.
  • the captured image is divided into a plurality of pixels
  • the controller calculates the measurement values related to the target coordinates of the chip relative to a target origin provided on the target substrate based on the distance between an image reference position in the captured image and the chip shown in the captured image in the direction in which the pixels are aligned, and based on the relative movement setting value between the stage and the camera.
  • an alignment substrate corresponding to the target substrate is provided with a plurality of alignment marks corresponding to a plurality of the chips, and an alignment origin corresponding to the target origin is provided on the alignment substrate
  • the controller calculates alignment measurement values of the alignment coordinates of the alignment mark based on the distance in the pixel arrangement direction between the image reference position and the alignment mark shown in the captured image with the alignment origin aligned with the image reference position, and the controller obtains alignment actual measurement values related to the alignment coordinates
  • the machine learning model uses the alignment measurement values related to the alignment coordinates as teacher data as the measurement values related to the target coordinates, and uses the alignment actual measurement values related to the alignment coordinates as teacher data as the actual measurement values related to the target coordinates.
  • This configuration makes it easy to obtain the data necessary to train a machine learning model.
  • This disclosure provides an inspection device that can obtain accurate coordinates of a chip mounted on a substrate.
  • FIG. 1 shows an inspection device.
  • FIG. 2 shows a target substrate.
  • FIG. 3 shows an image captured by the camera.
  • FIG. 4 shows the measurement of object coordinates relative to the tip.
  • FIG. 5 shows an image of the error factors.
  • FIG. 6 shows the alignment substrate.
  • FIG. 7 shows alignment measurements relating to alignment marks.
  • FIG. 8 shows the distortion aberration of a camera lens.
  • FIG. 9 shows a machine learning model.
  • FIG. 1 shows the inspection device 1.
  • the inspection device 1 includes a rail 10, a stage 20, a camera 30, and a controller 40.
  • the rail 10 includes a first rail 11 and a second rail 12.
  • the first rail 11 and the second rail 12 extend in a horizontal direction.
  • the first rail 11 extends in a front-rear direction x as a first direction among the horizontal directions.
  • the second rail 12 extends in a left-right direction y as a second direction among the horizontal directions.
  • the front-rear direction x and the left-right direction y intersect with each other (specifically, orthogonal to each other).
  • the first rail 11 and the second rail 12 intersect at a reference position.
  • the stage 20 is formed, for example, in the shape of a plate with the vertical direction z as its thickness direction.
  • the vertical direction z is perpendicular to the horizontal direction.
  • the vertical direction z is also the vertical direction.
  • the stage 20 extends in the horizontal direction.
  • the stage 20 is placed on the upper surface of the rail 10. More specifically, the stage 20 is placed on the upper surface of the first rail 11 and the upper surface of the second rail 12.
  • the stage 20 moves along the rail 10 by an actuator (not shown). More specifically, the stage 20 moves in the forward/backward direction x along the first rail 11.
  • the stage 20 moves in the left/right direction y along the second rail 12.
  • the camera 30 is positioned above the stage 20.
  • the camera 30 and the stage 20 are spaced apart from each other in the vertical direction z.
  • the imaging section of the camera 30 faces the upper surface of the stage 20.
  • the imaging axis of the camera 30 is perpendicular to the upper surface (horizontal plane) of the stage 20.
  • the camera 30 is fixed by a bracket or the like (not shown). The camera 30 does not move along with the stage 20. Details of the camera 30 will be described later.
  • the controller 40 is built into the main body of the inspection device 1.
  • the controller 40 includes, for example, a microcomputer mounted on a control board and a memory device that stores software for operating the microcomputer.
  • the controller 40 controls the actuator to move the stage 20 along the rail 10.
  • the controller 40 performs the calculation processing described below.
  • the target substrate 50 is a substrate to be inspected by the inspection device 1.
  • the target substrate 50 is, for example, a semiconductor substrate.
  • the target substrate 50 is, for example, a rectangular plate.
  • the target substrate 50 is placed on the upper surface of the stage 20. As the stage 20 moves along the rails 10, the target substrate 50 moves in the forward/backward direction x and the left/right direction y.
  • a number of chips 51 are arranged side by side on the target substrate 50.
  • the chips 51 are arranged in a matrix on the target substrate 50 in the front-to-back direction x and the left-to-right direction y.
  • the chips 51 are arranged at equal intervals in the front-to-back direction x.
  • the chips 51 are arranged at equal intervals in the left-to-right direction y.
  • Chip 51 is, for example, an integrated circuit. For ease of understanding, chip 51 is shown in FIG. 2 as a simple rectangle.
  • a target origin 52 is provided on the target substrate 50.
  • the target origin 52 is configured, for example, at a corner of the target substrate 50.
  • the coordinate in the forward/backward direction x of any chip 51 relative to (based on) the object origin 52 is the first object coordinate X.
  • the distance in the forward/backward direction x from the object origin 52 to the center of any chip 51 is the first object true value Xr.
  • the coordinate in the left/right direction y of any chip 51 relative to (based on) the object origin 52 is the second object coordinate Y.
  • the distance in the left/right direction y from the object origin 52 to the center of any chip 51 is the second object true value Yr.
  • the user does not know the first target true value Xr and the second target true value Yr.
  • first object coordinate X and the second object coordinate Y are illustrated for only one chip 51, but in reality, there are first object coordinates X and second object coordinates Y for all chips 51.
  • FIG. 3 shows a captured image 70 obtained by capturing an image of a chip 51 on a target substrate 50 with the camera 30.
  • the camera 30 does not move along with the stage 20.
  • the target substrate 50 which is placed on the upper surface of the stage 20 and moves in the forward/backward direction x and the left/right direction y, passes below the camera 30.
  • the camera 30 captures an image of the chip 51 on the target substrate 50 placed on the stage 20 to obtain the captured image 70.
  • the field of view of the captured image 70 is, for example, rectangular.
  • the captured image 70 is partitioned into a number of pixels 71.
  • the pixels 71 are arranged in a matrix in the horizontal direction t and the vertical direction v.
  • the pixels 71 are arranged at equal intervals in the horizontal direction t.
  • the pixels 71 are arranged at equal intervals in the vertical direction v. If there are no error factors, which will be described later, the horizontal direction t in which the pixels 71 are arranged will coincide with the front-to-back direction x, and the vertical direction v in which the pixels 71 are arranged will coincide with the left-to-right direction y.
  • the captured image 70 has an image reference position 72.
  • the image reference position 72 is, for example, at the center of the captured image 70.
  • Horizontal distance T is the distance in the horizontal direction t in which the pixels 71 are arranged between image reference position 72 in captured image 70 and chip 51 of target substrate 50 shown in captured image 70.
  • Vertical distance V is the distance in the vertical direction v in which the pixels 71 are arranged between image reference position 72 in captured image 70 and chip 51 of target substrate 50 shown in captured image 70.
  • (Target measurement value) 4 shows measurement of a first object coordinate X and a second object coordinate Y of a chip 51 provided on a target substrate 50.
  • the target substrate 50 placed on the stage 20 moves in the front-back direction x and the left-right direction y.
  • the camera 30 does not move along with the stage 20.
  • the camera 30 captures an image of the chip 51 of the target substrate 50 placed on the stage 20 and moving in the front-back direction x and the left-right direction y to obtain a captured image 70.
  • the controller 40 measures the first object coordinate X and the second object coordinate Y of the chip 51 of the target substrate 50 by the following method.
  • the controller 40 receives data of the captured image 70 obtained by the camera 30.
  • the controller 40 calculates the horizontal distance T between the image reference position 72 and the chip 51 based on the captured image 70.
  • the horizontal distance T is calculated based on the number of pixels 71 aligned in the horizontal direction t between the image reference position 72 and the chip 51, and the magnification of the captured image 70 in the horizontal direction t.
  • the controller 40 acquires a first relative movement setting value Lx between the stage 20 and the camera 30 in the forward/backward direction x.
  • the first movement setting value Lx is a setting value for the amount of relative movement between the stage 20 and the camera 30 in the forward/backward direction x.
  • the first movement setting value Lx is equal to the setting value for the amount of movement of the stage 20 in the forward/backward direction x.
  • the first movement setting value Lx is an input value that the user inputs to the controller 40 to move the stage 20 in the forward/backward direction x.
  • the target origin 52 of the target substrate 50 is aligned so as to overlap with the image reference position 72 of the captured image 70. If there are no error factors, which will be described later, the first movement setting value Lx will match the actual amount of movement of the target origin 52 in the forward/backward direction x relative to the image reference position 72.
  • the first object coordinate X is the coordinate in the forward/backward direction x of the tip 51 relative to (based on) the object origin 52.
  • the controller 40 calculates the first object measurement value Xm of the first object coordinate X based on the lateral distance T and the first movement setting value Lx.
  • the controller 40 receives data of the captured image 70 obtained by the camera 30.
  • the controller 40 calculates the vertical distance V between the image reference position 72 and the chip 51 based on the captured image 70.
  • the vertical distance V is calculated based on the number of pixels 71 aligned in the vertical direction v between the image reference position 72 and the chip 51, and the magnification of the captured image 70 in the vertical direction v.
  • the controller 40 acquires a second relative movement setting value Ly between the stage 20 and the camera 30 in the left-right direction y.
  • the second movement setting value Ly is a setting value for the relative amount of movement in the left-right direction y between the stage 20 and the camera 30.
  • the second movement setting value Ly is equal to the setting value for the amount of movement of the stage 20 in the left-right direction y.
  • the second movement setting value Ly is an input value that the user inputs to the controller 40 to move the stage 20 in the left-right direction y.
  • the target origin 52 of the target substrate 50 is aligned so as to overlap with the image reference position 72 of the captured image 70. If there are no error factors, which will be described later, the second movement setting value Ly matches the actual amount of movement of the target origin 52 in the left-right direction y relative to the image reference position 72.
  • the second object coordinate Y is the coordinate in the left-right direction y of the tip 51 relative to (based on) the object origin 52.
  • the controller 40 calculates the second object measurement value Ym of the second object coordinate Y based on the vertical distance V and the second movement setting value Ly.
  • (Error factors) 5 shows an image of the error factors. If there were no error factors, the first object measurement value Xm would match the first object true value Xr, which is the true value of the first object coordinate X, and the second object measurement value Ym would match the second object true value Yr, which is the true value of the second object coordinate Y.
  • the first object measurement value Xm may deviate from the first object true value Xr (Xm ⁇ Xr), and the second object measurement value Ym may deviate from the second object true value Yr (Ym ⁇ Yr).
  • error factors include assembly errors between the stage 20 and the camera 30, the imaging axis of the camera 30 not being perpendicular to the horizontal plane of the stage 20, deviations in the position and angle of the stage 20, deviations in the position and angle of the camera 30, the difference between the first movement setting value Lx and the second movement setting value Ly input by the user and the actual movement amount of the stage 20, vibration of the stage 20, vibration of the camera 30, measurement errors (distortion aberration) due to distortion of the lens of the camera 30, and assembly errors between the lens of the camera 30 and the image sensor of the camera 30.
  • the first target measurement value Xm and the second target measurement value Ym contain errors, so they deviate from the first target true value Xr and the second target true value Yr.
  • the first target measurement value Xm and the second target measurement value Ym need to be corrected in some way.
  • the controller 40 corrects the first object measurement value Xm associated with the first object coordinate X using the projective transformation matrix R to calculate a first object correction value Xc associated with the first object coordinate X.
  • the controller 40 corrects the second object measurement value Ym associated with the second object coordinate Y to calculate a second object correction value Yc associated with the second object coordinate Y.
  • the projective transformation matrix R is also called a homography transformation matrix, and is expressed by the following equation [Mathematical Expression 1].
  • the nine parameters a, b, c, d, e, f, g, h, and s in the projective transformation matrix R must be set by the user. Parameters a through h and s are obtained by using the alignment substrate 60, which will be described later. Projective transformation matrix R is stored, for example, in the memory of the controller 40.
  • (Alignment substrate) 6 shows an alignment substrate 60.
  • the alignment substrate 60 corresponds to the target substrate 50.
  • the alignment substrate 60 is a substrate for calibration.
  • the alignment substrate 60 is placed on the stage 20, similar to the target substrate 50.
  • the alignment substrate 60 has the same specifications as the target substrate 50.
  • the shape and dimensions of the alignment substrate 60 are the same as those of the target substrate 50.
  • the alignment substrate 60 has a plurality of alignment marks 61.
  • the alignment marks 61 on the alignment substrate 60 correspond to the chips 51 on the target substrate 50.
  • the shapes and positions of the alignment marks 61 on the alignment substrate 60 are the same as the shapes and positions of the chips 51 on the target substrate 50.
  • the pitch and number of the alignment marks 61 are the same as the pitch and number of the chips 51.
  • the multiple alignment marks 61 are arranged in a matrix on the alignment substrate 60 in the front-to-back direction x and the left-to-right direction y.
  • the multiple alignment marks 61 are arranged at equal intervals in the front-to-back direction x.
  • the multiple alignment marks 61 are arranged at equal intervals in the left-to-right direction y.
  • the alignment marks 61 are, for example, marks that resemble integrated circuits.
  • the alignment substrate 60 has an alignment origin 62.
  • the alignment origin 62 on the alignment substrate 60 corresponds to the target origin 52 on the target substrate 50.
  • the position of the alignment origin 62 on the alignment substrate 60 is the same as the position of the target origin 52 on the target substrate 50.
  • the alignment origin 62 is configured, for example, at a corner of the alignment substrate 60.
  • the coordinate in the forward/backward direction x of any alignment mark 61 relative to (based on) the alignment origin 62 is the first alignment coordinate X'.
  • the distance in the forward/backward direction x from the alignment origin 62 to the center of any alignment mark 61 is the first alignment actual measurement value Xk'.
  • the coordinate in the left/right direction y of any alignment mark 61 relative to (based on) the alignment origin 62 is the second alignment coordinate Y'.
  • the distance in the left/right direction y from the alignment origin 62 to the center of any alignment mark 61 is the second alignment actual measurement value Yk'.
  • the first alignment coordinate X' corresponds to the first target coordinate X.
  • the second alignment coordinate Y' corresponds to the second target coordinate Y.
  • the first alignment actual measurement value Xk' corresponds to the first target true value Xr.
  • the second alignment actual measurement value Yk' corresponds to the second target true value Yr.
  • the user knows the first alignment actual measurement value Xk' and the second alignment actual measurement value Yk' in advance.
  • the first alignment actual measurement value Xk' and the second alignment actual measurement value Yk' have been measured in advance, for example, by a sensor or the like.
  • the first alignment actual measurement value Xk' is known.
  • the second alignment actual measurement value Yk' is known.
  • FIG. 7 shows a first alignment measurement value Xm′ and a second alignment measurement value Ym′ related to an alignment mark 61 on an alignment substrate 60 .
  • the alignment origin 62 of the alignment board 60 is aligned so as to overlap with the image reference position 72 of the captured image 70. At this time, the alignment board 60 is stopped by a stopper or the like (not shown), so that the alignment origin 62 is positioned at the image reference position 72.
  • the controller 40 receives data of the captured image 70 obtained by the camera 30.
  • the controller 40 calculates the horizontal distance T between the image reference position 72 and the alignment mark 61 based on the captured image 70.
  • the horizontal distance T is the distance in the horizontal direction t in which the pixels 71 are arranged between the image reference position 72 in the captured image 70 and the alignment mark 61 of the alignment substrate 60 shown in the captured image 70.
  • the horizontal distance T is calculated based on the number of pixels 71 arranged in the horizontal direction t between the image reference position 72 and the alignment mark 61, and the magnification of the captured image 70 in the horizontal direction t.
  • the alignment origin 62 of the alignment board 60 is aligned with the image reference position 72.
  • the first alignment coordinate X' is the coordinate of the alignment mark 61 in the forward/backward direction x relative to the alignment origin 62 (based on the alignment origin 62).
  • the controller 40 calculates the first alignment measurement value Xm' of the first alignment coordinate X' based on the lateral distance T when the alignment origin 62 is aligned with the image reference position 72.
  • the first alignment measurement value Xm' may not match the first alignment actual measurement value Xk' (Xm' ⁇ Xk').
  • the data of the captured image 70 obtained by the camera 30 is input to the controller 40.
  • the controller 40 calculates the vertical distance V between the image reference position 72 and the alignment mark 61 based on the captured image 70.
  • the vertical distance V is the distance in the vertical direction v in which the pixels 71 are lined up between the image reference position 72 in the captured image 70 and the alignment mark 61 of the alignment substrate 60 shown in the captured image 70.
  • the vertical distance V is calculated based on the number of pixels 71 lined up in the vertical direction v between the image reference position 72 and the alignment mark 61, and the magnification of the captured image 70 in the vertical direction v.
  • the second alignment coordinate Y' is the coordinate of the alignment mark 61 in the left-right direction y relative to the alignment origin 62 (based on the alignment origin 62).
  • the controller 40 calculates the second alignment measurement value Ym' of the second alignment coordinate Y' based on the vertical distance V when the alignment origin 62 is aligned with the image reference position 72.
  • the second alignment measurement value Ym' may not match the second alignment actual measurement value Yk' (Ym' ⁇ Yk').
  • the parameters a to h and s of the projective transformation matrix R are derived by the following method.
  • the controller 40 acquires a known first alignment actual measurement value Xk' associated with the first alignment coordinate X'.
  • the first alignment actual measurement value Xk' is stored, for example, in the memory of the controller 40.
  • the controller 40 acquires a known second alignment actual measurement value Yk' associated with the second alignment coordinate Y'.
  • the second alignment actual measurement value Yk' is stored, for example, in the memory of the controller 40.
  • the controller 40 calculates the parameters a-h, s of the projective transformation matrix R based on the first alignment measurement value Xm', the first alignment actual measurement value Xk', the second alignment measurement value Ym', and the second alignment actual measurement value Yk'.
  • equation [1] is replaced with equation [2] using the alignment substrate 60.
  • the right side of the equation [Equation 2] contains the first alignment correction value Xc' and the second alignment correction value Yc'.
  • the first alignment correction value Xc' corresponds to the first target correction value Xc.
  • the second alignment correction value Yc' corresponds to the second target correction value Yc.
  • the first alignment correction value Xc' and the second alignment correction value Yc' contain unknown parameters a-h, and s, and therefore are not yet expressed as specific numerical values.
  • the parameters a-h, s of the projection transformation matrix R must be set so that the first alignment correction value Xc' (first target correction value Xc) approaches the first alignment actual measurement value Xk' (first target true value Xr) and the second alignment correction value Yc' (second target correction value Yc) approaches the second alignment actual measurement value Yk' (second target true value Yr).
  • the parameters a through h and s of the projective transformation matrix R are found, for example, by the least squares method. Specifically, find a through h and s when the value N shown in the formula [Equation 3] is at a minimum.
  • the projective transformation matrix R has nine unknown parameters a through h and s, it is sufficient to formulate and solve nine equations under nine conditions (e.g., using nine alignment marks 61) so that the value N in equation [3] is minimized.
  • the target measurement values Xm, Ym of the target coordinates X, Y of the chip 51 on the target substrate 50 obtained based on the captured image 70 may deviate from the target true values Xr, Yr, which are true values, due to various error factors.
  • the projective transformation matrix R is used to convert the target measurement values Xm, Ym into target correction values Xc, Yc, and bring them closer to the target true values Xr, Yr, which are the true values.
  • the alignment mark 61 of the alignment substrate 60 is used with the alignment origin 62 aligned with the image reference position 72. Then, the parameters a through h, and s of the projective transformation matrix R are calculated based on the alignment measurement values Xm', Ym' and the alignment actual measurement values Xk', Yk'. This makes it possible to obtain a highly accurate projective transformation matrix R.
  • an inspection device 1 that can obtain accurate target coordinates X, Y of a chip 51 mounted on a target substrate 50.
  • a more accurate projective transformation matrix R can be obtained by using the alignment measurement values Xk', Yk' as the correct solution values, which is advantageous in bringing the target correction values Xc, Yc closer to the target true values Xr, Yr, which are the true values.
  • the inspection device 1 can obtain accurate first target coordinates X and second target coordinates Y of the chip 51 mounted on the target substrate 50 even when the stage 20 and the camera 30 move relatively in the forward/backward direction x and left/right direction y.
  • the controller 40 may use the projective transformation matrix R to obtain the object correction values Xc, Yc by correcting the deviation of the object measurement values Xm, Ym caused by the distortion of the camera 30.
  • the distortion of the camera 30 may include not only physical distortion but also optical distortion. Examples of the optical distortion of the camera 30 include a lens focus error, a shading phenomenon, and a lens distortion aberration.
  • the values of the parameters a through h and s of the projective transformation matrix R used to correct the distortion of the camera 30 may differ from the values of the parameters a through h and s of the projective transformation matrix R used to correct the positional deviation of the stage 20.
  • the projective transformation matrix R for correcting the distortion of the camera 30 and the projective transformation matrix R for correcting the positional deviation of the stage 20 may be used separately, or a single projective transformation matrix R that combines the functions of both may be used.
  • Figure 8 shows the distortion of the lens 31 of the camera 30.
  • Distortion occurs in the radial direction of the lens 31 and in the tangential direction perpendicular to the radial direction.
  • Radial distortion occurs on the projection plane Q when a ray of light emitted from the subject P passes through the principal point of the lens 31 at a certain angle of incidence, because the exit angle does not match the angle of incidence.
  • Tangential distortion occurs due to a central position shift or tilt between the multiple lenses that make up the lens 31. Distortion can be expressed, for example, by equations [Equation 4] to [Equation 6].
  • the coordinates are (Xu, Yu) for an ideal lens 31 with no distortion aberration, and (Xd, Yd) for an actual lens 31 with distortion aberration.
  • r is the distance from the image center to the coordinates Xu, Yu. K1 to K5 are coefficients that represent the distortion of the lens 31.
  • K3 and K4 represent tangential distortion aberration, but in practice this can often be ignored, so they may be simplified to equations [7] and [8].
  • the coordinates Xd, Yd of the chip 51 correspond to the target measurement values Xm, Ym.
  • the controller 40 may use a projective transformation matrix R as shown in equation [9] to correct the deviation of the coordinates Xd, Yd of the chip 51 (corresponding to the target measurement values Xm, Ym) caused by the distortion of the lens 31 of the camera 30 and obtain the target correction values Xc, Yc.
  • the nine parameters a to h and s may be the same as those in the first embodiment, or may be calculated separately using the method described below.
  • the projective transformation matrix R can be used to convert the target measurement values Xm, Ym (for example, the coordinates Xd, Yd of the chip 51 when an actual lens 31 with distortion aberration is used) into target correction values Xc, Yc, which can be brought closer to the true values.
  • the controller 40 corrects the target measurement values Xm, Ym related to the target coordinates X, Y using the projection transformation matrix R to calculate the target correction values Xc, Yc related to the target coordinates X, Y.
  • the controller 40 uses a machine learning model M instead of a projective transformation matrix R.
  • FIG. 9 shows the configuration of the machine learning model M.
  • the machine learning model M is stored in a controller 40 (e.g., a personal computer, etc.).
  • a controller 40 e.g., a personal computer, etc.
  • the controller 40 executes the machine learning model M as follows.
  • the controller 40 corrects the target measurement values Xm, Ym related to the target coordinates X, Y using the machine learning model M, and calculates the target correction values Xc, Yc related to the target coordinates X, Y.
  • the target measurement values Xm, Ym are input to the machine learning model M, and the target correction values Xc, Yc are calculated and output by the machine learning model M.
  • the machine learning model M is a supervised learning model. Specific examples of the machine learning model M include ridge regression, GDBT (Gradient Boosting Decision Tree), and multilayer perceptron (MLP).
  • GDBT Gram Boosting Decision Tree
  • MLP multilayer perceptron
  • the machine learning model M learns as follows.
  • the machine learning model M uses as training data known target measurement values Xm, Ym relating to the target coordinates X, Y, and known target actual measurement values Xk, Yk, which are known actual measurement values relating to the target coordinates X, Y.
  • the known target measurement values Xm, Ym are obtained by collecting data on the target measurement values Xm, Ym measured by the method described in the first embodiment. It is preferable to collect a large amount of data on the known target measurement values Xm, Ym.
  • the known target measurement values Xm, Ym deviate from the target true values Xr, Yr, which are the true values of the target coordinates X, Y, due to various error factors.
  • the known target actual measurement values Xk, Yk coincide with the target true values Xr, Yr, which are the true values of the target coordinates X, Y.
  • the known target actual measurement values Xk, Yk are, so to speak, correct values. It is preferable to collect a large amount of data on the known target actual measurement values Xk, Yk.
  • the known target actual measurement values Xk, Yk are obtained, for example, by directly measuring them with a sensor, etc.
  • the known target measurement values Xm, Ym are provided to the input layer of the machine learning model M as training data, and the known target actual measurement values Xk, Yk are provided to the output layer of the machine learning model M as training data.
  • the target measurement values Xm, Ym of the target coordinates X, Y of the chip 51 on the target substrate 50 obtained based on the captured image 70 may deviate from the target true values Xr, Yr, which are true values, due to various error factors.
  • the target measurement values Xm, Ym are converted into target correction values Xc, Yc, and are brought closer to the target true values Xr, Yr, which are the true values.
  • the known target measurement values Xm, Ym and the known target actual measurement values Xk, Yk are used as training data in the machine learning model M. This makes it possible to obtain a machine learning model M with high accuracy.
  • an inspection device 1 that can obtain accurate target coordinates X, Y of a chip 51 mounted on a target substrate 50.
  • the unknown target measurement values Xmi, Ymi relating to the target coordinates X, Y of the chip 51 relative to the target origin 52 provided on the target substrate 50 can be easily calculated.
  • the alignment measurement values Xm', Ym' related to the alignment coordinates X', Y' correspond to the target measurement values Xm, Ym related to the target coordinates X, Y.
  • the machine learning model M may apply the known alignment measurement values Xm', Ym' related to the alignment coordinates X', Y' to the teacher data as the known target measurement values Xm, Ym related to the target coordinates X, Y.
  • the alignment measurement values Xk', Yk' related to the alignment coordinates X', Y' coincide with the target true values Xr, Yr related to the target coordinates X, Y.
  • the alignment measurement values Xk', Yk' related to the alignment coordinates X', Y' coincide with the known target measurement values Xk, Yk related to the target coordinates X, Y.
  • the machine learning model M may apply the known alignment measurement values Xk', Yk' related to the alignment coordinates X', Y' as teacher data for the known target measurement values Xk, Yk related to the target coordinates X, Y.
  • the data required for training the machine learning model M can be easily obtained.
  • the controller 40 may be provided outside the main body of the inspection device 1, rather than being built into the main body of the inspection device 1.
  • the stage 20 may be fixed and the camera 30 may be movable. Or, both may be movable.
  • the target substrate 50 and the alignment substrate 60 are not limited to a rectangular shape and may be, for example, circular.
  • the target substrate 50 and the alignment substrate 60 do not have to be semiconductors.
  • the target origin 52 and the alignment origin 62 may be provided, for example, in the center of the target substrate 50 and the alignment substrate 60, rather than at a corner.
  • the parameters of the projective transformation matrix R may be calculated using, for example, the Levenberg-Marquardt method or the least median method, rather than the least squares method.
  • the projective transformation matrix R may be stored in an external server rather than in the memory of the controller 40.
  • the controller 40 accesses the projective transformation matrix R in the external server to input data to the projective transformation matrix R or receive data output from the projective transformation matrix R.
  • the machine learning model M may be stored in an external server rather than being stored in the memory of the controller 40.
  • the controller 40 accesses the machine learning model M in the external server to input data to the machine learning model M or receive data output from the machine learning model M.
  • This disclosure is extremely useful and has high industrial applicability because it can be applied to inspection equipment.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

L'invention concerne un dispositif d'inspection capable d'obtenir des coordonnées précises d'une puce disposée sur un substrat. Spécifiquement, ce dispositif d'inspection comprend : un étage sur lequel un substrat cible est placé ; une caméra qui image le substrat cible pour obtenir une image capturée ; et un dispositif de commande qui calcule une valeur de mesure cible de coordonnées cibles d'une puce par rapport à une origine cible du substrat cible sur la base de la distance entre une position de référence d'image et la puce dans la direction dans laquelle des pixels sont agencés et une valeur définie de mouvement relatif entre l'étage et la caméra. Le dispositif de commande corrige la valeur de mesure cible à l'aide d'une matrice de transformation projective pour calculer une valeur de correction cible. Le dispositif de commande calcule une valeur de mesure d'alignement des coordonnées d'alignement par rapport à une origine d'alignement d'un substrat d'alignement sur la base de la distance entre la position de référence d'image et une marque d'alignement dans la direction dans laquelle les pixels sont agencés, dans un état dans lequel l'origine d'alignement est alignée avec la position de référence d'image. Le dispositif de commande calcule un paramètre de la matrice de transformation projective sur la base de la valeur de mesure d'alignement et d'une valeur de mesure réelle d'alignement connue.
PCT/JP2024/041788 2023-12-05 2024-11-26 Dispositif d'inspection Pending WO2025121204A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2023-205564 2023-12-05
JP2023-205561 2023-12-05
JP2023205564A JP2025090367A (ja) 2023-12-05 2023-12-05 検査装置
JP2023205561A JP2025090364A (ja) 2023-12-05 2023-12-05 検査装置

Publications (1)

Publication Number Publication Date
WO2025121204A1 true WO2025121204A1 (fr) 2025-06-12

Family

ID=95979830

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2024/041788 Pending WO2025121204A1 (fr) 2023-12-05 2024-11-26 Dispositif d'inspection

Country Status (1)

Country Link
WO (1) WO2025121204A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006049755A (ja) * 2004-08-09 2006-02-16 Omron Corp 回転中心算出方法およびこの方法を用いたワーク位置決め装置
JP2008251797A (ja) * 2007-03-30 2008-10-16 Fujifilm Corp 基準位置計測装置及び方法、並びに描画装置
JP2014203365A (ja) * 2013-04-08 2014-10-27 オムロン株式会社 制御システムおよび制御方法
JP2018151452A (ja) * 2017-03-10 2018-09-27 株式会社半導体エネルギー研究所 半導体装置、表示システム及び電子機器
JP2020135637A (ja) * 2019-02-22 2020-08-31 日本電信電話株式会社 姿勢推定装置、学習装置、方法、及びプログラム
JP2022033027A (ja) * 2020-08-11 2022-02-25 アプライド マテリアルズ イスラエル リミテッド ウエハ分析のための較正データを生成する方法およびシステム

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006049755A (ja) * 2004-08-09 2006-02-16 Omron Corp 回転中心算出方法およびこの方法を用いたワーク位置決め装置
JP2008251797A (ja) * 2007-03-30 2008-10-16 Fujifilm Corp 基準位置計測装置及び方法、並びに描画装置
JP2014203365A (ja) * 2013-04-08 2014-10-27 オムロン株式会社 制御システムおよび制御方法
JP2018151452A (ja) * 2017-03-10 2018-09-27 株式会社半導体エネルギー研究所 半導体装置、表示システム及び電子機器
JP2020135637A (ja) * 2019-02-22 2020-08-31 日本電信電話株式会社 姿勢推定装置、学習装置、方法、及びプログラム
JP2022033027A (ja) * 2020-08-11 2022-02-25 アプライド マテリアルズ イスラエル リミテッド ウエハ分析のための較正データを生成する方法およびシステム

Similar Documents

Publication Publication Date Title
US20130194569A1 (en) Substrate inspection method
US10531072B2 (en) Calibration device and method for calibrating a dental camera
JP5140761B2 (ja) 測定システムを較正する方法、コンピュータプログラム、電子制御装置、及び、測定システム
JP6967140B2 (ja) 検流計補正システム及び方法
US9250071B2 (en) Measurement apparatus and correction method of the same
US20100177192A1 (en) Three-dimensional measuring device
JP2016060610A (ja) エレベータ昇降路内寸法測定装置、エレベータ昇降路内寸法測定制御装置、およびエレベータ昇降路内寸法測定方法
US20150085108A1 (en) Lasergrammetry system and methods
US10578986B2 (en) Dual-layer alignment device and method
US9423242B2 (en) Board-warping measuring apparatus and board-warping measuring method thereof
WO2018168757A1 (fr) Dispositif de traitement d'image, système, procédé de traitement d'image, procédé de fabrication d'articles et programme
CN110706292A (zh) 一种基于机器视觉的二维工作台误差自标定方法
WO2025121204A1 (fr) Dispositif d'inspection
KR101503021B1 (ko) 측정장치 및 이의 보정방법
JP2025090364A (ja) 検査装置
JP2025090367A (ja) 検査装置
JP2009139285A (ja) 半田ボール検査装置、及びその検査方法、並びに形状検査装置
TW202544440A (zh) 檢查裝置
CN114370866B (zh) 一种星敏感器主点和主距测量系统及方法
JP6507063B2 (ja) 画像検査装置
CN117190854A (zh) 一种二维运动平台位置误差的测量系统及补偿方法
KR20130023305A (ko) 측정장치 및 이의 보정방법
Ricolfe-Viala et al. Optimal conditions for camera calibration using a planar template
JP6091092B2 (ja) 画像処理装置、及び画像処理方法
JP2018044863A (ja) 計測装置、計測方法、システム及び物品の製造方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24900498

Country of ref document: EP

Kind code of ref document: A1