TWI845450B - 3d object outline data establishment system based on robotic arm and method thereof - Google Patents
3d object outline data establishment system based on robotic arm and method thereof Download PDFInfo
- Publication number
- TWI845450B TWI845450B TW112145558A TW112145558A TWI845450B TW I845450 B TWI845450 B TW I845450B TW 112145558 A TW112145558 A TW 112145558A TW 112145558 A TW112145558 A TW 112145558A TW I845450 B TWI845450 B TW I845450B
- Authority
- TW
- Taiwan
- Prior art keywords
- arm
- motion matrix
- flange
- image
- robot arm
- Prior art date
Links
Landscapes
- Image Processing (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Manipulator (AREA)
Abstract
Description
一種數據建立系統及其方法,尤其是指一種透過機械手臂帶動觀測物件移動至不同觀測位置以建立三維物件輪廓數據的系統及其方法。A data establishment system and method thereof, in particular, a system and method thereof for establishing three-dimensional object contour data by moving an observed object to different observation positions through a robotic arm.
有鑒於人工智慧的發展,人工智慧的技術也廣泛的應用在各式各樣的領域,人工智慧也可以應用於三維物件的輪廓辨識,使用人工智慧辨識三維物件的輪廓需要先提供人工智慧的訓練數據,在經過訓練後的人工智慧才能精確的進行三維物件的輪廓辨識。With the development of artificial intelligence, artificial intelligence technology is also widely used in various fields. Artificial intelligence can also be applied to the contour recognition of three-dimensional objects. To use artificial intelligence to recognize the contour of three-dimensional objects, it is necessary to first provide artificial intelligence training data. Only after training can artificial intelligence accurately recognize the contour of three-dimensional objects.
現有人工智慧的訓練數據需要透過在固定場景中的不同位置處擺放三維物件,以進行人工智慧的訓練數據建立,但在固定場景中的不同位置處擺放三維物件多半是透過人為放置方式進行擺放,這會使得人工智慧的訓練數據需要耗費人工擺放三維物件的時間,將不利於人工智慧的訓練數據建立。The existing training data of artificial intelligence needs to be established by placing three-dimensional objects at different positions in a fixed scene. However, the placement of three-dimensional objects at different positions in a fixed scene is mostly done manually, which will make the training data of artificial intelligence need to spend time on manual placement of three-dimensional objects, which will be not conducive to the establishment of training data of artificial intelligence.
綜上所述,可知先前技術中長期以來一直存在現有人工智慧的訓練數據建立過於耗時的問題,因此有必要提出改進的技術手段,來解決此一問題。In summary, it can be seen that the existing artificial intelligence training data has long been a problem that it takes too long to establish, so it is necessary to propose improved technical means to solve this problem.
有鑒於先前技術存在現有人工智慧的訓練數據建立過於耗時的問題,本發明遂揭露一種基於機械手臂的三維物件輪廓數據建立系統及其方法,其中:In view of the problem that the establishment of training data for artificial intelligence in the prior art is too time-consuming, the present invention discloses a system and method for establishing three-dimensional object contour data based on a robot arm, wherein:
本發明所揭露第一實施態樣的基於機械手臂的三維物件輪廓數據建立系統,其包含:影像裝置、機械手臂、觀測物件、控制裝置以及物件輪廓計算裝置。The first embodiment of the present invention discloses a three-dimensional object contour data establishment system based on a robotic arm, which includes: an imaging device, a robotic arm, an observation object, a control device, and an object contour calculation device.
影像裝置具有固定的影像範圍;機械手臂具有法蘭端部;觀測物件被固定於法蘭端部。The imaging device has a fixed imaging range; the robot arm has a flange end; and the observed object is fixed on the flange end.
控制裝置與機械手臂建立連線,以控制機械手臂的法蘭端部的移動與轉動,使得固定於法蘭端部的觀測物件移動與轉動至不同取樣點,觀測物件移動與轉動至不同取樣點的位置位於影像範圍,計算出經過移動與轉動後法蘭端部到機械手臂的手臂空間座標的法蘭手臂運動矩陣。The control device is connected to the robot arm to control the movement and rotation of the flange end of the robot arm, so that the observation object fixed on the flange end moves and rotates to different sampling points. The position of the observation object moving and rotating to different sampling points is located in the image range, and the flange arm motion matrix of the arm space coordinates from the flange end to the robot arm after movement and rotation is calculated.
物件輪廓計算裝置與控制裝置以及影像裝置建立連線,自影像裝置接收影像裝置的影像參數,建立觀測物件與法蘭端部的物件法蘭運動矩陣,接收影像裝置的空間座標,自控制裝置接收手臂空間座標以及法蘭手臂運動矩陣,依據手臂空間座標以及空間座標計算出手臂影像運動矩陣,將物件法蘭運動矩陣、法蘭手臂運動矩陣以及手臂影像運動矩陣相乘以計算出物件影像運動矩陣,將物件影像運動矩陣配合影像裝置的內部參數使用三角測量法以計算出觀測物件的每一個三維座標點在二維平面的投影座標點,以計算出觀測物件在二維平面的輪廓影像。The object contour calculation device establishes a connection with the control device and the imaging device, receives the image parameters of the imaging device from the imaging device, establishes the object flange motion matrix of the observed object and the flange end, receives the spatial coordinates of the imaging device, receives the arm spatial coordinates and the flange arm motion matrix from the control device, calculates the arm image motion matrix according to the arm spatial coordinates and the spatial coordinates, multiplies the object flange motion matrix, the flange arm motion matrix and the arm image motion matrix to calculate the object image motion matrix, and uses the object image motion matrix with the internal parameters of the imaging device to use triangulation to calculate the projection coordinate point of each three-dimensional coordinate point of the observed object on the two-dimensional plane to calculate the contour image of the observed object on the two-dimensional plane.
本發明所揭露第一實施態樣的基於機械手臂的三維物件輪廓數據建立方法,其包含下列步驟:The first embodiment of the present invention discloses a method for establishing three-dimensional object contour data based on a robot arm, which comprises the following steps:
首先,影像裝置具有固定的影像範圍;接著,機械手臂具有法蘭端部;接著,觀測物件被固定於法蘭端部;接著,控制裝置與機械手臂建立連線,控制裝置控制機械手臂的法蘭端部的移動與轉動,使得固定於法蘭端部的觀測物件移動與轉動至不同取樣點,觀測物件移動與轉動至不同取樣點的位置位於影像範圍;接著,控制裝置計算出經過移動與轉動後法蘭端部到機械手臂的手臂空間座標的法蘭手臂運動矩陣;接著,物件輪廓計算裝置與控制裝置以及影像裝置建立連線;接著,物件輪廓計算裝置自影像裝置接收影像裝置的影像參數;接著,物件輪廓計算裝置建立觀測物件與法蘭端部的物件法蘭運動矩陣;接著,物件輪廓計算裝置接收影像裝置的空間座標;接著,物件輪廓計算裝置自控制裝置接收手臂空間座標以及法蘭手臂運動矩陣;接著,物件輪廓計算裝置依據手臂空間座標以及空間座標計算出手臂影像運動矩陣;接著,物件輪廓計算裝置將物件法蘭運動矩陣、法蘭手臂運動矩陣以及手臂影像運動矩陣相乘以計算出物件影像運動矩陣;最後,物件輪廓計算裝置將物件影像運動矩陣配合影像裝置的內部參數使用三角測量法以計算出觀測物件的每一個三維座標點在二維平面的投影座標點,以計算出觀測物件在二維平面的輪廓影像。First, the imaging device has a fixed imaging range; then, the robot arm has a flange end; then, the observation object is fixed to the flange end; then, the control device establishes a connection with the robot arm, and the control device controls the movement and rotation of the flange end of the robot arm, so that the observation object fixed to the flange end moves and rotates to different sampling points, and the position of the observation object moving and rotating to different sampling points is located in the imaging range; then, the control device calculates the flange arm motion matrix of the arm space coordinates from the flange end to the robot arm after movement and rotation; then, the object contour calculation device establishes a connection with the control device and the imaging device; then, the object contour calculation device receives the image parameters of the imaging device from the imaging device; then, the object contour calculation device establishes an observation The object and the object flange motion matrix at the flange end; then, the object contour calculation device receives the spatial coordinates of the imaging device; then, the object contour calculation device receives the arm spatial coordinates and the flange arm motion matrix from the control device; then, the object contour calculation device calculates the arm image motion matrix according to the arm spatial coordinates and the spatial coordinates; then, the object contour calculation device multiplies the object flange motion matrix, the flange arm motion matrix and the arm image motion matrix to calculate the object image motion matrix; finally, the object contour calculation device uses the triangulation method to calculate the projection coordinate point of each three-dimensional coordinate point of the observed object in the two-dimensional plane by combining the object image motion matrix with the internal parameters of the imaging device to calculate the contour image of the observed object in the two-dimensional plane.
本發明所揭露第二實施態樣的基於機械手臂的三維物件輪廓數據建立系統,其包含:觀測物件、機械手臂、影像裝置、控制裝置以及物件輪廓計算裝置。The second embodiment of the present invention discloses a three-dimensional object contour data establishment system based on a robot arm, which includes: an observed object, a robot arm, an imaging device, a control device, and an object contour calculation device.
觀測物件被固定放置;機械手臂具有法蘭端部;影像裝置被固定於法蘭端部且具有影像範圍。The observation object is fixedly placed; the robot arm has a flange end; the imaging device is fixed to the flange end and has an imaging range.
控制裝置與機械手臂建立連線,以控制機械手臂的法蘭端部的移動與轉動,使得固定於法蘭端部的影像裝置移動與轉動至不同取樣點,觀測物件位於移動與轉動至不同取樣點的影像範圍,計算出經過移動與轉動後法蘭端部到機械手臂的手臂空間座標的法蘭手臂運動矩陣。The control device is connected to the robot arm to control the movement and rotation of the flange end of the robot arm, so that the imaging device fixed on the flange end moves and rotates to different sampling points, the object is observed to be located in the imaging range of the different sampling points, and the flange arm motion matrix of the arm space coordinates from the flange end to the robot arm after the movement and rotation is calculated.
物件輪廓計算裝置與控制裝置以及影像裝置建立連線,自影像裝置接收影像裝置的影像參數,建立影像裝置與法蘭端部的影像法蘭運動矩陣,接收觀測物件的物件空間座標,自控制裝置接收手臂空間座標以及法蘭手臂運動矩陣,依據手臂空間座標以及物件空間座標計算出手臂物件運動矩陣,將影像法蘭運動矩陣、法蘭手臂運動矩陣以及手臂物件運動矩陣相乘以計算出影像物件運動矩陣,將影像物件運動矩陣配合影像裝置的內部參數使用三角測量法以計算出觀測物件的每一個三維座標點在二維平面的投影座標點,以計算出觀測物件在二維平面的輪廓影像。The object contour calculation device establishes a connection with the control device and the imaging device, receives the image parameters of the imaging device from the imaging device, establishes an image flange motion matrix between the imaging device and the flange end, receives the object space coordinates of the observed object, receives the arm space coordinates and the flange arm motion matrix from the control device, calculates the arm object motion matrix according to the arm space coordinates and the object space coordinates, multiplies the image flange motion matrix, the flange arm motion matrix and the arm object motion matrix to calculate the image object motion matrix, and uses the triangulation method to calculate the projection coordinate point of each three-dimensional coordinate point of the observed object on the two-dimensional plane with the image object motion matrix and the internal parameters of the imaging device to calculate the contour image of the observed object on the two-dimensional plane.
本發明所揭露第二實施態樣的基於機械手臂的三維物件輪廓數據建立方法,其包含下列步驟:The second embodiment of the present invention discloses a method for establishing three-dimensional object contour data based on a robot arm, which comprises the following steps:
首先,觀測物件被固定放置;接著,機械手臂具有法蘭端部;接著,影像裝置被固定於法蘭端部且具有影像範圍;接著,控制裝置與機械手臂建立連線,以控制機械手臂的法蘭端部的移動與轉動,使得固定於法蘭端部的影像裝置移動與轉動至不同取樣點,觀測物件位於移動與轉動至不同取樣點的影像範圍;接著,控制裝置計算出經過移動與轉動後法蘭端部到機械手臂的手臂空間座標的法蘭手臂運動矩陣;接著,物件輪廓計算裝置與控制裝置以及影像裝置建立連線;接著,物件輪廓計算裝置自影像裝置接收影像裝置的影像參數;接著,物件輪廓計算裝置建立影像裝置與法蘭端部的影像法蘭運動矩陣;接著,物件輪廓計算裝置接收觀測物件的物件空間座標;接著,物件輪廓計算裝置自控制裝置接收手臂空間座標以及法蘭手臂運動矩陣;接著,物件輪廓計算裝置依據手臂空間座標以及物件空間座標計算出手臂物件運動矩陣;接著,物件輪廓計算裝置將影像法蘭運動矩陣、法蘭手臂運動矩陣以及手臂物件運動矩陣相乘以計算出影像物件運動矩陣;最後,物件輪廓計算裝置將影像物件運動矩陣配合影像裝置的內部參數使用三角測量法以計算出觀測物件的每一個三維座標點在二維平面的投影座標點,以計算出觀測物件在二維平面的輪廓影像。First, the object to be observed is fixedly placed; then, the robot arm has a flange end; then, the imaging device is fixed to the flange end and has an imaging range; then, the control device establishes a connection with the robot arm to control the movement and rotation of the flange end of the robot arm, so that the imaging device fixed to the flange end moves and rotates to different sampling points, and the object to be observed is located in the imaging range of the movement and rotation to different sampling points; then, the control device calculates the flange arm motion matrix of the arm space coordinates from the flange end to the robot arm after the movement and rotation; then, the object contour calculation device establishes a connection with the control device and the imaging device; then, the object contour calculation device receives the image parameters of the imaging device from the imaging device; then, the object contour calculation device establishes a relationship between the imaging device and the control device. The image flange motion matrix at the flange end; then, the object contour calculation device receives the object space coordinates of the observed object; then, the object contour calculation device receives the arm space coordinates and the flange arm motion matrix from the control device; then, the object contour calculation device calculates the arm object motion matrix according to the arm space coordinates and the object space coordinates; then, the object contour calculation device multiplies the image flange motion matrix, the flange arm motion matrix and the arm object motion matrix to calculate the image object motion matrix; finally, the object contour calculation device uses the triangulation method to calculate the projection coordinate point of each three-dimensional coordinate point of the observed object on the two-dimensional plane by combining the image object motion matrix with the internal parameters of the imaging device, so as to calculate the contour image of the observed object on the two-dimensional plane.
本發明所揭露的系統及方法如上,由物件輪廓計算裝置建立觀測物件與法蘭端部的物件法蘭運動矩陣,自影像裝置接收影像裝置的影像參數,自控制裝置接收手臂空間座標以及法蘭手臂運動矩陣,依據手臂空間座標以及空間座標計算出手臂影像運動矩陣,將物件法蘭運動矩陣、法蘭手臂運動矩陣以及手臂影像運動矩陣相乘以計算出物件影像運動矩陣,將物件影像運動矩陣配合影像裝置的內部參數使用三角測量法以計算出觀測物件的每一個三維座標點在二維平面的投影座標點,以計算出觀測物件在二維平面的輪廓影像。The system and method disclosed in the present invention are as described above. An object contour calculation device establishes an object flange motion matrix of an observed object and a flange end, receives image parameters of an imaging device from an imaging device, receives arm spatial coordinates and a flange-arm motion matrix from a control device, calculates an arm image motion matrix based on the arm spatial coordinates and the spatial coordinates, multiplies the object flange motion matrix, the flange-arm motion matrix and the arm image motion matrix to calculate an object image motion matrix, and uses triangulation to calculate the projection coordinate point of each three-dimensional coordinate point of the observed object on a two-dimensional plane using the object image motion matrix in conjunction with the internal parameters of the imaging device, so as to calculate a contour image of the observed object on a two-dimensional plane.
透過上述的技術手段,本發明可以達成提供便捷三維物件輪廓數據建立的技術功效。Through the above-mentioned technical means, the present invention can achieve the technical effect of providing convenient three-dimensional object contour data establishment.
以下將配合圖式及實施例來詳細說明本發明的實施方式,藉此對本發明如何應用技術手段來解決技術問題並達成技術功效的實現過程能充分理解並據以實施。The following will be used in conjunction with drawings and embodiments to explain the implementation of the present invention in detail, so that the implementation process of how the present invention applies technical means to solve technical problems and achieve technical effects can be fully understood and implemented accordingly.
以下首先要說明本發明所揭露第一實施態樣的基於機械手臂的三維物件輪廓數據建立系統,並請參考「第1圖」所示,「第1圖」繪示為本發明基於機械手臂的三維物件輪廓數據建立系統第一實施態樣的系統方塊圖。The following will first explain the first implementation of the three-dimensional object contour data establishment system based on a robot arm disclosed in the present invention, and please refer to "Figure 1", which is a system block diagram of the first implementation of the three-dimensional object contour data establishment system based on a robot arm of the present invention.
本發明所揭露第一實施態樣的基於機械手臂的三維物件輪廓數據建立系統,其包含:影像裝置10、機械手臂20、觀測物件30、控制裝置40以及物件輪廓計算裝置50。The first embodiment of the present invention discloses a three-dimensional object contour data establishment system based on a robot arm, which includes: an imaging device 10, a robot arm 20, an observation object 30, a control device 40 and an object contour calculation device 50.
請參考「第2圖」所示,「第2圖」繪示為本發明基於機械手臂的三維物件輪廓數據建立第一實施態樣的影像範圍與法蘭端部運動示意圖。Please refer to "Figure 2", which is a schematic diagram of the image range and flange end movement of the first implementation of the present invention based on the three-dimensional object contour data of the robot arm.
影像裝置10具有固定的影像範圍11,機械手臂20具有法蘭端部21,本發明所使用的機械手臂20為六軸串連式機械手臂,在此僅為舉例說明之,並不以此侷限本發明的應用範疇,觀測物件30可以透過螺合方式、夾持方式、磁吸方式…等固定於機械手臂20的法蘭端部21,在此僅為舉例說明之,並不以此侷限本發明的應用範疇。The imaging device 10 has a fixed imaging range 11, and the robot arm 20 has a flange end 21. The robot arm 20 used in the present invention is a six-axis serial robot arm, which is only used as an example here, and the application scope of the present invention is not limited thereto. The observed object 30 can be fixed to the flange end 21 of the robot arm 20 by screwing, clamping, magnetic attraction, etc., which is only used as an example here, and the application scope of the present invention is not limited thereto.
控制裝置40與機械手臂20是透過有線傳輸方式或式無線傳輸方式與機械手臂20建立連線,前述的有線傳輸方式例如是:電纜網路、光纖網路…等,前述的無線傳輸方式例如是:Wi-Fi、行動通訊網路(例如是:3G、4G、5G…等)…等,在此僅為舉例說明之,並不以此侷限本發明的應用範疇。The control device 40 and the robot arm 20 are connected to each other via a wired transmission method or a wireless transmission method. The aforementioned wired transmission method is, for example, a cable network, an optical fiber network, etc., and the aforementioned wireless transmission method is, for example, Wi-Fi, a mobile communication network (for example, 3G, 4G, 5G, etc.), etc. These are merely examples for illustration and are not intended to limit the scope of application of the present invention.
控制裝置40是對機械手臂20的法蘭端部21的位置與角度進行控制,即控制裝置40生成機械手臂20每一個軸各自的控制指令以控制機械手臂20每一個軸的轉動速度以及轉動角度,進而控制機械手臂20的法蘭端部21的移動與轉動,藉以使得固定於機械手臂20的法蘭端部21的觀測物件30移動與轉動至不同取樣點,且觀測物件30移動與轉動到不同取樣點的位置位於影像範圍11中,即觀測物件30固定於機械手臂20的法蘭端部21上透過機械手臂20移動與轉動產生不同姿態。The control device 40 controls the position and angle of the flange end 21 of the robot arm 20, that is, the control device 40 generates control instructions for each axis of the robot arm 20 to control the rotation speed and rotation angle of each axis of the robot arm 20, and then controls the movement and rotation of the flange end 21 of the robot arm 20, so that the observation object 30 fixed on the flange end 21 of the robot arm 20 moves and rotates to different sampling points, and the position of the observation object 30 moving and rotating to different sampling points is located in the image range 11, that is, the observation object 30 is fixed on the flange end 21 of the robot arm 20 and produces different postures through the movement and rotation of the robot arm 20.
控制裝置40再計算出經過移動與轉動後機械手臂20的法蘭端部21到機械手臂20的手臂空間座標的法蘭手臂運動矩陣 (B表示機械手臂20,E表示法蘭端部21),前述機械手臂20的手臂空間座標即是機械手臂20固定於大地的座標點。 The control device 40 then calculates the flange arm motion matrix from the flange end 21 of the robot arm 20 to the arm space coordinates of the robot arm 20 after movement and rotation. (B represents the robot arm 20, and E represents the flange end 21). The arm space coordinates of the aforementioned robot arm 20 are the coordinate points where the robot arm 20 is fixed to the ground.
值得注意的是,法蘭手臂運動矩陣是由機械手臂20的6自由度姿態[x,y,z,r,p,w]推倒得到的4x4變換矩陣,控制裝置40一般是透過讀取機械手臂20的控制器參數得到,法蘭手臂運動矩陣 如下: It is worth noting that the flange arm motion matrix is a 4x4 transformation matrix derived from the 6-DOF posture [x, y, z, r, p, w] of the robot arm 20. The control device 40 generally obtains the flange arm motion matrix by reading the controller parameters of the robot arm 20. as follows:
其中,ca = cos(a), sa = sin(a), cb = cos(b), sb = sin(b), cc = cos(c), sc = sin(c), a = r*π/180, b = p*π/180, c = w*π/180。Among them, ca = cos(a), sa = sin(a), cb = cos(b), sb = sin(b), cc = cos(c), sc = sin(c), a = r*π/180, b = p*π/180, c = w*π/180.
物件輪廓計算裝置50與影像裝置10以及控制裝置40是透過有線傳輸方式或式無線傳輸方式與機械手臂20建立連線,前述的有線傳輸方式例如是:電纜網路、光纖網路…等,前述的無線傳輸方式例如是:Wi-Fi、行動通訊網路(例如是:3G、4G、5G…等)…等,在此僅為舉例說明之,並不以此侷限本發明的應用範疇。The object contour calculation device 50, the imaging device 10, and the control device 40 are connected to the robot arm 20 via a wired transmission method or a wireless transmission method. The aforementioned wired transmission method is, for example, a cable network, an optical fiber network, etc., and the aforementioned wireless transmission method is, for example, Wi-Fi, a mobile communication network (for example, 3G, 4G, 5G, etc.), etc. These are merely examples for illustration and are not intended to limit the scope of application of the present invention.
物件輪廓計算裝置50自影像裝置10接收影像裝置10的內部參數,影像裝置10的內部參數例如是:焦距、光圈、感光度…等,在此僅為舉例說明之,並不以此侷限本發明的應用範疇。The object contour calculation device 50 receives internal parameters of the imaging device 10 from the imaging device 10. The internal parameters of the imaging device 10 include, for example, focal length, aperture, sensitivity, etc. These are merely examples for illustration and are not intended to limit the scope of application of the present invention.
物件輪廓計算裝置50透過使用者介面接收影像裝置10的空間座標,物件輪廓計算裝置50再自控制裝置40接收手臂空間座標以及法蘭手臂運動矩陣,物件輪廓計算裝置50即可依據手臂空間座標以及空間座標計算出手臂影像運動矩陣。The object contour calculation device 50 receives the spatial coordinates of the imaging device 10 through the user interface, and then receives the arm spatial coordinates and the flange arm motion matrix from the control device 40. The object contour calculation device 50 can calculate the arm imaging motion matrix according to the arm spatial coordinates and the spatial coordinates.
值得注意的是,手臂影像運動矩陣 (C表示影像裝置10)是透過手眼校正(hand-to-eye calibration)獲得,即求解 ,前述公式即為 ,其中 , , , , 為觀測物件30至影像裝置10的轉換矩陣,是透過2D-3D對應點PnP姿態預估方法得到, 即是透過下列公式計算得到: It is worth noting that the arm image motion matrix (C represents the imaging device 10) is obtained through hand-to-eye calibration, that is, solving , the above formula is ,in , , , , The transformation matrix from the observed object 30 to the imaging device 10 is obtained by the 2D-3D corresponding point PnP posture estimation method. That is calculated by the following formula:
其中, 為3x3旋轉矩陣, 為3x1位移矩陣。 in, is a 3x3 rotation matrix, is a 3x1 displacement matrix.
物件輪廓計算裝置50建立觀測物件30與機械手臂20的法蘭端部21的物件法蘭運動矩陣,物件輪廓計算裝置50是透過下列公式以建立觀測物件30與機械手臂20的法蘭端部21的物件法蘭運動矩陣 (O表示觀測物件30): The object contour calculation device 50 establishes the object flange motion matrix of the observed object 30 and the flange end 21 of the robot arm 20. The object contour calculation device 50 establishes the object flange motion matrix of the observed object 30 and the flange end 21 of the robot arm 20 by the following formula: (O indicates observation object 30):
= , 為3x3旋轉矩陣, 為3x1位移矩陣。 = , is a 3x3 rotation matrix, is a 3x1 displacement matrix.
其中, 使用6自由度姿態標記軟體取得, 即是手臂影像運動矩陣,值得注意的是,觀測物件30與機械手臂20的法蘭端部21的物件法蘭運動矩陣為固定的矩陣,僅需要計算一次即可。 in, Obtained using 6-DOF pose marking software, That is the arm image motion matrix. It is worth noting that the object flange motion matrix of the observed object 30 and the flange end 21 of the robot arm 20 is a fixed matrix and only needs to be calculated once.
物件輪廓計算裝置50是透過下列公式以計算出物件影像運動矩陣 : The object contour calculation device 50 calculates the object image motion matrix by the following formula: :
即物件輪廓計算裝置50將物件法蘭運動矩陣、法蘭手臂運動矩陣以及手臂影像運動矩陣相乘以計算出物件影像運動矩陣。That is, the object contour calculation device 50 multiplies the object flange motion matrix, the flange arm motion matrix and the arm image motion matrix to calculate the object image motion matrix.
物件輪廓計算裝置50再將物件影像運動矩陣配合影像裝置10的內部參數使用三角測量法以計算出觀測物件30的每一個三維座標點在二維平面的投影座標點,以計算出觀測物件30在二維平面的輪廓影像。The object contour calculation device 50 then uses triangulation to calculate the projection coordinate point of each three-dimensional coordinate point of the observed object 30 on the two-dimensional plane by combining the object image motion matrix with the internal parameters of the imaging device 10, so as to calculate the contour image of the observed object 30 on the two-dimensional plane.
請參考「第3A圖」以及「第3B圖」所示,「第3A圖」繪示為本發明基於機械手臂的三維物件輪廓數據建立第一實施態樣的第一取樣點三維觀測物件示意圖;「第3B圖」繪示為本發明基於機械手臂的三維物件輪廓數據建立第一實施態樣的第一取樣點二維觀測物件輪廓示意圖。Please refer to "Figure 3A" and "Figure 3B", "Figure 3A" is a schematic diagram of a three-dimensional object observation at the first sampling point of the first implementation state of the present invention established based on the three-dimensional object contour data of the robot arm; "Figure 3B" is a schematic diagram of a two-dimensional object contour observed at the first sampling point of the first implementation state of the present invention established based on the three-dimensional object contour data of the robot arm.
透過上述過程,物件輪廓計算裝置50將「第3A圖」中第一取樣點的觀測物件30計算轉換為二維平面的輪廓影像31,第一取樣點的觀測物件30的二維平面的輪廓影像31請參考「第3B圖」所示。Through the above process, the object contour calculation device 50 calculates and converts the observed object 30 at the first sampling point in "Figure 3A" into a two-dimensional plane contour image 31. Please refer to "Figure 3B" for the two-dimensional plane contour image 31 of the observed object 30 at the first sampling point.
請參考「第4A圖」以及「第4B圖」所示,「第4A圖」繪示為本發明基於機械手臂的三維物件輪廓數據建立第一實施態樣的第二取樣點三維觀測物件示意圖;「第4B圖」繪示為本發明基於機械手臂的三維物件輪廓數據建立第一實施態樣的第二取樣點二維觀測物件輪廓示意圖。Please refer to "Figure 4A" and "Figure 4B", "Figure 4A" is a schematic diagram of a three-dimensional object observation at the second sampling point of the first implementation state of the present invention established based on the three-dimensional object contour data of the robot arm; "Figure 4B" is a schematic diagram of a two-dimensional object contour observed at the second sampling point of the first implementation state of the present invention established based on the three-dimensional object contour data of the robot arm.
透過上述過程,物件輪廓計算裝置50將「第4A圖」中第二取樣點的觀測物件30計算轉換為二維平面的輪廓影像31,第二取樣點的觀測物件30的二維平面的輪廓影像31請參考「第4B圖」所示。Through the above process, the object contour calculation device 50 calculates and converts the observed object 30 at the second sampling point in "Figure 4A" into a two-dimensional plane contour image 31. Please refer to "Figure 4B" for the two-dimensional plane contour image 31 of the observed object 30 at the second sampling point.
接著,以下將說明本發明的運作方法,並請同時參考「第5A圖」以及「第5B圖」所示,「第5A圖」以及「第5B圖」繪示為本發明基於機械手臂的三維物件輪廓數據建立方法第一實施態樣的方法流程圖。Next, the operation method of the present invention will be described below, and please refer to "Figure 5A" and "Figure 5B" at the same time. "Figure 5A" and "Figure 5B" are method flow charts of the first implementation of the method for establishing three-dimensional object contour data based on a robot arm of the present invention.
本發明所揭露第一實施態樣的基於機械手臂的三維物件輪廓數據建立方法,其包含下列步驟:The first embodiment of the present invention discloses a method for establishing three-dimensional object contour data based on a robot arm, which comprises the following steps:
首先,影像裝置具有固定的影像範圍(步驟601);接著,機械手臂具有法蘭端部(步驟602);接著,觀測物件被固定於法蘭端部(步驟603);接著,控制裝置與機械手臂建立連線,控制裝置控制機械手臂的法蘭端部的移動與轉動,使得固定於法蘭端部的觀測物件移動與轉動至不同取樣點,觀測物件移動與轉動至不同取樣點的位置位於影像範圍(步驟604);接著,控制裝置計算出經過移動與轉動後法蘭端部到機械手臂的手臂空間座標的法蘭手臂運動矩陣(步驟605);接著,物件輪廓計算裝置與控制裝置建立連線(步驟606);接著,物件輪廓計算裝置建立影像裝置的內部參數(步驟607);接著,物件輪廓計算裝置建立觀測物件與法蘭端部的物件法蘭運動矩陣(步驟608);接著,物件輪廓計算裝置接收影像裝置的空間座標(步驟609);接著,物件輪廓計算裝置自控制裝置接收手臂空間座標以及法蘭手臂運動矩陣(步驟610);接著,物件輪廓計算裝置依據手臂空間座標以及空間座標計算出手臂影像運動矩陣(步驟611);接著,物件輪廓計算裝置將物件法蘭運動矩陣、法蘭手臂運動矩陣以及手臂影像運動矩陣相乘以計算出物件影像運動矩陣(步驟612);最後,物件輪廓計算裝置將物件影像運動矩陣配合影像裝置的內部參數使用三角測量法以計算出觀測物件的每一個三維座標點在二維平面的投影座標點,以計算出觀測物件在二維平面的輪廓影像(步驟613)。First, the imaging device has a fixed imaging range (step 601); then, the robot arm has a flange end (step 602); then, the observation object is fixed to the flange end (step 603); then, the control device establishes a connection with the robot arm, and the control device controls the movement and rotation of the flange end of the robot arm, so that the observation object fixed to the flange end moves and rotates to different sampling points, and the observation object moves and rotates to different The position of the sampling point is located in the image range (step 604); then, the control device calculates the flange arm motion matrix of the arm space coordinates from the flange end to the robot arm after movement and rotation (step 605); then, the object contour calculation device establishes a connection with the control device (step 606); then, the object contour calculation device establishes the internal parameters of the imaging device (step 607); then, the object contour calculation device establishes the observed object The object flange motion matrix of the flange end is obtained by the object contour calculation device (step 608); then, the object contour calculation device receives the spatial coordinates of the imaging device (step 609); then, the object contour calculation device receives the arm spatial coordinates and the flange arm motion matrix from the control device (step 610); then, the object contour calculation device calculates the arm imaging motion matrix according to the arm spatial coordinates and the spatial coordinates (step 611); then, the object contour calculation device receives the arm spatial coordinates and the flange arm motion matrix from the control device (step 612); then, the object contour calculation device receives the arm spatial coordinates and the flange arm motion matrix from the control device (step 613); then, the object contour calculation device receives the arm spatial coordinates and the flange arm motion matrix from the control device (step 614); then, the object contour calculation device receives the arm spatial coordinates and the flange arm motion matrix from the control device (step 615); then, the object contour calculation device receives the arm spatial coordinates and the flange arm motion matrix from the control device (step 616); then, the object contour calculation device receives the arm spatial coordinates and the flange arm motion matrix from the control device (step 617); then, the object contour calculation device receives the arm spatial coordinates and the flange arm motion matrix from the control device (step 618); then, the object contour calculation device receives the arm spatial coordinates and the flange arm motion matrix from the control device (step 619); then, the object contour calculation device receives the arm spatial coordinates and the flange arm motion matrix from the control device (step 620); then, the object contour calculation device receives the arm spatial coordinates and the flange arm motion matrix from the control device (step 621); then, the object contour calculation device receives the arm spatial coordinates and the flange arm motion matrix from the control device (step 622); then, the object contour calculation device receives the arm spatial coordinates and the flange arm motion matrix from the control device (step 623); then, the object contour calculation device receives the arm spatial coordinates and the flange arm motion matrix from the control device (step 624); then, the object contour calculation device receives the arm spatial The calculation device multiplies the object flange motion matrix, the flange arm motion matrix and the arm image motion matrix to calculate the object image motion matrix (step 612); finally, the object contour calculation device uses the triangulation method to calculate the projection coordinate point of each three-dimensional coordinate point of the observed object on the two-dimensional plane by combining the object image motion matrix with the internal parameters of the imaging device to calculate the contour image of the observed object on the two-dimensional plane (step 613).
以下要說明本發明所揭露第二實施態樣的基於機械手臂的三維物件輪廓數據建立系統,請參考「第6圖」所示,「第6圖」繪示為本發明基於機械手臂的三維物件輪廓數據建立第二實施態樣的影像範圍與法蘭端部運動示意圖,第二實施態樣與第一實施態樣的差異在於觀測物件30是被固定放置,影像裝置10可以透過螺合方式、夾持方式、磁吸方式…等固定於機械手臂20的法蘭端部21,在此僅為舉例說明之,並不以此侷限本發明的應用範疇。The following is an explanation of the second embodiment of the present invention, which is based on a robot arm to establish a three-dimensional object contour data system. Please refer to "Figure 6", which shows the image range and flange end movement diagram of the second embodiment of the present invention based on the three-dimensional object contour data of the robot arm. The difference between the second embodiment and the first embodiment is that the observed object 30 is fixedly placed, and the imaging device 10 can be fixed to the flange end 21 of the robot arm 20 by screwing, clamping, magnetic attraction, etc. This is only an example, and the scope of application of the present invention is not limited thereto.
控制裝置40對機械手臂20的法蘭端部21的位置與角度進行控制,藉以使得固定於機械手臂20的法蘭端部21的影像裝置10移動與轉動至不同取樣點,經過移動與轉動至不同取樣點的影像裝置10時,觀測物件30仍會位於移動與轉動到不同取樣點的影像裝置10的影像範圍11中,即影像裝置10固定於機械手臂20的法蘭端部21上透過機械手臂20移動與轉動對觀測物件30以不同視角進行拍攝而取得包含不同面向觀測物件30的影像,控制裝置40再計算出經過移動與轉動後機械手臂20的法蘭端部21到機械手臂20的手臂空間座標的手臂法蘭運動矩陣 。 The control device 40 controls the position and angle of the flange end 21 of the robot arm 20, so that the imaging device 10 fixed on the flange end 21 of the robot arm 20 moves and rotates to different sampling points. When the imaging device 10 moves and rotates to different sampling points, the observed object 30 is still located in the image range 11 of the imaging device 10 that moves and rotates to different sampling points. That is, the imaging device 10 is fixed on the flange end 21 of the robot arm 20, and the observed object 30 is photographed at different viewing angles by the movement and rotation of the robot arm 20 to obtain images including different orientations of the observed object 30. The control device 40 then calculates the arm flange motion matrix from the flange end 21 of the robot arm 20 to the arm space coordinates of the robot arm 20 after the movement and rotation. .
值得注意的是,手臂法蘭運動矩陣是由機械手臂20的6自由度姿態[x,y,z,r,p,w]推倒得到的4x4變換矩陣,控制裝置40一般是透過讀取機械手臂20的控制器參數得到,手臂法蘭運動矩陣 如下: It is worth noting that the arm flange motion matrix is a 4x4 transformation matrix derived from the 6-DOF posture [x, y, z, r, p, w] of the robot arm 20. The control device 40 generally obtains the arm flange motion matrix by reading the controller parameters of the robot arm 20. as follows:
。 .
物件輪廓計算裝置50透過使用者介面接收觀測物件30的物件空間座標,物件輪廓計算裝置50再自控制裝置40接收手臂空間座標以及手臂法蘭運動矩陣,物件輪廓計算裝置50即可依據手臂空間座標以及物件空間座標計算出物件手臂運動矩陣 ,物件手臂運動矩陣 為下列公式: The object contour calculation device 50 receives the object space coordinates of the observed object 30 through the user interface, and then receives the arm space coordinates and the arm flange motion matrix from the control device 40. The object contour calculation device 50 can calculate the object arm motion matrix according to the arm space coordinates and the object space coordinates. , object arm motion matrix The following formula:
在第二實施態樣中,因觀測物件30是被固定放置,故不論影像裝置10如何進行移動與轉動,觀測物件30與機械手臂20的手臂空間座標之間的物件手臂運動矩陣 均不會變動,物件輪廓計算裝置50是透過手眼校正以建立影像裝置10與法蘭端部21的法蘭影像運動矩陣 。 In the second embodiment, since the observation object 30 is fixed, no matter how the imaging device 10 moves and rotates, the object arm motion matrix between the observation object 30 and the arm space coordinates of the robot arm 20 is The object contour calculation device 50 establishes the flange image motion matrix of the imaging device 10 and the flange end 21 through hand-eye calibration. .
物件輪廓計算裝置50將法蘭影像運動矩陣 、手臂法蘭運動矩陣 以及物件手臂運動矩陣 相乘以計算出影像物件運動矩陣 ,影像物件運動矩陣 即為下列公式: The object contour calculation device 50 converts the flange image motion matrix , Arm flange motion matrix and the object arm motion matrix Multiply to calculate the image object motion matrix , image object motion matrix This is the following formula:
接著,以下將說明本發明的運作方法,並請同時參考「第7A圖」以及「第7B圖」所示,「第7A圖」以及「第7B圖」繪示為本發明基於機械手臂的三維物件輪廓數據建立方法第二實施態樣的方法流程圖。Next, the operation method of the present invention will be described below, and please refer to "Figure 7A" and "Figure 7B" at the same time. "Figure 7A" and "Figure 7B" are method flow charts of the second implementation of the method for establishing three-dimensional object contour data based on a robot arm of the present invention.
本發明所揭露第二實施態樣的基於機械手臂的三維物件輪廓數據建立方法,其包含下列步驟:The second embodiment of the present invention discloses a method for establishing three-dimensional object contour data based on a robot arm, which comprises the following steps:
首先,觀測物件被固定放置(步驟701);接著,機械手臂具有法蘭端部(步驟702);接著,影像裝置被固定於法蘭端部且具有影像範圍(步驟703);接著,控制裝置與機械手臂建立連線,以控制機械手臂的法蘭端部的移動與轉動,使得固定於法蘭端部的影像裝置移動與轉動至不同取樣點,觀測物件位於移動與轉動至不同取樣點的影像範圍(步驟704);接著,控制裝置計算出經過移動與轉動後法蘭端部到機械手臂的手臂空間座標的手臂法蘭運動矩陣(步驟705);接著,物件輪廓計算裝置與控制裝置以及影像裝置建立連線(步驟706);接著,物件輪廓計算裝置自影像裝置接收影像裝置的影像參數(步驟707);接著,物件輪廓計算裝置建立影像裝置與法蘭端部的法蘭影像運動矩陣(步驟708);接著,物件輪廓計算裝置接收觀測物件的物件空間座標(步驟709);接著,物件輪廓計算裝置自控制裝置接收手臂空間座標以及手臂法蘭運動矩陣(步驟710);接著,物件輪廓計算裝置依據手臂空間座標以及物件空間座標計算出物件手臂運動矩陣(步驟711);接著,物件輪廓計算裝置將法蘭影像運動矩陣、手臂法蘭運動矩陣以及物件手臂運動矩陣相乘以計算出影像物件運動矩陣(步驟712);最後,物件輪廓計算裝置將影像物件運動矩陣配合影像裝置的內部參數使用三角測量法以計算出觀測物件的每一個三維座標點在二維平面的投影座標點,以計算出觀測物件在二維平面的輪廓影像(步驟713)。First, the observation object is fixedly placed (step 701); then, the robot arm has a flange end (step 702); then, the imaging device is fixed to the flange end and has an imaging range (step 703); then, the control device establishes a connection with the robot arm to control the movement and rotation of the flange end of the robot arm, so that the imaging device fixed to the flange end moves and rotates to different sampling points, and the observation object is located at the different sampling points. The image range of the point (step 704); then, the control device calculates the arm flange motion matrix of the arm space coordinates from the flange end to the robot arm after movement and rotation (step 705); then, the object contour calculation device establishes a connection with the control device and the imaging device (step 706); then, the object contour calculation device receives the image parameters of the imaging device from the imaging device (step 707); then, the object contour calculation device establishes an image The device and the flange image motion matrix of the flange end (step 708); then, the object contour calculation device receives the object space coordinates of the observed object (step 709); then, the object contour calculation device receives the arm space coordinates and the arm flange motion matrix from the control device (step 710); then, the object contour calculation device calculates the object arm motion matrix based on the arm space coordinates and the object space coordinates (step 711); then, the object The contour calculation device multiplies the flange image motion matrix, the arm flange motion matrix and the object arm motion matrix to calculate the image object motion matrix (step 712); finally, the object contour calculation device uses triangulation to calculate the projection coordinate point of each three-dimensional coordinate point of the observed object on the two-dimensional plane by combining the image object motion matrix with the internal parameters of the imaging device to calculate the contour image of the observed object on the two-dimensional plane (step 713).
綜上所述,可知本發明由物件輪廓計算裝置建立觀測物件與法蘭端部的物件法蘭運動矩陣,自影像裝置接收影像裝置的影像參數,自控制裝置接收手臂空間座標以及法蘭手臂運動矩陣,依據手臂空間座標以及空間座標計算出手臂影像運動矩陣,將物件法蘭運動矩陣、法蘭手臂運動矩陣以及手臂影像運動矩陣相乘以計算出物件影像運動矩陣,將物件影像運動矩陣配合影像裝置的內部參數使用三角測量法以計算出觀測物件的每一個三維座標點在二維平面的投影座標點,以計算出觀測物件在二維平面的輪廓影像。In summary, it can be seen that the present invention establishes an object flange motion matrix of the observed object and the flange end by an object contour calculation device, receives image parameters of the imaging device from the imaging device, receives arm spatial coordinates and flange-arm motion matrix from the control device, calculates the arm image motion matrix according to the arm spatial coordinates and spatial coordinates, multiplies the object flange motion matrix, the flange arm motion matrix and the arm image motion matrix to calculate the object image motion matrix, and uses triangulation method to calculate the projection coordinate point of each three-dimensional coordinate point of the observed object on the two-dimensional plane using the object image motion matrix in conjunction with the internal parameters of the imaging device to calculate the contour image of the observed object on the two-dimensional plane.
藉由此一技術手段可以來解決先前技術所存在現有人工智慧的訓練數據建立過於耗時的問題,進而達成提供便捷三維物件輪廓數據建立的技術功效。This technical means can solve the problem of the time-consuming establishment of existing artificial intelligence training data in previous technologies, thereby achieving the technical effect of providing convenient three-dimensional object contour data establishment.
雖然本發明所揭露的實施方式如上,惟所述的內容並非用以直接限定本發明的專利保護範圍。任何本發明所屬技術領域中具有通常知識者,在不脫離本發明所揭露的精神和範圍的前提下,可以在實施的形式上及細節上作些許的更動。本發明的專利保護範圍,仍須以所附的申請專利範圍所界定者為準。Although the implementation methods disclosed in the present invention are as above, the above contents are not used to directly limit the scope of patent protection of the present invention. Any person with ordinary knowledge in the technical field to which the present invention belongs can make slight changes in the form and details of implementation without departing from the spirit and scope disclosed in the present invention. The scope of patent protection of the present invention shall still be defined by the scope of the attached patent application.
10:影像裝置 11:影像範圍 20:機械手臂 21:法蘭端部 30:觀測物件 31:輪廓影像 40:控制裝置 50:物件輪廓計算裝置 步驟 601:影像裝置具有固定的影像範圍 步驟 602:機械手臂具有法蘭端部 步驟 603:觀測物件被固定於法蘭端部 步驟 604:控制裝置與機械手臂建立連線,控制裝置控制機械手臂的法蘭端部的移動與轉動,使得固定於法蘭端部的觀測物件移動與轉動至不同取樣點,觀測物件移動與轉動至不同取樣點的位置位於影像範圍 步驟 605:控制裝置計算出經過移動與轉動後法蘭端部到機械手臂的手臂空間座標的法蘭手臂運動矩陣 步驟 606:物件輪廓計算裝置與控制裝置建立連線 步驟 607:物件輪廓計算裝置建立影像裝置的內部參數 步驟 608:物件輪廓計算裝置建立觀測物件與法蘭端部的物件法蘭運動矩陣 步驟 609:物件輪廓計算裝置接收影像裝置的空間座標 步驟 610:物件輪廓計算裝置自控制裝置接收手臂空間座標以及法蘭手臂運動矩陣 步驟 611:物件輪廓計算裝置依據手臂空間座標以及空間座標計算出手臂影像運動矩陣 步驟 612:物件輪廓計算裝置將物件法蘭運動矩陣、法蘭手臂運動矩陣以及手臂影像運動矩陣相乘以計算出物件影像運動矩陣 步驟 613:物件輪廓計算裝置將物件影像運動矩陣配合影像裝置的內部參數使用三角測量法以計算出觀測物件的每一個三維座標點在二維平面的投影座標點,以計算出觀測物件在二維平面的輪廓影像 步驟 701:觀測物件被固定放置 步驟 702:機械手臂具有法蘭端部 步驟 703:影像裝置被固定於法蘭端部且具有影像範圍 步驟 704:控制裝置與機械手臂建立連線,以控制機械手臂的法蘭端部的移動與轉動,使得固定於法蘭端部的影像裝置移動與轉動至不同取樣點,觀測物件位於移動與轉動至不同取樣點的影像範圍 步驟 705:控制裝置計算出經過移動與轉動後法蘭端部到機械手臂的手臂空間座標的手臂法蘭運動矩陣 步驟 706:物件輪廓計算裝置與控制裝置以及影像裝置建立連線 步驟 707:物件輪廓計算裝置自影像裝置接收影像裝置的影像參數 步驟 708:物件輪廓計算裝置建立影像裝置與法蘭端部的法蘭影像運動矩陣 步驟 709:物件輪廓計算裝置接收觀測物件的物件空間座標 步驟 710:物件輪廓計算裝置自控制裝置接收手臂空間座標以及手臂法蘭運動矩陣 步驟 711:物件輪廓計算裝置依據手臂空間座標以及物件空間座標計算出物件手臂運動矩陣 步驟 712:物件輪廓計算裝置將法蘭影像運動矩陣、手臂法蘭運動矩陣以及物件手臂運動矩陣相乘以計算出影像物件運動矩陣 步驟 713:物件輪廓計算裝置將影像物件運動矩陣配合影像裝置的內部參數使用三角測量法以計算出觀測物件的每一個三維座標點在二維平面的投影座標點,以計算出觀測物件在二維平面的輪廓影像 10: imaging device 11: imaging range 20: robotic arm 21: flange end 30: observation object 31: contour image 40: control device 50: object contour calculation device Step 601: imaging device has a fixed imaging range Step 602: robotic arm has a flange end Step 603: the observation object is fixed at the flange end Step 604: the control device establishes a connection with the robotic arm, and the control device controls the movement and rotation of the flange end of the robotic arm, so that the observation object fixed at the flange end moves and rotates to different sampling points, and the position where the observation object moves and rotates to different sampling points is located in the imaging range Step 605: The control device calculates the flange arm motion matrix of the arm space coordinates from the flange end to the robot arm after movement and rotation Step 606: The object contour calculation device establishes a connection with the control device Step 607: The object contour calculation device establishes the internal parameters of the imaging device Step 608: The object contour calculation device establishes the object flange motion matrix of the observed object and the flange end Step 609: The object contour calculation device receives the space coordinates of the imaging device Step 610: The object contour calculation device receives the arm space coordinates and the flange arm motion matrix from the control device Step 611: The object contour calculation device calculates the arm image motion matrix based on the arm space coordinates and the space coordinates Step 612: The object contour calculation device multiplies the object flange motion matrix, the flange arm motion matrix and the arm image motion matrix to calculate the object image motion matrix Step 613: The object contour calculation device uses triangulation to calculate the projection coordinate point of each three-dimensional coordinate point of the observed object on the two-dimensional plane by combining the object image motion matrix with the internal parameters of the imaging device to calculate the contour image of the observed object on the two-dimensional plane Step 701: The observed object is fixedly placed Step 702: The robot arm has a flange end Step 703: The imaging device is fixed to the flange end and has an imaging range Step 704: The control device establishes a connection with the robot arm to control the movement and rotation of the flange end of the robot arm, so that the imaging device fixed to the flange end moves and rotates to different sampling points, and the object is observed to be located in the image range of the movement and rotation to different sampling points. Step 705: The control device calculates the arm flange motion matrix of the arm space coordinates from the flange end to the robot arm after movement and rotation. Step 706: The object contour calculation device establishes a connection with the control device and the imaging device. Step 707: The object contour calculation device receives the image parameters of the imaging device from the imaging device. Step 708: The object contour calculation device establishes the flange image motion matrix of the imaging device and the flange end. Step 709: The object contour calculation device receives the object space coordinates of the observed object Step 710: The object contour calculation device receives the arm space coordinates and the arm flange motion matrix from the control device Step 711: The object contour calculation device calculates the object arm motion matrix based on the arm space coordinates and the object space coordinates Step 712: The object contour calculation device multiplies the flange image motion matrix, the arm flange motion matrix and the object arm motion matrix to calculate the image object motion matrix Step 713: The object contour calculation device uses triangulation to calculate the projection coordinate point of each three-dimensional coordinate point of the observed object on the two-dimensional plane by combining the image object motion matrix with the internal parameters of the imaging device, so as to calculate the contour image of the observed object on the two-dimensional plane.
第1圖繪示為本發明基於機械手臂的三維物件輪廓數據建立系統第一實施態樣的系統方塊圖。 第2圖繪示為本發明基於機械手臂的三維物件輪廓數據建立第一實施態樣的影像範圍與法蘭端部運動示意圖。 第3A圖繪示為本發明基於機械手臂的三維物件輪廓數據建立第一實施態樣的第一取樣點三維觀測物件示意圖。 第3B圖繪示為本發明基於機械手臂的三維物件輪廓數據建立第一實施態樣的第一取樣點二維觀測物件輪廓示意圖。 第4A圖繪示為本發明基於機械手臂的三維物件輪廓數據建立第一實施態樣的第二取樣點三維觀測物件示意圖。 第4B圖繪示為本發明基於機械手臂的三維物件輪廓數據建立第一實施態樣的第二取樣點二維觀測物件輪廓示意圖。 第5A圖以及第5B圖繪示為本發明基於機械手臂的三維物件輪廓數據建立方法第一實施態樣的方法流程圖。 第6圖繪示為本發明基於機械手臂的三維物件輪廓數據建立第二實施態樣的影像範圍與法蘭端部運動示意圖。 第7A圖以及第7B圖繪示為本發明基於機械手臂的三維物件輪廓數據建立方法第二實施態樣的方法流程圖。 FIG. 1 is a system block diagram of the first implementation of the system for establishing the three-dimensional object contour data of the robot arm according to the present invention. FIG. 2 is a schematic diagram of the image range and the movement of the flange end according to the first implementation of the system for establishing the three-dimensional object contour data of the robot arm according to the present invention. FIG. 3A is a schematic diagram of the first sampling point three-dimensional observation of the object according to the first implementation of the system for establishing the three-dimensional object contour data of the robot arm according to the present invention. FIG. 3B is a schematic diagram of the first sampling point two-dimensional observation of the object contour according to the first implementation of the system for establishing the three-dimensional object contour data of the robot arm according to the present invention. FIG. 4A is a schematic diagram of the second sampling point three-dimensional observation of the object according to the first implementation of the system for establishing the three-dimensional object contour data of the robot arm according to the present invention. FIG. 4B is a schematic diagram of the second sampling point two-dimensional observation object contour of the first implementation of the present invention based on the three-dimensional object contour data of the robot arm. FIG. 5A and FIG. 5B are method flow charts of the first implementation of the method for establishing the three-dimensional object contour data of the robot arm of the present invention. FIG. 6 is a schematic diagram of the image range and flange end movement of the second implementation of the present invention based on the three-dimensional object contour data of the robot arm. FIG. 7A and FIG. 7B are method flow charts of the second implementation of the method for establishing the three-dimensional object contour data of the robot arm of the present invention.
10:影像裝置 10: Imaging device
20:機械手臂 20:Robotic arm
40:控制裝置 40: Control device
50:物件輪廓計算裝置 50: Object outline calculation device
Claims (10)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| TW112145558A TWI845450B (en) | 2023-11-24 | 2023-11-24 | 3d object outline data establishment system based on robotic arm and method thereof |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| TW112145558A TWI845450B (en) | 2023-11-24 | 2023-11-24 | 3d object outline data establishment system based on robotic arm and method thereof |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| TWI845450B true TWI845450B (en) | 2024-06-11 |
| TW202522404A TW202522404A (en) | 2025-06-01 |
Family
ID=92541584
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| TW112145558A TWI845450B (en) | 2023-11-24 | 2023-11-24 | 3d object outline data establishment system based on robotic arm and method thereof |
Country Status (1)
| Country | Link |
|---|---|
| TW (1) | TWI845450B (en) |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| TWI472711B (en) * | 2012-10-30 | 2015-02-11 | Ind Tech Res Inst | Method and device for measuring 3-d article without contacting |
| TWI493153B (en) * | 2014-04-08 | 2015-07-21 | Ind Tech Res Inst | Non-contact measurement device and method for object space information and the method thereof for computing the path from capturing the image |
| TWI566204B (en) * | 2014-10-28 | 2017-01-11 | 惠普發展公司有限責任合夥企業 | Three dimensional object recognition |
| US11045227B2 (en) * | 2007-12-18 | 2021-06-29 | Howmedica Osteonics Corporation | System and method for image segmentation, bone model generation and modification, and surgical planning |
-
2023
- 2023-11-24 TW TW112145558A patent/TWI845450B/en active
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US11045227B2 (en) * | 2007-12-18 | 2021-06-29 | Howmedica Osteonics Corporation | System and method for image segmentation, bone model generation and modification, and surgical planning |
| TWI472711B (en) * | 2012-10-30 | 2015-02-11 | Ind Tech Res Inst | Method and device for measuring 3-d article without contacting |
| TWI493153B (en) * | 2014-04-08 | 2015-07-21 | Ind Tech Res Inst | Non-contact measurement device and method for object space information and the method thereof for computing the path from capturing the image |
| TWI566204B (en) * | 2014-10-28 | 2017-01-11 | 惠普發展公司有限責任合夥企業 | Three dimensional object recognition |
Also Published As
| Publication number | Publication date |
|---|---|
| TW202522404A (en) | 2025-06-01 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN104463880B (en) | A kind of RGB D image acquiring methods | |
| CN104835117B (en) | Generating method of spherical panorama based on overlapping method | |
| CN106308946B (en) | A kind of augmented reality devices and methods therefor applied to stereotactic surgery robot | |
| CN111199560B (en) | Video monitoring positioning method and video monitoring system | |
| CN107471218B (en) | A hand-eye coordination method for a dual-arm robot based on polyocular vision | |
| CN108648237B (en) | A Vision-Based Spatial Localization Method | |
| US20180066934A1 (en) | Three-dimensional measurement apparatus, processing method, and non-transitory computer-readable storage medium | |
| CN108436909A (en) | A kind of hand and eye calibrating method of camera and robot based on ROS | |
| CN106898022A (en) | A kind of hand-held quick three-dimensional scanning system and method | |
| WO2019146201A1 (en) | Information processing device, information processing method, and information processing system | |
| CN104647390B (en) | For the multiple-camera associating active tracing order calibration method of mechanical arm remote operating | |
| CN109360243B (en) | Calibration method of multi-degree-of-freedom movable vision system | |
| CN114536399A (en) | Error detection method based on multiple pose identifications and robot system | |
| CN113920191B (en) | 6D data set construction method based on depth camera | |
| WO2018209592A1 (en) | Movement control method for robot, robot and controller | |
| CN106940894A (en) | A kind of hand-eye system self-calibrating method based on active vision | |
| CN104739514A (en) | Automatic tracking and positioning method for surgical instrument in large visual field | |
| CN115972208A (en) | Object-following control method, mirror-holding robot, and computer-readable medium | |
| JP4825971B2 (en) | Distance calculation device, distance calculation method, structure analysis device, and structure analysis method. | |
| CN102385692B (en) | Human face deflection image acquiring system and method | |
| CN110445982A (en) | A kind of tracking image pickup method based on six degree of freedom equipment | |
| TWI845450B (en) | 3d object outline data establishment system based on robotic arm and method thereof | |
| CN116423495A (en) | PNP-based 3D positioning tracking method and system | |
| CN111179341A (en) | A registration method of augmented reality device and mobile robot | |
| CN114549641B (en) | A human hand-robot contact state detection system and method |