WO2013176525A1 - Procédé de repérage de caméra conférant une réalité augmentée à un système de navigation pour une intervention chirurgicale - Google Patents
Procédé de repérage de caméra conférant une réalité augmentée à un système de navigation pour une intervention chirurgicale Download PDFInfo
- Publication number
- WO2013176525A1 WO2013176525A1 PCT/KR2013/004594 KR2013004594W WO2013176525A1 WO 2013176525 A1 WO2013176525 A1 WO 2013176525A1 KR 2013004594 W KR2013004594 W KR 2013004594W WO 2013176525 A1 WO2013176525 A1 WO 2013176525A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- spatial coordinates
- camera
- optical
- processor
- pattern board
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2065—Tracking using image or pattern recognition
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
- G06T2207/30208—Marker matrix
Definitions
- the present invention relates to a camera registration method for augmented reality of a surgical navigation system, and more particularly to augmented reality of a surgical navigation system that can correct the spatial coordinates between the optical center of the camera and a marker mounted on the camera. It relates to a camera registration method.
- Augmented Reality is a technology that allows computer graphics (CG) to coexist with the real world so that computer graphics can feel as if they exist in the real world.
- Augmented reality a concept that complements the real world with a virtual world, uses a virtual environment made of computer graphics, but the main role is the real environment. In other words, the 3D virtual image is overlapped with the live image viewed by the user, thereby providing additional information necessary for the real environment.
- a navigation system or the like is used during surgery to solve this problem. If the existing surgery is based on the doctor's experience, navigation surgery is accurate because it is a computer-assisted verification process.
- AR augmented reality
- CT computed tomography
- MRI magnetic resonance imaging
- more accurate augmented reality may be implemented only when the optical center coordinates of the camera are corrected.
- 1 is a view for explaining a conventional general method for correcting the coordinates of the optical center of the camera.
- a worker manually takes a tool having a marker 140 attached to the pattern board 130, and then uses the optical tracker 110 of the navigation system.
- the coordinate O pp of the marker 140 is detected.
- the camera 120 using the coordinates O cm of the marker 140 attached to the camera 120 and the coordinates O pp of the marker 140 attached to the tool to a processor (not shown).
- the coordinates (O c ) of the optical center of) are detected to correct the position and orientation of the optical center of the camera 120.
- the tool having the marker 140 attached thereto is photographed at different positions of the pattern board 130 several times to detect the coordinates of each marker 140.
- the method for detecting the distance between the optical centers of the conventional general camera 120 is a marker attached to the tool by dipping a tool with the marker 140 on the pattern board 130 by hand.
- the coordinate (O pp ) of 140 By detecting the coordinate (O pp ) of 140, there was a problem that the error range of augmented reality becomes very large. That is, it is impossible to accurately record the coordinate system direction every time the tool 140 is attached to the pattern board 130 by hand, so that the error of augmented reality is very large, and the tool to which the marker 140 is attached.
- Camera registration method for augmented reality of the surgical navigation system is attached to the camera and the pattern board, respectively, when the spatial coordinates of the first and second markers tracked by the optical tracker is changed a plurality of times, each time through a processor
- a second stage of correcting and storing the spatial coordinates of the processor It includes the system.
- the second marker may be attached to a portion where a pattern of the pattern board is formed.
- the second marker may be attached to a portion where the pattern of the pattern board is not formed.
- a pattern in the form of a chess pattern or a pattern in a circular pattern may be formed on the pattern board.
- the first step may include tracking the first and second markers by the optical tracker and detecting the spatial coordinates of the first and second markers from the spatial coordinates of the optical tracker through the processor; Calculating the spatial coordinates of the pattern board origin from the spatial coordinates of the optical center of the camera using the image of the check board acquired by the camera, and the first, from the spatial coordinates of the optical tracker.
- the spatial coordinates of the first and second markers and the optical center of the camera are changed from the spatial coordinates of the optical tracker by executing the spatial coordinate detection and spatial coordinate calculation step each time at the same time by changing the spatial coordinates of the two markers a plurality of times. Detecting spatial coordinates of the pattern board origin from the spatial coordinates.
- the step of calculating the spatial coordinates of the origin of the pattern board photographing the image of the pattern board through the camera, transmitting the image of the pattern board obtained by the camera to the processor, Calculating the spatial coordinates of the pattern board origin from the spatial coordinates of the optical center of the camera through camera correction using the image of the pattern board acquired by the camera in the processor.
- the spatial coordinates of the first and second markers tracked by the optical tracker in the first step may be changed by moving at least one of the positions of the optical tracker, the pattern board and the camera at least four times. Do.
- the second step may be performed by using the spatial coordinates of the pattern board origin from the spatial coordinates of the first and second markers from the spatial coordinates of the optical tracker stored in the processor and the spatial coordinates of the optical center of the camera.
- the processor calculates and corrects the spatial coordinates of the optical center of the camera from the coordinates of the first marker.
- the spatial coordinates of the first and second markers tracked by the optical tracker in the first step may be changed by moving at least one position of the optical tracker, the pattern board, and the camera at least once.
- a camera registration method for augmented reality of a surgical navigation system includes attaching a first marker to a camera, attaching a second marker to a pattern board, and then tracking the first marker or an optical tracker.
- the spatial coordinates of the second marker are changed at least once or at least four times to calculate the coordinates of the camera optical center so as to correct the spatial coordinates of the camera optical center from the spatial coordinates of the second marker.
- 1 is a view for explaining a conventional general method for correcting the coordinates of the optical center of the camera
- FIG. 2 is a conceptual diagram illustrating a camera registration method for augmented reality of a surgical navigation system for surgery according to a first embodiment of the present invention
- FIG. 3 is another exemplary view of a pattern board
- FIG. 4 is a flowchart illustrating a camera registration method for augmented reality of a surgical navigation system for surgery according to a first embodiment of the present invention
- 5 is a flowchart for explaining an operation S120.
- 6 is a flowchart for explaining an operation S122.
- FIG. 8 is a conceptual diagram illustrating a camera registration method for augmented reality of a surgical navigation system for surgery according to a second embodiment of the present invention
- FIG. 9 is a flowchart illustrating a camera registration method for augmented reality of a surgical navigation system for surgery according to a second embodiment of the present invention
- 10 is a flowchart for explaining an operation S220.
- 11 is a flowchart for explaining an operation S230.
- first and second may be used to describe various components, but the components should not be limited by the terms. The terms are used only for the purpose of distinguishing one component from another.
- the first component may be referred to as the second component, and similarly, the second component may also be referred to as the first component.
- FIG. 2 is a conceptual diagram illustrating a camera registration method for augmented reality of a surgical navigation system for surgery according to a first embodiment of the present invention
- Figure 3 is another illustration of a pattern board
- Figure 4 is a first view of the present invention
- the camera registration method for augmented reality of the surgical navigation system is the first and second markers 140 (140) attached to the camera 120 and the pattern board 130, respectively (
- the spatial coordinates of the optical center of the camera 120 are calculated by changing the spatial coordinates of the first marker 140 or the second marker 150 a plurality of times from the spatial coordinates of the optical tracker 110 for tracking 150. Since the spatial coordinates of the optical center of the camera 120 can be corrected from the spatial coordinates of the first marker 140, the error is less than that of the conventional camera 120 registration method for augmented reality. It enables to implement more accurate augmented reality.
- a pattern in the form of a chess pattern may be formed on the pattern board 130.
- the second marker 150 may be attached to a portion where a chess board pattern is formed on the pattern board 130.
- the second marker 150 may be attached to a portion where the chessboard pattern is not formed on the pattern board 130. That is, if the generated light can be detected by the optical tracker 110, the second marker 150 may be attached to the pattern board 130 regardless of the position of the pattern board 130.
- the pattern board 130 may include a pattern in the form of a circular pattern 131, as shown in Figure 3 in addition to the pattern of the chess pattern.
- a polygon pattern such as a triangle pattern or a square pattern, is also applicable.
- the first marker 140 tracked by the optical tracker 110 or The camera detects the spatial coordinates of the first and second markers 140 and 150 from the optical coordinates of the optical tracker 110 through a processor every time while changing the spatial coordinates of the second marker 150 a plurality of times.
- the spatial coordinates of the origin of the pattern board 130 are calculated from the spatial coordinates of the optical center and stored in the processor (S120).
- the spatial coordinates of the first and second markers 140 and 150 tracked by the optical tracker 110 may be at least one of the optical tracker 110, the pattern board 130, and the camera 120. It can be changed by moving one position, it is preferable to move the position of the optical tracker 110 or the camera 120.
- the spatial coordinates of the first and second markers 140 and 150 tracked by the optical tracker 110 may be changed at least four times. The reason why the spatial coordinates of the first and second markers 140 and 150 that are tracked by the optical tracker 110 is changed at least four times will be described later in the detailed description of step S120.
- the spatial coordinates of the first and second markers 140 and 150 are detected from the spatial coordinates of the optical tracker 110, and at the same time, the origin of the pattern board 130 is determined from the spatial coordinates of the optical center of the camera 120. After calculating the spatial coordinates and storing them in the processor, the spatial coordinates of the first and second markers 140 and 150 and the camera 120 are stored from the spatial coordinates of the optical tracker 110 stored in the processor. The spatial coordinates of the optical center of the camera 120 are corrected by the processor by using the spatial coordinates of the origin of the pattern board 130 from the spatial coordinates of the optical center and stored in the processor (S130).
- FIG. 5 is a flowchart for explaining an operation S120 and FIG. 6 is a flowchart for describing an operation S122.
- the optical tracker 110 first tracks the first and second markers 140 and 150 by the optical tracker 110 and the optical tracker 110 through the processor.
- the spatial coordinates of the first and second markers 140 and 150 are first detected from the spatial coordinates and stored in the processor (S121).
- the spatial coordinates of the origin of the pattern board 130 are first obtained from the spatial coordinates of the optical center of the camera 120 through the processor using the image of the pattern board 130 acquired by the camera 120. It is calculated and stored in the processor (S122).
- the spatial coordinates of the first and second markers 140 and 150 are changed from the spatial coordinates of the optical tracker 110 a plurality of times, that is, at least four times, and the steps S121 and S122 are each time. And executing the steps from the spatial coordinates of the at least four optical trackers 110 to the spatial coordinates of the first and second markers 140 and 150 and from the optical center spatial coordinates of the at least four cameras 120.
- the pattern board 130 detects the spatial coordinates of the origin and stores them in the processor sequentially (S123).
- the steps S121 and S122 are executed to perform the operation of the optical tracker 110.
- the processor includes the spatial coordinates of the first and second markers 140 and 150 from the spatial coordinates of the optical tracker 110 primarily detected in operation S121, and the camera computed primarily in operation S122.
- the first and second markers 140 from the spatial coordinates of the optical center of the optical board 120 and the spatial coordinates of the origin of the pattern board 130 and the spatial coordinates of the at least four optical trackers 110 calculated in step S123.
- 150 spatial coordinates, and at least 4 Of the spatial coordinates of the origin pattern board 130 is stored from the space coordinates of the optical center of the camera 120.
- the processor may include the spatial coordinates of the first and second markers 140 and 150 from the spatial coordinates of at least five optical trackers 110 and the spatial coordinates of the optical centers of the at least five cameras 120.
- the spatial coordinates of the origin of the pattern board 130 are stored.
- step S122 first, an image of the pattern board 130 is photographed through the camera 120 (S1220).
- the camera 120 is calibrated by the camera 120 using the image acquired by the camera 120 in the processor.
- the spatial coordinates of the origin of the pattern board 130 are calculated from the spatial coordinates of the optical center (S1222).
- the spatial coordinates of the origin of the pattern board 130 can be calculated by the method of Zhang, which is generally performed from the spatial coordinates of the optical center of the camera 120 through the correction of the camera 120.
- step S130 first, the spatial coordinates of the first and second markers 140 and 150 and the spatial coordinates of the optical tracker 110 detected by the processor are determined.
- the spatial coordinates of the origin of the pattern board 130 are calculated from the two markers 150 (S131).
- the processor stores the spatial coordinates of the first and second markers 140 and 150 from the spatial coordinates of one optical tracker 110 stored in the processor in step S121, and the processor stores the spatial coordinates of the first and second markers 140 and 150 in step S122.
- the first coordinate from the spatial coordinates of the origin of the pattern board 130 from the spatial coordinates of the optical center of the camera 120, and from the spatial coordinates of the at least four optical tracker 110 stored in the processor in step S123.
- the processor may store spatial coordinates of the first and second markers 140 and 150 and at least five cameras 120 from the stored spatial coordinates of the at least five optical trackers 110.
- the spatial coordinates of the optical center of the camera 120 from the first marker 140 and the pattern from the second markers 150 using the spatial coordinates of the origin of the pattern board 130 from the spatial coordinates of the optical center of The spatial coordinates of the origin of the board 130 are calculated.
- the processor uses the spatial coordinates of the optical center of the camera 120 from the first marker 140 and the spatial coordinates of the origin of the pattern board 130 from the second marker 150.
- the spatial coordinates of the optical center of 120 are corrected (S132).
- the spatial coordinates of the optical center of the camera 120 corrected by the processor are stored in the processor (S133).
- the spatial coordinates of the optical center of the camera 120 are the spatial coordinates (first path) of the optical center of the camera 120 from the optical tracker 110 through the first marker 140. And the spatial coordinates (second path) of the optical center of the camera 120 which are sequentially passed from the optical tracker 110 to the origin of the second marker 150 and the pattern board 130.
- the processor calculates and corrects the spatial coordinates of the optical center of the camera 120 from the coordinates of the first marker 140.
- the spatial coordinates of the optical center of the camera 120 in the step S132 is the spatial coordinates of the optical center of the camera 120 through the first path, as shown in FIG.
- the spatial coordinates of the optical center of the camera 120 are calculated and corrected by the processor from the coordinates of the first marker 140.
- the reason why the spatial coordinates of the origin of the pattern board 130 are needed from the spatial coordinates of the center will be described. That is, the reason why the spatial coordinates of the first and second markers 140 and 150 tracked by the optical tracker 110 are changed at least four times in step S120.
- an origin of the optical tracker 110 using world space coordinates is referred to as O p
- an origin of the first marker 140 attached to the camera 120 is referred to as O cm
- the pattern board When the origin of the second marker 150 attached to the 130 is called O pm and the origin of the pattern board 130 is referred to as O pp , the camera moved in parallel from the spatial coordinates of the optical tracker 110 ( The spatial coordinate T p-> cm of the first marker 140 attached to 120 may be expressed as Equation 1 below.
- T a represents the spatial coordinates and the zero coordinates of the optical tracker 110. 1 shows the correlation of the distance between the spatial coordinates of the marker 140, the description thereof will be omitted in the following equation.
- the spatial coordinates T cm-> c of the optical center of the camera 120 which are parallelly moved from the spatial coordinates of the first marker 140 may be expressed by Equation 2 below.
- the spatial coordinates of the optical center of the camera 120 which are parallelly moved from the spatial coordinates of the optical tracker 110 through the origin of the first marker 140, that is, from the spatial coordinates of the optical tracker 110.
- the spatial coordinates (T p-> cm T cm-> c ) of the optical center of the camera 120 moved in parallel through one path may be expressed by Equation 3 below.
- Equation 4 the spatial coordinates (T p-> pm ) of the second marker 150 attached to the pattern board 130 moved in parallel from the spatial coordinates of the optical tracker 110 can be expressed as Equation 4 have.
- Equation 5 the spatial coordinates (T pm-> pp ) of the origin of the pattern board 130 moved in parallel from the spatial coordinates of the second marker 150 may be expressed by Equation 5 below.
- the spatial coordinates T pp-> c of the origin of the camera 120 moved in parallel from the spatial coordinates of the origin of the pattern board 130 may be expressed by Equation 6 below.
- the spatial coordinates of the optical center of the camera 120 are moved in parallel through the origin of the second marker 150 and the origin of the pattern board 130 from the spatial coordinates of the optical tracker 110.
- the spatial coordinates (T pp-> c T pm-> pp T p-> pm ) of the optical center of the camera 120 which are moved in parallel through the second path from the spatial coordinates of the optical tracker 110 are represented by Equation 7 It is expressed as
- R 1 R 2 may be replaced with Equation 10 by Equation 8.
- Equation 11 substituting R a R b R 3 -1 in place of R 1 R 2 into Equation 9 gives Equation 11 below.
- Equation 12 may be expressed.
- R b , T A , T b , R A , T 2 , T K May be expressed as in Equations 13 to 18, respectively.
- Equation 12 may be expressed as Equation 19.
- unknown parameters to be calculated are R 11_b , R 12_b , R 13_b , R 21 _ b , R 22_b , R 23_b , R 31_b , R 32_b , R 33_b , T 1-b , T 2 15, such as -b , T 3-b , T 1-2 , T 2-2 , T 3-2, and three equations are generated in one configuration, so that the optical tracker 110
- the equation may be solved by moving at least one of the camera 120 and the pattern board 130 at least four times or by moving at least one of the camera 120 and the pattern board 130 at least four times.
- the spatial coordinates of the first and second markers 140 and 150 tracked by the optical tracker 110 are changed at least four times so that the processor can be changed from the spatial coordinates of at least five optical trackers 110 to the processor.
- the spatial coordinates of the origin of the pattern board 130 are stored from the spatial coordinates of the first and second markers 140 and 150 and the spatial coordinates of the optical centers of the at least five cameras 120
- the first marker is stored through the processor. Compensating the spatial coordinates of the optical center of the camera 120 by calculating the spatial coordinates of the optical center of the camera 120 and the spatial coordinates of the origin of the pattern board 130 from the second marker 150 from 140. can do.
- the camera registration method for augmented reality of the surgical navigation system As described above, the camera registration method for augmented reality of the surgical navigation system according to the present embodiment, after the second marker 150 is attached to the pattern board 130 is tracked by the optical tracker 110
- the coordinates of the optical center of the camera 120 are calculated by changing the spatial coordinates of the first and second markers 140 and 150 at least four times to determine the coordinates of the optical center of the camera 120 from the spatial coordinates of the second marker 150. Allows you to correct spatial coordinates.
- the camera 120 registration method for augmented reality of a surgical navigation system is a state in which the second marker 150 is not attached to the pattern board 130 by hand but attached to the pattern board 130.
- the coordinates of the optical center of the camera 120 can be calculated and corrected so that a single person can work and the pattern board 130 at the same time. Since the spatial coordinates of the second markers 150 attached to the fixed coordinates are not fixed, cumulative errors due to the spatial coordinates of the second markers 150 do not occur, thereby making it possible to implement more accurate augmented reality.
- FIG. 8 is a conceptual view illustrating a camera registration method for augmented reality of a surgical navigation system according to a second embodiment of the present invention
- Figure 9 is augmented reality of a surgical navigation system according to a second embodiment of the present invention
- 10 is a flowchart for describing a camera registration method
- FIG. 10 is a flowchart for describing an operation S220.
- the camera registration method for augmented reality of the surgical navigation system is substantially the same as the camera registration method according to the first embodiment except for steps S220 and S230, a part of the steps S220 and S230 A description of other methods except the content will be omitted.
- step S220 the optical tracker 220 first tracks the first and second markers 240 and 250 by the optical tracker 220 to process the optical tracker 210 through the processor.
- the spatial coordinates of the first and second markers 240 and 250 are first detected from the spatial coordinates and stored in the processor (S221).
- the spatial coordinates of the origin of the pattern board 230 are first-orderd from the spatial coordinates of the optical center of the camera 220 through the processor using the image of the pattern board 230 acquired by the camera 220. It is calculated and stored in the processor (S222).
- the spatial coordinates of the first and second markers 240 and 250 are changed from the spatial coordinates of the optical tracker 210 a plurality of times, that is, at least once, and each time step S221 and S222 is performed. From the spatial coordinates of at least one optical tracker 210, the spatial coordinates of the first and second markers 240 and 250, and the optical center spatial coordinates of the at least one camera 220, the pattern board 230 ) Spatial coordinates of the origin are detected and sequentially stored in the processor (S223).
- the optical tracker 110 is executed by performing the steps S221 and S222.
- the spatial coordinates of the origin of the pattern board 230 are detected from the spatial coordinates of the first and second markers 240 and 250 and the spatial coordinates of the pattern board 230 from the optical center spatial coordinates of the camera 220.
- the processor calculates the spatial coordinates of the first and second markers 240 and 250 from the spatial coordinates of the optical tracker 210 primarily detected in operation S221, and calculates the primary coordinates calculated in operation S222.
- the processor may include the spatial coordinates of the first and second markers 240 and 250 from the spatial coordinates of at least two optical trackers 220 and the spatial coordinates of the optical centers of the at least two cameras 220.
- the spatial coordinates of the origin of the pattern board 230 are stored.
- 11 is a flowchart for explaining an operation S230.
- step S230 first, the spatial coordinates of the first and second markers 240 and 250 and the camera are detected from the spatial coordinates of the optical tracker 210 detected by the processor.
- the spatial coordinates of the origin of the pattern board 230 are calculated from the two markers 250 (S231).
- the processor stores the spatial coordinates of the first and second markers 140 and 150 from the spatial coordinates of one optical tracker 110 stored in the processor in step S221, and the processor stores the spatial coordinates of the first and second markers 140 and 150.
- the first coordinate from the spatial coordinates of the origin of the pattern board 230 from the spatial coordinates of the optical center of the camera 220, and from the spatial coordinates of the at least one optical tracker 210 stored in the processor in step S223.
- the processor may store spatial coordinates of the first and second markers 240 and 250 and at least two cameras 220 from the spatial coordinates of the stored at least two optical trackers 210.
- the spatial coordinates of the optical center of the camera 220 from the first marker 240 and the pattern from the second markers 250 using the spatial coordinates of the origin of the pattern board 230 from the spatial coordinates of the optical center of The space coordinates of the origin of the board 230 are calculated.
- the processor uses the spatial coordinates of the optical center of the camera 220 from the first marker 240 and the spatial coordinates of the origin of the pattern board 230 from the second marker 250.
- the spatial coordinates of the optical center of 220 are corrected (S232).
- the spatial coordinates of the optical center of the camera 220 corrected by the processor are stored in the processor (S233).
- the spatial coordinates of the optical center of the camera 220 are the spatial coordinates of the optical center of the camera 220 from the optical tracker 210 through the first marker 240 (first path). And the spatial coordinates (second path) of the optical center of the camera 220 which are sequentially passed from the optical tracker 210 to the origin of the second marker 250 and the pattern board 230.
- the processor calculates and corrects the spatial coordinates of the optical center of the camera 220 from the coordinates of the first marker 240.
- the spatial coordinates of the optical center of the camera 220 in the step S232 is the spatial coordinates of the optical center of the camera 220 through the first path, as shown in FIG.
- the spatial coordinates of the optical center of the camera 220 are calculated and corrected by the processor from the coordinates of the first marker 240.
- the reason why the spatial coordinates of the origin of the pattern board 230 is needed from the spatial coordinates is described. That is, the reason why the spatial coordinates of the first and second markers 240 and 250 tracked by the optical tracker 210 are changed at least once in step S220.
- the origin of the optical tracker 210 using world space coordinates is referred to as O p
- the origin of the first marker 240 attached to the camera 220 is referred to as O cm
- the pattern board When the origin of the second marker 250 attached to the 230 is called O pm and the origin of the pattern board 230 is referred to as O pp , the camera moved in parallel from the spatial coordinates of the optical tracker 210 ( The spatial coordinates T p-> cm of the first marker 240 attached to the 220 may be expressed by Equation 20.
- T a represents the spatial coordinates and the zero coordinates of the optical tracker 210. 1 shows the correlation of the distance between the spatial coordinates of the marker 240, the description thereof will be omitted in the following equation.
- the spatial coordinates T cm-> c of the optical center of the camera 220 which are parallelly moved from the spatial coordinates of the first marker 240 may be expressed by Equation 21.
- the spatial coordinates of the optical center of the camera 220 which are parallelly moved from the spatial coordinates of the optical tracker 210 through the origin of the first marker 240, that is, from the spatial coordinates of the optical tracker 210.
- the spatial coordinates T p-> cm T cm-> c of the optical center of the camera 220 which are moved in parallel through one path may be expressed by Equation 22.
- Equation 23 the spatial coordinates T p-> pm of the second marker 250 attached to the pattern board 230 parallelly moved from the spatial coordinates of the optical tracker 210 may be expressed as Equation 23. have.
- Equation 24 the spatial coordinates (T pm-> pp ) of the origin of the pattern board 230 moved in parallel from the spatial coordinates of the second marker 250 may be expressed by Equation 24.
- Equation 25 the spatial coordinates T pp-> c of the origin of the camera 220 moved in parallel from the spatial coordinates of the origin of the pattern board 230 may be expressed by Equation 25.
- the spatial coordinates of the optical center of the camera 220 are moved in parallel through the origin of the second marker 250 and the origin of the pattern board 230 from the spatial coordinates of the optical tracker 210.
- the spatial coordinates (T pp-> c T pm-> pp T p-> pm ) of the optical center of the camera 220 moved in parallel through the second path from the spatial coordinates of the optical tracker 210 are represented by Equation 26. It is expressed as
- Equation 27 and Equation 28 results in Equation 27 and Equation 28.
- R 2 R 3 may be replaced with Equation 29 by Equation 27.
- Equation 30 is expressed.
- R D R b may be represented as Equation 31
- R 2 R 3 may be represented as Equation 32.
- Equation 30 may be expressed as Equation 33.
- Equation 33 unknown parameters to be calculated are R 11_b , R 12_b , R 13_b , R 21 _ b , R 22_b , R 23_b , R 31_b , R 32_b , R 33_b , r 11-2 , r 12 18 configurations , such as -2 , r 13-2 , r 21-2 , r 22-2 , r 23-2 , r 31-2 , r 32-2 , r 33-2, and one configuration 9 equations are generated from the optical tracker 210 and at least one of the camera 220 and the pattern board 230 at least one time or at least one of the camera 220 and the pattern board 230. The equation can be solved at least once.
- the spatial coordinates of the first and second markers 240 and 250 tracked by the optical tracker 210 are changed at least once so that the processor can be changed from the spatial coordinates of the at least two optical trackers 210 to the processor.
- the spatial coordinates of the origin of the pattern board 230 are stored from the spatial coordinates of the first and second markers 240 and 250 and the spatial coordinates of the optical centers of the at least two cameras 220, the first through the processor.
- the spatial coordinates of the optical center of the camera 220 are calculated by calculating the spatial coordinates of the optical center of the camera 220 from the marker 240 and the coordinates of the origin of the pattern board 230 from the second marker 250. You can correct it.
- the registration method of the camera 220 for augmented reality of the surgical navigation system is tracked by the optical tracker 210 after attaching the second marker 250 to the pattern board 230.
- the coordinates of the optical center of the camera 220 are calculated by changing the spatial coordinates of the first and second markers 240 and 250 at least twice, so that the camera 220 optical from the spatial coordinates of the second marker 250 is calculated. Make sure to correct the spatial coordinates of the center.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Theoretical Computer Science (AREA)
- Heart & Thoracic Surgery (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Robotics (AREA)
- Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Biomedical Technology (AREA)
- General Physics & Mathematics (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Length Measuring Devices By Optical Means (AREA)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US14/402,370 US9773312B2 (en) | 2012-05-25 | 2013-05-27 | Method of registrating a camera of a surgical navigation system for an augmented reality |
Applications Claiming Priority (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| KR20120055913 | 2012-05-25 | ||
| KR10-2012-0055913 | 2012-05-25 | ||
| KR20130059435A KR101427730B1 (ko) | 2012-05-25 | 2013-05-27 | 수술용 내비게이션 시스템의 증강현실을 위한 카메라 레지스트레이션 방법 |
| KR10-2013-0059435 | 2013-05-27 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2013176525A1 true WO2013176525A1 (fr) | 2013-11-28 |
Family
ID=49624132
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/KR2013/004594 Ceased WO2013176525A1 (fr) | 2012-05-25 | 2013-05-27 | Procédé de repérage de caméra conférant une réalité augmentée à un système de navigation pour une intervention chirurgicale |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2013176525A1 (fr) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10788672B2 (en) | 2016-03-01 | 2020-09-29 | Mirus Llc | Augmented visualization during surgery |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR20070050878A (ko) * | 2004-05-28 | 2007-05-16 | 내셔널 유니버시티 오브 싱가포르 | 양방향 시스템 및 그에 관한 방법 |
| JP2010287174A (ja) * | 2009-06-15 | 2010-12-24 | Dainippon Printing Co Ltd | 家具シミュレーション方法、装置、プログラム、記録媒体 |
| KR20110006360A (ko) * | 2009-07-14 | 2011-01-20 | 한국생산기술연구원 | 3차원 입체 컬러 영상 획득을 위한 영상 정합 방법 및 장치 |
| JP2011224266A (ja) * | 2010-04-22 | 2011-11-10 | Tokyo Univ Of Agriculture & Technology | 超音波診断システム及び超音波診断・治療システム |
-
2013
- 2013-05-27 WO PCT/KR2013/004594 patent/WO2013176525A1/fr not_active Ceased
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR20070050878A (ko) * | 2004-05-28 | 2007-05-16 | 내셔널 유니버시티 오브 싱가포르 | 양방향 시스템 및 그에 관한 방법 |
| JP2010287174A (ja) * | 2009-06-15 | 2010-12-24 | Dainippon Printing Co Ltd | 家具シミュレーション方法、装置、プログラム、記録媒体 |
| KR20110006360A (ko) * | 2009-07-14 | 2011-01-20 | 한국생산기술연구원 | 3차원 입체 컬러 영상 획득을 위한 영상 정합 방법 및 장치 |
| JP2011224266A (ja) * | 2010-04-22 | 2011-11-10 | Tokyo Univ Of Agriculture & Technology | 超音波診断システム及び超音波診断・治療システム |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10788672B2 (en) | 2016-03-01 | 2020-09-29 | Mirus Llc | Augmented visualization during surgery |
| US11275249B2 (en) | 2016-03-01 | 2022-03-15 | Mirus Llc | Augmented visualization during surgery |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2016043560A1 (fr) | Système de suivi optique, et procédé de correspondance de coordonnées pour système de suivi optique | |
| KR101427730B1 (ko) | 수술용 내비게이션 시스템의 증강현실을 위한 카메라 레지스트레이션 방법 | |
| WO2019132427A1 (fr) | Appareil de projection de cible laser et son procédé de commande, et système d'induction de chirurgie au laser comprenant un appareil de projection de cible laser | |
| WO2019132614A1 (fr) | Procédé et appareil de segmentation d'image chirurgicale | |
| WO2015183049A1 (fr) | Système de poursuite optique et procédé de calcul de posture de partie marqueur dans un système de poursuite optique | |
| EP2943835A1 (fr) | Afficheur facial réalisant un étalonnage du regard, et procédé de commande associé | |
| JP2004004043A (ja) | カメラ補正装置 | |
| WO2021153973A1 (fr) | Dispositif de fourniture d'informations de chirurgie robotique de remplacement d'articulation et son procédé de fourniture | |
| WO2019135437A1 (fr) | Robot de guidage et son procédé de fonctionnement | |
| WO2010058927A2 (fr) | Dispositif pour photographier un visage | |
| WO2017073924A1 (fr) | Système et procédé pour déterminer l'emplacement d'un appareil de travail sous-marin à l'aide de la ligne de soudure d'une structure sous-marine | |
| JP7359633B2 (ja) | ロボットシステム | |
| WO2018214147A1 (fr) | Procédé et système d'étalonnage de robot, robot et support de stockage | |
| WO2011084012A2 (fr) | Procédé d'estimation et de correction de localisation par satellite d'un robot mobile utilisant des points de repère magnétiques | |
| WO2024096691A1 (fr) | Procédé et dispositif d'estimation de coordonnées gps de multiples objets cibles et de suivi d'objets cibles sur la base d'informations d'image de caméra concernant un véhicule aérien sans pilote | |
| WO2011065697A2 (fr) | Dispositif robotique permettant de mesurer des ondes de pouls à l'aide d'un télémètre à laser et procédé de mesure d'ondes de pouls utilisant ce dispositif | |
| WO2020075954A1 (fr) | Système et procédé de positionnement utilisant une combinaison de résultats de reconnaissance d'emplacement basée sur un capteur multimodal | |
| WO2013176525A1 (fr) | Procédé de repérage de caméra conférant une réalité augmentée à un système de navigation pour une intervention chirurgicale | |
| WO2021215843A1 (fr) | Procédé de détection de marqueur d'image buccale, et dispositif d'adaptation d'image buccale et procédé utilisant celui-ci | |
| WO2018182053A1 (fr) | Procédé et appareil d'acquisition d'informations sur la forme d'un objet | |
| WO2019074201A1 (fr) | Dispositif d'imagerie à rayons x, détecteur de rayons x et système d'imagerie à rayons x | |
| WO2011034343A2 (fr) | Procédé de mesure de la quantité physique d'un objet à l'aide d'une source lumineuse unique et d'une unité de capteur à surface plate et système de golf virtuel utilisant ledit procédé | |
| WO2020235784A1 (fr) | Procédé et dispositif de détection de nerf | |
| WO2025053337A1 (fr) | Procédé et système de planification de chirurgie automatique basée sur une image 2d | |
| WO2022014805A1 (fr) | Procédé de construction de plate-forme mobile d'espace virtuel par le biais d'une estimation de coordonnées 3d par covariance croisée |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13793892 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 14402370 Country of ref document: US |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 13793892 Country of ref document: EP Kind code of ref document: A1 |