WO2014100950A1 - Système d'imagerie tridimensionnelle et dispositif de balayage de poche pour l'imagerie tridimensionnelle - Google Patents
Système d'imagerie tridimensionnelle et dispositif de balayage de poche pour l'imagerie tridimensionnelle Download PDFInfo
- Publication number
- WO2014100950A1 WO2014100950A1 PCT/CN2012/087331 CN2012087331W WO2014100950A1 WO 2014100950 A1 WO2014100950 A1 WO 2014100950A1 CN 2012087331 W CN2012087331 W CN 2012087331W WO 2014100950 A1 WO2014100950 A1 WO 2014100950A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- scanning device
- handheld scanning
- handheld
- data processing
- placement information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0082—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes
- A61B5/0088—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes for oral or dental tissue
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0062—Arrangements for scanning
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0073—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence by tomography, i.e. reconstruction of 3D images from 2D projections
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/06—Devices, other than using radiation, for detecting or locating foreign bodies ; Determining position of diagnostic devices within or on the body of the patient
- A61B5/065—Determining position of the probe employing exclusively positioning means located on or in the probe, e.g. using position sensors arranged on the probe
- A61B5/067—Determining position of the probe employing exclusively positioning means located on or in the probe, e.g. using position sensors arranged on the probe using accelerometers or gyroscopes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2560/00—Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
- A61B2560/04—Constructional details of apparatus
- A61B2560/0431—Portable apparatus, e.g. comprising a handle or case
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/107—Measuring physical dimensions, e.g. size of the entire body or parts thereof
- A61B5/1077—Measuring of profiles
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/107—Measuring physical dimensions, e.g. size of the entire body or parts thereof
- A61B5/1079—Measuring physical dimensions, e.g. size of the entire body or parts thereof using optical or photographic means
Definitions
- THREE-DIMENSIONAL IMAGING SYSTEM AND HANDHELD SCANNING DEVICE FOR THREE-DIMENSIONAL IMAGING
- This invention relates to three-dimensional (3D) imaging of physical object. More specifically, it relates to 3D imaging by handheld camera device.
- 3D imaging technology has been playing an increasingly important role for a wide range of applications, including the production of movies and video games, computer-aided industrial design, orthotics and prosthetics, reverse engineering and prototyping, quality control and inspection, documentation of cultural artifacts, and the like.
- electronic handheld 3D camera product configured with imaging optics have been developed. Due to its portability, such camera devices can be easily handled by the user to analyze a real-world obj ect or environment as needed.
- 3D camera product is designed to collect data on the shape of the object and possibly its appearance, which is then recorded as data points within three-dimensional space. Once a point cloud of geometric samples on the surface of the subject has been obtained, these points can then be used to extrapolate the shape of the subject (a process called reconstruction), for example be converted into a triangulated mesh and then a computer-aided design model. For most situations, it requires multiple scans from many different directions to produce a complete model of the object. These scans have to be brought into a common reference system, a process that is usually called alignment or registration, and then merged to create a complete model.
- a three-dimensional (3D) imaging system comprising: a data processing unit; a handheld scanning device for acquiring a plurality of two-dimensional (2D) image sets of an object from a plurality of perspectives, one 2D image set corresponding to one perspective; wherein the handheld scanning device is equipped with an assistant sensing unit which is configured to provide placement information of the handheld scanning device, and wherein said placement information is used by the data processing unit to control the acquisition of 2D images sets and the stitching of 3D views from each 2D image set so as to obtain a global surface of the object.
- said assistant sensing unit comprises at least one sensor adapted to provide signals related to at least one degree of freedom of the handheld scanning device when each 2D image set is acquired; and a sensor controller adapted to generate said placement information from the signals.
- the placement information of the handheld scanning device is generated as the position and orientation of the image sensing unit provided inside the handheld scanning device when each 2D image set is acquired.
- said at least one sensor is selected from the group consisting of accelerometer, gyroscope, magnetometer and GPS.
- the 2D image sets are acquired by the image sensing unit with structured light projected onto the object by the illuminating unit provided inside the handheld scanning device.
- the structured light is of sinusoidal pattern or multi-line pattern.
- the data processing unit is configured to use said placement information to determine the arrangement of one 3D view relative to another for the stitching, wherein the two 3D views are successively acquired.
- the data processing unit when an iterative algorithm is used by the data processing unit to refine the arrangement of a new 3D view relative to an already stitched 3D view, the data processing unit is configured to use said placement information to determine the initial arrangement of said new 3D view.
- 3D views is stitched sequentially or simultaneously.
- the data processing unit is configured to determine whether the handheld device is stable based on said placement information and trigger the acquisition of 2D image sets when the handheld device is determined as stable.
- the data processing unit is configured to detect the unstable motions of the handheld device based on said placement information and discard the 2D image sets acquired during those unstable motions.
- the handheld scanning device is an intra-oral camera.
- the obj ect is the j aw, tooth or gum of a patient, an implant or a preparation.
- the image sensing unit is a charge-coupled device (CCD) or a Complementary metal-oxide-semiconductor (CMOS) unit.
- CCD charge-coupled device
- CMOS Complementary metal-oxide-semiconductor
- a handheld scanning device for 3D imaging, wherein the handheld scanning device is configured to acquire a plurality of two-dimensional (2D) image sets of an object from a plurality of perspectives, one 2D image set corresponding to one perspective, and wherein the handheld scanning device is equipped with an assistant sensing unit which is configured to provide placement information of the handheld scanning device, said placement information is used to control the acquisition of 2D image sets and the stitching of 3D views converted from each 2D image set so as to obtain a global surface of the object.
- 2D two-dimensional
- This invention introduces the use of additional sensors in the handheld 3D camera product to provide orientation and/or position information of the image acquisition device relative to the object.
- the 3D imaging system according to the present invention may significantly increase the efficiency and reliability in generating the stereo vision of object surface through the matching and merging of images taken from random views. This way, more accurate results could be obtained with less computational intensity.
- Fig. l illustrates the functional block diagram of the 3D imaging system according to one embodiment of the invention.
- Fig.2 illustrates the relationship between the world coordinates system and the handpiece coordinate system and the six parameters describing the 3D transform.
- Fig.3 provides schematic illustrations of various placement sensors and the relationship between the sensor output and placement information.
- Fig.4 illustrates an example workflow of stitching between 2 views
- Fig.5 illustrates comparison of the merging results with and without the device placement information according to one embodiment of the invention.
- the methods and arrangements of the present disclosure may be illustrated by being used in dental implant imaging.
- teaching of the present disclosure can be applied to the 3D imaging product with similar architecture, where the generation of the object surface in three dimensions is based on a sequence of images acquired at arbitrary orientation relative to the object.
- Fig. l illustrates the functional block diagram of a 3D imaging system according to one embodiment of the invention.
- the 3D imaging system provided in the present invention may be used to analyzes a real-world object or environment to collect data on its shape and possibly its appearance.
- the 3D imaging system may comprise a handheld scanning device 10 for acquiring a plurality of two-dimensional (2D) image sets of an object from a plurality of perspectives.
- This kind of handheld scanning device also termed as handpiece herein for simplicity, could be configured into any shape, as long as it could provide the user with sufficient convenience in its moving and operating.
- the various 2D image sets could be acquired using structured light projected onto the object by the illuminating unit 103 provided inside the handheld scanning device. Projecting a narrow band of light onto a three-dimensionally shaped surface may produce a hne of illumination that appears distorted from other perspectives than that of the projector, and can be used for an exact geometric reconstruction of the surface shape.
- the reflected light would be received by the image sensing unit 102, which for example could be implemented as a charge-coupled device (CCD) or a Complementary metal-oxide-semiconductor (CMOS).
- CCD charge-coupled device
- CMOS Complementary metal-oxide-semiconductor
- an image is projected through a lens onto the capacitor array within the image sensing unit 102, causing the accumulation of an electric charge proportional to the light intensity at that location, a two-dimensional array, used in video and still cameras, captures a two-dimensional picture corresponding to the scene projected onto the focal plane of the sensing unit.
- the structured light could be of sinusoidal pattern or multi-line pattern. Seen from different viewpoints, the pattern appears geometrically distorted due to the surface shape of the object.
- Each image set may be acquired at an arbitrary orientation relative to the object and contains a fraction of the surface information to be constructed.
- the various captured 2D image sets will then be transported to the data processing unit 20 for further conversion.
- the data processing unit 20 may be implemented as, for example, any suitable processing platform, computing platform, computing device, processing device, computing system, processing system, computer, processor, or the like, and may be implemented using any suitable combination of hardware and/or software. It is to be understood that though the data processing unit 20 is illustrated or described as separate from the handheld scanning device, this is not necessary. In another embodiment of the present invention, for example, this processing unit could be integrated into the handheld scanning device. In yet another embodiment of the present invention, part of the functions as described with respect to the data processing unit 20 could be carried out by an additional processor within the handheld part.
- each 2D image set is converted into a 3D data structure in the data processing unit 20, forming a 3D view from one particular perspective, while the data processing unit 20 is further configured to match and merge these 3D views in one common reference coordinate system so as to provide a global surface for the object of interest.
- the reference coordinate system could be Cartesian coordinate system with origin and axis directions fixed relative to the space in which the object, the user and the imaging system are located. The orientation of this coordinate system could be defined relative to the first set of images being captured.
- each 3D view is acquired by placing the handheld device at a different position and orientation.
- the arrangement of each 3D view relative to each other can be determined by using reference features on the surface being scanned.
- the assistant sensing unit 101 within the handheld scanning device 10 is configured to record placement information of device 10 when each 2D image set is acquired. With the help of this information about the perspective of each 3D view, it may cut off huge amount of computational burden for the data processing unit 20 and raise up the result generation speed.
- the sensor controller could be the ITG-3200 from InvenSense
- the assistant sensing unit 101 would be the gyroscope component on the integrated circuit.
- the image sensing unit 102 could be composed of imaging optics (lenses) and a VITA 1300 from ON Semiconductors, which is a 1.3 Megapixel 150 FPS Global Shutter CMOS Image Sensor.
- the illuminating unit 103 could be composed of Red, Green and
- Blue LEDs combined with dielectric mirrors and concentrated with a micro-lens array onto a DMD array, following by imaging optics (lenses) for imaging on the teeth surface.
- the assistant sensing unit comprises at least one sensor adapted to provide signals related to at least one degree of freedom of the handheld scanning device 10 when each 2D image set is acquired; and a sensor controller adapted to generate the placement information from the signals .
- the sensors incorporated in the assistant sensing unit could be a piece of hardware embedded onto the handheld scanning device, which delivers signal amplitude, like voltage, intensity, when a set of 2D images is acquired. This signal can be converted by the sensor controller into placement information. In other embodiment of the present invention, the signals from sensors could also be communicated to the data processing unit for the conversion together with the processing of image data, or any other processing module within the whole system.
- placement of the handheld scanning device 10 is fully determined by 6 parameters: 3 rotations and 3 translations.
- One sensor might not provide all the placement information.
- Several sensors might be combined, where one or more position sensors are used to acquire the position data in three translational degrees of freedom, and one or more orientation sensors acquire the angular data in three rotational degrees of freedom. Thus, sufficient data (even redundant in some cases) would be obtained to determine the placement of the handheld device in a 3D space.
- Fig.2 shows a typical definition of the 6 parameters which describe the position of the handpiece relative to the world coordinate system.
- the world coordinate system is an arbitrary reference system, which doesn't undergo acceleration during the experiment (it is also called an inertial coordinate system or Lorentz coordinate system). It is not unique and the world coordinate system labeled X, Y, Z has been arbitrarily chosen.
- the 6 parameters which control the position of the handpiece relative to the world coordinate system X, Y, Z are the three coordinates (x, y, z) defining the system translation plus the roll, pitch and yaw defining the system rotation.
- the handpiece in the "reference position” gives a placement example when all 6 parameters are set to 0. This location is defined by factory during system calibration and on system startup.
- the handpiece in the "actual position” gives an arbitrary placement example when the user holds the system.
- a three-axis micro-electromechanical accelerometer may be utilized as the position sensor. By integrating acceleration twice over time, it is possible to obtain relative 3D position information during the acquisition sequence.
- a three-axis gyroscope may be employed as the orientation sensor, which measures relative orientation to an internal element spinning fast to give it inertia. System orientation can also be obtained using a magnetometer, which measures the direction of the earth magnetic field as it is a constant vector during the acquisition sequence.
- a GPS system could be incorporated to provide position information.
- radar images are captured from airplanes which use the GPS navigation system to locate the plane and track it. This location could be similarly used when assembling all 3D views into a surface map.
- the sensor controller inside the assistant sensing unit may extract the position data and orientation data from the sensed position signal and orientation signals respectively, and derive coordinates from geometric transformations of the position data and orientation data.
- the obj ect being scanned does not move during the image acquisition, but also, the system could handle a tolerance of 10 degree for the object movement around its axis of symmetry.
- the sensors embedded for obtaining the placement information of the handheld scanning device could be arranged close to the image sensing unit 102. It is to be easily understood that the displacement information of the handheld scanning device when the images are captured would more precisely be determined as the image sensing unit 102 where the images are actually sensed.
- the position of the image sensing unit 102 relative to the assistant sensing unit 101 could be predetermined at the manufacture and stored. When calculating the position and orientation with respect to where the images are taken based on the signals from the assistant sensing unit, it would be advantageous to take said position information about the image sensing unit 102 into consideration, which helps to generate a more accurate results. Without this placement information, surface matching may need searching in a large space of 6 degrees of freedom, which requires computing time and might be inaccurate.
- the displacement information of the scanning device when the 2D images are captured then may be stored in association with each corresponding 3D views for use in their matching and merging.
- Fig.4 provides an example of a workflow for stitching two views together with the placement information provided by the assistant sensing unit.
- the generation of candidate relative transform Tl, T2, T3 as illustrated in Fig.4, i.e., different combination results of the two 3D views (i.e., vl and v2 in Fig.4), may use standard computer vision technique as follow: (i) computation of keypoint locations, (ii) computation of feature descriptor at each keypoint location, (in) creation of a list of corresponding features between both views, (iv) computation of transforms using the list of corresponding features.
- a keypoint is a position defining a region of interest as detected by an algorithm.
- a feature descriptor is a vector describing the neighborhood of the keypoint.
- the feature descriptor is a compact, local representation of the object which is ideally unique, invariant to any 3D transform (in term of translation, rotation, scaling and the like) for unambiguous matching.
- Examples of algorithms to compute keypoints and feature descriptors in 3D are Normal Aligned Radial Feature (NARF) or Point Feature Histogram (PFH) or the Fast PFH (FPFH) detectors.
- NARF Normal Aligned Radial Feature
- PFH Point Feature Histogram
- FPFH Fast PFH
- the computation of feature correspondences between views is usually based on a search of the N closest features in the other view for each feature in the first, the second or both views together, where N denotes a natural non-null integer.
- the computation of transforms from the list of correspondences is frequently based on a RANdom SAmple Consensus (RANSAC), which is an iterative random search of the largest group of correspondences describing the same transformation.
- RANSAC
- RANSAC is detecting transforms satisfying the largest number of correspondences, it will frequently select transforms which only place both surfaces on top of each other, regardless of the local curvature. This happens because there exist several incorrect correspondences generated from non-unique feature descriptors (i.e. incorrect pairing of features).
- the correct transform usually has only a small overlap with the other views, thus limiting the number of feature correspondences which comply with the correct transform. Incorrect transforms should be removed or ranked low quickly to avoid spending computational resources on them. Therefore, to distinguish between the generated transforms, the use of external data from position sensors is helpful.
- This transform V can be compared to candidate transforms from the ANSAC algorithm to reject or reduce the rank of incompatible transforms, as described below.
- the transform evaluation from Fig 4 usually involves the Iterative Closest Point (ICP) algorithm which is sensitive to the initial candidate transform. ICP iteratively performs the following operations until a convergence criterion is met: (i) selection of a subset of points, (ii) computation of point correspondences on the other 3D view, (in) correspondence weighting and outlier rejection, (iv) assignation of an error metric, (v) minimization of the error for that set of correspondences until a second convergence criterion is met. Because of its iterative nature, the ICP algorithm is the most time consuming part of the stitching process, especially if the initial candidate is wrong, i.e. more iterations will be performed than for the correct initial candidate.
- ICP Iterative Closest Point
- Matrix V is completely determined for the rigid transform and can directly be used as a candidate transform.
- the plurality of 3D views could be stitched sequentially or simultaneously.
- the data processing unit 20 may be configured to use said placement information to determine the arrangement of one 3D view relative to another where the two 3D views are successively acquired.
- Sequential stitching means that the new acquired view is only matched against the previously acquired views (not against the future views).
- a simultaneous stitching would be done after all views have been acquired, to increase the likelihood of a view to be successfully stitched. Each view would be tested against the entire dataset. It could be expected that simultaneous stitching has a higher stitching success rate than the sequential one and will be less interactive.
- an iterative algorithm may be used by the data processing unit to refine the arrangement of a new 3D view relative to an already stitched 3D view.
- iterative closest point ICP
- the data processing unit 20 may be configured to use said placement information to determine the initial arrangement of said new 3D view for those iterative algorithms, without introducing undesired errors at the very beginning.
- the data processing unit may be further configured to determine whether the handheld device is stable based on said placement information and trigger the acquisition of 2D image sets automatically when the handheld device is determined as stable.
- many handheld scanning devices commercially available may be configured with a "capture" button for the customer to start the operation.
- the user is not actually ready when they push the start button, rendering the device in an unstable status.
- it may even be intended to automatically trigger the capture.
- the assistant sensing unit may start to collect the placement information of the handheld device at a certain interval in response to an internal command. The information provided by the assistant sensing unit could be very helpful in determining when the system is stable and automatically trigger the capture accordingly.
- the data processing unit may be configured to detect the unstable motions of the handheld device based on said placement information and discard the 2D image sets acquired during those unstable motions.
- the operator might move his hand during the capture of a set of 2D images from one perspective, which may have a negative impact on the generation of a 3D view.
- the assistant sensing unit Based on the information provided by the assistant sensing unit, it can be detected in the data processing unit if there exist jitter or bigger move during one capture and discard these data automatically, which may avoid much unnecessary effort in calculating on the wrong data.
- a global surface of the object such as the jaw, tooth or gum of a patient, an implant or a preparation in dental application, would be shown to the user on display 30 for further analysis.
- Available prior information from the assistant sensing unit helps reduce the computation and speed up the display of a more accurate result.
- Fig.5 illustrates the merging results with and without the device placement information according to one embodiment of the invention. It is to be noted that the example in Fig.5 is only shown in 2D for the sake of simplicity, which is still enough to provide a clear vision of the benefits from the present invention.
- the example illustrated in Fig.5 is the application of the 3D imaging system in dental field, where the goal is to reconstruct tooth surface from images acquired by the handheld scanning device operated by the dentist.
- the terms “plurality” and “a plurality” as used herein may include, for example, “multiple” or “two or more”.
- the terms “plurality” or “a plurality” may be used throughout the specification to describe two or more components, devices, elements, units, parameters, or the like.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Heart & Thoracic Surgery (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Physics & Mathematics (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Human Computer Interaction (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
L'invention concerne un système d'imagerie tridimensionnelle (3D) comprenant: une unité de traitement des données; un dispositif de balayage de poche pour acquérir une pluralité de jeux d'images bidimensionnelles (2D) d'un objet à partir d'une pluralité de perspectives, un jeu d'images 2D correspondant à une perspective; où le dispositif de balayage de poche est équipé d'une unité de détection assistante qui est configurée pour fournir les informations de placement du dispositif de balayage de poche et où lesdites informations de placement sont utilisées par l'unité de traitement des données pour contrôler l'acquisition des jeux d'images 2D et la suture des vues 3D est convertie à partir de chaque jeu d'images 2D pour obtenir une surface globale de l'objet. L'invention fournit aussi un dispositif de balayage de poche pour l'imagerie 3D.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/CN2012/087331 WO2014100950A1 (fr) | 2012-12-24 | 2012-12-24 | Système d'imagerie tridimensionnelle et dispositif de balayage de poche pour l'imagerie tridimensionnelle |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/CN2012/087331 WO2014100950A1 (fr) | 2012-12-24 | 2012-12-24 | Système d'imagerie tridimensionnelle et dispositif de balayage de poche pour l'imagerie tridimensionnelle |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2014100950A1 true WO2014100950A1 (fr) | 2014-07-03 |
Family
ID=51019639
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/CN2012/087331 Ceased WO2014100950A1 (fr) | 2012-12-24 | 2012-12-24 | Système d'imagerie tridimensionnelle et dispositif de balayage de poche pour l'imagerie tridimensionnelle |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2014100950A1 (fr) |
Cited By (15)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2016164238A1 (fr) * | 2015-04-10 | 2016-10-13 | 3M Innovative Properties Company | Dispositif d'exposition à un rayonnement lumineux à usage dentaire |
| ITUB20160307A1 (it) * | 2016-01-18 | 2017-07-18 | Gabriel Maria Scozzarro | Dispositivo per la ricostruzione tridimensionale di organi del corpo umano |
| EP3186783A4 (fr) * | 2014-08-27 | 2018-01-17 | Carestream Dental Technology Topco Limited | Remaillage automatique de surfaces 3-d |
| CN107644454A (zh) * | 2017-08-25 | 2018-01-30 | 欧阳聪星 | 一种图像处理方法及装置 |
| WO2018053046A1 (fr) * | 2016-09-14 | 2018-03-22 | Dental Imaging Technologies Corporation | Capteur d'imagerie multi-dimensionnel à détection d'état d'erreur |
| CN107909609A (zh) * | 2017-11-01 | 2018-04-13 | 欧阳聪星 | 一种图像处理方法及装置 |
| US10213180B2 (en) | 2016-09-14 | 2019-02-26 | Dental Imaging Technologies Corporation | Multiple-dimension imaging sensor with operation based on magnetic field detection |
| US10220172B2 (en) | 2015-11-25 | 2019-03-05 | Resmed Limited | Methods and systems for providing interface components for respiratory therapy |
| JP2019532790A (ja) * | 2016-09-28 | 2019-11-14 | クレヴァーデント エルティディ. | カメラ付きデンタル吸引器 |
| WO2020042943A1 (fr) * | 2018-08-27 | 2020-03-05 | Shanghai United Imaging Healthcare Co., Ltd. | Système et procédé de détermination d'un point cible pour une biopsie par aspiration |
| US10853957B2 (en) * | 2015-10-08 | 2020-12-01 | Carestream Dental Technology Topco Limited | Real-time key view extraction for continuous 3D reconstruction |
| US10890444B2 (en) | 2015-05-15 | 2021-01-12 | Tata Consultancy Services Limited | System and method for estimating three-dimensional measurements of physical objects |
| US10932733B2 (en) | 2016-09-14 | 2021-03-02 | Dental Imaging Technologies Corporation | Multiple-dimension imaging sensor with operation based on movement detection |
| CN113613578A (zh) * | 2019-01-24 | 2021-11-05 | 皇家飞利浦有限公司 | 一种确定手持设备相对于对象的位置和/或取向的方法、相应的装置和计算机程序产品 |
| CN113824946A (zh) * | 2020-06-18 | 2021-12-21 | 和硕联合科技股份有限公司 | 电子手写笔 |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN1900741A (zh) * | 2005-11-18 | 2007-01-24 | 北京航空航天大学 | 高光谱全偏振三维成像集成探测系统 |
| US20090087050A1 (en) * | 2007-08-16 | 2009-04-02 | Michael Gandyra | Device for determining the 3D coordinates of an object, in particular of a tooth |
| US20100128109A1 (en) * | 2008-11-25 | 2010-05-27 | Banks Paul S | Systems And Methods Of High Resolution Three-Dimensional Imaging |
| CN202218880U (zh) * | 2011-07-27 | 2012-05-16 | 深圳市恩普电子技术有限公司 | 超声三维成像探头 |
-
2012
- 2012-12-24 WO PCT/CN2012/087331 patent/WO2014100950A1/fr not_active Ceased
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN1900741A (zh) * | 2005-11-18 | 2007-01-24 | 北京航空航天大学 | 高光谱全偏振三维成像集成探测系统 |
| US20090087050A1 (en) * | 2007-08-16 | 2009-04-02 | Michael Gandyra | Device for determining the 3D coordinates of an object, in particular of a tooth |
| US20100128109A1 (en) * | 2008-11-25 | 2010-05-27 | Banks Paul S | Systems And Methods Of High Resolution Three-Dimensional Imaging |
| CN202218880U (zh) * | 2011-07-27 | 2012-05-16 | 深圳市恩普电子技术有限公司 | 超声三维成像探头 |
Cited By (29)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP3186783A4 (fr) * | 2014-08-27 | 2018-01-17 | Carestream Dental Technology Topco Limited | Remaillage automatique de surfaces 3-d |
| US10758126B2 (en) | 2015-04-10 | 2020-09-01 | 3M Innovative Properties Company | Dental irradiation device |
| WO2016164238A1 (fr) * | 2015-04-10 | 2016-10-13 | 3M Innovative Properties Company | Dispositif d'exposition à un rayonnement lumineux à usage dentaire |
| US10890444B2 (en) | 2015-05-15 | 2021-01-12 | Tata Consultancy Services Limited | System and method for estimating three-dimensional measurements of physical objects |
| US10853957B2 (en) * | 2015-10-08 | 2020-12-01 | Carestream Dental Technology Topco Limited | Real-time key view extraction for continuous 3D reconstruction |
| US11791042B2 (en) | 2015-11-25 | 2023-10-17 | ResMed Pty Ltd | Methods and systems for providing interface components for respiratory therapy |
| US11103664B2 (en) | 2015-11-25 | 2021-08-31 | ResMed Pty Ltd | Methods and systems for providing interface components for respiratory therapy |
| US10220172B2 (en) | 2015-11-25 | 2019-03-05 | Resmed Limited | Methods and systems for providing interface components for respiratory therapy |
| ITUB20160307A1 (it) * | 2016-01-18 | 2017-07-18 | Gabriel Maria Scozzarro | Dispositivo per la ricostruzione tridimensionale di organi del corpo umano |
| WO2018053046A1 (fr) * | 2016-09-14 | 2018-03-22 | Dental Imaging Technologies Corporation | Capteur d'imagerie multi-dimensionnel à détection d'état d'erreur |
| CN109982642A (zh) * | 2016-09-14 | 2019-07-05 | 登塔尔图像科技公司 | 具有故障条件检测的多维成像传感器 |
| US10390788B2 (en) | 2016-09-14 | 2019-08-27 | Dental Imaging Technologies Corporation | Multiple-dimension imaging sensor with operation based on detection of placement in mouth |
| US10932733B2 (en) | 2016-09-14 | 2021-03-02 | Dental Imaging Technologies Corporation | Multiple-dimension imaging sensor with operation based on movement detection |
| US10925571B2 (en) | 2016-09-14 | 2021-02-23 | Dental Imaging Technologies Corporation | Intra-oral imaging sensor with operation based on output of a multi-dimensional sensor |
| US10299742B2 (en) | 2016-09-14 | 2019-05-28 | Dental Imaging Technologies Corporation | Multiple-dimension imaging sensor with fault condition detection |
| US10213180B2 (en) | 2016-09-14 | 2019-02-26 | Dental Imaging Technologies Corporation | Multiple-dimension imaging sensor with operation based on magnetic field detection |
| JP2019532790A (ja) * | 2016-09-28 | 2019-11-14 | クレヴァーデント エルティディ. | カメラ付きデンタル吸引器 |
| WO2019037582A1 (fr) * | 2017-08-25 | 2019-02-28 | 欧阳聪星 | Procédé et dispositif de traitement d'image |
| TWI691933B (zh) * | 2017-08-25 | 2020-04-21 | 大陸商北京奇禹科技有限公司 | 一種圖像處理方法及裝置 |
| CN107644454B (zh) * | 2017-08-25 | 2020-02-18 | 北京奇禹科技有限公司 | 一种图像处理方法及装置 |
| US10937238B2 (en) | 2017-08-25 | 2021-03-02 | Beijing Keeyoo Technologies Co., Ltd. | Image processing method and device |
| CN107644454A (zh) * | 2017-08-25 | 2018-01-30 | 欧阳聪星 | 一种图像处理方法及装置 |
| CN107909609B (zh) * | 2017-11-01 | 2019-09-20 | 欧阳聪星 | 一种图像处理方法及装置 |
| US11107188B2 (en) | 2017-11-01 | 2021-08-31 | Beijing Keeyoo Technologies Co., Ltd | Image processing method and device |
| CN107909609A (zh) * | 2017-11-01 | 2018-04-13 | 欧阳聪星 | 一种图像处理方法及装置 |
| WO2020042943A1 (fr) * | 2018-08-27 | 2020-03-05 | Shanghai United Imaging Healthcare Co., Ltd. | Système et procédé de détermination d'un point cible pour une biopsie par aspiration |
| US12496132B2 (en) | 2018-08-27 | 2025-12-16 | Shanghai United Imaging Healthcare Co., Ltd. | System and method for determining a target point for a needle biopsy |
| CN113613578A (zh) * | 2019-01-24 | 2021-11-05 | 皇家飞利浦有限公司 | 一种确定手持设备相对于对象的位置和/或取向的方法、相应的装置和计算机程序产品 |
| CN113824946A (zh) * | 2020-06-18 | 2021-12-21 | 和硕联合科技股份有限公司 | 电子手写笔 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2014100950A1 (fr) | Système d'imagerie tridimensionnelle et dispositif de balayage de poche pour l'imagerie tridimensionnelle | |
| JP6845895B2 (ja) | 画像に基づく位置検出方法、装置、機器及び記憶媒体 | |
| EP2543483B1 (fr) | Appareil et procédé de traitement d'informations | |
| CN111442721B (zh) | 一种基于多激光测距和测角的标定设备及方法 | |
| Ait-Aider et al. | Simultaneous object pose and velocity computation using a single view from a rolling shutter camera | |
| US6094215A (en) | Method of determining relative camera orientation position to create 3-D visual images | |
| CN111951326B (zh) | 基于多摄像装置的目标对象骨骼关键点定位方法和装置 | |
| CN110230983B (zh) | 抗振式光学三维定位方法及装置 | |
| CN113379822A (zh) | 一种基于采集设备位姿信息获取目标物3d信息的方法 | |
| JP2004157850A (ja) | 運動検出装置 | |
| CN111445529B (zh) | 一种基于多激光测距的标定设备及方法 | |
| JP6969121B2 (ja) | 撮像システム、画像処理装置および画像処理プログラム | |
| CN113327291A (zh) | 一种基于连续拍摄对远距离目标物3d建模的标定方法 | |
| CN110268701B (zh) | 成像设备 | |
| Samson et al. | The agile stereo pair for active vision | |
| JP2003296708A (ja) | データ処理方法、データ処理プログラムおよび記録媒体 | |
| JPH1023465A (ja) | 撮像方法及び装置 | |
| JP2003006618A (ja) | 3次元モデルの生成方法および装置並びにコンピュータプログラム | |
| JP2009216480A (ja) | 三次元位置姿勢計測方法および装置 | |
| Castanheiro et al. | Modeling hyperhemispherical points and calibrating a dual-fish-eye system for close-range applications | |
| JPH09119819A (ja) | 三次元情報再構成装置 | |
| JP2697917B2 (ja) | 三次元座標計測装置 | |
| Urquhart | The active stereo probe: the design and implementation of an active videometrics system | |
| CN111325780B (zh) | 一种基于图像筛选的3d模型快速构建方法 | |
| Fabian et al. | One-point visual odometry using a RGB-depth camera pair |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12890704 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 12890704 Country of ref document: EP Kind code of ref document: A1 |