[go: up one dir, main page]

WO2016152288A1 - Material detection device, material detection method, and program - Google Patents

Material detection device, material detection method, and program Download PDF

Info

Publication number
WO2016152288A1
WO2016152288A1 PCT/JP2016/053810 JP2016053810W WO2016152288A1 WO 2016152288 A1 WO2016152288 A1 WO 2016152288A1 JP 2016053810 W JP2016053810 W JP 2016053810W WO 2016152288 A1 WO2016152288 A1 WO 2016152288A1
Authority
WO
WIPO (PCT)
Prior art keywords
target object
material detection
shape
information
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2016/053810
Other languages
French (fr)
Japanese (ja)
Inventor
尚子 菅野
浩平 宮本
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Publication of WO2016152288A1 publication Critical patent/WO2016152288A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/255Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures for measuring radius of curvature
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • G01N21/31Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry
    • G01N21/35Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry using infrared light
    • G01N21/3563Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry using infrared light for analysing solids; Preparation of samples therefor

Definitions

  • the present disclosure relates to a material detection device, a material detection method, and a program.
  • the basic data format has point group coordinate position, polygon size, and distance information between point groups in a three-dimensional object. Recently, in addition to these basic information, a standard that can hold color information, shadow information, and material information for each point cloud is becoming widespread.
  • the current 3D scanner detects three-dimensional shapes, but none has the function of detecting even materials. That is, in order to hold material information for each point group, the 3D data format detected by the 3D scanner must be once imported into the data editing environment, and the material must be set for each point group.
  • Patent Document 1 describes a material recognition technique that provides material information of an object included in an image captured by a user.
  • the technique described in the above-mentioned patent document receives an incident wave generated by the transmitter of the exploration radar unit in the direction of the object and receives a surface reflected wave having an amplitude lower than that of the incident wave and having the same phase.
  • the physical property information of the object is detected, and the physical property information is compared with the reference physical property information corresponding to the material of the object to recognize what the object is.
  • since physical property information is detected at a minimum specific point it is difficult to acquire the physical properties of the entire three-dimensional object.
  • a shape acquisition unit that acquires a three-dimensional shape of a target object
  • an imaging information acquisition unit that acquires imaging information obtained by imaging the target object, the three-dimensional shape and the imaging information Based on the information of the area detection unit that divides the three-dimensional shape for each area, the material detection wave irradiated for each area of the target object, and the material detection wave reflected by the target object
  • a material acquisition unit that acquires the material for each region.
  • acquiring the three-dimensional shape of the target object acquiring imaging information obtained by imaging the target object, and based on the three-dimensional shape and the imaging information, Dividing the three-dimensional shape for each region, and determining the material of the target object based on information of the material detection wave irradiated to the region of the target object and the material detection wave reflected by the target object.
  • a material detection method comprising: acquiring each time.
  • means for acquiring a three-dimensional shape of a target object means for acquiring imaging information obtained by imaging the target object, the tertiary based on the three-dimensional shape and the imaging information Means for dividing the original shape into regions, and obtaining the material of the target object for each region based on information of the material detection wave irradiated to the region of the target object and the material detection wave reflected by the target object
  • a program for causing a computer to function as a means for performing the above is provided.
  • the material of each element can be detected with a simple process for a three-dimensional object.
  • the above effects are not necessarily limited, and any of the effects shown in the present specification, or other effects that can be grasped from the present specification, together with or in place of the above effects. May be played.
  • FIG. It is a mimetic diagram showing the composition of the system concerning one embodiment of this indication. It is a schematic diagram which shows the example which took in the scissors with the system of this embodiment. It is a schematic diagram which shows a mode that the system which concerns on this embodiment detects the material of scissors for every area
  • FIG. It is a flowchart which shows the process of this embodiment. It is a schematic diagram which shows the example which implement
  • FIG. 1 is a schematic diagram showing a configuration of a system 1000 according to the present embodiment.
  • a system 1000 according to this embodiment includes a recording hardware 100 and an analysis unit (software) 200.
  • the recording hardware 100 includes a shape recognizing infrared light emitting unit (projector) 102, a shape recognizing infrared light receiving unit (sensor) 104, an RGB color recognition camera 106, a material recognizing infrared light emitting unit (projector) 108, and a material recognizing infrared light.
  • a light receiving unit (sensor) 110 is included.
  • the analysis unit 200 includes a target object shape detection unit 202, a specific point material detection unit 204, a material determination library 206, and a region-specific material allocation unit 208.
  • Each component of the analysis unit 200 shown in FIG. 1 can be composed of a central processing unit such as a CPU and software (program) for causing it to function.
  • the program can be stored in a memory (not shown) included in the recording hardware 100 or a recording medium connected from the outside.
  • the recording hardware 100 and the analysis unit 200 may be configured as an integrated device, or may be configured as separate devices.
  • the 3D scanner includes a contact type and a non-contact type, but the system 1000 of this embodiment uses a non-contact type as an example.
  • the shape detection unit 202 detects the shape and surface of the target object by irradiating the target object (three-dimensional object) with infrared light from the shape recognition infrared light emitting unit 102 and receiving the infrared light received by the shape recognition infrared light receiving unit 104.
  • the shape is estimated by obtaining the time phase difference difference between the irradiation light and the reflected light. Specifically, light is received at the timing of infrared light emission, the phase difference between the light-emitting part and the light-receiving part is calculated from the received light signal intensity, and the shape is estimated by calculating the flight time (distance) from the phase difference. To do.
  • the distance to the object is estimated by triangulation from the difference between the irradiation pattern and the reflection pattern, and the shape and depth information is acquired. Specifically, the light is received in accordance with the light emission timing, the light emission pattern and the light reception pattern are collated, and the distance is calculated from the projection angle and the incident angle based on the principle of three-stroke surveying.
  • the pattern irradiation type is considered to perform measurement at a higher speed than the type in which the laser beam is moved.
  • the type of scanning by moving the laser beam is called a ToF (Time OF Flight) method
  • the type of irradiating a pattern is also called an active stereo method. Note that light other than infrared light may be used for shape recognition.
  • the shape detection method is not particularly limited, and it is also possible to detect the shape using a method other than the above.
  • the target object shape detection unit 202 analyzes shape information from an image / video captured by the recording hardware 100. Specifically, the RGB color recognition camera 106 images a target object, analyzes shape information of the target object from information obtained by the imaging, and performs region division. By performing shape analysis at this stage, it is possible to speed up the processing when material information is later dropped into the 3D data format.
  • FIG. 2 is a schematic diagram showing an example in which the scissors 300 are captured by the system 1000 of the present embodiment.
  • the shape detection unit 202 analyzes the shape information for segmenting the metal portion and the handle portion of the scissors 300 and the color of the handle.
  • the shape detection unit 202 includes an imaging information acquisition unit 202 a that acquires imaging information obtained by imaging from the RGB color recognition camera 106.
  • the shape detection unit 202 uses color information / edge information, luminance / saturation information, contrast, depth information, background information, light reflectance, flat part (region with less texture) information, temperature, and the like as reference information.
  • An area dividing unit 202b that divides (segments) the three-dimensional shape of the target object for each area is provided.
  • the shape detection unit 202 extracts edge information of the target three-dimensional object subjected to background separation and segmentation, and performs region division based on the edge information.
  • the region dividing unit 202b divides the three-dimensional shape for each region by providing a boundary line in the three-dimensional information including the mesh information or the depth information indicating the three-dimensional shape.
  • the boundary line is set based on imaging information such as color information, edge information, luminance / saturation information, and contrast. Imaging information such as color information, edge information, luminance / saturation information, and contrast used for area division is acquired by the RGB color recognition camera 106 and acquired by the imaging information acquisition unit 202a. It is also possible for the user to input auxiliary information such as an object / background and perform segmentation by the graph cut method.
  • the target object is specified from the background by selecting the target object with a touch operation at the time of shooting. As a result, the background can be separated with high accuracy.
  • the scissors 300 are segmented into a metal portion (region A), a handle portion (region B), and a resin portion (region C) other than the handle portion.
  • region A a metal portion
  • region B a handle portion
  • region C a resin portion
  • both the region B and the region C are made of resin and have different colors.
  • the material of the specific point is measured.
  • an infrared ray for detecting the material irradiated by the infrared ray emitting unit for material recognition 108 and an infrared ray receiving unit for material recognition 110 for receiving the infrared ray are used.
  • the material is estimated by irradiating a specific point on the three-dimensional object with the infrared rays emitted by the material recognition infrared light emitting unit 108 using a known method.
  • Infrared rays emitted from the material recognizing infrared light emitting unit 108 are applied to a specific point of the target object and received by the material recognizing infrared light receiving unit 110.
  • the material detection unit 204 obtains the reflectance of the infrared light emitted by the material recognition infrared light emitting unit 108 based on the infrared light emitted by the material recognition infrared light emitting unit 108 and the infrared light received by the material recognition infrared light receiving unit 110. Then, by comparing with the reflectance held in the material determination library 206, the material of the specific point of the target object is predicted.
  • signals such as infrared rays, ultrasonic waves, and electromagnetic waves are irradiated on the object, the reflectance (response) is measured, and the material of the object is predicted by comparing the coefficients of the reflection components. Things are generally used.
  • material detection is realized by a system that irradiates a part of a target object with an infrared laser of a point light source and detects the intensity, transfer function, and attenuation degree of the reflected infrared light.
  • the material detection by the point light source irradiation the material of only the region (specific point) that can be irradiated with the infrared laser is detected, so it takes a lot of time to measure the material of the entire three-dimensional object.
  • the time required to detect the material of a specific object is proportional to the size of the object. As an example, it is known that it takes about 20 ms to detect the material of an area corresponding to 256 pixels by a conventional method. This detection time is the time required to specify an object by irradiating an object with infrared rays, measuring the reflectance, and comparing how close the material coefficient is to the reflectance.
  • the target object is divided (segmented) into regions, and material determination is performed for each region, thereby greatly reducing processing time and load.
  • the metal part is irradiated with infrared rays as an example of a material detection wave to determine that the material is metal.
  • the handle portion is irradiated with infrared rays to determine that the material is plastic or resin.
  • the material can be determined for each of the regions A, B, and C by performing material recognition once for each of the regions A, B, and C shown in FIG.
  • Each material determined at this time is held in association with the world coordinate system. In this way, only a specific part in a limited area is recognized as a material, and is used when it is dropped into a later 3D data format. Therefore, the material detection is not performed on all of the point groups showing the three-dimensional shape, and the processing can be greatly simplified.
  • the target object is estimated by material for each region, and the process of dropping into the 3D data format is performed.
  • the process of dropping into the 3D data format is performed.
  • the material is determined for each region, and the material prediction unit for each region is introduced to speed up the material prediction of the three-dimensional object.
  • the segmentation result by the shape detection unit 202 is used for this speeding up.
  • the shape detection unit 202 extracts edge information of the target three-dimensional object subjected to background separation and segmentation.
  • edge information there is a method in which Sobel, Laplacian, and Canny operators are applied to the first derivative result of an image. Further, it is possible to extract edge information such as pixel luminance value difference, saturation value difference, template matching, and the like.
  • the edge of the 3D object extracted here produces a continuous closed area when the background is separated. For example, it means that a line starting from point A has returned to point A again. In this case, it can be determined that the closed region surrounded by this line is likely to be one of the components of a certain object.
  • the material allocation unit 208 by region allocates material information of a specific point detected by the material detection unit 204 to the closed region.
  • a paint routine algorithm can be used.
  • the paint routine algorithm is an algorithm for painting a color in a certain closed space, and is used in a paint application of Windows (registered trademark) accessories. This makes it possible to have material information as if all point cloud data is painted.
  • 3D data format generally has world coordinate information, polygon size and position, texture information, etc.
  • formats that can have material information such as 3DS, FBX format, and OBJ format have been born. Since the world coordinates for each object (areas A, B, and C shown in FIG. 2) divided by segmentation can be specified, material information can be assigned to each object.
  • a format such as an AMF (Additive Manufacturing File) format that can describe information about colors, materials, and internal structures may be used based on the STL used in 3D printers.
  • AMF Additional Manufacturing File
  • FIG. 3 is a schematic diagram showing how the system 1000 according to this embodiment detects the material of the scissors 300 for each of the areas A, B, and C.
  • infrared light is emitted from the material recognition infrared light emitting unit 108 for each of the regions A, B, and C, and the infrared light is received by the material recognition infrared light receiving unit 110.
  • the material detection unit 204 detects the material for each of the regions A, B, and C. At this time, the material is detected by comparing with the material coefficient of each material (ABS resin, PLA resin, nylon, plastic, gypsum, rubber, metal, etc.) held in the material determination library 206.
  • each region A, B, and C In order to detect the material for each of the regions A, B, and C, each region A, B, and C emits light once. The material is detected by emitting light only once at a specific point in each of the areas A, B, and C, and the detected material is assigned to all of the areas. As a result, the processing time and load can be significantly reduced.
  • FIG. 4 is a flowchart showing processing in the present embodiment.
  • step S ⁇ b> 10 infrared irradiation is performed by the shape recognition infrared light emitting unit 102 and light is received by the shape recognition infrared light receiving unit 104, so that the shape detection unit 202 detects the shape of the target object.
  • step S12 imaging information by the RGB color recognition camera 106 is acquired.
  • step S14 the shape area and the classification are performed based on the imaging information.
  • background separation is performed.
  • step S18 the region and the segment of the shape based on the edge information are divided. As described above, the region division by the region dividing unit 202b is performed in steps S14 to S18.
  • step S20 the material determination library 206 is collated, and material detection is performed for each region. Therefore, in steps S22, S24, and S26, one of the areas A, B, and C is determined. If the area is A, the process proceeds to step S28, and the material of the area A is recognized by a paint routine. In the case of the area B, the process proceeds to step S30, and the material of the area B is recognized by the paint routine. In the case of the area C, the process proceeds to step S32, and the material of the area C is recognized by the paint routine. After steps S28, S30, and S32, the process proceeds to step S34 to create a 3D data format.
  • FIG. 5 is a schematic diagram illustrating an example in which the system 1000 is realized by a mobile device 1010 such as a smartphone.
  • FIG. 6 is a schematic diagram showing the configuration of the mobile device 1010.
  • the mobile device 1010 includes a display unit 114, a touch sensor 116, and an irradiation position guide unit 210 in addition to the system 1000 illustrated in FIG. 1.
  • the mobile device 1010 includes a touch panel configured by providing a touch sensor 116 on the display unit 114.
  • a captured image of the scissors 300 captured by the RGB color recognition camera 106 is displayed on the display unit 114.
  • the irradiation position guiding unit 210 is an application for guiding the irradiation position of the sensor signal to an area where material detection is desired.
  • the infrared irradiation position (mark 400 indicated by a cross in FIG. 5) by the material recognizing infrared light emitting unit 108 is displayed so as to be superimposed on the captured image displayed on the display unit 114.
  • the position of the mark 400 can be calculated from, for example, the infrared irradiation direction and the distance to the target object (scissors 300). Accordingly, the user can detect the material of the region including the position by aligning the mark 400 with an arbitrary position of the scissors 300 in the captured image. The user can inform the mobile device 1010 that the area to be detected is irradiated by operating the touch sensor 116.
  • the mobile device 1010 includes a shape recognition infrared light emitting unit 102, a shape recognition infrared light receiving unit 104, and an RGB color recognition camera 106.
  • the mobile device 1010 does not have to include the shape recognition infrared light emitting unit 102, the shape recognition infrared light receiving unit 104, and the RGB color recognition camera 106.
  • FIG. 7 is a schematic diagram illustrating an example in which the system 1000 is realized by a mobile device 1020 such as a smartphone.
  • the material recognition infrared light emitting unit 108 and the material recognition infrared light receiving unit 110 are used for detecting a material.
  • FIG. 8 is a schematic diagram showing the configuration of the mobile device 1020. As shown in FIG. 8, unlike the mobile device 1010 of FIG. 6, the mobile device 1020 does not include the material recognizing infrared light emitting unit 108 and the material recognizing infrared light receiving unit 110.
  • the mobile device 1020 includes a communication unit 118 in order to communicate with the infrared sensor unit 1030 and Bluetooth (registered trademark).
  • the infrared sensor unit 1030 includes a material recognizing infrared light emitting unit 1032, a material recognizing infrared light receiving unit 1034, and a communication unit 1036.
  • the material recognizing infrared light emitting unit 1032 and the material recognizing infrared light receiving unit 1034 correspond to the material recognizing infrared light emitting unit 108 and the material recognizing infrared light receiving unit 110 of FIGS.
  • the communication unit 1036 communicates with the mobile device 1020 via Bluetooth (registered trademark) or the like.
  • the infrared sensor unit 1030 sends information about the infrared rays irradiated by the material recognition infrared light emitting unit 1032 and the infrared rays received by the material recognition infrared light receiving unit 1034 from the communication unit 1036 to the mobile device 1020.
  • the material detection unit 204 of the mobile device 1020 acquires the infrared reflectance based on the information sent from the infrared sensor unit 1030 and compares it with the reflectance held in the material determination library 206. The material of the object is detected for each region.
  • the mobile device 1020 does not have an infrared sensor, if communication with the infrared sensor unit 1030 is possible, by sending a signal from the infrared sensor unit 1030 side to the mobile device 1020 side, It is possible to estimate the material.
  • the three-dimensional shape obtained by shape detection is divided into regions based on the imaging information obtained by the RGB color recognition camera 106, and infrared rays are irradiated to specific points in the divided regions.
  • material detection is performed for each region.
  • the material obtained by detecting the material at the specific point is assigned to the 3D data format as the material of the entire region. Therefore, it is not necessary to detect the material for each point group in the 3D data format, and the processing time and processing load can be greatly reduced.
  • a shape acquisition unit that acquires the three-dimensional shape of the target object;
  • An imaging information acquisition unit that acquires imaging information obtained by imaging the target object;
  • a region dividing unit that divides the three-dimensional shape into regions,
  • a material acquisition unit that acquires the material of the target object for each region based on the information of the material detection wave irradiated for each region of the target object and the material detection wave reflected by the target object;
  • a material detection device comprising: (2) The material detection device according to (1), wherein the material detection wave is infrared light.
  • the region dividing unit divides the three-dimensional shape for each region by providing boundary lines in three-dimensional information including mesh information or depth information indicating the three-dimensional shape. Material detection device. (4) The material detection device according to (3), wherein the region dividing unit provides the boundary line based on the imaging information. (5) The material acquisition unit further includes a region-specific material allocation unit that allocates the material of the target object acquired for each region to the three-dimensional shape data acquired by the shape acquisition unit for each region. The material detection apparatus as described in 1).
  • the material detection device according to (1) further comprising: (7) The material detection device according to (2), wherein the material acquisition unit acquires the material based on a reflectance of the infrared light irradiated on the target object in the target object.
  • Infrared reflectance for each material is held in advance, and the material acquisition unit is configured to reflect the reflectance of the infrared light applied to the target object in the target object, and the infrared reflectance for each material held in advance.
  • the shape acquisition unit acquires the three-dimensional shape based on information on a light beam applied to the target object and a light beam reflected by the target object.
  • the light beam is infrared;
  • An infrared light emitting part for shape recognition that emits infrared light for acquiring the three-dimensional shape of the target object;
  • Infrared light receiving unit for shape recognition for receiving infrared light reflected by the target object, which is irradiated by the infrared light emitting unit for shape recognition;
  • the infrared light emitting unit for shape recognition irradiates infrared rays in a predetermined pattern
  • the material detection device according to (10), wherein the shape acquisition unit acquires the three-dimensional shape by triangulation from a difference between an infrared irradiation pattern and a reflection pattern.
  • the material detection device comprising: (15) The material detection device according to (1), further including a communication unit that receives information on infrared rays irradiated on the target object and infrared rays reflected on the target object from an external infrared sensor unit.
  • a material detection method comprising: (17) means for acquiring the three-dimensional shape of the target object; Means for acquiring imaging information obtained by imaging the target object; Means for dividing the three-dimensional shape into regions based on the three-dimensional shape and the imaging information; Means for acquiring the material of the target object for each region based on the information of the material detection wave irradiated for each region of the target object and the material detection wave reflected by the target object;
  • Infrared light emitting unit for shape recognition 104 Infrared light receiving unit for shape recognition 106 RGB color recognition camera 108 Infrared light emitting unit for material recognition 110 Infrared light receiving unit for material recognition 114 Display unit 202 Shape detection unit 202a Imaging information acquisition unit 202b Area division unit 204 Material detection unit 208 Area-specific material allocation unit 1000 System 1010, 1020 Mobile device

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

Disclosed is a material detection device that is provided with: a shape acquiring unit that acquires the three-dimensional shape of an object; a picked-up image information acquiring unit that acquires picked-up image information obtained by picking up an image of the object; a region-dividing unit that divides the three-dimensional shape into regions on the basis of the three-dimensional shape and the picked-up image information; and a material acquiring unit that acquires, by each of the regions, material of the object on the basis of information of material detection waves with which each of the regions of the object is irradiated, and information of material detection waves reflected by the object. With such configuration, material of each element of the three-dimensional object can be detected with the simple processing.

Description

材質検出装置、材質検出方法及びプログラムMaterial detection device, material detection method and program

 本開示は、材質検出装置、材質検出方法及びプログラムに関する。 The present disclosure relates to a material detection device, a material detection method, and a program.

 近時においては、3Dプリンタの市場拡大と普及が進んでいる。特に最近では、複数色、複数素材の同時印刷等が可能になっている。同時に3Dプリンタ向けの様々なデータフォーマット規格が進化している。基本的なデータフォーマットは三次元形状の物体における点群の座標位置・ポリゴンサイズ・点群間の距離情報を持つ。最近ではこれらの基本情報に加え、色情報・影情報・材料情報を点群ごとに保持できる規格も普及しつつある。 Recently, the market expansion and popularization of 3D printers is progressing. Particularly recently, simultaneous printing of a plurality of colors and a plurality of materials has become possible. At the same time, various data format standards for 3D printers are evolving. The basic data format has point group coordinate position, polygon size, and distance information between point groups in a three-dimensional object. Recently, in addition to these basic information, a standard that can hold color information, shadow information, and material information for each point cloud is becoming widespread.

 一方で、現行の3Dスキャナは三次元形状を検出するが、材質まで同時に検出する機能を持ち合わせたものは無い。すなわち、点群ごとに材料情報を保持するためには、3Dスキャナによって形状検出された3Dデータフォーマットを一旦データ編集環境に取り込み、点群ごとに材質を設定しなければならない。 On the other hand, the current 3D scanner detects three-dimensional shapes, but none has the function of detecting even materials. That is, in order to hold material information for each point group, the 3D data format detected by the 3D scanner must be once imported into the data editing environment, and the material must be set for each point group.

 この点に関し、下記の特許文献1には、ユーザが撮影した映像に含まれた客体の材質情報を提供する材料認識技術が記載されている。 In this regard, the following Patent Document 1 describes a material recognition technique that provides material information of an object included in an image captured by a user.

特開2013-250263号公報JP 2013-250263 A

 しかしながら、上記特許文献に記載された技術は、探査レーダ部の送信機で発生した入射波を客体方向に入射し、入射波に比べて振幅が低下し、同一の位相を有する表面反射波を受信することで、当該客体の物性情報を検出し、物性情報と客体の材質に対応する基準物性情報とを比較することで、当該客体が何であるかを認識している。このような手法では、極小の特定点で物性情報が検出されるため、三次元物体全体の物性を取得することは困難が伴う。 However, the technique described in the above-mentioned patent document receives an incident wave generated by the transmitter of the exploration radar unit in the direction of the object and receives a surface reflected wave having an amplitude lower than that of the incident wave and having the same phase. Thus, the physical property information of the object is detected, and the physical property information is compared with the reference physical property information corresponding to the material of the object to recognize what the object is. In such a method, since physical property information is detected at a minimum specific point, it is difficult to acquire the physical properties of the entire three-dimensional object.

 特に、3Dスキャンされたデータの点群は、数を増せば増すほど精度が上がり、より細かな物体表現が可能となる。このため、全ての点群ごとの三次元物体材質を検出し、点群ごとに材質を設定してデータベース化するためには、膨大な時間、膨大な処理を要することになる。 Especially, as the number of point groups of 3D scanned data increases, the accuracy increases and finer object representation becomes possible. For this reason, in order to detect the three-dimensional object material for every point group, and to set a material for every point group and to make a database, an enormous time and an enormous process are required.

 そこで、三次元物体について、簡素な処理で各要素の材質を検出することが望まれていた。 Therefore, it has been desired to detect the material of each element with a simple process for a three-dimensional object.

 本開示によれば、対象物体の三次元形状を取得する形状取得部と、前記対象物体を撮像して得られた撮像情報を取得する撮像情報取得部と、前記三次元形状と前記撮像情報に基づいて、前記三次元形状を領域毎に分割する領域分割部と、前記対象物体の前記領域毎に照射された材料検出波と当該対象物体で反射した材料検出波の情報に基づいて当該対象物体の材質を前記領域毎に取得する材質取得部と、を備える、材質検出装置が提供される。 According to the present disclosure, a shape acquisition unit that acquires a three-dimensional shape of a target object, an imaging information acquisition unit that acquires imaging information obtained by imaging the target object, the three-dimensional shape and the imaging information Based on the information of the area detection unit that divides the three-dimensional shape for each area, the material detection wave irradiated for each area of the target object, and the material detection wave reflected by the target object And a material acquisition unit that acquires the material for each region.

 また、本開示によれば、対象物体の三次元形状を取得することと、前記対象物体を撮像して得られた撮像情報を取得することと、前記三次元形状と前記撮像情報に基づいて、前記三次元形状を領域毎に分割することと、前記対象物体の前記領域毎に照射された材料検出波と当該対象物体で反射した材料検出波の情報に基づいて当該対象物体の材質を前記領域毎に取得することと、を備える、材質検出方法が提供される。 Further, according to the present disclosure, acquiring the three-dimensional shape of the target object, acquiring imaging information obtained by imaging the target object, and based on the three-dimensional shape and the imaging information, Dividing the three-dimensional shape for each region, and determining the material of the target object based on information of the material detection wave irradiated to the region of the target object and the material detection wave reflected by the target object. A material detection method comprising: acquiring each time.

 また、本開示によれば、対象物体の三次元形状を取得する手段、前記対象物体を撮像して得られた撮像情報を取得する手段、前記三次元形状と前記撮像情報に基づいて、前記三次元形状を領域毎に分割する手段、前記対象物体の前記領域毎に照射された材料検出波と当該対象物体で反射した材料検出波の情報に基づいて当該対象物体の材質を前記領域毎に取得する手段、としてコンピュータを機能させるためのプログラムが提供される。 Further, according to the present disclosure, means for acquiring a three-dimensional shape of a target object, means for acquiring imaging information obtained by imaging the target object, the tertiary based on the three-dimensional shape and the imaging information Means for dividing the original shape into regions, and obtaining the material of the target object for each region based on information of the material detection wave irradiated to the region of the target object and the material detection wave reflected by the target object A program for causing a computer to function as a means for performing the above is provided.

 以上説明したように本開示によれば、三次元物体について、簡素な処理で各要素の材質を検出することができる。
 なお、上記の効果は必ずしも限定的なものではなく、上記の効果とともに、または上記の効果に代えて、本明細書に示されたいずれかの効果、または本明細書から把握され得る他の効果が奏されてもよい。
As described above, according to the present disclosure, the material of each element can be detected with a simple process for a three-dimensional object.
Note that the above effects are not necessarily limited, and any of the effects shown in the present specification, or other effects that can be grasped from the present specification, together with or in place of the above effects. May be played.

本開示の一実施形態に係るシステムの構成を示す模式図である。It is a mimetic diagram showing the composition of the system concerning one embodiment of this indication. ハサミを本実施形態のシステムで取り込んだ例を示す模式図である。It is a schematic diagram which shows the example which took in the scissors with the system of this embodiment. 本実施形態に係るシステムがハサミの材質を領域A,B,C毎に検出する様子を示す模式図である。It is a schematic diagram which shows a mode that the system which concerns on this embodiment detects the material of scissors for every area | region A, B, and C. FIG. 本実施形態の処理を示すフローチャートである。It is a flowchart which shows the process of this embodiment. スマートフォン等のモバイル機器でシステムを実現した例を示す模式図である。It is a schematic diagram which shows the example which implement | achieved the system with mobile devices, such as a smart phone. 図5のモバイル機器の構成を示す模式図である。It is a schematic diagram which shows the structure of the mobile device of FIG. スマートフォン等のモバイル機器でシステムを実現した例を示す模式図である。It is a schematic diagram which shows the example which implement | achieved the system with mobile devices, such as a smart phone. 図7のモバイル機器の構成を示す模式図である。It is a schematic diagram which shows the structure of the mobile device of FIG.

 以下に添付図面を参照しながら、本開示の好適な実施の形態について詳細に説明する。なお、本明細書及び図面において、実質的に同一の機能構成を有する構成要素については、同一の符号を付することにより重複説明を省略する。 Hereinafter, preferred embodiments of the present disclosure will be described in detail with reference to the accompanying drawings. In addition, in this specification and drawing, about the component which has the substantially same function structure, duplication description is abbreviate | omitted by attaching | subjecting the same code | symbol.

 なお、説明は以下の順序で行うものとする。
 1.三次元物体の材質検出システムの構成
 2.三次元物体の形状解析・領域分割
 3.領域毎の特定点における材質検出
 4.三次元データフォーマットへの材質情報の付与
 5.本実施形態における処理フロー
 6.モバイル機器への適用
The description will be made in the following order.
1. 1. Configuration of material detection system for 3D objects 2. Shape analysis and region segmentation of 3D objects 3. Material detection at specific points in each region 4. Add material information to 3D data format Process flow in this embodiment 6. Application to mobile devices

 1.三次元物体の材質検出システムの構成
 まず、図1を参照して、本開示の一実施形態に係る三次元物体の材質検出システム1000の概略構成について説明する。図1は、本実施形態に係るシステム1000の構成を示す模式図である。図1に示すように、本実施形態に係るシステム1000は、記録用ハードウェア100と解析部(ソフトウェア)200とから構成されている。記録用ハードウェア100は、形状認識用赤外線発光部(プロジェクタ)102、形状認識用赤外線受光部(センサ)104、RGB色認識カメラ106、材質認識用赤外線発光部(プロジェクタ)108、材質認識用赤外線受光部(センサ)110を有して構成されている。
1. Configuration of 3D Object Material Detection System First, a schematic configuration of a 3D object material detection system 1000 according to an embodiment of the present disclosure will be described with reference to FIG. FIG. 1 is a schematic diagram showing a configuration of a system 1000 according to the present embodiment. As shown in FIG. 1, a system 1000 according to this embodiment includes a recording hardware 100 and an analysis unit (software) 200. The recording hardware 100 includes a shape recognizing infrared light emitting unit (projector) 102, a shape recognizing infrared light receiving unit (sensor) 104, an RGB color recognition camera 106, a material recognizing infrared light emitting unit (projector) 108, and a material recognizing infrared light. A light receiving unit (sensor) 110 is included.

 また、解析部200は、対象物体の形状検出部202、特定点の材質検出部204、材質判定用ライブラリ206、領域別素材割当部208を有して構成されている。なお、図1に示す解析部200の各構成要素は、CPU等の中央演算処理装置と、これを機能させるためのソフトウェア(プログラム)から構成されることができる。この場合において、そのプログラムは、記録用ハードウェア100が備えるメモリ(不図示)、または外部から接続される記録媒体に格納されることができる。なお、本実施形態のシステム1000において、記録用ハードウェア100と解析部200とは、一体の装置として構成されていても良いし、それぞれが別の装置として構成されていても良い。 The analysis unit 200 includes a target object shape detection unit 202, a specific point material detection unit 204, a material determination library 206, and a region-specific material allocation unit 208. Each component of the analysis unit 200 shown in FIG. 1 can be composed of a central processing unit such as a CPU and software (program) for causing it to function. In this case, the program can be stored in a memory (not shown) included in the recording hardware 100 or a recording medium connected from the outside. In the system 1000 according to the present embodiment, the recording hardware 100 and the analysis unit 200 may be configured as an integrated device, or may be configured as separate devices.

 2.三次元物体の形状検出
 3Dスキャナには接触式と非接触式が挙げられるが、本実施形態のシステム1000では一例として非接触式を採用する。非接触式では、形状認識用赤外線発光部102から赤外線を対象物体(三次元物体)へ照射し、形状認識用赤外線受光部104が受光した赤外線により、形状検出部202が対象物体の形状、表面の凹凸を検出する。形状認識の手法として、レーザー光を移動させながら対象物体の周囲360°をスキャンするタイプと、規則性をもったパターンを対象物体に照射し、センサが反射赤外線を受光するタイプがある。レーザー光を移動させながら形状計測するものは、照射光と反射光の時間位相差差分を求めて形状を推測する。具体的には、赤外光の発光タイミングに合わせて受光し、受光信号強度から発光部と受光部の位相差を算出し、位相差から飛行時間(距離)を算出することで、形状を推測する。一方、パターンを照射するタイプでは、照射パターンと反射パターンとの差分から三角測量により物体までの距離を推定し、形状、奥行き情報を取得する。具体的には、発光タイミングに合わせて受光し、発光パターンと受光パターンを照合し、投射角と入射角から三画測量の原理により距離を算出する。一般的に、パターンを照射するタイプの方が、レーザー光を移動させるタイプに比較して高速に測定が行われるとされている。なお、レーザー光を移動させてスキャンするタイプはToF(Time OF Flight)方式と呼ばれ、パターンを照射するタイプはアクティブステレオ方式とも呼ばれる。なお、形状認識のために赤外光以外の光線を用いても良い。
2. Detecting the shape of a three-dimensional object The 3D scanner includes a contact type and a non-contact type, but the system 1000 of this embodiment uses a non-contact type as an example. In the non-contact type, the shape detection unit 202 detects the shape and surface of the target object by irradiating the target object (three-dimensional object) with infrared light from the shape recognition infrared light emitting unit 102 and receiving the infrared light received by the shape recognition infrared light receiving unit 104. Detect irregularities in As a shape recognition method, there are a type that scans 360 ° around the target object while moving the laser beam, and a type that irradiates the target object with a regular pattern and the sensor receives reflected infrared rays. In the case of measuring the shape while moving the laser beam, the shape is estimated by obtaining the time phase difference difference between the irradiation light and the reflected light. Specifically, light is received at the timing of infrared light emission, the phase difference between the light-emitting part and the light-receiving part is calculated from the received light signal intensity, and the shape is estimated by calculating the flight time (distance) from the phase difference. To do. On the other hand, in the pattern irradiation type, the distance to the object is estimated by triangulation from the difference between the irradiation pattern and the reflection pattern, and the shape and depth information is acquired. Specifically, the light is received in accordance with the light emission timing, the light emission pattern and the light reception pattern are collated, and the distance is calculated from the projection angle and the incident angle based on the principle of three-stroke surveying. In general, the pattern irradiation type is considered to perform measurement at a higher speed than the type in which the laser beam is moved. Note that the type of scanning by moving the laser beam is called a ToF (Time OF Flight) method, and the type of irradiating a pattern is also called an active stereo method. Note that light other than infrared light may be used for shape recognition.

 その他にも、可視光カメラの画像の差分から三角測量で形状・奥行情報を求める手法もある。具体的には、2台のカメラで同時に受光を行い、2枚の画像の特徴点を照合し、2つの入射角から三角測量で距離を算出する。また、1台の可視光カメラを動かして複数箇所で受光を行い、得られた複数の画像の特徴点を照合し、カメラの移動距離を推定し、2つの入射角から三角測量で距離を算出することもできる。更に、Depth From Defocusと言われる画像のボケ度合から形状、奥行情報を求める手法もある。本実施形態において、形状検出の方法は特に限定されるものではなく、上記以外の手法を利用して形状を検出することも可能である。 In addition, there is also a technique for obtaining shape / depth information by triangulation from the difference between images of visible light cameras. Specifically, light is received simultaneously by two cameras, feature points of two images are collated, and a distance is calculated by triangulation from two incident angles. In addition, move one visible light camera to receive light at multiple locations, collate feature points of the obtained images, estimate the camera movement distance, and calculate the distance by triangulation from two incident angles You can also Furthermore, there is also a technique for obtaining shape and depth information from the degree of blurring of the image, called Depth From Defocus. In the present embodiment, the shape detection method is not particularly limited, and it is also possible to detect the shape using a method other than the above.

 2.三次元物体の形状解析・領域分割
 赤外線による三次元物体の形状検出が行われると、次に三次元物体の形状を解析する。対象物体の形状検出部202は、記録用ハードウェア100で取り込んだ画像・映像から形状情報を解析する。具体的には、RGB色認識カメラ106が対象物体を撮像し、撮像により得られた情報から対象物体の形状情報を解析し、領域分割を行う。この段階で形状解析をすることで、後で材質情報を3Dデータフォーマットに落とし込む際に処理を高速化することができる。
2. Shape analysis / region segmentation of 3D object Once the shape of the 3D object is detected by infrared rays, the shape of the 3D object is analyzed. The target object shape detection unit 202 analyzes shape information from an image / video captured by the recording hardware 100. Specifically, the RGB color recognition camera 106 images a target object, analyzes shape information of the target object from information obtained by the imaging, and performs region division. By performing shape analysis at this stage, it is possible to speed up the processing when material information is later dropped into the 3D data format.

 一例として、図2は、ハサミ300を本実施形態のシステム1000で取り込んだ例を示す模式図である。この時、形状検出部202は、ハサミ300の金属部分と持ち手部分、更には持ち手の色をセグメント分けする形状情報を解析する。このため、形状検出部202は、RGB色認識カメラ106から、撮像により得たられた撮像情報を取得する撮像情報取得部202aを有している。また、形状検出部202は、色情報・エッジ情報、輝度・彩度情報、コントラスト、Depth情報、背景情報、光の反射率、平坦部(テクスチャが少ない領域)情報、温度等を参考情報として、対象物体の三次元形状を領域毎に分割(セグメンテーション)する領域分割部202bを有している。形状検出部202では、背景分離・セグメンテーションされた対象の三次元物体のエッジ情報を抽出し、エッジ情報に基づいて領域分割を行う。 As an example, FIG. 2 is a schematic diagram showing an example in which the scissors 300 are captured by the system 1000 of the present embodiment. At this time, the shape detection unit 202 analyzes the shape information for segmenting the metal portion and the handle portion of the scissors 300 and the color of the handle. For this reason, the shape detection unit 202 includes an imaging information acquisition unit 202 a that acquires imaging information obtained by imaging from the RGB color recognition camera 106. In addition, the shape detection unit 202 uses color information / edge information, luminance / saturation information, contrast, depth information, background information, light reflectance, flat part (region with less texture) information, temperature, and the like as reference information. An area dividing unit 202b that divides (segments) the three-dimensional shape of the target object for each area is provided. The shape detection unit 202 extracts edge information of the target three-dimensional object subjected to background separation and segmentation, and performs region division based on the edge information.

 領域分割部202bは、三次元形状を示すメッシュ情報又はデプス情報からなる三次元情報に境界線を設けることで三次元形状を領域毎に分割する。この際、境界線は、色情報、エッジ情報、輝度・彩度情報、コントラスト等の撮像情報に基づいて設定される。領域分割の際に用いる色情報、エッジ情報、輝度・彩度情報、コントラスト等の撮像情報は、RGB色認識カメラ106で取得され、撮像情報取得部202aによって取得される。また、ユーザが物体・背景等の補助情報を入力し、グラフカット方式でセグメンテーションを行うことも可能とする。例えば、撮影時に対象物体をタッチ操作で選択することで、背景から対象物体を特定する。これにより高精度に背景分離が可能となる。 The region dividing unit 202b divides the three-dimensional shape for each region by providing a boundary line in the three-dimensional information including the mesh information or the depth information indicating the three-dimensional shape. At this time, the boundary line is set based on imaging information such as color information, edge information, luminance / saturation information, and contrast. Imaging information such as color information, edge information, luminance / saturation information, and contrast used for area division is acquired by the RGB color recognition camera 106 and acquired by the imaging information acquisition unit 202a. It is also possible for the user to input auxiliary information such as an object / background and perform segmentation by the graph cut method. For example, the target object is specified from the background by selecting the target object with a touch operation at the time of shooting. As a result, the background can be separated with high accuracy.

 形状検出部202による形状検出の結果、図2に示すように、ハサミ300が金属部分(領域A)、持ち手部分(領域B)、持ち手部分以外の樹脂部分(領域C)にセグメンテーションされる。ここで、領域Bと領域Cはともに樹脂から構成され、色が異なるものとする。 As a result of shape detection by the shape detection unit 202, as shown in FIG. 2, the scissors 300 are segmented into a metal portion (region A), a handle portion (region B), and a resin portion (region C) other than the handle portion. . Here, both the region B and the region C are made of resin and have different colors.

 3.領域毎の特定点における材質検出
 次に、特定点の材質を計測する。ここでは、材質認識用赤外線発光部108が照射する材質検出用の赤外線と、赤外線を受光する材質認識用赤外線受光部110を用いる。ここでは、公知の手法を用い、材質認識用赤外線発光部108が照射する赤外線を三次元物体のある特定点に照射することで、材質を推定する。材質認識用赤外線発光部108が照射する赤外線は、対象物体のある特定点に照射され、材質認識用赤外線受光部110によって受光される。材質検出部204は、材質認識用赤外線発光部108が照射する赤外線と、材質認識用赤外線受光部110が受光した赤外線に基づいて、材質認識用赤外線発光部108が照射する赤外線の反射率を取得し、材質判定用ライブラリ206に保持している反射率と比較することで、対象物体の特定点の材質を予測する。
3. Next, the material of the specific point is measured. Here, an infrared ray for detecting the material irradiated by the infrared ray emitting unit for material recognition 108 and an infrared ray receiving unit for material recognition 110 for receiving the infrared ray are used. Here, the material is estimated by irradiating a specific point on the three-dimensional object with the infrared rays emitted by the material recognition infrared light emitting unit 108 using a known method. Infrared rays emitted from the material recognizing infrared light emitting unit 108 are applied to a specific point of the target object and received by the material recognizing infrared light receiving unit 110. The material detection unit 204 obtains the reflectance of the infrared light emitted by the material recognition infrared light emitting unit 108 based on the infrared light emitted by the material recognition infrared light emitting unit 108 and the infrared light received by the material recognition infrared light receiving unit 110. Then, by comparing with the reflectance held in the material determination library 206, the material of the specific point of the target object is predicted.

 従来の物体材質検出では、赤外線・超音波・電磁波等の信号(材料検出波)を物体に照射し、その反射率(応答)を計測し、反射成分の係数比較により、物体の材質を予測するものが一般的に活用されている。これらの材質検出は、点光源の赤外線レーザーを対象物体の一部分に照射し、反射して返ってくる赤外線の強度と伝達関数・減衰度合いを検出するシステムで成立する。しかし、これらの点光源照射による材質検出では、赤外線レーザが照射できる領域(特定点)のみの材質を検出するため、三次元物体全体の材質を測定するには膨大な時間を要する。 In conventional object material detection, signals (material detection waves) such as infrared rays, ultrasonic waves, and electromagnetic waves are irradiated on the object, the reflectance (response) is measured, and the material of the object is predicted by comparing the coefficients of the reflection components. Things are generally used. Such material detection is realized by a system that irradiates a part of a target object with an infrared laser of a point light source and detects the intensity, transfer function, and attenuation degree of the reflected infrared light. However, in the material detection by the point light source irradiation, the material of only the region (specific point) that can be irradiated with the infrared laser is detected, so it takes a lot of time to measure the material of the entire three-dimensional object.

 ここで、特定物体の材質検出に要する時間は、物体の大きさに比例する。一例として、従来の手法で256画素にあたる領域の材質を検出するためには、20ms程度の時間を要することが知られている。この検出時間は、赤外線を物体に照射し、反射率を計測し、その反射率が持ち合わせる材質係数にどれ程近いかを比較しながら特定するのにかかる時間である。 Here, the time required to detect the material of a specific object is proportional to the size of the object. As an example, it is known that it takes about 20 ms to detect the material of an area corresponding to 256 pixels by a conventional method. This detection time is the time required to specify an object by irradiating an object with infrared rays, measuring the reflectance, and comparing how close the material coefficient is to the reflectance.

 このような膨大な処理時間を抑えるため、本実施形態では、対象物体を領域毎に分割(セグメンテーション)し、各領域毎に材質判定を行うことで、処理にかかる時間、負荷を大幅に削減する。上述したハサミ300の例では、ハサミ300は金属部分と持ち手部分に大きく分けられるため、まず、金属部分に材料検出波の一例として赤外線を照射して、素材は金属だと判定させる。次に、持ち手部分に赤外線を照射して、素材がプラスチック・樹脂等であると判定させる。これにより、図2に示す各領域A,B,Cについて1回ずつ材料認識を行うことで、領域A,B,Cのそれぞれについて材質を判定することができる。このとき判定されたそれぞれの材質は、ワールド座標系と関連して保持される。このように、限られた領域の特定部品のみを材質認識させておいて、後の3Dデータフォーマットに落とし込む際に活用する。従って、三次元形状を示す点群の全てで材料検出を行うことがなく、処理を大幅に簡素化できる。 In order to suppress such an enormous amount of processing time, in this embodiment, the target object is divided (segmented) into regions, and material determination is performed for each region, thereby greatly reducing processing time and load. . In the example of the scissors 300 described above, since the scissors 300 are roughly divided into a metal part and a handle part, first, the metal part is irradiated with infrared rays as an example of a material detection wave to determine that the material is metal. Next, the handle portion is irradiated with infrared rays to determine that the material is plastic or resin. Thereby, the material can be determined for each of the regions A, B, and C by performing material recognition once for each of the regions A, B, and C shown in FIG. Each material determined at this time is held in association with the world coordinate system. In this way, only a specific part in a limited area is recognized as a material, and is used when it is dropped into a later 3D data format. Therefore, the material detection is not performed on all of the point groups showing the three-dimensional shape, and the processing can be greatly simplified.

 4.三次元データフォーマットへの材質情報の付与
 本実施形態では、対象物体を領域別に材質推定し、3Dデータフォーマットへ落とし込む処理を行う。上述のように、従来の特定点材質予測では、三次元物体のある極小領域の材質を計測することは可能であるが、特定点のみの材質なので、三次元物体の一部の点でしか材質情報を持ち合わすことができない。ここで、すべての点群の材質を計測しようとすると膨大な計測時間が必要となる。
4). Giving material information to the three-dimensional data format In this embodiment, the target object is estimated by material for each region, and the process of dropping into the 3D data format is performed. As described above, in the conventional material prediction of a specific point, it is possible to measure the material of a very small area of a 3D object. I can't have information with me. Here, if it is going to measure the material of all the point groups, enormous measurement time will be needed.

 特に、三次元形状の情報に基づいて3Dプリンタでプリントアウトすることを想定した場合、三次元形状の点群すべてに材質を割り当てなければならない。本実施形態では、領域毎に材質を判定し、領域別素材割当部208を導入することで、三次元物体の材質予測を高速化する。 In particular, if it is assumed that a 3D printer prints out based on 3D shape information, materials must be assigned to all 3D shape point clouds. In the present embodiment, the material is determined for each region, and the material prediction unit for each region is introduced to speed up the material prediction of the three-dimensional object.

 この高速化にあたり、形状検出部202によるセグメンテーション結果が活用される。上述のように形状検出部202では、背景分離・セグメンテーションされた対象の三次元物体のエッジ情報を抽出する。エッジ情報を抽出する手段として、画像の一次微分結果にSobel、Laplacian、Cannyオペレーターを施す方法が挙げられる。また、画素の輝度値差分、彩度値差分、テンプレートマッチング等のエッジ情報抽出も可能とする。 The segmentation result by the shape detection unit 202 is used for this speeding up. As described above, the shape detection unit 202 extracts edge information of the target three-dimensional object subjected to background separation and segmentation. As a means for extracting edge information, there is a method in which Sobel, Laplacian, and Canny operators are applied to the first derivative result of an image. Further, it is possible to extract edge information such as pixel luminance value difference, saturation value difference, template matching, and the like.

 ここで抽出された三次元物体のエッジは、背景分離をすると、連続的閉鎖領域を生み出す。例えば点Aから始まった線が、また点Aに戻っている状態を意味する。この場合、この線で囲まれた閉鎖領域は、ある物体の構成要素の一つである可能性が高いと判定できる。ここで、領域別素材割当部208は、材質検出部204が検出したある特定点の材質情報をその閉鎖領域に割り当てる。割り当ての際には、ペイントルーチンアルゴリズムを用いることができる。ペイントルーチンアルゴリズムとは、ある閉鎖空間に色を塗りつぶすアルゴリズムで、Windows(登録商標)アクセサリのペイントアプリケーションで活用されるものである。これによってすべての点群データに色を塗るかのように材質情報を持たせることができる。 [The edge of the 3D object extracted here produces a continuous closed area when the background is separated. For example, it means that a line starting from point A has returned to point A again. In this case, it can be determined that the closed region surrounded by this line is likely to be one of the components of a certain object. Here, the material allocation unit 208 by region allocates material information of a specific point detected by the material detection unit 204 to the closed region. When assigning, a paint routine algorithm can be used. The paint routine algorithm is an algorithm for painting a color in a certain closed space, and is used in a paint application of Windows (registered trademark) accessories. This makes it possible to have material information as if all point cloud data is painted.

 3Dデータフォーマットは、一般的にワールド座標情報とポリゴンサイズと位置、テクスチャ情報などを持ち合わせる。最近の3Dフォーマットには、3DS,FBX format,OBJ formatといった材質情報を持つことができるフォーマットも誕生している。セグメンテーションによって分けられたオブジェクト(図2に示す領域A,B,C)ごとのワールド座標は特定できるので、オブジェクトごとに材質情報を割り当てることができる。また、3Dプリンターで使われているSTLをベースに、色や材質、内部構造に関する情報を記述できるAMF(アディティブ・マニュファクチャリング・ファイル)フォーマット等のフォーマットを用いても良い。 3D data format generally has world coordinate information, polygon size and position, texture information, etc. In recent 3D formats, formats that can have material information such as 3DS, FBX format, and OBJ format have been born. Since the world coordinates for each object (areas A, B, and C shown in FIG. 2) divided by segmentation can be specified, material information can be assigned to each object. A format such as an AMF (Additive Manufacturing File) format that can describe information about colors, materials, and internal structures may be used based on the STL used in 3D printers.

 図3は、本実施形態に係るシステム1000がハサミ300の材質を領域A,B,C毎に検出する様子を示す模式図である。形状検出部202によりセグメンテーションを行い、領域A,B,Cに分割した後、領域A,B,C毎に材質認識用赤外線発光部108により赤外線を発光し、材質認識用赤外線受光部110により赤外線を受光することで、材質検出部204が領域A,B,C毎に材質を検出する。この際に、材質判定用ライブラリ206に保持された各材質(ABS樹脂、PLA樹脂、ナイロン、プラスチック、石膏、ゴム、金属等)の材質係数と比較することで、材質を検出する。 FIG. 3 is a schematic diagram showing how the system 1000 according to this embodiment detects the material of the scissors 300 for each of the areas A, B, and C. After the segmentation is performed by the shape detection unit 202 and divided into regions A, B, and C, infrared light is emitted from the material recognition infrared light emitting unit 108 for each of the regions A, B, and C, and the infrared light is received by the material recognition infrared light receiving unit 110. , The material detection unit 204 detects the material for each of the regions A, B, and C. At this time, the material is detected by comparing with the material coefficient of each material (ABS resin, PLA resin, nylon, plastic, gypsum, rubber, metal, etc.) held in the material determination library 206.

 領域A,B,C毎に材質検出を行うため、各領域A,B,Cにはそれぞれ1回の発光を行う。各領域A,B,Cの特定点に1回のみ発光を行うことで材質を検出し、その領域の全てに検出した材質を割り当てる。これにより、処理時間、負荷を大幅に低減することが可能である。 In order to detect the material for each of the regions A, B, and C, each region A, B, and C emits light once. The material is detected by emitting light only once at a specific point in each of the areas A, B, and C, and the detected material is assigned to all of the areas. As a result, the processing time and load can be significantly reduced.

 5.本実施形態における処理フロー
 図4は、本実施形態の処理を示すフローチャートである。先ず、ステップS10では、形状認識用赤外線発光部102により赤外線照射を行い、形状認識用赤外線受光部104により受光することで、形状検出部202が対象物体の形状を検出する。次のステップS12では、RGB色認識カメラ106による撮像情報を取得する。次のステップS14では、撮像情報を基にした形状の領域、クラス分けを行う。次のステップS16では、背景分離を行う。次のステップS18では、エッジ情報を基にした形状の領域、セグメント分けを行う。このように、ステップS14~S18により、領域分割部202bによる領域分割が行われる。
5. Processing Flow in the Present Embodiment FIG. 4 is a flowchart showing processing in the present embodiment. First, in step S <b> 10, infrared irradiation is performed by the shape recognition infrared light emitting unit 102 and light is received by the shape recognition infrared light receiving unit 104, so that the shape detection unit 202 detects the shape of the target object. In the next step S12, imaging information by the RGB color recognition camera 106 is acquired. In the next step S14, the shape area and the classification are performed based on the imaging information. In the next step S16, background separation is performed. In the next step S18, the region and the segment of the shape based on the edge information are divided. As described above, the region division by the region dividing unit 202b is performed in steps S14 to S18.

 次のステップS20では、材質判定用ライブラリ206を照合して、領域別に材質検出を行う。このため、ステップS22,S24,S26において、領域A,B,Cのいずれかを判定し、領域Aの場合はステップS28へ進み、ペイントルーチンにより領域Aの材質を認定する。また、領域Bの場合はステップS30へ進み、ペイントルーチンにより領域Bの材質を認定する。また、領域Cの場合はステップS32へ進み、ペイントルーチンにより領域Cの材質を認定する。ステップS28,S30,S32の後はステップS34へ進み、3Dデータフォーマットを作成する。 In the next step S20, the material determination library 206 is collated, and material detection is performed for each region. Therefore, in steps S22, S24, and S26, one of the areas A, B, and C is determined. If the area is A, the process proceeds to step S28, and the material of the area A is recognized by a paint routine. In the case of the area B, the process proceeds to step S30, and the material of the area B is recognized by the paint routine. In the case of the area C, the process proceeds to step S32, and the material of the area C is recognized by the paint routine. After steps S28, S30, and S32, the process proceeds to step S34 to create a 3D data format.

 6.モバイル機器への適用
 図5は、スマートフォン等のモバイル機器1010でシステム1000を実現した例を示す模式図である。また、図6は、モバイル機器1010の構成を示す模式図である。図6に示すように、モバイル機器1010は、図1に示すシステム1000に加え、表示部114、タッチセンサ116、照射位置誘導部210を有している。モバイル機器1010は、表示部114上にタッチセンサ116が設けられることで構成されたタッチパネルを備えている。RGB色認識カメラ106が撮像したハサミ300の撮像画像は、表示部114に表示される。照射位置誘導部210は、材質検出をしたい領域にセンサ信号の照射位置を誘導するアプリケーションである。図5に示す例では、材質認識用赤外線発光部108による赤外線の照射位置(図5中に十字で示すマーク400)が、表示部114に表示された撮像画像と重畳して表示される。マーク400の位置は、例えば赤外線の照射方向と対象物体(ハサミ300)までの距離から演算できる。これにより、ユーザはマーク400を撮像画像中のハサミ300の任意の位置に合わせることで、その位置を含む領域の材質を検出することができる。ユーザは、検出したい領域に照射が行われていることを、タッチセンサ116を操作することによってモバイル機器1010に伝えることができる。
6). Application to Mobile Device FIG. 5 is a schematic diagram illustrating an example in which the system 1000 is realized by a mobile device 1010 such as a smartphone. FIG. 6 is a schematic diagram showing the configuration of the mobile device 1010. As illustrated in FIG. 6, the mobile device 1010 includes a display unit 114, a touch sensor 116, and an irradiation position guide unit 210 in addition to the system 1000 illustrated in FIG. 1. The mobile device 1010 includes a touch panel configured by providing a touch sensor 116 on the display unit 114. A captured image of the scissors 300 captured by the RGB color recognition camera 106 is displayed on the display unit 114. The irradiation position guiding unit 210 is an application for guiding the irradiation position of the sensor signal to an area where material detection is desired. In the example shown in FIG. 5, the infrared irradiation position (mark 400 indicated by a cross in FIG. 5) by the material recognizing infrared light emitting unit 108 is displayed so as to be superimposed on the captured image displayed on the display unit 114. The position of the mark 400 can be calculated from, for example, the infrared irradiation direction and the distance to the target object (scissors 300). Accordingly, the user can detect the material of the region including the position by aligning the mark 400 with an arbitrary position of the scissors 300 in the captured image. The user can inform the mobile device 1010 that the area to be detected is irradiated by operating the touch sensor 116.

 図6に示す例では、モバイル機器1010が形状認識用赤外線発光部102、形状認識用赤外線受光部104、RGB色認識カメラ106を備えているが、対象物体を形状検出した結果のデータを他装置から取得するようにすれば、モバイル機器1010が形状認識用赤外線発光部102、形状認識用赤外線受光部104、RGB色認識カメラ106を備えていなくても良い。 In the example shown in FIG. 6, the mobile device 1010 includes a shape recognition infrared light emitting unit 102, a shape recognition infrared light receiving unit 104, and an RGB color recognition camera 106. The mobile device 1010 does not have to include the shape recognition infrared light emitting unit 102, the shape recognition infrared light receiving unit 104, and the RGB color recognition camera 106.

 また、図7は、スマートフォン等のモバイル機器1020でシステム1000を実現した例を示す模式図であって、材質検出のための材質認識用赤外線発光部108、材質認識用赤外線受光部110をモバイル機器1020が備えていない場合を示している。また、図8は、モバイル機器1020の構成を示す模式図である。図8に示すように、モバイル機器1020は、図6のモバイル機器1010と異なり、材質認識用赤外線発光部108および材質認識用赤外線受光部110を備えていない。モバイル機器1020は、赤外線センサユニット1030とBluetooth(登録商標)等によって通信を行うため、通信部118を備えている。 FIG. 7 is a schematic diagram illustrating an example in which the system 1000 is realized by a mobile device 1020 such as a smartphone. The material recognition infrared light emitting unit 108 and the material recognition infrared light receiving unit 110 are used for detecting a material. The case where 1020 is not provided is shown. FIG. 8 is a schematic diagram showing the configuration of the mobile device 1020. As shown in FIG. 8, unlike the mobile device 1010 of FIG. 6, the mobile device 1020 does not include the material recognizing infrared light emitting unit 108 and the material recognizing infrared light receiving unit 110. The mobile device 1020 includes a communication unit 118 in order to communicate with the infrared sensor unit 1030 and Bluetooth (registered trademark).

 赤外線センサユニット1030は、材質認識用赤外線発光部1032、材質認識用赤外線受光部1034、通信部1036を有している。材質認識用赤外線発光部1032、および材質認識用赤外線受光部1034は、図1および図6の材質認識用赤外線発光部108、および材質認識用赤外線受光部110に対応する。通信部1036は、モバイル機器1020とBluetooth(登録商標)等によって通信を行う。赤外線センサユニット1030は、材質認識用赤外線発光部1032が照射した赤外線と、材質認識用赤外線受光部1034が受光した赤外線に関する情報を通信部1036からモバイル機器1020へ送る。モバイル機器1020の材質検出部204は、赤外線センサユニット1030から送られた情報に基づいて、赤外線の反射率を取得し、材質判定用ライブラリ206に保持している反射率と比較することで、対象物体の材質を領域毎に検出する。このように、モバイル機器1020が赤外線のセンサを有していない場合であっても、赤外線センサユニット1030と通信可能であれば、赤外線センサユニット1030側から信号をモバイル機器1020側に送ることで、材質を推定することが可能である。 The infrared sensor unit 1030 includes a material recognizing infrared light emitting unit 1032, a material recognizing infrared light receiving unit 1034, and a communication unit 1036. The material recognizing infrared light emitting unit 1032 and the material recognizing infrared light receiving unit 1034 correspond to the material recognizing infrared light emitting unit 108 and the material recognizing infrared light receiving unit 110 of FIGS. The communication unit 1036 communicates with the mobile device 1020 via Bluetooth (registered trademark) or the like. The infrared sensor unit 1030 sends information about the infrared rays irradiated by the material recognition infrared light emitting unit 1032 and the infrared rays received by the material recognition infrared light receiving unit 1034 from the communication unit 1036 to the mobile device 1020. The material detection unit 204 of the mobile device 1020 acquires the infrared reflectance based on the information sent from the infrared sensor unit 1030 and compares it with the reflectance held in the material determination library 206. The material of the object is detected for each region. Thus, even if the mobile device 1020 does not have an infrared sensor, if communication with the infrared sensor unit 1030 is possible, by sending a signal from the infrared sensor unit 1030 side to the mobile device 1020 side, It is possible to estimate the material.

 以上説明したように本実施形態によれば、RGB色認識カメラ106によって得られた撮像情報に基づいて形状検出により得られた三次元形状を領域分割し、分割した領域の特定点に赤外線を照射することで領域毎に材質検出を行う。特定点の材質検出により得られた材質は、その領域全体の材質として3Dデータフォーマットへ割り当てられる。従って、3Dデータフォーマットの点群のそれぞれについて材質検出を行う必要がなく、処理時間、処理負荷を大幅に削減することが可能となる。 As described above, according to the present embodiment, the three-dimensional shape obtained by shape detection is divided into regions based on the imaging information obtained by the RGB color recognition camera 106, and infrared rays are irradiated to specific points in the divided regions. Thus, material detection is performed for each region. The material obtained by detecting the material at the specific point is assigned to the 3D data format as the material of the entire region. Therefore, it is not necessary to detect the material for each point group in the 3D data format, and the processing time and processing load can be greatly reduced.

 以上、添付図面を参照しながら本開示の好適な実施形態について詳細に説明したが、本開示の技術的範囲はかかる例に限定されない。本開示の技術分野における通常の知識を有する者であれば、特許請求の範囲に記載された技術的思想の範疇内において、各種の変更例または修正例に想到し得ることは明らかであり、これらについても、当然に本開示の技術的範囲に属するものと了解される。 The preferred embodiments of the present disclosure have been described in detail above with reference to the accompanying drawings, but the technical scope of the present disclosure is not limited to such examples. It is obvious that a person having ordinary knowledge in the technical field of the present disclosure can come up with various changes or modifications within the scope of the technical idea described in the claims. Of course, it is understood that it belongs to the technical scope of the present disclosure.

 また、本明細書に記載された効果は、あくまで説明的または例示的なものであって限定的ではない。つまり、本開示に係る技術は、上記の効果とともに、または上記の効果に代えて、本明細書の記載から当業者には明らかな他の効果を奏しうる。 In addition, the effects described in this specification are merely illustrative or illustrative, and are not limited. That is, the technology according to the present disclosure can exhibit other effects that are apparent to those skilled in the art from the description of the present specification in addition to or instead of the above effects.

 なお、以下のような構成も本開示の技術的範囲に属する。
(1) 対象物体の三次元形状を取得する形状取得部と、
 前記対象物体を撮像して得られた撮像情報を取得する撮像情報取得部と、
 前記三次元形状と前記撮像情報に基づいて、前記三次元形状を領域毎に分割する領域分割部と、
 前記対象物体の前記領域毎に照射された材料検出波と当該対象物体で反射した材料検出波の情報に基づいて当該対象物体の材質を前記領域毎に取得する材質取得部と、
 を備える、材質検出装置。
(2) 前記材料検出波は赤外線である、前記(1)に記載の材質検出装置。
(3) 前記領域分割部は、前記三次元形状を示すメッシュ情報又はデプス情報からなる三次元情報に境界線を設けることで前記三次元形状を前記領域毎に分割する、前記(1)に記載の材質検出装置。
(4) 前記領域分割部は、前記撮像情報に基づいて前記境界線を設ける、前記(3)に記載の材質検出装置。
(5) 前記材質取得部が前記領域毎に取得した前記対象物体の材質を前記形状取得部が取得した前記三次元形状のデータに前記領域毎に割り当てる領域別材質割当部を更に備える、前記(1)に記載の材質検出装置。
(6) 前記対象物体の材質を取得するための赤外線を照射する材質認識用赤外線発光部と、
 前記材質認識用赤外線発光部が照射した赤外線が前記対象物体で反射した赤外線を受光する材質認識用赤外線受光部と、
 を更に備える、前記(1)に記載の材質検出装置。
(7) 前記材質取得部は、前記対象物体に照射された赤外線の前記対象物体における反射率に基づいて前記材質を取得する、前記(2)に記載の材質検出装置。
(8) 材質毎の赤外線の反射率を予め保持し、前記材質取得部は、前記対象物体に照射された赤外線の前記対象物体における反射率と、予め保持した前記材質毎の赤外線の反射率とを比較して前記材質を取得する、前記(7)に記載の材質検出装置。
(9) 前記形状取得部は、前記対象物体に照射された光線と前記対象物体で反射した光線の情報に基づいて前記三次元形状を取得する、前記(1)に記載の材質検出装置。
(10) 前記光線は赤外線であり、
 前記対象物体の前記三次元形状を取得するための赤外線を照射する形状認識用赤外線発光部と、
 前記形状認識用赤外線発光部が照射した赤外線が前記対象物体で反射した赤外線を受光する形状認識用赤外線受光部と、
 を更に備える、前記(9)に記載の材質検出装置。
(11) 前記形状取得部は、赤外線の照射光と反射光の位相差に基づいて前記三次元形状を取得する、前記(10)に記載の材質検出装置。
(12) 前記形状認識用赤外線発光部は、所定のパターンで赤外線を照射し、
 前記形状取得部は、赤外線の照射パターンと反射パターンとの差分から三角測量により前記三次元形状を取得する、前記(10)に記載の材質検出装置。
(13) 前記形状取得部は、異なる位置で前記対象物体を撮像した画像の特徴点を照合し、入射角に基づいて三角測量により前記三次元形状を取得する、前記(1)に記載の材質検出装置。
(14) 前記対象物体を撮像する撮像部と、
 前記撮像部が撮像した撮像画像と、前記対象物体の材質を取得するための赤外線の照射位置とを前記撮像画像に重畳して表示する表示部と、
 を備える、前記(1)に記載の材質検出装置。
(15) 前記対象物体に照射された赤外線と当該対象物体で反射した赤外線の情報を外部の赤外線センサユニットから受信する通信部を更に備える、前記(1)に記載の材質検出装置。
(16) 対象物体の三次元形状を取得することと、
 前記対象物体を撮像して得られた撮像情報を取得することと、
 前記三次元形状と前記撮像情報に基づいて、前記三次元形状を領域毎に分割することと、
 前記対象物体の前記領域毎に照射された材料検出波と当該対象物体で反射した材料検出波の情報に基づいて当該対象物体の材質を前記領域毎に取得することと、
 を備える、材質検出方法。
(17) 対象物体の三次元形状を取得する手段、
 前記対象物体を撮像して得られた撮像情報を取得する手段、
 前記三次元形状と前記撮像情報に基づいて、前記三次元形状を領域毎に分割する手段、
 前記対象物体の前記領域毎に照射された材料検出波と当該対象物体で反射した材料検出波の情報に基づいて当該対象物体の材質を前記領域毎に取得する手段、
 としてコンピュータを機能させるためのプログラム。
The following configurations also belong to the technical scope of the present disclosure.
(1) a shape acquisition unit that acquires the three-dimensional shape of the target object;
An imaging information acquisition unit that acquires imaging information obtained by imaging the target object;
Based on the three-dimensional shape and the imaging information, a region dividing unit that divides the three-dimensional shape into regions,
A material acquisition unit that acquires the material of the target object for each region based on the information of the material detection wave irradiated for each region of the target object and the material detection wave reflected by the target object;
A material detection device comprising:
(2) The material detection device according to (1), wherein the material detection wave is infrared light.
(3) The region dividing unit divides the three-dimensional shape for each region by providing boundary lines in three-dimensional information including mesh information or depth information indicating the three-dimensional shape. Material detection device.
(4) The material detection device according to (3), wherein the region dividing unit provides the boundary line based on the imaging information.
(5) The material acquisition unit further includes a region-specific material allocation unit that allocates the material of the target object acquired for each region to the three-dimensional shape data acquired by the shape acquisition unit for each region. The material detection apparatus as described in 1).
(6) A material recognizing infrared light emitting unit for irradiating infrared light for acquiring the material of the target object;
Infrared light receiving part for material recognition for receiving infrared light reflected by the target object, which is irradiated by the infrared light emitting part for material recognition;
The material detection device according to (1), further comprising:
(7) The material detection device according to (2), wherein the material acquisition unit acquires the material based on a reflectance of the infrared light irradiated on the target object in the target object.
(8) Infrared reflectance for each material is held in advance, and the material acquisition unit is configured to reflect the reflectance of the infrared light applied to the target object in the target object, and the infrared reflectance for each material held in advance. The material detection device according to (7), wherein the material is obtained by comparing the two.
(9) The material detection device according to (1), wherein the shape acquisition unit acquires the three-dimensional shape based on information on a light beam applied to the target object and a light beam reflected by the target object.
(10) The light beam is infrared;
An infrared light emitting part for shape recognition that emits infrared light for acquiring the three-dimensional shape of the target object;
Infrared light receiving unit for shape recognition for receiving infrared light reflected by the target object, which is irradiated by the infrared light emitting unit for shape recognition;
The material detection device according to (9), further comprising:
(11) The material detection device according to (10), wherein the shape acquisition unit acquires the three-dimensional shape based on a phase difference between infrared irradiation light and reflected light.
(12) The infrared light emitting unit for shape recognition irradiates infrared rays in a predetermined pattern,
The material detection device according to (10), wherein the shape acquisition unit acquires the three-dimensional shape by triangulation from a difference between an infrared irradiation pattern and a reflection pattern.
(13) The material according to (1), wherein the shape acquisition unit collates feature points of images obtained by imaging the target object at different positions, and acquires the three-dimensional shape by triangulation based on an incident angle. Detection device.
(14) an imaging unit that images the target object;
A display unit that superimposes and displays a captured image captured by the imaging unit and an infrared irradiation position for acquiring the material of the target object on the captured image;
The material detection device according to (1), comprising:
(15) The material detection device according to (1), further including a communication unit that receives information on infrared rays irradiated on the target object and infrared rays reflected on the target object from an external infrared sensor unit.
(16) acquiring the three-dimensional shape of the target object;
Obtaining imaging information obtained by imaging the target object;
Dividing the three-dimensional shape into regions based on the three-dimensional shape and the imaging information;
Obtaining the material of the target object for each region based on the information of the material detection wave irradiated for each region of the target object and the material detection wave reflected by the target object;
A material detection method comprising:
(17) means for acquiring the three-dimensional shape of the target object;
Means for acquiring imaging information obtained by imaging the target object;
Means for dividing the three-dimensional shape into regions based on the three-dimensional shape and the imaging information;
Means for acquiring the material of the target object for each region based on the information of the material detection wave irradiated for each region of the target object and the material detection wave reflected by the target object;
As a program to make the computer function as.

 102  形状認識用赤外線発光部
 104  形状認識用赤外線受光部
 106  RGB色認識カメラ
 108  材質認識用赤外線発光部
 110  材質認識用赤外線受光部
 114  表示部
 202  形状検出部
 202a 撮像情報取得部
 202b 領域分割部
 204  材質検出部
 208  領域別材質割当部
 1000 システム
 1010,1020  モバイル機器
102 Infrared light emitting unit for shape recognition 104 Infrared light receiving unit for shape recognition 106 RGB color recognition camera 108 Infrared light emitting unit for material recognition 110 Infrared light receiving unit for material recognition 114 Display unit 202 Shape detection unit 202a Imaging information acquisition unit 202b Area division unit 204 Material detection unit 208 Area-specific material allocation unit 1000 System 1010, 1020 Mobile device

Claims (17)

 対象物体の三次元形状を取得する形状取得部と、
 前記対象物体を撮像して得られた撮像情報を取得する撮像情報取得部と、
 前記三次元形状と前記撮像情報に基づいて、前記三次元形状を領域毎に分割する領域分割部と、
 前記対象物体の前記領域毎に照射された材料検出波と当該対象物体で反射した材料検出波の情報に基づいて当該対象物体の材質を前記領域毎に取得する材質取得部と、
 を備える、材質検出装置。
A shape acquisition unit that acquires the three-dimensional shape of the target object;
An imaging information acquisition unit that acquires imaging information obtained by imaging the target object;
Based on the three-dimensional shape and the imaging information, a region dividing unit that divides the three-dimensional shape into regions,
A material acquisition unit that acquires the material of the target object for each region based on the information of the material detection wave irradiated for each region of the target object and the material detection wave reflected by the target object;
A material detection device comprising:
 前記材料検出波は赤外線である、請求項1に記載の材質検出装置。 The material detection device according to claim 1, wherein the material detection wave is infrared rays.  前記領域分割部は、前記三次元形状を示すメッシュ情報又はデプス情報からなる三次元情報に境界線を設けることで前記三次元形状を前記領域毎に分割する、請求項1に記載の材質検出装置。 The material detection device according to claim 1, wherein the region dividing unit divides the three-dimensional shape for each region by providing a boundary line in three-dimensional information including mesh information or depth information indicating the three-dimensional shape. .  前記領域分割部は、前記撮像情報に基づいて前記境界線を設ける、請求項3に記載の材質検出装置。 The material detection device according to claim 3, wherein the region dividing unit provides the boundary line based on the imaging information.  前記材質取得部が前記領域毎に取得した前記対象物体の材質を前記形状取得部が取得した前記三次元形状のデータに前記領域毎に割り当てる領域別材質割当部を更に備える、請求項1に記載の材質検出装置。 The material acquisition unit according to claim 1, further comprising a region-specific material allocation unit that allocates the material of the target object acquired for each region by the material acquisition unit to the data of the three-dimensional shape acquired by the shape acquisition unit for each region. Material detection device.  前記対象物体の材質を取得するための赤外線を照射する材質認識用赤外線発光部と、
 前記材質認識用赤外線発光部が照射した赤外線が前記対象物体で反射した赤外線を受光する材質認識用赤外線受光部と、
 を更に備える、請求項1に記載の材質検出装置。
Infrared light emitting part for material recognition that irradiates infrared rays for acquiring the material of the target object;
Infrared light receiving part for material recognition for receiving infrared light reflected by the target object, which is irradiated by the infrared light emitting part for material recognition;
The material detection device according to claim 1, further comprising:
 前記材質取得部は、前記対象物体に照射された赤外線の前記対象物体における反射率に基づいて前記材質を取得する、請求項2に記載の材質検出装置。 The material detection device according to claim 2, wherein the material acquisition unit acquires the material based on a reflectance of the target object of infrared rays irradiated on the target object.  材質毎の赤外線の反射率を予め保持し、前記材質取得部は、前記対象物体に照射された赤外線の前記対象物体における反射率と、予め保持した前記材質毎の赤外線の反射率とを比較して前記材質を取得する、請求項7に記載の材質検出装置。 The infrared reflectance for each material is held in advance, and the material acquisition unit compares the reflectance of the infrared light irradiated to the target object with respect to the target object, and the infrared reflectance for each material held in advance. The material detection device according to claim 7, wherein the material is acquired.  前記形状取得部は、前記対象物体に照射された光線と前記対象物体で反射した光線の情報に基づいて前記三次元形状を取得する、請求項1に記載の材質検出装置。 The material detection device according to claim 1, wherein the shape acquisition unit acquires the three-dimensional shape based on information on a light beam irradiated on the target object and a light beam reflected by the target object.  前記光線は赤外線であり、
 前記対象物体の前記三次元形状を取得するための赤外線を照射する形状認識用赤外線発光部と、
 前記形状認識用赤外線発光部が照射した赤外線が前記対象物体で反射した赤外線を受光する形状認識用赤外線受光部と、
 を更に備える、請求項9に記載の材質検出装置。
The ray is infrared;
An infrared light emitting part for shape recognition that emits infrared light for acquiring the three-dimensional shape of the target object;
Infrared light receiving unit for shape recognition for receiving infrared light reflected by the target object, which is irradiated by the infrared light emitting unit for shape recognition;
The material detection device according to claim 9, further comprising:
 前記形状取得部は、赤外線の照射光と反射光の位相差に基づいて前記三次元形状を取得する、請求項10に記載の材質検出装置。 The material detection device according to claim 10, wherein the shape acquisition unit acquires the three-dimensional shape based on a phase difference between infrared irradiation light and reflected light.  前記形状認識用赤外線発光部は、所定のパターンで赤外線を照射し、
 前記形状取得部は、赤外線の照射パターンと反射パターンとの差分から三角測量により前記三次元形状を取得する、請求項10に記載の材質検出装置。
The infrared light emitting unit for shape recognition irradiates infrared rays in a predetermined pattern,
The material detection device according to claim 10, wherein the shape acquisition unit acquires the three-dimensional shape by triangulation from a difference between an infrared irradiation pattern and a reflection pattern.
 前記形状取得部は、異なる位置で前記対象物体を撮像した画像の特徴点を照合し、入射角に基づいて三角測量により前記三次元形状を取得する、請求項1に記載の材質検出装置。 The material detection device according to claim 1, wherein the shape acquisition unit collates feature points of images obtained by capturing the target object at different positions, and acquires the three-dimensional shape by triangulation based on an incident angle.  前記対象物体を撮像する撮像部と、
 前記撮像部が撮像した撮像画像と、前記対象物体の材質を取得するための赤外線の照射位置とを前記撮像画像に重畳して表示する表示部と、
 を備える、請求項1に記載の材質検出装置。
An imaging unit for imaging the target object;
A display unit that superimposes and displays a captured image captured by the imaging unit and an infrared irradiation position for acquiring the material of the target object on the captured image;
The material detection device according to claim 1, comprising:
 前記対象物体に照射された赤外線と当該対象物体で反射した赤外線の情報を外部の赤外線センサユニットから受信する通信部を更に備える、請求項1に記載の材質検出装置。 The material detection device according to claim 1, further comprising a communication unit that receives information on infrared rays irradiated to the target object and infrared rays reflected by the target object from an external infrared sensor unit.  対象物体の三次元形状を取得することと、
 前記対象物体を撮像して得られた撮像情報を取得することと、
 前記三次元形状と前記撮像情報に基づいて、前記三次元形状を領域毎に分割することと、
 前記対象物体の前記領域毎に照射された材料検出波と当該対象物体で反射した材料検出波の情報に基づいて当該対象物体の材質を前記領域毎に取得することと、
 を備える、材質検出方法。
Obtaining the three-dimensional shape of the target object;
Obtaining imaging information obtained by imaging the target object;
Dividing the three-dimensional shape into regions based on the three-dimensional shape and the imaging information;
Obtaining the material of the target object for each region based on the information of the material detection wave irradiated for each region of the target object and the material detection wave reflected by the target object;
A material detection method comprising:
 対象物体の三次元形状を取得する手段、
 前記対象物体を撮像して得られた撮像情報を取得する手段、
 前記三次元形状と前記撮像情報に基づいて、前記三次元形状を領域毎に分割する手段、
 前記対象物体の前記領域毎に照射された材料検出波と当該対象物体で反射した材料検出波の情報に基づいて当該対象物体の材質を前記領域毎に取得する手段、
 としてコンピュータを機能させるためのプログラム。
Means for obtaining a three-dimensional shape of a target object;
Means for acquiring imaging information obtained by imaging the target object;
Means for dividing the three-dimensional shape into regions based on the three-dimensional shape and the imaging information;
Means for acquiring the material of the target object for each region based on the information of the material detection wave irradiated for each region of the target object and the material detection wave reflected by the target object;
As a program to make the computer function as.
PCT/JP2016/053810 2015-03-25 2016-02-09 Material detection device, material detection method, and program Ceased WO2016152288A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015-062639 2015-03-25
JP2015062639 2015-03-25

Publications (1)

Publication Number Publication Date
WO2016152288A1 true WO2016152288A1 (en) 2016-09-29

Family

ID=56977576

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/053810 Ceased WO2016152288A1 (en) 2015-03-25 2016-02-09 Material detection device, material detection method, and program

Country Status (1)

Country Link
WO (1) WO2016152288A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110794418A (en) * 2018-08-03 2020-02-14 株式会社拓普康 Measuring device
CN112926702A (en) * 2019-12-06 2021-06-08 李雯毓 Active light source type object material identification system and method
CN115578744A (en) * 2022-10-12 2023-01-06 广东三维家信息科技有限公司 Method and device for identifying material of model
CN116311187A (en) * 2023-02-02 2023-06-23 珠海普罗米修斯视觉技术有限公司 Object material identification method, device, electronic device and storage medium
JP7483267B2 (en) 2021-05-31 2024-05-15 日本コンベヤ株式会社 Rolling monitoring system for pipe conveyors

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007216381A (en) * 2004-07-13 2007-08-30 Matsushita Electric Ind Co Ltd robot
JP2013250263A (en) * 2012-05-31 2013-12-12 Korea Institute Of Science And Technology Object material recognition device and method of the same

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007216381A (en) * 2004-07-13 2007-08-30 Matsushita Electric Ind Co Ltd robot
JP2013250263A (en) * 2012-05-31 2013-12-12 Korea Institute Of Science And Technology Object material recognition device and method of the same

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110794418A (en) * 2018-08-03 2020-02-14 株式会社拓普康 Measuring device
CN110794418B (en) * 2018-08-03 2024-05-28 株式会社拓普康 Measuring device
CN112926702A (en) * 2019-12-06 2021-06-08 李雯毓 Active light source type object material identification system and method
JP7483267B2 (en) 2021-05-31 2024-05-15 日本コンベヤ株式会社 Rolling monitoring system for pipe conveyors
CN115578744A (en) * 2022-10-12 2023-01-06 广东三维家信息科技有限公司 Method and device for identifying material of model
CN116311187A (en) * 2023-02-02 2023-06-23 珠海普罗米修斯视觉技术有限公司 Object material identification method, device, electronic device and storage medium

Similar Documents

Publication Publication Date Title
US11915502B2 (en) Systems and methods for depth map sampling
Kang et al. Automatic targetless camera–lidar calibration by aligning edge with gaussian mixture model
US8331652B2 (en) Simultaneous localization and map building method and medium for moving robot
US10288418B2 (en) Information processing apparatus, information processing method, and storage medium
CN117970367A (en) Laser detection method for port machine equipment
KR101054736B1 (en) 3D object recognition and attitude estimation method
JP2021192064A (en) Three-dimensional measuring system and three-dimensional measuring method
KR101272448B1 (en) Apparatus and method for detecting region of interest, and the recording media storing the program performing the said method
WO2016152288A1 (en) Material detection device, material detection method, and program
KR102257746B1 (en) Method for controlling robot group and system thereof
US9805249B2 (en) Method and device for recognizing dangerousness of object
Leens et al. Combining color, depth, and motion for video segmentation
JP6172432B2 (en) Subject identification device, subject identification method, and subject identification program
KR101997048B1 (en) Method for recognizing distant multiple codes for logistics management and code recognizing apparatus using the same
EP3213504B1 (en) Image data segmentation
CN109816697B (en) A system and method for building a map by an unmanned model vehicle
JP2016009474A (en) Object identification system, information processing apparatus, information processing method, and program
US20220398760A1 (en) Image processing device and three-dimensional measuring system
US20230393278A1 (en) Electronic device, method and computer program
WO2021049490A1 (en) Image registration device, image generation system, image registration method and image registration program
KR20220110034A (en) A method of generating an intensity information with extended expression range by reflecting a geometric characteristic of object and a LiDAR device that performs the method
JP4110501B2 (en) Random pattern generation apparatus and method, distance image generation apparatus and method, and program providing medium
WO2014197283A1 (en) Edge preserving depth filtering
Song et al. Estimation of kinect depth confidence through self-training
CN109840463A (en) A kind of Lane detection method and apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16768191

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16768191

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP