[go: up one dir, main page]

WO2019203000A1 - Dispositif de reconnaissance d'environnement externe - Google Patents

Dispositif de reconnaissance d'environnement externe Download PDF

Info

Publication number
WO2019203000A1
WO2019203000A1 PCT/JP2019/014915 JP2019014915W WO2019203000A1 WO 2019203000 A1 WO2019203000 A1 WO 2019203000A1 JP 2019014915 W JP2019014915 W JP 2019014915W WO 2019203000 A1 WO2019203000 A1 WO 2019203000A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
image
unit
external environment
recognition device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2019/014915
Other languages
English (en)
Japanese (ja)
Inventor
健 志磨
琢馬 大里
裕史 大塚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Astemo Ltd
Original Assignee
Hitachi Automotive Systems Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Automotive Systems Ltd filed Critical Hitachi Automotive Systems Ltd
Priority to CN201980025028.1A priority Critical patent/CN111937035A/zh
Publication of WO2019203000A1 publication Critical patent/WO2019203000A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/09Taking automatic action to avoid collision, e.g. braking and steering
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/04Traffic conditions
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/20Processor architectures; Processor configuration, e.g. pipelining
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to an external recognition device.
  • a vehicular image processing apparatus capable of improving the reliability of information analysis using camera images is known (see, for example, Patent Document 1).
  • Patent Document 1 states that “a region of interest including a preceding vehicle or a road white line or a sign and other non-attention regions are set in an image captured by an imaging unit that images the front of the host vehicle, and the imaging unit Luminance information in the attention area and non-attention area in the captured image is detected, and the vehicle is in a driving environment where it is difficult to analyze the situation ahead of the vehicle by image processing based on the luminance information of the attention area and the non-attention area. It is determined whether or not.
  • An object of the present invention is to provide an external recognition apparatus capable of suppressing a transfer delay when outputting image data to the outside.
  • the present invention provides an imaging unit that captures an image, a data generation unit that generates a plurality of types of data from the image captured by the imaging unit, and data from the plurality of types of data.
  • transfer delay can be suppressed when image data is output to the outside.
  • FIG. 1 is a configuration diagram of a system including a stereo camera according to an embodiment of the present invention. It is a block diagram of the stereo camera by embodiment of this invention. It is a block diagram which shows the function of the stereo camera by embodiment of this invention. It is a figure which shows the example of a communication interface. It is a flowchart which shows operation
  • FIG. 1 is a configuration diagram of a system 1 including a stereo camera 100 according to an embodiment of the present invention.
  • the system 1 includes a stereo camera 100, an automatic operation ECU 200 (Electronic Control Unit), a millimeter wave radar 300, a LIDAR 400 (Light Detection Detection And And Ranging), and the like.
  • the stereo camera 100 images the same target object from a plurality of different viewpoints, and calculates the distance to the target object from the deviation of the appearance (parallax) in the obtained image.
  • the stereo camera 100 is connected to the automatic operation ECU 200 with a LAN (Local Area Network) cable compliant with Ethernet (registered trademark).
  • Ethernet registered trademark
  • CAN Controller Area ⁇ ⁇ Network
  • the automatic driving ECU 200 controls automatic driving of the vehicle based on the distance, angle, relative speed, and the like of the target object detected by sensors such as the millimeter wave radar 300 and the LIDAR 400.
  • the automatic operation ECU 200 is connected to sensors such as the millimeter wave radar 300 and the LIDAR 400 via a CAN-compliant two-wire bus.
  • Millimeter wave radar 300 detects the distance, angle, relative speed, etc. of the target object using millimeter waves (electromagnetic waves).
  • the LIDAR 400 detects the distance, angle, relative speed, and the like of the target object using light waves (electromagnetic waves) such as ultraviolet rays, visible rays, and near infrared rays.
  • FIG. 2 is a configuration diagram of the stereo camera 100 according to the embodiment of the present invention.
  • the stereo camera 100 includes an imaging unit 101, a CPU 102 (Central Processing Unit), a memory 103, an image processing unit 104, and a communication circuit 105.
  • a CPU 102 Central Processing Unit
  • the imaging unit 101 includes a left imaging unit 101L and a right imaging unit 101R, and captures an image.
  • each of the left imaging unit 101L and the right imaging unit 101R includes an optical element such as a lens and an imaging element such as a CCD (Charge Coupled Device) and a CMOS (Complementary Metal Oxide Semiconductor).
  • the optical element refracts light and forms an image on the imaging element.
  • the image sensor receives an image of light refracted by the optical element and generates an image corresponding to the intensity of the light.
  • the CPU 102 implements various functions to be described later by executing a predetermined program.
  • CPU102 produces
  • the memory 103 includes, for example, a nonvolatile memory such as FLASH and a volatile memory such as a RAM (Random Access Memory), and stores various information.
  • a nonvolatile memory such as FLASH
  • a volatile memory such as a RAM (Random Access Memory)
  • the image processing unit 104 includes a circuit such as an FPGA (Field-Programmable Gate Array), a DSP (Digital Signal Processor), and an ASIC (Application Specific Integrated Circuit).
  • the image processing unit 104 performs image correction, parallax calculation, and the like, for example. Details of the function of the image processing unit 104 will be described later with reference to FIG.
  • the communication circuit 105 includes, for example, an Ethernet-compliant transceiver.
  • the communication circuit 105 transmits or receives various data.
  • FIG. 3 is a block diagram illustrating functions of the stereo camera 100 according to the embodiment of the present invention.
  • the left imaging unit 101L captures a first image of the target object.
  • the right imaging unit 101R captures a second image of the target object.
  • the captured first image and second image are temporarily stored, for example, in the memory 103 as “raw images” (RAW data).
  • the image processing unit 104 includes an image correction unit 1041, a parallax calculation unit 1042, a three-dimensional object extraction unit 1043, a three-dimensional grouping unit 1044, and an object identification unit 1045. Note that the image correction unit 1041, the parallax calculation unit 1042, the three-dimensional object extraction unit 1043, the three-dimensional grouping unit 1044, and the object identification unit 1045 may be configured separately.
  • the image correction unit 1041 corrects the first image captured by the left imaging unit 101L and the second image captured by the right imaging unit 101R. Specifically, the image correction unit 1041 corrects, for example, distortion and optical axis deviation between the first image and the second image.
  • the corrected first image and second image are temporarily stored, for example, in the memory 103 as “corrected images”.
  • the parallax calculation unit 1042 calculates parallax from the first image and the second image corrected by the image correction unit 1041 by performing stereo matching.
  • the parallax calculation unit 1042 calculates the distance from the parallax to the target object by triangulation.
  • the parallax calculation unit 1042 generates a parallax image in which a position on the image is associated with a distance corresponding to the parallax.
  • the parallax image (first data) is stored in the memory 103, for example.
  • the three-dimensional object extraction unit 1043 extracts an object located at an equivalent distance from the parallax image generated by the parallax calculation unit 1042 as one solid object.
  • the three-dimensional object raw information (second data) that is data of the extracted three-dimensional object is stored in the memory 103, for example.
  • the three-dimensional grouping unit 1044 groups the three-dimensional objects extracted by the three-dimensional object extraction unit 1043 based on luminance, edges, and the like.
  • the post-three-dimensional object grouping information (third data) that is data of the grouped three-dimensional object is stored in the memory 103, for example.
  • the object identification unit 1045 identifies what the target object is (a pedestrian, a vehicle, etc.) from the three-dimensional object grouped by the three-dimensional grouping unit 1044.
  • the CPU 102 functions as a vehicle control unit 1021 and a data extraction unit 1022 by executing a predetermined program.
  • the vehicle control unit 1021 generates a command for controlling the vehicle according to the type of the object identified by the object identification unit 1045. For example, if the vehicle control unit 1021 determines that the pedestrian is on the path of the vehicle, the vehicle control unit 1021 generates a command to decelerate the vehicle.
  • the data extraction unit 1022 extracts (selects) data stored in the memory 103 in response to a request from the automatic operation ECU 200 received by a request reception unit 1051 described later. That is, the data extraction unit 1022 extracts predetermined data from a plurality of types of data. In other words, the data extraction unit 1022 extracts data in response to a request.
  • the communication circuit 105 functions as a request receiving unit 1051 and a data output unit 1052.
  • Request receiving unit 1051 receives a request from external automatic driving ECU 200.
  • Data output unit 1052 outputs the data extracted by data extraction unit 1022 to external automatic operation ECU 200. In other words, the data output unit 1052 outputs data in response to a request.
  • the image correction unit 1041, the parallax calculation unit 1042, the three-dimensional object extraction unit 1043, and the three-dimensional grouping unit 1044 constitute a data generation unit 104a that generates a plurality of types of data from the image captured by the imaging unit 101.
  • FIG. 4 is a diagram illustrating an example of a communication interface.
  • FIG. 4 shows a “request” input from the automatic operation ECU 200 to the communication circuit 105.
  • the third row of the table of FIG. 4 shows “data according to request” output from the communication circuit 105 to the automatic operation ECU 200.
  • the request from the automatic operation ECU 200 includes a coordinate value on the image (information indicating a position on the image) as information indicating a region of interest, a type ID as information indicating output content (type of data), and an output resolution (resolution).
  • a coordinate value on the image information indicating a position on the image
  • a type ID information indicating output content
  • an output resolution resolution
  • the number of images to be thinned out or the number of output objects, and the number of outputs per unit time (seconds) as information indicating the output frequency are included.
  • the attention area is, for example, a part estimated by the automatic operation ECU 200 that there is a three-dimensional object based on the distance, angle, relative speed, and the like of the target object detected by sensors such as the millimeter wave radar 300 and the LIDAR 400.
  • the type ID indicates the type of data to be output.
  • data types for example, raw images (RAW data) captured by the imaging unit 101, parallax images (first data), three-dimensional object raw information (second data), and three-dimensional object grouping information (third data) are included. is there.
  • FIG. 5 is a flowchart showing the operation of the stereo camera 100 according to the embodiment of the present invention.
  • the stereo camera 100 receives the request shown in FIG. 4 from the automatic driving ECU 200 (S10).
  • the stereo camera 100 extracts (selects) data stored in the memory 103 in response to the request received in S10 (S15).
  • Stereo camera 100 outputs the data extracted in S15 (S20). For example, in response to the request illustrated in FIG. 4, the stereo camera 100 reads out the data of the type indicated by the “type ID” from the memory 103 for the attention area indicated by the “coordinate value on the image”. Data is output (transmitted) to the automatic operation ECU 200 at an output resolution indicated by “Decimation degree / number of outputs” and an output frequency indicated by “times / s”. Note that the stereo camera 100 may output synchronization information.
  • the data extraction unit 1022 extracts “coordinate values on the image” (position on the image indicated by the position information) included in the request. Thereby, the automatic driving ECU 200 can acquire data of only the attention area.
  • the data extraction unit 1022 extracts the type of data indicated by the “type ID” included in the request. As a result, the automatic operation ECU 200 can acquire only a desired type of data.
  • the data output unit 1052 outputs the data extracted by the data extraction unit 1022 to the external automatic operation ECU 200 at an output resolution (resolution) indicated by “Decimation degree / number of outputs” included in the request. Thereby, the automatic driving ECU 200 can acquire data of a desired output resolution.
  • the data output unit 1052 outputs the data extracted by the data extraction unit 1022 to the outside at an output frequency indicated by “times / s” included in the request. Thereby, automatic driving ECU200 can acquire data with desired output frequency.
  • the data output unit 1052 outputs the data extracted by the data extraction unit 1022 to the external automatic operation ECU 200 via Ethernet. As a result, the transfer speed is improved as compared with CAN or the like.
  • the data generation unit 104a generates a parallax image (first data) indicating parallax or a distance corresponding to the parallax, and uses the three-dimensional object data as the three-dimensional object raw information (second data) from the parallax image (first data). Extracted and generated three-dimensional object grouping information (third data) grouped from the three-dimensional object raw information (second data).
  • the data extraction unit 1022 converts the type of data indicated by the type ID into a raw image (RAW data), a parallax image (first data), three-dimensional object raw information (second data), and three-dimensional object grouping information (third data). Extract (select) from a raw image (RAW data), a parallax image (first data), three-dimensional object raw information (second data), and three-dimensional object grouping information (third data). Extract (select) from a raw image (RAW data), a parallax image (first data), three-dimensional object raw information (second data), and three-dimensional object grouping information (third data). Extract (select) from a raw image (RAW data), a parallax image (first data), three-dimensional object raw information (second data), and three-dimensional object grouping information (third data). Extract (select) from a raw image (RAW data), a parallax image (first data), three-dimensional object raw information (second data), and three-dimensional object grouping information (third
  • the object identification unit 1045 identifies an object from the image captured by the imaging unit 101.
  • the data extraction unit 1022 extracts data of an object associated with an importance level equal to or higher than a threshold value.
  • the system 1 includes a stereo camera 100 (external world recognition device), sensors such as millimeter wave radar 300 and LIDAR 400 that detect the position of an object using electromagnetic waves, and an automatic operation ECU 200 (control device).
  • a stereo camera 100 external world recognition device
  • sensors such as millimeter wave radar 300 and LIDAR 400 that detect the position of an object using electromagnetic waves
  • an automatic operation ECU 200 control device
  • the automatic operation ECU 200 converts the position of the object detected by the sensor into a position on the image, transmits a request including the position information of the position to the stereo camera 100 (external world recognition device), and a data extraction unit
  • the data extracted by 1022 is received from the stereo camera 100 (external recognition device), and the vehicle is controlled based on the received data.
  • the automatic operation ECU 200 can extract data of various layers (the imaging unit 101, the image correction unit 1041, the three-dimensional grouping unit 1044, and the like).
  • transfer delay can be suppressed when image data is output to the outside.
  • the volume of data output from a stereo camera with high pixels can be reduced.
  • FIG. 6 is a block diagram illustrating functions of a stereo camera 100A according to a modification. Note that the hardware configuration of the stereo camera 100A of this modification is the same as that of the stereo camera 100 shown in FIG.
  • the communication circuit 105 does not have the request reception unit 1051.
  • the data extraction unit 1022A of the present modification extracts data on the position (coordinate value) on the image of the object identified by the object identification unit 1045 and associated with an importance level equal to or higher than the threshold. That is, the data extraction unit 1022A extracts data of a highly important object (pedestrian, dangerous part).
  • the importance level and the object type are stored in association with the memory 103, for example.
  • FIG. 7 is a flowchart showing the operation of the modified stereo camera 100A.
  • Stereo camera 100A identifies an object (S13).
  • the stereo camera 100A extracts data of objects (pedestrians and dangerous parts) with high importance among the objects identified in S13 (S18).
  • the stereo camera 100A outputs the data extracted in S18 (S20).
  • the stereo camera 100 is adopted as the external recognition device, but the number of image sensors is arbitrary, and a monocular camera may be used.
  • millimeter wave radar and LIDAR are exemplified as the sensor, but the sensor may be a sonar that detects the distance, angle, relative speed, and the like of the target object using sound waves.
  • images such as raw images (RAW data), parallax images (first data), three-dimensional object raw information (second data), and three-dimensional object grouping information (third data).
  • RAW data raw images
  • first data parallax images
  • second data three-dimensional object raw information
  • third data three-dimensional object grouping information
  • the data related to the above is stored in the memory 103, but may be stored in the built-in memory of the image processing unit 104.
  • the image correction unit 1041, the parallax calculation unit 1042, the three-dimensional object extraction unit 1043, the three-dimensional grouping unit 1044, and the object identification unit 1045 are realized as functions of the image processing unit 104 as an example. It may be realized. Note that each of the image correction unit 1041, the parallax calculation unit 1042, the three-dimensional object extraction unit 1043, the three-dimensional grouping unit 1044, and the object identification unit 1045 may be configured as a circuit such as an ASIC.
  • Ethernet is adopted as a standard between the stereo camera 100 and the automatic operation ECU 200, but other standards may be used as long as a transfer rate equal to or higher than that of the Ethernet can be realized.
  • the “request” from the automatic operation ECU 200 includes the coordinate value on the image, but may include the coordinate value of another coordinate system such as the world coordinate system instead of the coordinate value on the image.
  • the CPU 102 of the stereo camera 100 converts coordinate values in the world coordinate system into coordinate values on the image.
  • the vehicle speed may be included instead of the coordinate value on the image.
  • the data extraction unit 1022 extracts (selects), for example, a high resolution image and a parallax image, and extracts (selects) a low resolution image and a parallax image.
  • the request from the automatic operation ECU 200 includes information specifying either or both of the first image data captured by the left imaging unit 101L and the second image data captured by the right imaging unit 101R. Also good.
  • the data extraction units 1022 and 1022A extract (select) the data of the designated image.
  • the vehicle speed is detected by a vehicle speed sensor.
  • the automatic operation ECU 200 refers to the parallax image (first data), the three-dimensional object raw information (second data), the three-dimensional object grouping information (third data), and the like received from the stereo camera 100, and there is a detected object candidate.
  • the coordinate value of the image coordinate system or the world coordinate system corresponding to the position as information indicating the region of interest may be included in the “request” and input to the stereo camera 100 again.
  • each of the above-described configurations, functions, and the like may be realized by hardware by designing a part or all of them with, for example, an integrated circuit.
  • Each of the above-described configurations, functions, and the like may be realized by software by interpreting and executing a program that realizes each function by a processor (CPU).
  • Information such as programs, tables, and files for realizing each function can be stored in a recording device such as a memory, a hard disk, or an SSD (Solid State Drive), or a recording medium such as an IC card, an SD card, or a DVD.
  • SYMBOLS 1 DESCRIPTION OF SYMBOLS 1 ... System, 100, 100A ... Stereo camera, 101 ... Imaging part, 101L ... Left imaging part, 101R ... Right imaging part, 103 ... Memory, 104 ... Image processing part, 104a ... Data generation part, 105 ... Communication circuit, 300 ... millimeter wave radar, 400 ... LIDAR, 1021 ... vehicle control unit, 1022 ... data extraction unit, 1022A ... data extraction unit, 1041 ... image correction unit, 1042 ... parallax calculation unit, 1043 ... three-dimensional object extraction unit, 1044 ... three-dimensional grouping , 1045 ... object identification part, 1051 ... request reception part, 1052 ... data output part

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Transportation (AREA)
  • Automation & Control Theory (AREA)
  • Mechanical Engineering (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Mathematical Physics (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

La présente invention concerne un dispositif de reconnaissance d'environnement externe avec lequel un retard de transfert lors de l'émission externe de données d'image peut être atténué. Une caméra stéréo 100 (dispositif de reconnaissance d'environnement externe) selon la présente invention comprend une partie de capture d'image 101, une partie de génération de données 104a, une partie d'extraction de données 1022, et une partie de sortie de données 1052. La partie de capture d'image 101 capture des images. La partie de génération de données 104a génère une pluralité de types de données à partir des images capturées par la partie de capture d'image 101. La partie d'extraction de données 1022 extrait des données de la pluralité de types de données. La partie de sortie de données 1052 délivre à l'extérieur les données extraites par la partie d'extraction de données 1022.
PCT/JP2019/014915 2018-04-17 2019-04-04 Dispositif de reconnaissance d'environnement externe Ceased WO2019203000A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201980025028.1A CN111937035A (zh) 2018-04-17 2019-04-04 外界识别装置

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018079266A JP7106332B2 (ja) 2018-04-17 2018-04-17 外界認識装置
JP2018-079266 2018-04-17

Publications (1)

Publication Number Publication Date
WO2019203000A1 true WO2019203000A1 (fr) 2019-10-24

Family

ID=68238894

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/014915 Ceased WO2019203000A1 (fr) 2018-04-17 2019-04-04 Dispositif de reconnaissance d'environnement externe

Country Status (3)

Country Link
JP (1) JP7106332B2 (fr)
CN (1) CN111937035A (fr)
WO (1) WO2019203000A1 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014073322A1 (fr) * 2012-11-08 2014-05-15 日立オートモティブシステムズ株式会社 Dispositif de détection d'objet et procédé de détection d'objet
WO2017195650A1 (fr) * 2016-05-13 2017-11-16 ソニー株式会社 Dispositif de génération, procédé de génération, dispositif de reproduction et procédé de reproduction

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008059260A (ja) * 2006-08-31 2008-03-13 Fujitsu Ltd 移動検出画像生成装置
JP2014130429A (ja) * 2012-12-28 2014-07-10 Ricoh Co Ltd 撮像装置及び立体物領域検出用プログラム
JP6098318B2 (ja) * 2013-04-15 2017-03-22 オムロン株式会社 画像処理装置、画像処理方法、画像処理プログラムおよび記録媒体
JP6313646B2 (ja) * 2014-04-24 2018-04-18 日立オートモティブシステムズ株式会社 外界認識装置
JP2016019009A (ja) * 2014-07-04 2016-02-01 日立オートモティブシステムズ株式会社 情報処理装置
EP3410415B1 (fr) * 2016-01-28 2020-11-18 Ricoh Company, Ltd. Dispositif de traitement d'images, dispositif d'imagerie, système de commande d'entité mobile et programme de traitement d'image

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014073322A1 (fr) * 2012-11-08 2014-05-15 日立オートモティブシステムズ株式会社 Dispositif de détection d'objet et procédé de détection d'objet
WO2017195650A1 (fr) * 2016-05-13 2017-11-16 ソニー株式会社 Dispositif de génération, procédé de génération, dispositif de reproduction et procédé de reproduction

Also Published As

Publication number Publication date
JP7106332B2 (ja) 2022-07-26
JP2019185666A (ja) 2019-10-24
CN111937035A (zh) 2020-11-13

Similar Documents

Publication Publication Date Title
US11616900B2 (en) Electronic device for recording image as per multiple frame rates using camera and method for operating same
TWI540462B (zh) 手勢辨識方法及其裝置
CN108139484B (zh) 测距装置和测距装置的控制方法
JP6795027B2 (ja) 情報処理装置、物体認識装置、機器制御システム、移動体、画像処理方法およびプログラム
JP6156733B2 (ja) 対象物認識装置、機器制御システム、車両、対象物認識方法及び対象物認識用プログラム
US11258962B2 (en) Electronic device, method, and computer-readable medium for providing bokeh effect in video
US8224069B2 (en) Image processing apparatus, image matching method, and computer-readable recording medium
US8977006B2 (en) Target recognition system and target recognition method executed by the target recognition system
CN106911922B (zh) 从单个传感器生成的深度图
KR102472156B1 (ko) 전자 장치 및 그 깊이 정보 생성 방법
KR102552923B1 (ko) 복수의 카메라들 또는 깊이 센서 중 적어도 하나를 이용하여 깊이 정보를 획득하는 전자 장치
CN105321159B (zh) 手持式电子装置、图像提取装置及景深信息的获取方法
EP4148671B1 (fr) Dispositif électronique et son procédé de commande
KR102842566B1 (ko) 얼굴 데이터 획득을 위한 방법 및 이를 위한 전자 장치
CN111479078B (zh) 图像传感器芯片、电子装置及操作图像传感器芯片的方法
US20210250498A1 (en) Electronic device and method for displaying image in electronic device
CN118435023A (zh) 深度传感器设备和用于操作深度传感器设备的方法
CN116243337A (zh) 实时虚拟lidar传感器
KR101420242B1 (ko) 스테레오 카메라를 이용한 차량 검지 방법 및 장치
WO2019203000A1 (fr) Dispositif de reconnaissance d'environnement externe
JP2010060371A (ja) 物体検出装置
US20170078644A1 (en) Controller applied to a three-dimensional (3d) capture device and 3d image capture device
JP2018146495A (ja) 物体検出装置、物体検出方法、物体検出プログラム、撮像装置、及び、機器制御システム
CN118176435A (zh) 包括lidar系统的电子装置以及控制电子装置的方法
JP6620050B2 (ja) 画像処理装置、車載用画像処理装置、プログラム、及び記録媒体

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19787580

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19787580

Country of ref document: EP

Kind code of ref document: A1