[go: up one dir, main page]

WO2019077010A1 - Data processing method and associated onboard system - Google Patents

Data processing method and associated onboard system Download PDF

Info

Publication number
WO2019077010A1
WO2019077010A1 PCT/EP2018/078465 EP2018078465W WO2019077010A1 WO 2019077010 A1 WO2019077010 A1 WO 2019077010A1 EP 2018078465 W EP2018078465 W EP 2018078465W WO 2019077010 A1 WO2019077010 A1 WO 2019077010A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixels
sensor
pos
matrix
environment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/EP2018/078465
Other languages
French (fr)
Inventor
Pierre Emmanuel BALANDREAU
Jeremie Pinoteau
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Valeo Comfort and Driving Assistance SAS
Original Assignee
Valeo Comfort and Driving Assistance SAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Valeo Comfort and Driving Assistance SAS filed Critical Valeo Comfort and Driving Assistance SAS
Priority to EP18785645.5A priority Critical patent/EP3698321A1/en
Publication of WO2019077010A1 publication Critical patent/WO2019077010A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/147Details of sensors, e.g. sensor lenses
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language

Definitions

  • the present invention generally relates to the field of processing data acquired by a sensor.
  • It relates more particularly to a data processing method and an associated embedded system.
  • a sensor generally a matrix sensor
  • acquires a set of pixels which are then processed in order to obtain information relating to the environment facing the sensor.
  • Such solutions are used in particular in the field of vehicles, for example to build a map of the environment located at the front of the vehicle or to recognize gestures made by the driver of the vehicle.
  • processing algorithms used are, however, more and more complex and thus consuming material resources (microprocessor, memory, etc.).
  • the invention proposes a method of data processing comprising the following steps:
  • the processing algorithm is thus applied to only a subset of the acquired pixels, which limits the volume of the manipulated data. Thanks to the location of the object, the treatments performed focus on the area of the environment for which the application of the processing algorithm is of interest.
  • Other non-limiting and advantageous features of the data processing method taken individually or in any technically possible combination, are as follows:
  • the sensor is matrix
  • the first set of pixels is a matrix of pixels
  • the step of determining the second set of pixels comprises a step of determining an area comprising said position and / or a step of extracting the pixels of the first set located in said area;
  • the senor delivers a plurality of first sets of pixels
  • the method comprises a step of determining, for each of said first sets of pixels, a second set of corresponding pixels by extracting the pixels of the first set concerned located in said area (the same area being thus used for the different first sets) ;
  • the senor is part of a time of flight sensor further comprising an electromagnetic radiation emitting element
  • said first sets are acquired by the sensor respectively in correspondence with distinct phases of the (modulated) signal emitted by the emitting element of electromagnetic radiation;
  • said position is determined by analysis of an image representative of the environment (such as at least a part of the first set of pixels or an image obtained by means of the aforementioned processing algorithm);
  • said position is determined by tracking the object (typically by evaluating a speed of the object in the image between two successive images and estimating the new position of the object on the basis of this speed and a position of the object in the previous image).
  • the invention also proposes an embedded system comprising a processing unit and an environment-facing sensor, wherein the sensor is designed to acquire a first set of pixels and wherein the processing unit comprises a module designed to determine a position of an object of the environment, a module designed to determine a second set of pixels included in the first set of pixels (the second set of pixels covering said position and being distinct from the first set), and a module designed to apply a pixel processing algorithm of the second set only.
  • the processing unit comprises a module designed to determine a position of an object of the environment, a module designed to determine a second set of pixels included in the first set of pixels (the second set of pixels covering said position and being distinct from the first set), and a module designed to apply a pixel processing algorithm of the second set only.
  • FIG. 1 represents an onboard system including a flight time sensor
  • FIG. 2 represents the functional modules of a processing unit of the onboard system of FIG. 1;
  • FIG. 3 represents pixel matrices processed by a processing method using the invention.
  • FIG. 1 schematically shows a system embedded in a vehicle, here a motor vehicle, and comprising a flight time sensor 1 and a processing unit 10.
  • the flight time sensor 1, here a three-dimensional camera based on the principle of flight time comprises at least one element emitting electromagnetic radiation 2 (typically one or more electroluminescent diode (s) emitting in the infrared) and a matrix sensor 4 (such as an image sensor sensitive to radiation emitted by the emitter element 2, here the infrared) defining a set pixels.
  • electromagnetic radiation 2 typically one or more electroluminescent diode (s) emitting in the infrared
  • a matrix sensor 4 such as an image sensor sensitive to radiation emitted by the emitter element 2, here the infrared
  • the radiation E emitted by the emitter element 2 (generally through an optical transmission system not shown in FIG. 1) is reflected towards the matrix sensor 4 (radiation referenced R in FIG. 1) by the first object encountered on the path E. radiation
  • an optical system (such as a lens) is placed facing the matrix sensor 4 so that each pixel of the matrix sensor 4 receives the reflected signal R coming from a particular direction of the solid angle analyzed by the time of flight sensor 1.
  • a processing unit 10 controls the emission of the radiation E by the transmitter element 2, and then analyzes the signals measured by the matrix sensor 4 (represented here in the form of a plurality of matrices M k of pixels) so as to determine in particular a matrix D comprising the distances d 1 of the first object encountered for a plurality of directions of space facing the flight time sensor 1.
  • This matrix D of the distances d 1 (as well as possibly other information, such as a luminance matrix L) is transmitted by the processing unit 10 to another electronic system (not shown) for use by the latter.
  • the flight time sensor 1 can be placed at the front of the vehicle in order to build a map of the front environment of the vehicle and / or to detect an obstacle and / or to evaluate the speed of another vehicle located at the front (by deriving a distance of the vehicle evaluated on the basis of some of the distances di).
  • the flight time sensor can be placed in the passenger compartment of the vehicle (for example facing the driver of the vehicle) and the distances di determined by the processing unit 10 can be used within a gesture recognition algorithm.
  • the processing unit 10 controls the acquisition by the matrix sensor 4 of a plurality of matrices M k of pixels, at times respectively corresponding to a plurality of distinct phases (two by two) of the modulated signal.
  • FIG. 2 shows the functional modules of the processing unit
  • Each functional module 12, 14, 1 6 represented in FIG. 2 corresponds to a particular functionality implemented by the processing unit 10.
  • Several (if not all) functional modules can however in practice be implemented by the same entity. physical, here a processor of the processing unit 10, on which executes program instructions stored in a memory associated with the processor (each functional module then being implemented in this case by the execution of a particular game instructions stored in this memory).
  • the processing unit 10 thus comprises a location module 12, a selection module 14 and an analysis module 1 6.
  • the location module 12 is designed to locate (roughly) an object present in the solid angle observed by the matrix sensor 4.
  • the location of the object is achieved by analyzing an image representative of the environment facing the matrix sensor 4 in the field of view of the matrix sensor 4.
  • This representative image may be in practice one of the matrices M k of pixels produced by the matrix sensor 4 or the luminance matrix L produced as indicated below by the analysis module 1 6, possibly (in both cases) after a downsampling step to reduce the volume of processed data.
  • This first possibility can be used in particular if no object has been detected beforehand.
  • the location of the object is achieved by tracking the object and estimating the current position of the object (typically based on the position of the object at the previous iteration and an estimated displacement of the object of an iteration to the next iteration).
  • the location module 12 When the location module 12 locates an object, the location module 12 transmits POS position information to the selection module 14.
  • the POS position information comprises, for example, coordinates of the detected object expressed with respect to the matrix of pixels acquired by the matrix sensor 4. These coordinates make it possible to define the position and / or the extent of the object concerned in the solid angle observed by the matrix sensor 4.
  • the selection module 14 also receives as input the pixel matrices M k acquired by the matrix sensor 4.
  • the selection module 14 receives no position information POS and then transmits the pixel matrices M k received at the input to the analysis module 1 6 (without modification).
  • the location module 12 transmits the POS position information to the selection module 14 as already indicated.
  • the selection module 12 is then designed to:
  • zone Z includes all the pixels located 10 pixels or less (vertically or horizontally) from the position defined by the POS position information (including the defined pixel or the pixels defined by the POS position information). .
  • Each set S k of pixels is therefore a subset of the set of pixels of a corresponding matrix M k of pixels.
  • the same zone Z is used to extract the pixels of the different matrices M k of pixels.
  • the analysis module 1 6 is designed for applying at least one processing algorithm to the sets S k of pixels received at the input.
  • the analysis module 1 6 implements a first processing algorithm making it possible to obtain the matrix D of the distances di j on the basis of sets S k of pixels received at the input (when the module of selection 14 carries out the aforementioned extraction, or on the basis of matrices M k of pixels when no object is detected by the location module 12 and that the selection module 14 then simply transmits these matrices M k of pixels).
  • the analysis module 1 6 here also implements a second processing algorithm, which in turn makes it possible to obtain the above-mentioned luminance matrix L on the basis of sets S k of pixels received at input (or, as previously, on the base of matrices M k of pixels).
  • the processing algorithms are applied to only part of the pixels (the sets S k ), which makes it possible to reduce the volume of the treatments performed.
  • the selection module 14 extracts pixels from the matrices M k of pixels as indicated above and transmits to the analysis module 1 6 sets S k of pixels, the matrix D of the distances d 1 j and / or the matrix L of luminance are usually partial (ie do not include data for the entire field of view of the matrix sensor 4).
  • the matrix sensor 4 acquires at step E2 at least one matrix of pixels, here a plurality of matrix M k of pixels as explained above.
  • the locating unit 12 detects an object O in the environment facing the matrix sensor 4 (step E4) and determines its POS position, for example by analyzing one of the matrices M k of pixels. Alternatively, the detection of the object O could be performed by analyzing a luminance matrix L calculated by the processing unit 10.
  • the selection module 14 determines in step E6 a zone Z covering the object O (and for example larger than the object O to take into account an error margin).
  • the selection module 14 For each matrix M k of pixels, the selection module 14 extracts the pixels located in the zone Z in order to obtain a set S k of corresponding pixels (step E8).
  • the analysis module 1 6 can thus apply at step E10 at least one processing algorithm to sets S k of pixels (and to these pixels only), here in order to obtain a (partial) matrix D of distances dij or a (partial) matrix of luminance L.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Vascular Medicine (AREA)
  • Human Computer Interaction (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)
  • Testing Or Calibration Of Command Recording Devices (AREA)

Abstract

A data processing method comprises the following steps: - acquiring (E2) a first set (Mk) of pixels by a sensor facing an environment; - determining (E4) a position (POS) of an object (O) of the environment; - determining (E6, E8) a second set (Sk) of pixels included in the first set (Mk) of pixels, the second set (Sk) of pixels covering said position (POS) and being distinct from the first set (Mk); - applying (E10) a processing algorithm to the pixels of the second set (Sk) only. An associated onboard system is also described.

Description

Procédé de traitement de données et système embarqué associé  Data processing method and associated embedded system

DOMAINE TECHNIQUE AUQUEL SE RAPPORTE L'INVENTION La présente invention concerne de manière générale le domaine du traitement des données acquises par un capteur.  TECHNICAL FIELD TO WHICH THE INVENTION RELATES The present invention generally relates to the field of processing data acquired by a sensor.

Elle concerne plus particulièrement un procédé de traitement de données et un système embarqué associé.  It relates more particularly to a data processing method and an associated embedded system.

ARRIERE-PLAN TECHNOLOGIQUE  BACKGROUND

On connaît notamment les procédés de traitement de données dans lesquels un capteur (en général un capteur matriciel) acquiert un ensemble de pixels, qui sont ensuite traités afin d'obtenir des informations relatives à l'environnement faisant face au capteur.  In particular, data processing methods are known in which a sensor (generally a matrix sensor) acquires a set of pixels, which are then processed in order to obtain information relating to the environment facing the sensor.

Des telles solutions sont utilisées notamment dans le domaine des véhicules, que ce soit par exemple pour construire une cartographie de l'environnement situé à l'avant du véhicule ou pour reconnaître des gestes effectués par le conducteur du véhicule.  Such solutions are used in particular in the field of vehicles, for example to build a map of the environment located at the front of the vehicle or to recognize gestures made by the driver of the vehicle.

Les algorithmes de traitement utilisés sont toutefois de plus en plus complexes et ainsi consommateurs en ressources matérielles (microprocesseur, mémoire, etc.).  The processing algorithms used are, however, more and more complex and thus consuming material resources (microprocessor, memory, etc.).

OBJET DE L'INVENTION  OBJECT OF THE INVENTION

Dans ce contexte, l'invention propose un procédé de traitement de données comprenant les étapes suivantes :  In this context, the invention proposes a method of data processing comprising the following steps:

- acquisition d'un premier ensemble de pixels par un capteur faisant face à un environnement ;  - acquisition of a first set of pixels by a sensor facing an environment;

- détermination d'une position d'un objet de l'environnement ;  determining a position of an object of the environment;

- détermination d'un second ensemble de pixels inclus dans le premier ensemble de pixels, le second ensemble de pixels recouvrant ladite position et étant distinct du premier ensemble ;  determining a second set of pixels included in the first set of pixels, the second set of pixels covering said position and being distinct from the first set;

- application d'un algorithme de traitement aux pixels du second ensemble seulement.  - Application of a processing algorithm to the pixels of the second set only.

L'algorithme de traitement est ainsi appliqué à un sous-ensemble seulement des pixels acquis, ce qui limite le volume des données manipulées. Grâce à la localisation de l'objet, les traitements effectués se concentrent sur la zone de l'environnement pour laquelle l'application de l'algorithme de traitement a un intérêt. D'autres caractéristiques non limitatives et avantageuses du procédé de traitement de données, prises individuellement ou selon toutes les combinaisons techniquement possibles, sont les suivantes : The processing algorithm is thus applied to only a subset of the acquired pixels, which limits the volume of the manipulated data. Thanks to the location of the object, the treatments performed focus on the area of the environment for which the application of the processing algorithm is of interest. Other non-limiting and advantageous features of the data processing method, taken individually or in any technically possible combination, are as follows:

- le capteur est matriciel ;  the sensor is matrix;

- le premier ensemble de pixels est une matrice de pixels ;  the first set of pixels is a matrix of pixels;

- l'étape de détermination du second ensemble de pixels comprend une étape de détermination d'une zone comprenant ladite position et/ou une étape d'extraction des pixels du premier ensemble situés dans ladite zone ;  the step of determining the second set of pixels comprises a step of determining an area comprising said position and / or a step of extracting the pixels of the first set located in said area;

- le capteur délivre une pluralité de premiers ensembles de pixels ;  the sensor delivers a plurality of first sets of pixels;

- le procédé comprend une étape de détermination, pour chacun desdits premiers ensembles de pixels, d'un second ensemble de pixels correspondant par extraction des pixels du premier ensemble concerné situés dans ladite zone (la même zone étant ainsi utilisée pour les différents premiers ensembles) ;  the method comprises a step of determining, for each of said first sets of pixels, a second set of corresponding pixels by extracting the pixels of the first set concerned located in said area (the same area being thus used for the different first sets) ;

- le capteur fait partie d'un capteur de temps de vol comprenant en outre un élément émetteur de rayonnement électromagnétique ;  the sensor is part of a time of flight sensor further comprising an electromagnetic radiation emitting element;

- lesdits premiers ensembles sont acquis par le capteur respectivement en correspondance avec des phases distinctes du signal (modulé) émis par l'élément émetteur de rayonnement électromagnétique ;  said first sets are acquired by the sensor respectively in correspondence with distinct phases of the (modulated) signal emitted by the emitting element of electromagnetic radiation;

- ladite position est déterminée par analyse d'une image représentative de l'environnement (telle qu'une partie au moins du premier ensemble de pixels ou une image obtenue grâce à l'algorithme de traitement précité) ;  said position is determined by analysis of an image representative of the environment (such as at least a part of the first set of pixels or an image obtained by means of the aforementioned processing algorithm);

- ladite position est déterminée par suivi de l'objet (typiquement par évaluation d'une vitesse de l'objet dans l'image entre deux images successives et estimation de la nouvelle position de l'objet sur la base de cette vitesse et d'une position de l'objet dans l'image précédente).  said position is determined by tracking the object (typically by evaluating a speed of the object in the image between two successive images and estimating the new position of the object on the basis of this speed and a position of the object in the previous image).

L'invention propose également un système embarqué comprenant une unité de traitement et un capteur faisant face à un environnement, dans lequel le capteur est conçu pour acquérir un premier ensemble de pixels et dans lequel l'unité de traitement comprend un module conçu pour déterminer une position d'un objet de l'environnement, un module conçu pour déterminer un second ensemble de pixels inclus dans le premier ensemble de pixels (le second ensemble de pixels recouvrant ladite position et étant distinct du premier ensemble), et un module conçu pour appliquer un algorithme de traitement aux pixels du second ensemble seulement. Les caractéristiques optionnelles mentionnées ci-dessus en terme de procédé peuvent éventuellement s'appliquer à un tel système embarqué. The invention also proposes an embedded system comprising a processing unit and an environment-facing sensor, wherein the sensor is designed to acquire a first set of pixels and wherein the processing unit comprises a module designed to determine a position of an object of the environment, a module designed to determine a second set of pixels included in the first set of pixels (the second set of pixels covering said position and being distinct from the first set), and a module designed to apply a pixel processing algorithm of the second set only. The optional features mentioned above in terms of method may possibly apply to such an embedded system.

DESCRIPTION DÉTAILLÉE D'UN EXEMPLE DE RÉALISATION La description qui va suivre en regard des dessins annexés, donnés à titre d'exemples non limitatifs, fera bien comprendre en quoi consiste l'invention et comment elle peut être réalisée.  DETAILED DESCRIPTION OF AN EXEMPLARY EMBODIMENT The following description with reference to the accompanying drawings, given by way of non-limiting examples, will make it clear what the invention consists of and how it can be implemented.

Sur les dessins annexés :  In the accompanying drawings:

- la figure 1 représente un système embarqué comprenant un capteur de temps de vol ;  FIG. 1 represents an onboard system including a flight time sensor;

- la figure 2 représente les modules fonctionnels d'une unité de traitement du système embarqué de la figure 1 ; et  FIG. 2 represents the functional modules of a processing unit of the onboard system of FIG. 1; and

- la figure 3 représente des matrices de pixel traitées par un procédé de traitement utilisant l'invention.  FIG. 3 represents pixel matrices processed by a processing method using the invention.

On a schématiquement représenté sur la figure 1 un système embarqué dans un véhicule, ici un véhicule automobile, et comprenant un capteur de temps de vol 1 et une unité de traitement 10.  FIG. 1 schematically shows a system embedded in a vehicle, here a motor vehicle, and comprising a flight time sensor 1 and a processing unit 10.

Le capteur de temps de vol 1 , ici une caméra tridimensionnelle basée sur le principe du temps de vol (en anglais " ToF 3D caméra" pour " Time of Flight 3D caméra"), comprend au moins un élément émetteur de rayonnement électromagnétique 2 (typiquement une ou plusieurs diode(s) électroluminescente(s) émettant dans l'infrarouge) et un capteur matriciel 4 (tel qu'un capteur d'image sensible au rayonnement émis par l'élément émetteur 2, ici l'infrarouge) définissant un ensemble de pixels.  The flight time sensor 1, here a three-dimensional camera based on the principle of flight time (in English "ToF 3D camera" for "Time of Flight 3D camera"), comprises at least one element emitting electromagnetic radiation 2 (typically one or more electroluminescent diode (s) emitting in the infrared) and a matrix sensor 4 (such as an image sensor sensitive to radiation emitted by the emitter element 2, here the infrared) defining a set pixels.

Le rayonnement E émis par l'élément émetteur 2 (généralement à travers un système optique d'émission non représenté en figure 1 ) est réfléchi en direction du capteur matriciel 4 (rayonnement référencé R en figure 1 ) par le premier objet rencontré sur le trajet du rayonnement E.  The radiation E emitted by the emitter element 2 (generally through an optical transmission system not shown in FIG. 1) is reflected towards the matrix sensor 4 (radiation referenced R in FIG. 1) by the first object encountered on the path E. radiation

En pratique, un système optique (tel qu'une lentille) est placé en regard du capteur matriciel 4 de telle sorte que chaque pixel du capteur matriciel 4 reçoit le signal réfléchi R provenant d'une direction particulière de l'angle solide analysé par le capteur de temps de vol 1 .  In practice, an optical system (such as a lens) is placed facing the matrix sensor 4 so that each pixel of the matrix sensor 4 receives the reflected signal R coming from a particular direction of the solid angle analyzed by the time of flight sensor 1.

Comme visible en figure 1 , une unité de traitement 10 (par exemple une unité électronique de commande ou, de manière générale, un microcontrôleur) commande l'émission du rayonnement E par l'élément émetteur 2, puis analyse les signaux mesurés par le capteur matriciel 4 (représentés ici sous la forme d'une pluralité de matrices Mk de pixels) de manière à déterminer notamment une matrice D comprenant les distances di du premier objet rencontré pour une pluralité de directions de l'espace faisant face au capteur de temps de vol 1 . As visible in FIG. 1, a processing unit 10 (for example an electronic control unit or, in general, a microcontroller) controls the emission of the radiation E by the transmitter element 2, and then analyzes the signals measured by the matrix sensor 4 (represented here in the form of a plurality of matrices M k of pixels) so as to determine in particular a matrix D comprising the distances d 1 of the first object encountered for a plurality of directions of space facing the flight time sensor 1.

Cette matrice D des distances di (ainsi qu'éventuellement d'autres informations, telle qu'une matrice de luminance L) est transmise par l'unité de traitement 10 à un autre système électronique (non représenté) pour utilisation par celui-ci.  This matrix D of the distances d 1 (as well as possibly other information, such as a luminance matrix L) is transmitted by the processing unit 10 to another electronic system (not shown) for use by the latter.

Par exemple, dans le domaine automobile, le capteur de temps de vol 1 peut être placé à l'avant du véhicule afin de construire une cartographie de l'environnement avant du véhicule et/ou de détecter un obstacle et/ou d'évaluer la vitesse d'un autre véhicule situé à l'avant (par dérivation d'une distance du véhicule évaluée sur la base de certaines des distances di ).  For example, in the automotive field, the flight time sensor 1 can be placed at the front of the vehicle in order to build a map of the front environment of the vehicle and / or to detect an obstacle and / or to evaluate the speed of another vehicle located at the front (by deriving a distance of the vehicle evaluated on the basis of some of the distances di).

Selon une autre possibilité envisageable, le capteur de temps de vol peut être placé dans l'habitacle du véhicule (par exemple face au conducteur du véhicule) et les distances di déterminées par l'unité de traitement 10 peuvent être utilisées au sein d'un algorithme de reconnaissance gestuelle.  According to another possible possibility, the flight time sensor can be placed in the passenger compartment of the vehicle (for example facing the driver of the vehicle) and the distances di determined by the processing unit 10 can be used within a gesture recognition algorithm.

Dans l'exemple décrit ici, le rayonnement E émis par l'élément émetteur In the example described here, the radiation E emitted by the emitting element

2 (sous la commande de l'unité de traitement 10) est un signal modulé et l'unité de traitement 10 commande l'acquisition par le capteur matriciel 4 d'une pluralité de matrices Mk de pixels, à des instants correspondant respectivement à une pluralité de phases distinctes (deux à deux) du signal modulé. 2 (under the control of the processing unit 10) is a modulated signal and the processing unit 10 controls the acquisition by the matrix sensor 4 of a plurality of matrices M k of pixels, at times respectively corresponding to a plurality of distinct phases (two by two) of the modulated signal.

La figure 2 représente les modules fonctionnels de l'unité de traitement Figure 2 shows the functional modules of the processing unit

10 utiles à la compréhension de l'invention. 10 useful for understanding the invention.

Chaque module fonctionnel 12, 14, 1 6 représenté sur la figure 2 correspond à une fonctionnalité particulière mise en œuvre par l'unité de traitement 10. Plusieurs (voire tous les) modules fonctionnels peuvent toutefois en pratique être mis en œuvre par une même entité physique, ici un processeur de l'unité de traitement 10, sur lequel s'exécute des instructions de programme mémorisées dans une mémoire associée au processeur (chaque module fonctionnel étant alors dans ce cas mis en œuvre par l'exécution d'un jeu particulier d'instructions mémorisées dans cette mémoire).  Each functional module 12, 14, 1 6 represented in FIG. 2 corresponds to a particular functionality implemented by the processing unit 10. Several (if not all) functional modules can however in practice be implemented by the same entity. physical, here a processor of the processing unit 10, on which executes program instructions stored in a memory associated with the processor (each functional module then being implemented in this case by the execution of a particular game instructions stored in this memory).

L'unité de traitement 10 comprend ainsi un module de localisation 12, un module de sélection 14 et un module d'analyse 1 6. Le module de localisation 12 est conçu pour localiser (grossièrement) un objet présent dans l'angle solide observé par le capteur matriciel 4. The processing unit 10 thus comprises a location module 12, a selection module 14 and an analysis module 1 6. The location module 12 is designed to locate (roughly) an object present in the solid angle observed by the matrix sensor 4.

Selon une première possibilité envisageable, la localisation de l'objet est réalisée par analyse d'une image représentative de l'environnement faisant face au capteur matriciel 4 dans le champ de vision du capteur matriciel 4.  According to a first possible possibility, the location of the object is achieved by analyzing an image representative of the environment facing the matrix sensor 4 in the field of view of the matrix sensor 4.

Cette image représentative peut être en pratique une des matrices Mk de pixels produites par la capteur matriciel 4 ou la matrice de luminance L produite comme indiqué ci-dessous par le module d'analyse 1 6, éventuellement (dans les deux cas) après une étape de sous-échantillonnage afin de réduire le volume des données traitées. This representative image may be in practice one of the matrices M k of pixels produced by the matrix sensor 4 or the luminance matrix L produced as indicated below by the analysis module 1 6, possibly (in both cases) after a downsampling step to reduce the volume of processed data.

Cette première possibilité peut être utilisée en particulier si aucun objet n'a été détecté au préalable.  This first possibility can be used in particular if no object has been detected beforehand.

Selon une seconde possibilité envisageable, la localisation de l'objet est réalisée par suivi de l'objet et estimation de la position courante de l'objet (sur la base typiquement de la position de l'objet à l'itération précédente et d'un déplacement estimé de l'objet d'une itération à l'itération suivante).  According to a second possible possibility, the location of the object is achieved by tracking the object and estimating the current position of the object (typically based on the position of the object at the previous iteration and an estimated displacement of the object of an iteration to the next iteration).

Lorsque le module de localisation 12 localise un objet, le module de localisation 12 transmet une information de position POS au module de sélection 14.  When the location module 12 locates an object, the location module 12 transmits POS position information to the selection module 14.

L'information de position POS comprend par exemple des coordonnées de l'objet détecté exprimées par rapport à la matrice de pixels acquise par le capteur matriciel 4. Ces coordonnées permettent de définir la position et/ou l'étendue de l'objet concerné dans l'angle solide observé par le capteur matriciel 4.  The POS position information comprises, for example, coordinates of the detected object expressed with respect to the matrix of pixels acquired by the matrix sensor 4. These coordinates make it possible to define the position and / or the extent of the object concerned in the solid angle observed by the matrix sensor 4.

Le module de sélection 14 reçoit également en entrée les matrices de pixel Mk acquises par le capteur matriciel 4. The selection module 14 also receives as input the pixel matrices M k acquired by the matrix sensor 4.

Tant qu'aucun objet n'a été détecté, le module de sélection 14 ne reçoit aucune information de position POS et transmet alors les matrices de pixel Mk reçues en entrée au module d'analyse 1 6 (sans modification). As long as no object has been detected, the selection module 14 receives no position information POS and then transmits the pixel matrices M k received at the input to the analysis module 1 6 (without modification).

En revanche, si un objet est détecté par le module de localisation 12, le module de localisation 12 transmet l'information de position POS au module de sélection 14 comme déjà indiqué. Le module de sélection 12 est alors conçu pour :  On the other hand, if an object is detected by the location module 12, the location module 12 transmits the POS position information to the selection module 14 as already indicated. The selection module 12 is then designed to:

- déterminer une zone Z couvrant la (ou les) position(s) définie(s) par l'information de position POS, - extraire, dans chaque matrice Mk de pixels reçue en entrée, les pixels correspondant à la zone Z, ce qui permet d'obtenir pour chaque matrice Mk de pixels un ensemble Sk de pixels, et determining a zone Z covering the position (s) defined by the POS position information, extracting, in each matrix M k of pixels received at the input, the pixels corresponding to zone Z, which makes it possible to obtain for each matrix M k of pixels a set S k of pixels, and

- transmettre les ensembles Sk de pixels au module d'analyse 1 6. Le module de sélection 14 détermine la zone Z de manière prédéfinie sur la base de l'information de position POS. Par exemple, la zone Z comprend tous les pixels situés à 10 pixels ou moins (verticalement ou horizontalement) de la position définie par l'information de position POS (y compris le pixel défini ou les pixels définis par l'information de position POS). transmitting the sets S k of pixels to the analysis module 1 6. The selection module 14 determines the zone Z in a predefined manner on the basis of the POS position information. For example, zone Z includes all the pixels located 10 pixels or less (vertically or horizontally) from the position defined by the POS position information (including the defined pixel or the pixels defined by the POS position information). .

Chaque ensemble Sk de pixels est donc un sous-ensemble de l'ensemble des pixels d'une matrice Mk de pixels correspondante. Each set S k of pixels is therefore a subset of the set of pixels of a corresponding matrix M k of pixels.

On remarque que, dans l'exemple décrit ici, la même zone Z est utilisée pour extraire les pixels des différentes matrices Mk de pixels. Note that, in the example described here, the same zone Z is used to extract the pixels of the different matrices M k of pixels.

Le module d'analyse 1 6 est quant à lui conçu pour appliquer au moins un algorithme de traitement aux ensembles Sk de pixels reçus en entrée. The analysis module 1 6 is designed for applying at least one processing algorithm to the sets S k of pixels received at the input.

Dans l'exemple décrit ici, le module d'analyse 1 6 met en œuvre un premier algorithme de traitement permettant d'obtenir la matrice D des distances dij sur la base des ensembles Sk de pixels reçus en entrée (lorsque le module de sélection 14 effectue l'extraction précitée, ou sur la base des matrices Mk de pixels lorsqu'aucun objet n'est détecté par le module de localisation 12 et que le module de sélection 14 transmet alors simplement ces matrices Mk de pixels). In the example described here, the analysis module 1 6 implements a first processing algorithm making it possible to obtain the matrix D of the distances di j on the basis of sets S k of pixels received at the input (when the module of selection 14 carries out the aforementioned extraction, or on the basis of matrices M k of pixels when no object is detected by the location module 12 and that the selection module 14 then simply transmits these matrices M k of pixels).

Le module d'analyse 1 6 met ici en œuvre également un second algorithme de traitement, permettant quant à lui d'obtenir la matrice L de luminance précitée sur la base des ensembles Sk de pixels reçus en entrée (ou, comme précédemment, sur la base des matrices Mk de pixels). The analysis module 1 6 here also implements a second processing algorithm, which in turn makes it possible to obtain the above-mentioned luminance matrix L on the basis of sets S k of pixels received at input (or, as previously, on the base of matrices M k of pixels).

On pourra se référer par exemple au document technique " Time-of-Flight Caméra - An Introduction", Texas Instruments, réf. SLOA190B, pour plus d'informations sur les traitements qui viennent d'être décrits.  For example, the technical document "Time-of-Flight Camera - An Introduction", Texas Instruments, ref. SLOA190B, for more information on the treatments just described.

Ainsi, lorsqu'un objet est détecté par le module de localisation 12, les algorithmes de traitement sont appliqués à une partie seulement des pixels (les ensembles Sk), ce qui permet de réduire le volume des traitements effectués. Thus, when an object is detected by the location module 12, the processing algorithms are applied to only part of the pixels (the sets S k ), which makes it possible to reduce the volume of the treatments performed.

Lorsque le module de sélection 14 extrait des pixels des matrices Mk de pixels comme indiqué plus haut et transmet au module d'analyse 1 6 des ensembles Sk de pixels, la matrice D des distances djj et/ou la matrice L de luminance sont généralement partielles {i.e. : ne comportent pas des données pour l'ensemble du champ de vision du capteur matriciel 4). When the selection module 14 extracts pixels from the matrices M k of pixels as indicated above and transmits to the analysis module 1 6 sets S k of pixels, the matrix D of the distances d 1 j and / or the matrix L of luminance are usually partial (ie do not include data for the entire field of view of the matrix sensor 4).

On décrit à présent en référence à la figure 3 un exemple de procédé de traitement mis en œuvre par l'unité de traitement 10.  An example of a treatment method implemented by the processing unit 10 is now described with reference to FIG.

Le capteur matriciel 4 acquiert à l'étape E2 au moins une matrice de pixels, ici une pluralité de matrice Mk de pixels comme expliqué ci-dessus. The matrix sensor 4 acquires at step E2 at least one matrix of pixels, here a plurality of matrix M k of pixels as explained above.

L'unité de localisation 12 détecte un objet O dans l'environnement faisant face au capteur matriciel 4 (étape E4) et détermine sa position POS, par exemple par analyse de l'une des matrices Mk de pixels. En variante, la détection de l'objet O pourrait être réalisée par analyse d'une matrice de luminance L calculée par l'unité de traitement 10. The locating unit 12 detects an object O in the environment facing the matrix sensor 4 (step E4) and determines its POS position, for example by analyzing one of the matrices M k of pixels. Alternatively, the detection of the object O could be performed by analyzing a luminance matrix L calculated by the processing unit 10.

Le module de sélection 14 détermine alors à l'étape E6 une zone Z couvrant l'objet O (et par exemple plus étendue que l'objet O afin de tenir compte d'une marge d'erreur).  The selection module 14 then determines in step E6 a zone Z covering the object O (and for example larger than the object O to take into account an error margin).

Pour chaque matrice Mk de pixels, le module de sélection 14 extrait les pixels situés dans la zone Z afin d'obtenir un ensemble Sk de pixels correspondant (étape E8). For each matrix M k of pixels, the selection module 14 extracts the pixels located in the zone Z in order to obtain a set S k of corresponding pixels (step E8).

Le module d'analyse 1 6 peut ainsi appliquer à l'étape E10 au moins un algorithme de traitement aux ensembles Sk de pixels (et à ces pixels seulement), ici afin d'obtenir une matrice (partielle) D de distances dij ou une matrice (partielle) de luminance L. The analysis module 1 6 can thus apply at step E10 at least one processing algorithm to sets S k of pixels (and to these pixels only), here in order to obtain a (partial) matrix D of distances dij or a (partial) matrix of luminance L.

Claims

REVENDICATIONS 1 . Procédé de traitement de données comprenant les étapes suivantes :1. A data processing method comprising the steps of: - acquisition (E2) d'un premier ensemble (Mk) de pixels par un capteur (4) faisant face à un environnement ; acquisition (E2) of a first set (M k ) of pixels by a sensor (4) facing an environment; - détermination (E4) d'une position (POS) d'un objet (O) de l'environnement ;  determination (E4) of a position (POS) of an object (O) of the environment; - détermination (E6, E8) d'un second ensemble (Sk) de pixels inclus dans le premier ensemble (Mk) de pixels, le second ensemble (Sk) de pixels recouvrant ladite position (POS) et étant distinct du premier ensemble (Mk) ; determination (E6, E8) of a second set (Sk) of pixels included in the first set (M k ) of pixels, the second set (S k ) of pixels covering said position (POS) and being distinct from the first set (M k ); - application (E1 0) d'un algorithme de traitement aux pixels du second ensemble (Sk) seulement. applying (E1 0) a processing algorithm to the pixels of the second set (S k ) only. 2. Procédé de traitement selon la revendication 1 , dans lequel le capteur (4) est matriciel et dans lequel le premier ensemble de pixels est une matrice de pixels (Mk). 2. Processing method according to claim 1, wherein the sensor (4) is matrix and wherein the first set of pixels is a matrix of pixels (M k ). 3. Procédé selon la revendication 1 ou 2, dans lequel l'étape de détermination du second ensemble de pixels comprend une étape de détermination (E6) d'une zone (Z) comprenant ladite position (POS) et une étape d'extraction (E8) des pixels du premier ensemble (Mk) situés dans ladite zone (Z). 3. Method according to claim 1 or 2, wherein the step of determining the second set of pixels comprises a step of determining (E6) a zone (Z) comprising said position (POS) and an extraction step ( E8) pixels of the first set (M k ) located in said zone (Z). 4. Procédé selon l'une des revendications 1 à 3, dans lequel le capteur (4) délivre une pluralité de premiers ensembles (Mk) de pixels. 4. Method according to one of claims 1 to 3, wherein the sensor (4) delivers a plurality of first sets (M k ) of pixels. 5. Procédé selon la revendication 4 prise dans la dépendance de la revendication 3, comprenant une étape de détermination, pour chacun desdits premiers ensembles (Mk) de pixels, d'un second ensemble (Sk) de pixels correspondant par extraction des pixels du premier ensemble (Mk) concerné situés dans ladite zone (Z). 5. Method according to claim 4 taken in dependence of claim 3, comprising a step of determining, for each of said first sets (M k ) of pixels, a second set (S k ) corresponding pixels by pixel extraction. of the first set (M k ) concerned located in said zone (Z). 6. Procédé selon l'une des revendications 1 à 5, dans lequel le capteur (4) fait partie d'un capteur de temps de vol (1 ) comprenant un élément émetteur de rayonnement électromagnétique (2). 6. Method according to one of claims 1 to 5, wherein the sensor (4) is part of a flight time sensor (1) comprising an electromagnetic radiation emitting element (2). 7. Procédé selon la revendication 6 prise dans la dépendance de la revendication 4 ou 5, dans lequel lesdits premiers ensembles (Mk) sont acquis par le capteur (4) respectivement en correspondance avec des phases distinctes du signal émis par l'élément émetteur de rayonnement électromagnétique (2). 7. Method according to claim 6 taken in dependence of claim 4 or 5, wherein said first sets (M k ) are acquired by the sensor (4) respectively in correspondence with distinct phases of the signal emitted by the transmitting element electromagnetic radiation (2). 8. Procédé selon l'une des revendications 1 à 7, dans lequel ladite position (POS) est déterminée par analyse d'une image représentative de l'environnement. 8. Method according to one of claims 1 to 7, wherein said position (POS) is determined by analysis of an image representative of the environment. 9. Procédé selon l'une des revendications 1 à 7, dans lequel ladite position (POS) est déterminée par suivi de l'objet (O). 9. Method according to one of claims 1 to 7, wherein said position (POS) is determined by tracking the object (O). 1 0. Système embarqué comprenant une unité de traitement (1 0) et un capteur (4) faisant face à un environnement, dans lequel le capteur (4) est conçu pour acquérir un premier ensemble (Mk) de pixels et dans lequel l'unité de traitement (1 0) comprend : An embedded system comprising a processing unit (1 0) and a sensor (4) facing an environment, wherein the sensor (4) is adapted to acquire a first set (M k ) of pixels and wherein processing unit (1 0) comprises: - un module (1 2) conçu pour déterminer une position (POS) d'un objet (O) de l'environnement ;  a module (1 2) designed to determine a position (POS) of an object (O) of the environment; - un module (14) conçu pour déterminer un second ensemble (Sk) de pixels inclus dans le premier ensemble (Mk) de pixels, le second ensemble (Sk) de pixels recouvrant ladite position (POS) et étant distinct du premier ensemble (Mk) ; et a module (14) designed to determine a second set (Sk) of pixels included in the first set (M k ) of pixels, the second set (Sk) of pixels covering said position (POS) and being distinct from the first set ( M k ); and - un module (1 6) conçu pour appliquer un algorithme de traitement aux pixels du second ensemble (Sk) seulement.  a module (1 6) designed to apply a processing algorithm to the pixels of the second set (Sk) only.
PCT/EP2018/078465 2017-10-19 2018-10-17 Data processing method and associated onboard system Ceased WO2019077010A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP18785645.5A EP3698321A1 (en) 2017-10-19 2018-10-17 Data processing method and associated onboard system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR1759875 2017-10-19
FR1759875A FR3074950B1 (en) 2017-10-19 2017-10-19 DATA PROCESSING METHOD AND EMBEDDED SYSTEM THEREOF

Publications (1)

Publication Number Publication Date
WO2019077010A1 true WO2019077010A1 (en) 2019-04-25

Family

ID=61027878

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2018/078465 Ceased WO2019077010A1 (en) 2017-10-19 2018-10-17 Data processing method and associated onboard system

Country Status (3)

Country Link
EP (1) EP3698321A1 (en)
FR (1) FR3074950B1 (en)
WO (1) WO2019077010A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11663697B2 (en) 2020-02-03 2023-05-30 Stmicroelectronics (Grenoble 2) Sas Device for assembling two shots of a scene and associated method

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160048726A1 (en) * 2014-08-15 2016-02-18 Apple Inc. Three-Dimensional Hand Tracking Using Depth Sequences
WO2017045251A1 (en) * 2015-09-15 2017-03-23 SZ DJI Technology Co., Ltd. Systems and methods for uav interactive instructions and control

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160048726A1 (en) * 2014-08-15 2016-02-18 Apple Inc. Three-Dimensional Hand Tracking Using Depth Sequences
WO2017045251A1 (en) * 2015-09-15 2017-03-23 SZ DJI Technology Co., Ltd. Systems and methods for uav interactive instructions and control

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"Time-of-Flight and Depth Imaging. Sensors, Algorithms, and Applications", vol. 8200, 1 August 2013, SPRINGER INTERNATIONAL PUBLISHING, Cham, ISBN: 978-3-642-38988-7, ISSN: 0302-9743, article THOMAS HOEGG ET AL: "Real-Time Motion Artifact Compensation for PMD-ToF Images", pages: 273 - 288, XP055466005, 032548, DOI: 10.1007/978-3-642-44964-2_13 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11663697B2 (en) 2020-02-03 2023-05-30 Stmicroelectronics (Grenoble 2) Sas Device for assembling two shots of a scene and associated method

Also Published As

Publication number Publication date
FR3074950A1 (en) 2019-06-14
EP3698321A1 (en) 2020-08-26
FR3074950B1 (en) 2019-12-27

Similar Documents

Publication Publication Date Title
JP6784943B2 (en) Information processing equipment, information processing methods, and programs
EP3680615B1 (en) Method for determining a radius of protection for a vision-based navigation system
EP2473867B1 (en) Multi-target data processing for multi-receiver passive radars in an sfn or mfn mode
FR3025898A1 (en) METHOD AND SYSTEM FOR LOCALIZATION AND MAPPING
CN105049784A (en) Method and device for image-based visibility range estimation
KR102252915B1 (en) Method and apparatus for distance estimation using stereo camera
EP3721261B1 (en) Distance time-of-flight modules
EP3189389A2 (en) Location and mapping device and method
FR3112215A1 (en) System and method for detecting an obstacle in a vehicle environment
WO2019133342A1 (en) Method and system for stereo based vehicle pose estimation
FR3146523A1 (en) Method of estimating the speed of a vehicle
FR3056530B1 (en) OBSTRUCTION DETECTION BY FUSION OF OBJECTS FOR MOTOR VEHICLE
CN115214681A (en) Method and system for determining ground level by means of artificial neural network
EP3698321A1 (en) Data processing method and associated onboard system
FR3056531B1 (en) OBSTACLE DETECTION FOR MOTOR VEHICLE
WO2018041978A1 (en) Device for determining a speed limit, on-board system comprising such a device, and method for determining a speed limit
KR20170119167A (en) System and method for detecting object
WO2019086314A1 (en) Method of processing data for system for aiding the driving of a vehicle and associated system for aiding driving
FR3103301A1 (en) Method for detecting specularly reflected light beam intensity peaks
EP3871009A1 (en) Method for determining a current value of an occupancy parameter relating to a portion of a space located in the vicinity of a motor-driven land vehicle
EP3069319B1 (en) System and method for characterising objects of interest in a scene
EP3155446B1 (en) Method and system for estimating a parameter in a motor vehicle, and motor vehicle provided with such a system
WO2017211837A1 (en) On-board system and method for determining a relative position
EP3757942B1 (en) Method and device for passive telemetry by image processing
CN118671740A (en) Robust lidar to camera sensor alignment

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18785645

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2018785645

Country of ref document: EP

Effective date: 20200519