WO2024018164A1 - Procede et systeme de surveillance spatiale infrarouge de jour - Google Patents
Procede et systeme de surveillance spatiale infrarouge de jour Download PDFInfo
- Publication number
- WO2024018164A1 WO2024018164A1 PCT/FR2023/051139 FR2023051139W WO2024018164A1 WO 2024018164 A1 WO2024018164 A1 WO 2024018164A1 FR 2023051139 W FR2023051139 W FR 2023051139W WO 2024018164 A1 WO2024018164 A1 WO 2024018164A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- shots
- earth
- light
- objects
- space
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/143—Sensing or illuminating at different wavelengths
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/50—Extraction of image or video features by performing operations within image blocks; by using histograms, e.g. histogram of oriented gradients [HoG]; by summing image-intensity values; Projection analysis
- G06V10/507—Summing image-intensity values; Histogram projection analysis
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/75—Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
- G06V10/751—Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
Definitions
- the invention relates to monitoring objects orbiting the Earth using short-wave infrared sensors during the day, and more particularly to an improved monitoring method.
- Such a system makes it possible to follow the evolution of the trajectories of objects, to catalog these objects and to update their trajectories.
- Near-earth space is defined as the portion of space located up to a few hundred thousand kilometers from the earth. Near-earth space monitoring therefore concerns essentially, but not only, the detection of objects that are in orbit around the Earth.
- the context of the present invention is the increase in the number of objects orbiting the Earth that we are seeing. These objects can be, for example, debris or operational satellites.
- the size distribution of objects varies from a characteristic radius of a few millimeters, for example propulsion residues, paint or meteorites, to several tens of meters, notably satellites or artificial orbital systems, whether they are operational or not.
- the system must be able to refine on request the precision of the knowledge of the orbital parameters of a given object, so as to be able to precisely predict its position in the near future, typically a few days, with a view, for example, to consolidating a risk of collision and plan possible avoidance maneuvers.
- telescopes or radars are generally used.
- ground-based telescopes are generally unable to see high-altitude satellites during the day due to bright sky backgrounds. Satellites can also be difficult to track by radars given their limited geographic distribution and range limitations.
- Ground-based optical telescopes are forced to operate during the night due to increased photon emission noise, or quantum noise, and the potential for daytime sky background saturation. Although some ground systems have solved these problems, these systems are generally expensive. Space surveillance satellites can also be used during the day and do not encounter problems detecting photon emission noise, but are also expensive and have limitations due to their observation patterns, their need to facing solar avoidance and their relatively long latency in sending tracking data to the ground. Passive ground-based radio frequency (RF) systems can detect resident space objects (RSOs) during the day, but these RSOs must actively transmit data to a ground satellite station. As a result, most SARs are not observed during the day, leaving nearby SARs vulnerable to dangerous and/or harmful activities.
- RF radio frequency
- the Graz Observatory in Austria carries out laser detection in broad daylight on certain space objects. But, the observation of these objects is based on sensors in the visible, it is limited to very luminous objects and in low orbit.
- the main aim of the present invention is therefore to provide a system and a method for detecting space objects in orbit around the Earth capable of operating in broad daylight and less expensive than known systems.
- a space surveillance method for detecting space objects orbiting the Earth in images captured during the day, the method comprising the following steps:
- a shooting device comprising at least one infrared sensor, each infrared shot comprising a matrix of pixels which are each associated with an intensity of light received by an infrared sensor,
- the step of detecting space objects in orbit around the Earth is implemented by an artificial intelligence system operating by deep learning and comprising a plurality of artificial neural network layers linked together to analyze the information from the previous layer of neurons, the object detection step spacecraft in orbit around the Earth comprising:
- the discrimination comprising a monitoring of each detected and stationary light point in successive shots, and, a recording of the coordinates of the light points detected at possibly different positions and grouped by this monitoring, the recording being made, for each light point detected, following its disappearance in the following shots.
- Said infrared sensor such as a SWIR (“short wave infrared” camera in English) provides images which can be very different from the visible images in terms of noise characteristics, heterogeneities, defects in particular. This is why classical detection algorithms directly transposed from the visible imaging application give poor results.
- SWIR short wave infrared
- the method according to the invention comprises an algorithm based on artificial intelligence. It is based on a neural network trained on a set of images specifically simulated to be as representative as possible of real infrared images in the optical configuration used for image acquisition. A set of real images acquired in real conditions with the experimental device was studied to build a realistic model of noise, defects, response, background and non-uniformities. Then, these models were used to simulate a set of images, with known ground truth, to use as a training set for the neural network.
- SNR signal-to-noise ratio
- the method according to the invention has the advantage of being able to operate from a simple calibration carried out from a view of the sky that does not require a specific module or development.
- Deep learning is one of the main technologies of Machine learning, implementing algorithms capable of mimicking the actions of the human brain thanks to artificial neural networks composed of tens or even hundreds of “layers”. » of neurons, each receiving and interpreting information from the previous layer. Deep learning networks are trained on the basis of the complex data structures they encounter.
- the method according to the invention also has the advantage of being able to be used by a system equipped with a conventional reflecting telescope which has not been optimized for operation with short-wavelength infrared waves.
- the camera device thus makes it possible to easily carry out, from Earth, space surveillance to detect space objects orbiting the Earth in images captured during the day.
- the layers of artificial neural networks can be calibrated, prior to their use to detect space objects (satellites, debris, stars, etc.) present in the images infrared, by a supervised learning process from a base of different images allowing the artificial intelligence system to determine the typical characteristics of a space object (characteristics differentiating a light point corresponding to a real space object from the background of image, hot pixels, electronic noise, etc.).
- the method may further comprise, just after capturing shots, an application of a non-uniformity correction to the captured shots.
- the method may further comprise filtering of each stacked image obtained.
- the method may further comprise a formation of stacked images from a superposition of a plurality of said shots, each pixel of a stacked image being associated with an intensity of received light corresponding to the average of the intensities of the superimposed shots for the same pixel, the detection of spatial objects using the stacked images as shots to be processed.
- the method may further comprise, before the step of detecting space objects orbiting the Earth, a step of delineating each stacked image to eliminate streaky defects on the stacked image.
- a space surveillance system for detecting space objects in orbit around the Earth, the system comprising a reflecting telescope mounted on a mechanical support with motorized movement, a shooting device comprising at least one infrared sensor mounted at the output of the reflector telescope and configured to take series of shots of the daytime sky at a frequency between 1 Hz and a few hundred Hertz, and a processing unit receiving each captured shot by the camera.
- the processing unit can comprise an artificial intelligence system operating by deep learning and comprising a plurality of layers of artificial neural network connected together to analyze the information of the previous layer of neurons, the processing unit being configured to carry out the following steps from the images received:
- the detection being carried out by the artificial intelligence system operating by deep learning, and
- an identification of each object detected from a catalog of known space objects in orbit around the Earth comprising a detection of light points on each shot, and a discrimination of the detected light points, the discrimination comprising a monitoring of each detected and stationary light point in successive shots, and, a recording of the coordinates of the light points detected at possibly different positions and grouped by this monitoring, the recording being carried out, for each light point detected, following its disappearance in the following shots.
- the imaging device may further comprise at least one visible light sensor mounted at the output of the reflecting telescope and configured to take series of shots of the night sky, the spatial surveillance system further comprising a day/night alternation module making it possible to change the type of sensor receiving light from the sky depending on the environmental light intensity
- Figure 1 represents a spatial surveillance method according to one mode of implementation of the invention.
- Figure 2 represents a spatial surveillance system according to a first embodiment of the invention.
- Figure 3 represents a spatial surveillance system according to a second embodiment of the invention.
- Figure 2 is schematically represented a spatial surveillance system according to a first embodiment of the invention.
- the space surveillance system 1 is configured to detect space objects orbiting the Earth.
- the system 1 comprises a reflector telescope 2 mounted on a mechanical support 3 with motorized movement, a shooting device 4 mounted at the outlet of the reflector telescope 2 and comprising at least one infrared sensor 40, and a processing unit 5 receiving each shot view captured by the shooting device 4.
- the shooting device 4 is configured to take series of shots of the daytime sky at a frequency of between 1 Hz and a few hundred Hertz.
- the exposure time for each shot is adjusted to limit the saturation of the infrared sensor 40.
- the processing unit 5 comprises an artificial intelligence system 50 operating by deep learning and comprising a plurality of artificial neural network layers connected together to analyze the information from the previous layer of neurons.
- the processing unit 50 is configured to implement a spatial surveillance method in images captured during the day.
- FIG. 1 is shown such a monitoring method according to one mode of implementation of the invention.
- a plurality of infrared images of the sky in broad daylight are captured using the shooting device 4 and its infrared sensor 40.
- Each infrared image includes a matrix of pixels which are each associated with an intensity of light received by the infrared sensor.
- a correction of the non-uniformity of response of the pixels is carried out for each captured image.
- This correction is made via the prior acquisition of a uniform reference image, acquired directly on the sky in an area devoid of luminous objects (stars or satellites). This calibration does not require dismantling the sensor or additional equipment.
- step 120 of the method stacks of a plurality of images captured successively are produced.
- the stacked image obtained is produced by superimposing the infrared images of the stack.
- Each pixel of a stacked image is associated with an intensity of received light corresponding to the average of the intensities of the superimposed infrared images for the same pixel.
- an edging or “destriping” in English, is then carried out for each stacked image. Edging helps eliminate streaky defects that may appear on the stacked image due to the superposition of infrared images.
- a detection of the light points is carried out on each of the lined stacked images.
- This step consists of extracting the light points in relation to the background noise.
- This step of detecting light points 140 can be preceded by conventional image filtering (averaging type) to reduce background noise and improve detection performance.
- the step 140 of detecting the light points is implemented by the artificial intelligence system 50 operating by deep learning and comprising a plurality of layers of artificial neural network connected together to analyze the information from the previous layer of neurons.
- the light points detected in the previous step 140 are discriminated into stars or satellites. This discrimination involves tracking the detected and stationary light points in the images (these stationary points therefore belong to the same spatial object). And once all the points of the same spatial object are listed, we make a recording of the coordinates of the light points detected at possibly different positions and grouped by this tracking. In other words, recording is made as soon as the detected light point disappears on the following images.
- step 150 The detection of space objects in step 150 thus makes it possible to obtain the space objects orbiting around the Earth present in the images.
- the detected objects are compared to the objects listed in a catalog of known space objects orbiting the Earth.
- the catalog lists known space objects in orbit around the Earth, indicating the different characteristics of the space object: its dimensions, its orbit, and various intrinsic characteristics.
- a next step 170 new features are extracted from the objects detected in the stacked images, to either update the information relating to this spatial object if it is already listed in the catalog, or to add a new spatial object in the catalog.
- the layers of artificial neural networks are calibrated, prior to their use to detect space objects (stars, satellites, debris, etc.), by a supervised learning process from a base of different images allowing the artificial intelligence system to determine the typical characteristics of a space object (stars, satellites, debris, etc.).
- the calibration is performed on a set of simulation images that have been generated to reproduce typical images from the sensor.
- Each simulation image includes a background and background noise, with light spots corresponding either to stars, to space objects, or to sensor defects.
- we associate its truth i.e. the positions of the real objects in the image (stars and satellites).
- a large number of images are thus created and serve as a learning basis for the neural network.
- the present invention thus makes it possible to provide a system and a method for detecting space objects in orbit around the Earth capable of operating in broad daylight and less expensive than known systems.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Evolutionary Computation (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computing Systems (AREA)
- Software Systems (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Databases & Information Systems (AREA)
- Medical Informatics (AREA)
- General Engineering & Computer Science (AREA)
- Biomedical Technology (AREA)
- Mathematical Physics (AREA)
- Biophysics (AREA)
- Molecular Biology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Computational Linguistics (AREA)
- Data Mining & Analysis (AREA)
- Image Analysis (AREA)
- Closed-Circuit Television Systems (AREA)
- Geophysics And Detection Of Objects (AREA)
Abstract
Description
Claims
Priority Applications (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2025503082A JP2025528712A (ja) | 2022-07-21 | 2023-07-21 | 昼の赤外線宇宙監視のための方法及びシステム |
| KR1020257004797A KR20250059391A (ko) | 2022-07-21 | 2023-07-21 | 주간 적외선 우주 감시를 위한 방법 및 시스템 |
| AU2023311394A AU2023311394A1 (en) | 2022-07-21 | 2023-07-21 | Method and system for daytime infrared space surveillance |
| EP23754818.5A EP4558970A1 (fr) | 2022-07-21 | 2023-07-21 | Procede et systeme de surveillance spatiale infrarouge de jour |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| FR2207482A FR3138215B1 (fr) | 2022-07-21 | 2022-07-21 | Procédé et système de surveillance spaciale infrarouge de jour |
| FRFR2207482 | 2022-07-21 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2024018164A1 true WO2024018164A1 (fr) | 2024-01-25 |
Family
ID=84362249
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/FR2023/051139 Ceased WO2024018164A1 (fr) | 2022-07-21 | 2023-07-21 | Procede et systeme de surveillance spatiale infrarouge de jour |
Country Status (6)
| Country | Link |
|---|---|
| EP (1) | EP4558970A1 (fr) |
| JP (1) | JP2025528712A (fr) |
| KR (1) | KR20250059391A (fr) |
| AU (1) | AU2023311394A1 (fr) |
| FR (1) | FR3138215B1 (fr) |
| WO (1) | WO2024018164A1 (fr) |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9423341B1 (en) | 2009-11-30 | 2016-08-23 | Oceanit Laboratories, Inc. | Daytime infrared imaging of satellites |
| US10740609B1 (en) | 2019-08-30 | 2020-08-11 | Numerica Corporation | System and method for space object detection in daytime sky images |
-
2022
- 2022-07-21 FR FR2207482A patent/FR3138215B1/fr active Active
-
2023
- 2023-07-21 WO PCT/FR2023/051139 patent/WO2024018164A1/fr not_active Ceased
- 2023-07-21 AU AU2023311394A patent/AU2023311394A1/en active Pending
- 2023-07-21 KR KR1020257004797A patent/KR20250059391A/ko active Pending
- 2023-07-21 JP JP2025503082A patent/JP2025528712A/ja active Pending
- 2023-07-21 EP EP23754818.5A patent/EP4558970A1/fr active Pending
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9423341B1 (en) | 2009-11-30 | 2016-08-23 | Oceanit Laboratories, Inc. | Daytime infrared imaging of satellites |
| US10740609B1 (en) | 2019-08-30 | 2020-08-11 | Numerica Corporation | System and method for space object detection in daytime sky images |
| US20210064849A1 (en) * | 2019-08-30 | 2021-03-04 | Numerica Corporation | System and method for space object detection in daytime sky images |
Non-Patent Citations (7)
| Title |
|---|
| DENG QIUQUN ET AL: "Multi-Scale Convolutional Neural Networks for Space Infrared Point Objects Discrimination", IEEE ACCESS, vol. 7, 18 March 2019 (2019-03-18), pages 28113 - 28123, XP011715159, DOI: 10.1109/ACCESS.2019.2898028 * |
| FITZGERALD GARRETT ET AL: "Geosynchronous satellite detection and tracking with WFOV camera arrays using spatiotemporal neural networks (GEO-SPANN)", SPIE SMART STRUCTURES AND MATERIALS + NONDESTRUCTIVE EVALUATION AND HEALTH MONITORING, 2005, SAN DIEGO, CALIFORNIA, UNITED STATES, SPIE, US, vol. 12101, 27 May 2022 (2022-05-27), pages 1210104 - 1210104, XP060160796, ISSN: 0277-786X, ISBN: 978-1-5106-4548-6, DOI: 10.1117/12.2618047 * |
| PENG JIA ET AL: "Detection and Classification of Astronomical Targets with Deep Neural Networks in Wide Field Small Aperture Telescopes", ARXIV.ORG, CORNELL UNIVERSITY LIBRARY, 201 OLIN LIBRARY CORNELL UNIVERSITY ITHACA, NY 14853, 21 February 2020 (2020-02-21), XP081621274 * |
| RODRIGUEZ-VILLAMIZAR JULIAN ET AL: "Daylight Measurement Acquisition of Defunct Resident Space Objects Combining Active and Passive Electro-Optical Systems", IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, IEEE, USA, vol. 60, 1 June 2022 (2022-06-01), pages 1 - 17, XP011911495, ISSN: 0196-2892, [retrieved on 20220602], DOI: 10.1109/TGRS.2022.3179719 * |
| SALVATORE NIKOLAUS ET AL: "Learned Event-based Visual Perception for Improved Space Object Detection", 2022 IEEE/CVF WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), IEEE, 3 January 2022 (2022-01-03), pages 3301 - 3310, XP034086528, DOI: 10.1109/WACV51458.2022.00336 * |
| SHADDIX JEFF ET AL: "Daytime Optical Contributions Toward Timely Space Domain Awareness in Low Earth Orbit", 1 January 2021 (2021-01-01), XP093030794, Retrieved from the Internet <URL:https://amostech.com/TechnicalPapers/2021/SSA-SDA/Shaddix.pdf> [retrieved on 20230310] * |
| TAO JIANG ET AL: "Deep Convolutional Neural Network Based Small Space Debris Saliency Detection", 2019 25TH INTERNATIONAL CONFERENCE ON AUTOMATION AND COMPUTING (ICAC), CHINESE AUTOMATION AND COMPUTING SOCIETY IN THE UK - CACSUK, 5 September 2019 (2019-09-05), pages 1 - 6, XP033649490, DOI: 10.23919/ICONAC.2019.8895100 * |
Also Published As
| Publication number | Publication date |
|---|---|
| EP4558970A1 (fr) | 2025-05-28 |
| KR20250059391A (ko) | 2025-05-02 |
| JP2025528712A (ja) | 2025-09-02 |
| FR3138215B1 (fr) | 2024-08-02 |
| AU2023311394A1 (en) | 2025-02-13 |
| FR3138215A1 (fr) | 2024-01-26 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CA2804991C (fr) | Systeme optique de veille pour systeme de veille spatiale de surveillance de l'espace proche | |
| EP2593368B1 (fr) | Procede de realisation d'un systeme de veille spatiale pour la surveillance de l'espace proche | |
| EP2593367B1 (fr) | Systeme de veille spatiale pour la surveillance de l'espace proche | |
| EP4308460B1 (fr) | Systeme de detection de la trajectoire d'objets mobiles | |
| WO2014179482A1 (fr) | Dispositif d'estimation d'urgence incendie dans une orbite géosynchrone (fuego) | |
| Boër et al. | TAROT: a network for space surveillance and tracking operations | |
| FR3137183A1 (fr) | Procédé et dispositif pour la détermination d’une loi de pointage d’un satellite par détermination d’une distribution spatio-temporelle | |
| US11471717B1 (en) | Early fire detection and suppression | |
| EP2593904A1 (fr) | Procédé et dispositif d'imagerie bi-spectral multifonctions | |
| EP4115327B1 (fr) | Procédé d'aide à la détection d'éléments, dispositif et plateforme associés | |
| WO2024018164A1 (fr) | Procede et systeme de surveillance spatiale infrarouge de jour | |
| Zarcone et al. | Image processing for geo detection | |
| Becker et al. | Improved space object detection using short-exposure image data with daylight background | |
| CN112906521A (zh) | 一种基于生成对抗网络的红外图像生成系统及方法 | |
| Suthakar | IMAGE PROCESSING FOR STRATOSPHERIC BASED SPACE SITUATIONAL AWARENESS (SSA) | |
| FR3114884A1 (fr) | Système de détection de la trajectoire d’objets mobiles | |
| Jansson | Optical Detection and Analysis of Meteors with AllSky7 | |
| FR2701762A1 (fr) | Dispositif de restitution d'orbite de corps célestes, notamment de satellites artificiels, par écartométrie. | |
| Kelly | FUEGO—Fire Urgency Estimator in Geosynchronous Orbit—A Proposed Early-Warning Fire |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23754818 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: P2025-00211 Country of ref document: AE Ref document number: 2025503082 Country of ref document: JP Ref document number: 18997313 Country of ref document: US |
|
| WWE | Wipo information: entry into national phase |
Ref document number: AU2023311394 Country of ref document: AU |
|
| ENP | Entry into the national phase |
Ref document number: 2023311394 Country of ref document: AU Date of ref document: 20230721 Kind code of ref document: A |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 1020257004797 Country of ref document: KR |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2023754818 Country of ref document: EP |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| ENP | Entry into the national phase |
Ref document number: 2023754818 Country of ref document: EP Effective date: 20250221 |
|
| WWP | Wipo information: published in national office |
Ref document number: 2023754818 Country of ref document: EP |