[go: up one dir, main page]

EP4646825A1 - Procédé et système de combinaison de multiples balayages dans un seul ensemble de données - Google Patents

Procédé et système de combinaison de multiples balayages dans un seul ensemble de données

Info

Publication number
EP4646825A1
EP4646825A1 EP24738979.4A EP24738979A EP4646825A1 EP 4646825 A1 EP4646825 A1 EP 4646825A1 EP 24738979 A EP24738979 A EP 24738979A EP 4646825 A1 EP4646825 A1 EP 4646825A1
Authority
EP
European Patent Office
Prior art keywords
points
environment
scans
alignment points
alignment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP24738979.4A
Other languages
German (de)
English (en)
Inventor
Ryan James Goss
Graham David Ferris
Mark NORGREN
Robert Parker
Daniel John Benjamin
John TAFOYA
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Brightai Corp
Original Assignee
Brightai Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Brightai Corp filed Critical Brightai Corp
Publication of EP4646825A1 publication Critical patent/EP4646825A1/fr
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/14Transformations for image registration, e.g. adjusting or mapping for alignment of images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00002Diagnosis, testing or measuring; Detecting, analysing or monitoring not otherwise provided for
    • H04N1/00026Methods therefor
    • H04N1/00034Measuring, i.e. determining a quantity by comparison with a standard
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/22Measuring arrangements characterised by the use of optical techniques for measuring depth
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/28Measuring arrangements characterised by the use of optical techniques for measuring areas
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/28Measuring arrangements characterised by the use of optical techniques for measuring areas
    • G01B11/285Measuring arrangements characterised by the use of optical techniques for measuring areas using photoelectric detection means
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • G06T7/337Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/38Registration of image sequences
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00002Diagnosis, testing or measuring; Detecting, analysing or monitoring not otherwise provided for
    • H04N1/00007Diagnosis, testing or measuring; Detecting, analysing or monitoring not otherwise provided for relating to particular apparatus or devices
    • H04N1/00018Scanning arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00002Diagnosis, testing or measuring; Detecting, analysing or monitoring not otherwise provided for
    • H04N1/00071Diagnosis, testing or measuring; Detecting, analysing or monitoring not otherwise provided for characterised by the action taken
    • H04N1/00082Adjusting or controlling
    • H04N1/00087Setting or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/26Measuring arrangements characterised by the use of optical techniques for measuring angles or tapers; for testing the alignment of axes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/26Measuring arrangements characterised by the use of optical techniques for measuring angles or tapers; for testing the alignment of axes
    • G01B11/27Measuring arrangements characterised by the use of optical techniques for measuring angles or tapers; for testing the alignment of axes for testing the alignment of axes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/26Measuring arrangements characterised by the use of optical techniques for measuring angles or tapers; for testing the alignment of axes
    • G01B11/27Measuring arrangements characterised by the use of optical techniques for measuring angles or tapers; for testing the alignment of axes for testing the alignment of axes
    • G01B11/272Measuring arrangements characterised by the use of optical techniques for measuring angles or tapers; for testing the alignment of axes for testing the alignment of axes using photoelectric detection means
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods

Definitions

  • This invention generally relates to a data processing method of combining multiple data sets and more particularly to a method for combining data from multiple scans of a single environment into a single data set.
  • An improved method can be obtained for combining multiple data sets of a single subject to form a single representation of the subject, such as an environment.
  • the method uses a small set of alignment points and scanning the environment from a first position, repositioning the scanning system to take a scan of the same environment from a second position, and labeling the same alignment points to the extent they are visible from the second location.
  • the alignment points may be static points of interest in the environment or they may be placed in the environment by the user to denote the location of common reference points.
  • the alignment points may be chalk, tape, signs or other fiducial markings that are known in advance and can be detected by the scanning system.
  • FIG. 1 is a flowchart showing a preferred embodiment that uses a multi-scan process
  • FIG. 2A shows two scans of a single environment where the alignment points are not aligned
  • FIG. 2B shows the two scans depicted in FIG. 2A where the alignment points have been used to transform the scans so they align;
  • FIG. 3 shows an example transformation that would be involved in combining two scans where the scanner has been turned at a 45° angle between the two scans;
  • FIG. 4 is a depiction of a preferred embodiment of the invention in an environment containing a rectangular feature and an obstruction;
  • FIG. 5 is a flow chart showing the process for determining the alignment transformation.
  • One preferred embodiment involves the scanning of an outdoor environment, such as a backyard. Such an environment is too large to capture with sufficient detail from a single location since there are often obstacles, such as trees or buildings, that will obstruct part of the environment from a single scan location.
  • the measurement device will be moveable around the measurement scene and will combine the measurement data from the one or more locations into a single set of measurements or schematics.
  • Another preferred embodiment allows for an environment to add additional measurements and information to be gathered from the scene. Sometimes the measured environment may have some changes, such as a new additional feature, removal of an object or change of materials worth noting and measuring with the device.
  • the measuring device will allow an operator to return to the same location, or a different location within the environment to add measurements at a later time.
  • the measurement device is composed of one or more visual capture devices, one or more depth measuring device, one or more positional sensor, a data processing engine, and a command and control application.
  • the measurement device will be mounted on a tripod or other fixed position and will be repositioned around the measurement scene at specific locations. In other embodiments the device will also be hand-held, wheeled or otherwise mobile during the scanning process to move around obstructions.
  • Particularly preferred is the use of the measuring device disclosed in Provisional Application No. 63/437561 titled “Measuring Device and Method of Use” and filed on the same day as this application and which is incorporated herein by reference.
  • FIG. 1 depicts a flow chart of the overall method of a preferred embodiment of the current invention.
  • the method involves positioning a scanning system in a first location in the environment to be scanned.
  • a small set of alignment points are identified in the environment that are visible from multiple locations.
  • the environment is scanned from the first location and then, since this is the first scan, in step 80 the scanning system is moved to a second location in the environment.
  • the same set of alignment points are labeled from this second location. Since this is not the first scan, as determined in step 40, the system in step 90 then determines a transformation for the second scan to align the alignment points with those of the prior scans.
  • step 100 the environment is scanned from the second location in step 60. If not, the user repositions the measuring device and/or re-labels the alignment points back from the second location at step 30 and has the system re-determine a transformation for the second scan in step 90. Once the second scan is completed, the scanner is moved to another location in step 80 and the process is repeated from step 30 by labelling the alignment points from the new location. This process is repeated until the user has obtained the desired number of scans of the environment, which will preferably be determined based on the complexity of the environment and the number of obstructions present. At this point, in step 70 it is determined that no more scans are required and the process proceeds to step 110 where it is completed.
  • the same set of alignment points are labeled from a selected location at this new time.
  • the location may be the same as the first scan or any prior scans, or may be from a new location, as long as the alignment points are labeled and the calibrated transformation looks correct.
  • the alignment points may be static elements that are already present in the environment, such as the corner of a building or patio.
  • the alignment point can be placed in the environment for the purpose of providing common reference points. This might be chalk, tape, signs or other fiducial markings that are known to be detected. Particularly preferred is the use of unique fiducial markings that are known to the scanning system in advance and can be automatically identified as alignment points when taking each of the scans. For example, a sheet containing a high contrast icon or design that would not normally be present in the environment can be used as a fiducial marking.
  • all of the selected alignment points are visible from all the locations a scan of the environment will be taken.
  • the minimum number of alignment points that are visible from each location will be at least 3.
  • the number of alignment points are at least 4 or more in order to improve the alignment accuracy and minimize any error from the transformation. If the number of alignment points is less than 3, then there are multiple possible methods to combine the different locations and inaccurate measurements will occur. If fewer than 3 points are visible then the measurement system will display an error to the user to find and label additional alignment points. It is also preferred that the number of alignment points selected are less than 100 and more preferably less than 6. While a larger number of alignment points can be used, it is less effective as it increases the amount of time the user needs to conduct the scan and does not provide a significant improvement in the resulting combined scan.
  • these alignment point locations are preferably chosen and labeled using a mobile device via an app that uses an application program interface (API) to connect to the software on the scanning system wirelessly.
  • API application program interface
  • the user can manually select the alignment points from an image of the environment as viewed from the first scan location that is depicted on the screen of the mobile device.
  • the scanner will record the position (e.g. XYZ coordinates) for each of the alignment points.
  • the set (or a subset) of the alignment points are retargeted from the new location and are similarly identified either automatically or by the user.
  • the scanner will now have multiple data sets representing the same collection of alignment points.
  • the measurement device will use the alignment data sets to attempt to compute a common frame of reference for all data points and a transformation to combine all of the data points. This is accomplished by carrying out iterations of rotational and translational transformations to minimize the difference between the original points and the transformed points. Prior to the transformation each data set will have a different frame of reference.
  • the absolute position of the alignment points will be common between the multiple locations and the data processing engine will match and align the data points by attempting to find common characteristics between the labeled alignment points.
  • the characteristics used for alignment will include, but are not limited to, distance between points, edge length of segments, normal vector alignment, and error distance after transformation. The specifics of the characteristics used for alignment will allow for the small set of alignment data points to be uniquely identified so as to allow the transformation to be determined.
  • the process to determine the common frame of reference will be iterative to calculate a transformation (step 140), apply the computed transformation (step 142), and compare the alignment characteristics (step 144), where the iteration will continue until the transformation alignment is within the prescribed error threshold (as determined at step 144).
  • the measurement device and the user may add, modify, or delete the alignment points (steps 146 and 138). This can include removal of inaccurate or noisy points which may cause errors in the alignment process.
  • the iteration may also include repositioning or re -measuring a subset of alignment points.
  • All data points collected will utilize the computed transformation information in order to combine the data sets into a common reference frame.
  • the common reference frame may be chosen as the frame of reference for one of the scans, or a complete independent frame of reference into which all data is translated.
  • FIG. 3 shows a simple example of a transformation would be rotation about one of the axes. If the scanner is simply turned at a 45° angle in the azimuth between the two scans, the transformation would be a rotation about the azimuth axis and would be calculated as shown in FIG. 3. Specifically, scan 116 would be rotated 45° per transformation 118 to result in scan 120. This can also be seen in FIGS 2A and 2B. In Fig. 2A, two scans have resulted in data set 112 and data set 114. FIG. 2B shows the result after a transformation has been applied to data set 120 to combine the data sets 112 and 114.
  • the computed transformation with the common frame of reference is used to adapt all data from the scanning system so that the output information for the user interface and application API is always in the common frame of reference.
  • This augmentation of the data set will occur in real-time (or near real-time) so that while in the measuring environment the scanning system automation and the user interface can validate the combined data sets for accuracy and validity to ensure the multiple scans look and act as a single continuous scan.
  • FIG. 4 shows an example of a preferred embodiment of the method being used to scan a backyard that contains a rectangular pool.
  • Points 122, 124, 126, and 128 are points of interest in this environment, namely the four comers of the rectangular pool. Points 122, 124, and 126 are visible from the first scan location 130, but point 128 is not visible from the first scan location 130 due to the obstruction 134.
  • Scan 1 begins with setting the alignment points.
  • the visible points of interest (122, 124 and 126) can be labeled in the Scan 1 as well.
  • the scanner is physically moved and reoriented to the second scan location 132 where the same alignment points are labeled.
  • the overlay of the data is now combined between the 2 scans so that as Scan 2 labels the point of interest 128 it is shown in relative position to the previous labeled points (122, 124, and 126) to form a single unified data set from a common frame of reference.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

L'invention concerne un procédé amélioré qui peut être obtenu pour combiner de multiples ensembles de données d'un sujet unique pour former une seule représentation du sujet, telle qu'un environnement. Le procédé utilise un petit ensemble de points d'alignement et le balayage de l'environnement à partir d'une première position, le repositionnement du système de balayage pour effectuer un balayage du même environnement à partir d'une deuxième position, et le marquage des mêmes points d'alignement dans l'étendue qu'ils sont visibles depuis le deuxième emplacement. Les points d'alignement peuvent être des points d'intérêt statiques dans l'environnement ou ils peuvent être placés dans l'environnement par l'utilisateur pour désigner l'emplacement de points de référence communs.
EP24738979.4A 2023-01-06 2024-01-05 Procédé et système de combinaison de multiples balayages dans un seul ensemble de données Pending EP4646825A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202363437570P 2023-01-06 2023-01-06
PCT/US2024/010475 WO2024148260A1 (fr) 2023-01-06 2024-01-05 Procédé et système de combinaison de multiples balayages dans un seul ensemble de données

Publications (1)

Publication Number Publication Date
EP4646825A1 true EP4646825A1 (fr) 2025-11-12

Family

ID=91761083

Family Applications (1)

Application Number Title Priority Date Filing Date
EP24738979.4A Pending EP4646825A1 (fr) 2023-01-06 2024-01-05 Procédé et système de combinaison de multiples balayages dans un seul ensemble de données

Country Status (4)

Country Link
US (1) US20240236242A1 (fr)
EP (1) EP4646825A1 (fr)
AU (1) AU2024205978A1 (fr)
WO (1) WO2024148260A1 (fr)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2559157A (en) * 2017-01-27 2018-08-01 Ucl Business Plc Apparatus, method and system for alignment of 3D datasets
US10265138B2 (en) * 2017-09-18 2019-04-23 MediVis, Inc. Methods and systems for generating and using 3D images in surgical settings
EP3758351B1 (fr) * 2019-06-26 2023-10-11 Faro Technologies, Inc. Système et procédé de balayage d'un environnement faisant appel à plusieurs scanners simultanément
EP3825730A1 (fr) * 2019-11-21 2021-05-26 Bentley Systems, Incorporated Affectation de chaque point d'un nuage de points à une position de dispositif de balayage d'une pluralité de différentes positions de dispositif de balayage dans un nuage de points

Also Published As

Publication number Publication date
US20240236242A1 (en) 2024-07-11
WO2024148260A1 (fr) 2024-07-11
AU2024205978A1 (en) 2025-07-24

Similar Documents

Publication Publication Date Title
US8930127B2 (en) Localization method for mobile robots based on landmarks
JP5620200B2 (ja) 点群位置データ処理装置、点群位置データ処理方法、点群位置データ処理システム、および点群位置データ処理プログラム
CN108007344B (zh) 用于可视地表示扫描数据的方法、存储介质和测量系统
US8264537B2 (en) Photogrammetric networks for positional accuracy
US20130113897A1 (en) Process and arrangement for determining the position of a measuring point in geometrical space
US10186027B1 (en) Layout projection
JP2003514234A (ja) 画像測定方法および装置
US20100061593A1 (en) Extrapolation system for solar access determination
US6256058B1 (en) Method for simultaneously compositing a panoramic image and determining camera focal length
CN113112539A (zh) 一种油气田视频监控通视及区域通视网分析系统、方法、设备及存储介质
US7821535B2 (en) Information processing method and apparatus
US20240236242A1 (en) Method and system for combining multiple scans into a single data set
Previtali et al. Rigorous procedure for mapping thermal infrared images on three-dimensional models of building façades
US12112508B2 (en) Calibrating system for colorizing point-clouds
CN110736816A (zh) 一种基于智能巡检机器人的甲烷泄漏检测和定位方法
Wang et al. An improved two-point calibration method for stereo vision with rotating cameras in large FOV
Kleiner et al. Handheld 3-D Scanning with Automatic Multi-View Registration Based on Visual-Inertial Navigation
CN112697091B (zh) 一种中继跟踪器的跟踪扫描系统、标定方法及测量方法
CN115359114B (zh) 一种定位方法、装置、电子设备和计算机可读存储介质
US12211223B2 (en) System and method for setting a viewpoint for displaying geospatial data on a mediated reality device using geotags
US12249093B2 (en) System and method for geo-referencing object on floor
CN113393529A (zh) 摄像机的标定方法、装置、设备和介质
Gruen et al. High-accuracy matching of object edges
CN118570078B (zh) 一种融合非连续tls的slam精度增强方法
Gruszczyński Method for precise determination of eccentric instrument set-ups

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20250722

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC ME MK MT NL NO PL PT RO RS SE SI SK SM TR