[go: up one dir, main page]

WO2021054601A1 - Procédé et dispositif de localisation sans fil sur la base d'image - Google Patents

Procédé et dispositif de localisation sans fil sur la base d'image Download PDF

Info

Publication number
WO2021054601A1
WO2021054601A1 PCT/KR2020/010110 KR2020010110W WO2021054601A1 WO 2021054601 A1 WO2021054601 A1 WO 2021054601A1 KR 2020010110 W KR2020010110 W KR 2020010110W WO 2021054601 A1 WO2021054601 A1 WO 2021054601A1
Authority
WO
WIPO (PCT)
Prior art keywords
signal
image
mobile node
node
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/KR2020/010110
Other languages
English (en)
Korean (ko)
Inventor
이택진
김재헌
김철기
신범주
이정호
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Korea Institute of Science and Technology KIST
Original Assignee
Korea Institute of Science and Technology KIST
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Korea Institute of Science and Technology KIST filed Critical Korea Institute of Science and Technology KIST
Publication of WO2021054601A1 publication Critical patent/WO2021054601A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S11/00Systems for determining distance or velocity not using reflection or reradiation
    • G01S11/02Systems for determining distance or velocity not using reflection or reradiation using radio waves
    • G01S11/06Systems for determining distance or velocity not using reflection or reradiation using radio waves using intensity measurements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
    • G01S5/0252Radio frequency fingerprinting
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods

Definitions

  • the generating of the one pixel value includes combining the ID of at least one fixed node serving as at least one kind of color component and the intensity of each signal serving as a value of each color component. You can create values.
  • the wireless positioning method further includes estimating a relative position of the mobile node at a time point at which the at least one signal is received, and generating the movement path image includes at least the estimated relative position of the mobile node.
  • the movement path image may be generated by expressing the receiving point of one signal as a pixel of the generated pixel value.
  • the step of searching for a portion most similar to the movement path image includes comparing values of pixels of the movement path image with values of pixels of each of a plurality of search areas in the map image, and thus, each of the plurality of search areas for the movement path image.
  • a similarity degree of is calculated, and a search area having the highest similarity among the calculated similarities may be searched as a part most similar to the movement path image.
  • the map image includes pixels having a plurality of pixel values generated from the intensity of signals received at the actual signal reception points in the area and a plurality of pixels generated from the intensity of the signals virtually received at the virtual signal reception points in the area. It may include pixels having a pixel value of.
  • the wireless positioning apparatus further includes a relative position estimating unit for estimating a relative position of the mobile node at a time point at which the at least one signal is received, and the image generating unit at least one signal indicated by the estimated relative position of the mobile node.
  • the moving path image may be generated by expressing the receiving point of as a pixel of the generated pixel value.
  • the prior art proposes a technique that reflects the relative position of the mobile node to the current position determination of the mobile node when the positioning accuracy is low in the path environment as described above, but the relative position estimation algorithm cannot avoid accumulation of relative position errors. There is a limit to preventing a decrease in positioning accuracy.
  • FIG. 1 is a block diagram of a wireless communication system according to an embodiment of the present invention.
  • FIG. 2 is a configuration diagram of a wireless positioning apparatus of the mobile node 1 shown in FIG. 1.
  • FIG. 3 is a flowchart of a wireless positioning method according to an embodiment of the present invention.
  • FIG. 10 is a diagram for explaining a principle of estimating signal strength by the positioning server 3 shown in FIG. 1.
  • FIG. 11 is a diagram showing an example of a map image stored in the positioning server 3 shown in FIG. 1.
  • An embodiment of the present invention to be described below relates to a wireless positioning method and apparatus for providing a positioning service using a wireless signal such as a Wi-Fi signal, a Long Term Evolution (LTE) signal, and the like, and in particular, improving the calculation efficiency of a wireless positioning algorithm.
  • the present invention relates to an image-based wireless positioning method and apparatus capable of improving positioning accuracy in a wide space, such as a branch point or a square that is divided into several lengths while being divided into multiple lengths.
  • a wireless positioning method and a wireless positioning device will be briefly referred to as “wireless positioning method” and “wireless positioning device”. Since the mobile node is carried by the user such as a smartphone or is located near the user such as a navigation system and moves with the user, the location of the mobile node may be interpreted as the user's location.
  • a sampling rate of time domain data to be described below is determined according to the length of the scan period of the scan unit 11. The shorter the scan period of the wireless communication unit 10, the higher the sampling rate of time domain data to be described below, and as a result, the accuracy of the estimated absolute position of the mobile node 1 according to the present embodiment may be improved.
  • the scan period of the scan unit 11 is determined in consideration of.
  • the scan unit 11 receives a signal from one fixed node 2 through the scanning process. It is done. If there are a plurality of fixed nodes 2 within the communication available range at the current location of the mobile node 1, the scan unit 11 performs a scanning process from the plurality of fixed nodes 2 to the fixed node 2 A plurality of signals as many as) are received. In FIG. 1, an example in which the mobile node 1 receives three signals from three fixed nodes 21, 22, and 23 is shown. It can be seen that the other fixed node 24 is located outside the communication range of the mobile node 1.
  • step 130 the signal processing unit 12 of the mobile node 1 generates one pixel value from the ID of the at least one fixed node 2 extracted in step 120 and the at least one signal strength measured in step 120. .
  • the signal processing unit 12 calculates the ID of at least one fixed node 2 serving as at least one kind of color component and the intensity of each signal serving as a value of each color component. Combining produces any one pixel value. Any one pixel value generated as described above is a pixel value obtained by combining the ID of the fixed node 2 transmitting each signal for each signal received in step 110 and the intensity of each signal measured in step 120 into one set.
  • (ID 1 , RSS 1 ) is a set of the ID of the first fixed node 2 corresponding to one kind of color component and the signal strength of the first fixed node 2 corresponding to the value of the color component.
  • (ID 2 , RSS 2 ) is a set of the ID of the second fixed node 2 corresponding to another kind of color component and the signal strength of the second fixed node 2 corresponding to the value of the color component. The same goes for other sets.
  • a portion matching the moving path image of the moving node 1 may be found in the map image by using an existing image matching technique.
  • existing image matching techniques include histogram matching, template matching, and feature matching.
  • Each pixel value of a general image is represented by a plurality of types of color components and values of each color component.
  • a pixel value may be expressed as a combination of a value of an R (red) component, a value of G (green), and a value of a B (blue) component.
  • the existing image matching technique finds a part that matches the partial image in the whole image by using the similarity of values for each color component of two pixels corresponding to each other between the whole image and the partial image.
  • the scanning unit 11 scans the surrounding signals three times.
  • the scan unit 11 receives the signal transmitted from the fixed node 2 having the first ID during the first signal scan, the pixel value generated in step 130 becomes ⁇ (ID 1 , RSS 1 ) ⁇ .
  • the scan unit 11 receives the signal transmitted from the fixed node 2 having the first ID when scanning the second signal, the pixel value generated in step 130 becomes ⁇ (ID 1 , RSS 1 ) ⁇ .
  • the scan unit 11 receives the signal transmitted from the fixed node 2 having the second ID and the signal transmitted from the fixed node 2 having the third ID when scanning the third signal, the generated in step 130 Pixel values are ⁇ (ID 2 , RSS 2 ), (ID 3 , RSS 3 ) ⁇ .
  • the first pixel value ⁇ (ID 1, RSS 1) ⁇ with two two RSS 1 values in the pixel value ⁇ (ID 1, RSS 1) ⁇ at the time of the second signal scanned at the time of the second signal scan may be the same with each other, or different have.
  • the wireless positioning method is executed twice while the mobile node 1 is located at the same point
  • the wireless positioning method is executed twice while the mobile node 1 is located at different points. This is the case.
  • the two RSS 1 values may be different from each other due to various reasons such as a change in the wireless environment.
  • a typical example of a change in the wireless environment is signal reception interference caused by nearby moving objects such as pedestrians or vehicles.
  • step 230 the relative position estimation unit 13 of the mobile node 1 determines the moving node 1 relative to the previous position of the moving node 1 based on the moving distance and the moving direction of the moving node 1 calculated in step 220.
  • a current relative position of the mobile node 1 with respect to the previous position of the mobile node 1 is estimated by calculating a relative change of the current position of ).
  • the relative position estimating unit 13 is a relative change in the current position of the mobile node 1 with respect to the previous position of the mobile node 1 calculated as described above.
  • the coordinate value (x 1 , 0) of the current relative position with respect to the coordinate value (0, 0) of the previous position is estimated from.
  • the relative position estimating unit 13 transfers the x-coordinate value of the current relative position of the mobile node 1 when the moving distance in the x-axis direction of the mobile node 1 calculated in step 220 is 1 meter or more and less than 2 meters. Set as x coordinate value of relative position + 1".
  • the relative position estimating unit 13 expresses the moving distance in the x-axis direction of the moving node 1 calculated in step 220 in meters, which is the resolution of the two-dimensional coordinate system, so that the movement of the moving node 1 calculated in step 220 Convert the distance to the scale of a two-dimensional coordinate system.
  • values below the metric unit are rounded off, but may be processed in other ways such as rounding.
  • the relative position estimating unit 13 expresses the moving distance in the arbitrary direction of the moving node 1 calculated in step 220 in meters of the x-axis and meters of the y-axis of the two-dimensional coordinate system, thereby moving calculated in step 220.
  • the moving distance and the moving direction of the node 1 can be converted into a scale of a two-dimensional coordinate system.
  • the values expressed in meters of the x-axis and meters of the y-axis become the x-coordinate and y-coordinate values of the relative position of the mobile node 1.
  • the resolution of the 2D coordinate system may be various values such as 10 centimeters, 50 centimeters, and 2 meters in addition to 1 meter, depending on the performance of the wireless positioning device according to the present embodiment.
  • the x-axis coordinate value when the moving node 1 moves in a direction opposite to the "+" direction of the x-axis, the x-axis coordinate value may be a negative number.
  • the vertical upper direction of the "+" direction of the x-axis becomes the "+” direction of the y-axis, and the vertical lower direction becomes the "-" direction of the y-axis, and the y-axis coordinate value may be positive or negative.
  • the relative position coordinate value of the mobile node 1 is determined according to the movement distance and the movement direction of the mobile node 1 which are repeatedly calculated in step 220 according to the repeated execution of the wireless positioning method according to the present embodiment.
  • the relative position estimating unit 13 moves the estimated movement in step 520 after estimating the absolute position of the mobile node 1 in step 520 to be described below.
  • the relative position of the mobile node 1 is not continuously estimated based on the previous relative position of the mobile node 1, but when the relative position of the mobile node 1 is replaced with an absolute position, Since it is estimated based on the absolute position, the section to which the relative position estimation of the mobile node 1 is applied is very short, so that the absolute position error of the mobile node 1 due to the accumulation of the error in the relative position due to the repetition of the relative position estimation is almost Does not occur.
  • the image generator 15 of the mobile node 1 determines the receiving point of the at least one signal in step 110 indicated by the relative position of the mobile node 1 estimated in step 230 as the pixel generated in step 130.
  • a moving path image of the moving node 1 is generated by expressing it as a pixel having a value.
  • the image generation unit 15 expresses the receiving point of the at least one signal in step 110 as a pixel having the pixel value generated in step 130 at the indication point of the relative position of the mobile node 1 estimated in step 230. It may be implemented by storing the pixel value generated in step 130 in the address of the corresponding buffer 14.
  • the coordinate system of the movement path image generated in step 310 is a two-dimensional coordinate system, and the coordinate values of each pixel are composed of a coordinate value of the x-axis, which is a horizontal axis, and a coordinate value of the y-axis, which is a vertical axis, that is, (x, y).
  • the coordinate system of the moving path image may be a three-dimensional coordinate system.
  • the coordinate value of each pixel is composed of the coordinate value of the x-axis, the coordinate value of the y-axis, and the coordinate value of the z-axis, that is, (x, y, z), and the coordinate system of the map image is the same as the coordinate system of the moving route image. It becomes the coordinate system.
  • the image generator 15 is a point in the two-dimensional coordinate system indicated by the coordinate value (0, 0) of the relative position of the mobile node 1 estimated in step 230.
  • the mobile node Create an image of the movement path of 1).
  • the image generator 15 is a two-dimensional coordinate system indicated by the coordinate values (x 1, 0) of the relative position of the mobile node 1 estimated in step 230.
  • step 130 By placing a pixel having the pixel value generated in step 130 at the point (x 1 , 0) of, that is, storing the pixel value in the address of the buffer 14 corresponding to the point (x 1, 0) of the 2D coordinate system. By doing so, a moving path image of the moving node 1 is generated.
  • step 310 the image generator 15 generates and places a pixel having the pixel value generated in step 130 at the point of the relative position estimated in step 230 in the same coordinate system as the map image.
  • the receiving point of at least one signal in step 110 is expressed as a pixel having a pixel value generated in step 130.
  • the storage area of the buffer 14 at that address serves as a pixel having the pixel value.
  • the buffer ( It may be implemented by selecting the address of 14) and storing the pixel value generated in step 130 at the address of the buffer 14 selected as described above.
  • the moving node 1 When the scan period of the scan unit 11 is long compared to the resolution of the relative position coordinates of the moving node 1, the moving node 1 generates additional pixel values in addition to the pixel values generated in step 130 using image interpolation, etc. It is also possible to improve the resolution of the moving path image according to the resolution of the relative position coordinates of.
  • step 310 illustrated in FIG. 3 includes the following steps executed by the image generating unit 15 illustrated in FIG. 3. Whenever the wireless positioning method shown in FIG. 3 is repeatedly executed, the detailed steps of step 310 shown in FIG. 5 are also repeatedly executed as many times as the number of repetitions.
  • the image generator 15 receives the pixel value generated in step 130 from the signal processor 12 every scan period of the scan unit 11.
  • the image generator 15 receives the coordinate value of the relative position estimated in step 230 from the relative position estimator 13 at each time point of receiving at least one signal in step 110. In this way, in step 311, the image generator 15 receives a new pixel value and a coordinate value of a relative position every scan period of the scan unit 11.
  • step 313 the image generator 15 checks whether the coordinate value of the relative position received in step 312 is the same as the coordinate value of the relative position of the mobile node 1 estimated before the estimation of the relative position. If the check result in step 313 is the same as the coordinate value of the relative position received in step 312 and the coordinate value of the relative position of the mobile node 1 estimated before the estimation of the relative position, the process proceeds to step 314; otherwise, 315 Proceed to step.
  • the coordinate value of the current relative position of the mobile node 1 and the coordinate value of the relative position of the mobile node 1 estimated before the estimation of the relative position are the same, mainly when the mobile node 1 stays in one place. to be. In rare cases, even when the moving node 1 accurately returns to the previous progress path, the coordinate value of the current relative position of the moving node 1 and the coordinate value of the relative position of the moving node 1 estimated before the estimation of this relative position This could be the same.
  • the pixel having the pixel value updated in step 314 is also the pixel having the pixel value generated in step 130, the pixel having the pixel value updated in step 314 may be the pixel having the last pixel value.
  • the position of the moving node 1 is a pixel position at an intermediate position among pixels representing a movement path.
  • the image generating unit 15 is a pixel having the pixel value received in step 311 at the point indicating the relative position coordinate value received in step 312 in the two-dimensional coordinate system of the moving path image stored in the buffer 14.
  • the moving path image stored in the buffer 14 is updated by arranging. As step 315 is repeatedly executed, the length of the movement path indicated by the movement path image increases more and more.
  • the image generator 15 generates a movement path image having only one pixel by storing the pixel value received in step 311 in the address of the buffer 14 corresponding to the origin (0, 0). Thereafter, the movement path image is generated in a manner in which the existing movement path image stored in the buffer 14 is updated.
  • step 316 the image generating unit 15 checks whether the moving path image updated in step 315, that is, the length of the moving path indicated by the moving path image stored in the buffer 14 exceeds the reference length. As a result of checking in step 316, if the length of the moving path indicated by the moving path image exceeds the reference length, the process proceeds to step 317, and if not, the step 510 proceeds.
  • step 317 the image generator 15 removes the pixels having the oldest generated pixel values among the pixels of the movement path image stored in the buffer 14, thereby removing the movement path image stored in the buffer 14. Update.
  • the length of the moving path indicated by the moving path image of the present embodiment gradually increases according to the repeated execution of the wireless positioning method shown in FIG. 3 and then maintains the reference length.
  • the reference length means the minimum length of the moving path of the mobile node 1 that can accurately estimate the position of the mobile node 1. If the moving path of the mobile node 1 is too short, the positioning accuracy of the mobile node 1 may be degraded, and if the moving path of the moving node 1 is too long, it takes a lot of time to match the image and the positioning of the mobile node 1 Real-time performance may be degraded.
  • the cluster selection unit 16 of the mobile node 1 selects at least one cluster from among clusters in all regions where the positioning service according to the present embodiment is provided based on the at least one signal received in step 110. do.
  • the entire area where the wireless positioning service is provided is divided into a plurality of clusters.
  • the cluster selection unit 16 selects one cluster in which the mobile node 1 is located based on the ID of at least one fixed node 2 carried in at least one signal received in step 110. do. For example, when a fixed node 2 transmits a signal only to a specific cluster or a combination of a plurality of fixed nodes 2 can receive signals only in a specific cluster, the cluster is Can be selected.
  • the cluster selection unit 16 When the cluster selection unit 16 cannot select one cluster in which the mobile node 1 is located based on the ID of at least one fixed node 2, the strength of at least one signal received in step 110 Based on this, one cluster in which the mobile node 1 is located is selected. For example, if a fixed node 2 transmits a signal to two clusters adjacent to each other or a combination of a plurality of fixed nodes 2 can receive signals in two adjacent clusters, at least one signal A cluster may be selected based on the strength.
  • the cluster selection unit 16 may select a plurality of clusters by adding clusters around the selected cluster. For example, a plurality of clusters may be selected when the mobile node 1 is located at the boundary of two clusters adjacent to each other or when the accuracy of wireless positioning is to be improved by increasing the number of clusters.
  • step 420 the map loader 17 of the mobile node 1 requests to transmit the map image corresponding to the at least one cluster selected in step 410 to the positioning server 3 through the wireless communication unit 10. To transmit. Data representing at least one cluster selected in step 410 is carried on this signal.
  • step 430 when the positioning server 3 receives the request signal for the map image transmitted from the mobile node 1, at least one of the request signals indicated by the request signal from the map images for all regions where the positioning service according to the present embodiment is provided. A map image of a cluster of, that is, a map image of an area where the mobile node 1 is located is extracted. Map images of all regions in which the positioning service according to the present embodiment is provided are stored in the database of the positioning server 3.
  • step 440 the positioning server 3 transmits the map image extracted in step 430 to the mobile node 1.
  • step 450 the mobile node 1 receives the map image transmitted from the positioning server 3. If the mobile node 1 has a database sufficient to accommodate the map image stored in the database of the positioning server 3, the mobile node 1 will extract the map image from the map image stored in the database therein. May be. In this case, steps 420, 440, and 450 may be omitted, and step 430 is performed by the mobile node 1. For example, the mobile node 1 may receive a map image as shown in FIG. 6.
  • FIG. 6 is a diagram illustrating an example of a map image received in step 450 of FIG. 3. Since the moving route image stored in the buffer 14 of the moving node 1 and the map image received from the positioning server 3 must be able to match each other, the map image shown in FIG. 6 is a method of generating the moving route image in step 310. It is created in the same way as Therefore, the description of the map image generation will be replaced with the description of the movement path image generation. However, the moving route image generated in step 310 is generated only as much as the reference length, but the map image for all areas where the positioning service according to the present embodiment is provided is generated for all movable routes such as roads and alleyways in the entire area.
  • a map creation node used for generating a map image generates an image representing all the paths while moving around all possible paths such as roads and alleyways in the entire area.
  • the coordinate value of each pixel of the movement path image is a relative position coordinate value, or the coordinate value of each pixel of the map image is an absolute position coordinate value.
  • other information such as Global Positioning System (GPS) coordinates and an address for each country may be mapped and stored together.
  • GPS Global Positioning System
  • the wireless positioning apparatus displays the current location in the region where the positioning service is provided, and provides information such as GPS coordinates of the point where the user is currently located and the address of each country to the user of the mobile node 1. Can provide.
  • the comparison unit 18 and the absolute position estimating unit 19 of the mobile node 1 are based on the comparison of the moving path image generated in step 310 and the map image received in step 450.
  • the map image received in step 450 is a map image of an area where the mobile node 1 is located.
  • the comparison unit 18 compares the moving path image generated in step 310 with the map image received in step 450.
  • the absolute position estimating unit 19 estimates the absolute position of the mobile node 1 based on the comparison result of the comparison unit 18.
  • the map image received in step 450 is a map image of an area where the mobile node 1 is located.
  • the length of the movement path image generated in step 310 increases. Until the movement path image generated in step 310 reaches a certain length, the estimated value of the absolute position of the mobile node 1 may be inaccurate. Recently, as the performance of the smartphone or navigation system corresponding to the mobile node 1 is very excellent, the movement path image generated in step 310 can guarantee a high accuracy of the estimated value of the absolute position of the mobile node 1. You can reach the length immediately. The user can be provided with a high-accuracy positioning service as soon as the wireless positioning device shown in FIG. 2 is driven.
  • the movement path image of this embodiment is composed of pixels having IDs of fixed nodes that transmit signals received while moving until the mobile node 1 reaches the current position and pixel values generated from the intensity of the signals,
  • the calculation efficiency of the wireless positioning algorithm can be significantly improved.
  • the prior art uses a pattern of changes in signal strength according to a one-dimensional relative position change of the mobile node without consideration of the moving direction of the mobile node, whereas the present embodiment Since a two-dimensional or three-dimensional movement path image in which the two-dimensional or three-dimensional motion is reflected as it is is used, positioning accuracy can be greatly improved.
  • the calculation efficiency of the wireless positioning algorithm is improved, the positioning real-time property is improved, so that both the accuracy and the real-time performance of the wireless positioning can be improved.
  • the next relative position of the mobile node 1 is estimated based on the absolute position. It can avoid the accumulation of relative position errors.
  • step 510 the comparison unit 18 of the moving node 1 compares the moving path image generated in step 310 with the map image received in step 450, so that the most similar part of the map image to the moving path image generated in step 310. To find out.
  • the comparison unit 18 compares the values of the pixels of the movement path image generated in step 310 with the values of the pixels of each of the plurality of search areas in the map image received in step 450. The similarity of each of the plurality of search areas in the map image with respect to the route image is calculated. Subsequently, the comparison unit 18 searches for a search area having the highest similarity among the calculated similarities as a part most similar to the movement path image generated in step 310.
  • each of the plurality of search areas in the map image refers to an area of the same size that overlaps the movement path image when the movement path image is moved on the map image in order to search for a portion most similar to the movement path image.
  • FIG. 8 is a diagram illustrating an example of searching for a similar part in step 510 of FIG. 3.
  • the moving path image is shown as a solid line and the map image is shown as a dotted line in order to distinguish between the moving path image and the map image.
  • the comparison unit 18 positions the movement path image on the first search area 81 on the upper left of the map image and overlaps with each other in the state of the map image and the values of the pixels of the corresponding movement path image. The sum of squared differences between the values of the pixels in the search area 81 of is calculated. As the values of the two pixels corresponding to each other are similar, the squared value of the difference converges to zero.
  • the total is 0. That is, the degree of similarity between the values of the pixels of the movement path image and the search area 81 of the map image is the sum of the squared difference between the values of the pixels of the movement path image and the values of the pixels of the search area 81 of the map image corresponding to each other. As the reciprocal, it is inversely proportional to the total size.
  • the comparison unit 18 rotates the movement path image by a unit angle. Subsequently, the comparison unit 18 places the thus rotated movement path image on the upper left area of the map image, and in that state, the values of the pixels of the movement path image corresponding to each other and the pixels of the search area 81 of the map image Calculate the sum of the squared differences between the values of Subsequently, as described above, the comparison unit 18 moves the movement path image on the map image by one pixel in sequence in the x-axis and y-axis directions, and moves the movement path images corresponding to each other for each search area of the map image at one pixel interval.
  • the comparison unit 18 selects the search region having the highest similarity among the similarities calculated so far, that is, the search region having the smallest sum of squared differences calculated so far, as the part most similar to the moving path image generated in step 310. Find out.
  • the comparison unit 18 generates the search area in step 310 and the part most similar to the rotation path image in step 510. It is searched as.
  • the moving path image rotated approximately 45 degrees in the clockwise direction is located on the search area 83 of the map image in close proximity to the portion of the map image most similar to it.
  • the first search area 81 is set as an upper left area, but may be set to another area such as an upper right area.
  • the search direction of the movement path image varies according to the position of the first search area 81 in the map image.
  • the absolute position estimating unit 19 of the mobile node 1 estimates the absolute position of the map image indicated by the portion identified in step 510 as the absolute position of the mobile node 1.
  • the absolute position estimating unit 19 corresponds to a pixel having the most recently generated pixel value among pixels of the movement path image among the pixels of the portion of the map image detected in step 510.
  • the coordinate value of the pixel is estimated as the absolute position of the moving node 1.
  • the absolute position of the map image indicated by the part searched by the comparison in step 510 is the most recently generated pixel value among the pixels of the part searched by the comparison in step 510. It is a coordinate value of a pixel corresponding to a pixel having.
  • the comparison unit 18 includes the movement path image rotated in step 510 among the pixels in the portion of the map image detected in step 510.
  • the coordinate value of the pixel corresponding to the pixel having the most recently generated pixel value among the pixels of is estimated as the absolute position of the moving node 1.
  • the coordinate values (-48, -116) of the pixels of the map image corresponding to the most recently generated pixel 42 among the pixels of the movement path image rotated approximately 45 degrees in the clockwise direction are moved. It is estimated to be the absolute position of node 1.
  • the movement path image of the present embodiment is a pixel having the IDs of fixed nodes that transmit received signals while moving until the mobile node 1 reaches the current position and pixel values generated from the intensity of the signals.
  • the coordinate value of the pixel corresponding to the pixel having the most recently generated pixel value among the pixels of the movement path image among the pixels in the map image is the absolute position of the moving node 1 Is estimated to be.
  • the present embodiment is a moving path image and a map image of sufficient length to allow the LTE signal change to appear. It is possible to estimate the absolute position of the mobile node 1 by comparing. Accordingly, the present embodiment can provide a high-accuracy positioning service even when wireless positioning is performed using a wireless signal with little change in signal strength over a wide area, such as an LTE signal. Since the location of the mobile node can be accurately estimated using the LTE signal with little change in signal strength between measurement points on the moving path, it is possible to provide a wireless positioning service that can cover both outdoors and beyond. have. As a result, the present embodiment can provide a vehicle navigation system capable of both indoor and outdoor positioning or a wireless positioning service for autonomous driving, so it is most widely used as a vehicle navigation system at present, but can replace GPS, which is impossible to locate indoors. I can.
  • FIG. 9 is a diagram showing an attenuation model of a signal transmitted from the fixed node 2 shown in FIG. 1.
  • the x-axis of the attenuation model of the signal transmitted from the fixed node 2 represents the distance from the actual reception point of the signal transmitted from the fixed node 2 to the virtual reception point of the signal
  • the y-axis Indicates the strength of the signal.
  • FIG. 10 is a diagram for explaining a principle of estimating signal strength by the positioning server 3 shown in FIG. 1. As described above, the map image stored in the database of the positioning server 3 is generated in the same manner as the moving route image generation method.
  • the designer of the positioning server 3 holds the portable map creation node 30 to measure the strength of signals received in various places in the region where the positioning service according to the present embodiment is provided. I wander around. In order to distinguish it from the mobile node 2 shown in FIG. 1, the map creation node 30 is displayed in a pedestrian shape in FIG. 9.
  • the signal strength is not measured for an area 100 other than the moving path of the map creation node 30.
  • Pixels representing the area 100 are not generated. That is, a large blank area 100 exists in the map image. Since the matching between the two images is performed by comparing the values of the pixels of one image with the values of the pixels of the other image, image matching may not be performed normally when there is a large blank area 100 in the map image.
  • the map creation node 30 may pass, if it does not pass right next to the dotted line shown in FIG. 10, pixels representing the immediate vicinity of the dotted line shown in FIG. 10 are not generated.
  • This phenomenon is prominent in large spaces such as plazas, underground parking lots, and large indoor halls.
  • the mobile node 1 is located at a point in a large square or room, although signal reception is possible at that point, it often occurs that a pixel representing that point does not exist in the map image. In this case, a part most similar to the moving path image of the moving node 1 in the map image may not be normally retrieved.
  • the mapping node 30 uses the strength measurement values of the signals received at the actual signal reception points in the region where the positioning service is provided.
  • the intensity is estimated, and pixels representing virtual signal reception points are generated from the intensity estimates of these signals.
  • the map creation node 30 fills the blank area 100 in the map image with the pixels thus generated.
  • the map image is derived from pixels having a plurality of pixel values generated from the intensity of signals received at the actual signal reception points in the region where the positioning service is provided, and the intensity of the signals virtually received at the virtual signal reception points in the region. It can be said that it is composed of pixels having a plurality of generated pixel values.
  • the map creation node 30 includes the distance between the actual signal reception point 101 and the virtual signal reception point 102 and the strength of the signal received at the actual signal reception point 101 according to Equation 1 below. From the measured values, the strength of the virtually received signal at the virtual signal reception point 102 is estimated. In this way, the strength of the signals virtually received at the virtual signal receiving points can be estimated from the distance between the real signal receiving points and the virtual signal receiving points, and the measured strength of the signals received at the real signal receiving points. .
  • This “R 0 ” is determined according to whether the fixed node 2 is an access point or a base station and a model type of the fixed node 2.
  • "e” denotes an exponential function corresponding to the inverse function of the natural logarithm
  • "d” denotes the distance between the actual signal reception point 101 and the virtual signal reception point 102.
  • "a”, "b”, and “c” are coefficients whose values are experimentally determined. In other words, “a”, “b”, and “c” are the distances between several actual signal reception points and the measured value of the signal strength received from those points are substituted into “d” and "Rd” in Equation 1 below. It can be determined by repeating the process.
  • FIG. 11 is a diagram showing an example of a map image stored in the positioning server 3 shown in FIG. 1.
  • Fig. 11(a) shows a virtual three-dimensional view of the map image stored in the positioning server 3
  • Fig. 11(b) shows an actual map image stored in the positioning server 3.
  • the measured value of the intensity of signals received at points in the area where the mobile node 1 is located is expressed as points having a height proportional to the size of the measured value.
  • square boxes are displayed at points representing the strength measurement values of signals received at actual signal reception points.
  • the remaining points represent the estimated strength of signals received at the virtual signal reception points.
  • the mapping node 30 receives signals and measures the intensity while passing through a rectangular circumferential path, pixel filling of the virtual signal reception points as described above is performed. It can be seen that most of the area around the path is filled with pixels at regular intervals.
  • the mobile node 1 is created by the mobile node 1 as the periphery of the portion most similar to the moving path of the mobile node 1 in the map image is not empty and is filled with pixels representing virtual signal reception points. Even if the moving path indicated by the moved path image is slightly different from the actual moving path of the moving node 1, the absolute position of the moving node 1 can be estimated relatively accurately. In other words, since the values of the pixels of the part of the moving path image slightly different from the actual moving path of the moving node 1 are almost the same as the values of the pixels corresponding to this part in the map image, the absolute value of the moving node 1 The location can be estimated relatively accurately.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Position Fixing By Use Of Radio Waves (AREA)
  • Navigation (AREA)

Abstract

La présente invention concerne un procédé et un dispositif de localisation sans fil sur la base d'image, selon lequel : une valeur de pixel est générée à partir d'un identifiant (ID) d'au moins un noeud fixe et d'une intensité d'au moins un signal reçu du noeud fixe; une image de voie de déplacement d'un noeud mobile est générée par expression d'un point de réception dudit au moins un signal à l'aide d'un pixel ayant la valeur de pixel; et une position du noeud mobile est estimée sur la base de la comparaison de l'image de voie de déplacement avec une image de carte pour une région dans laquelle le noeud mobile est localisé, de sorte qu'il est possible d'améliorer la précision de localisation dans un grand espace, tel qu'un carré, ou au niveau d'un point de bifurcation où une route est divisée en plusieurs branches, tout en améliorant l'efficacité de calcul d'un algorithme de localisation sans fil.
PCT/KR2020/010110 2019-09-17 2020-07-31 Procédé et dispositif de localisation sans fil sur la base d'image Ceased WO2021054601A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020190114043A KR102308803B1 (ko) 2019-09-17 2019-09-17 이미지 기반의 무선 측위 방법 및 장치
KR10-2019-0114043 2019-09-17

Publications (1)

Publication Number Publication Date
WO2021054601A1 true WO2021054601A1 (fr) 2021-03-25

Family

ID=74883831

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2020/010110 Ceased WO2021054601A1 (fr) 2019-09-17 2020-07-31 Procédé et dispositif de localisation sans fil sur la base d'image

Country Status (2)

Country Link
KR (1) KR102308803B1 (fr)
WO (1) WO2021054601A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102497581B1 (ko) * 2022-07-13 2023-02-08 주식회사 티제이랩스 인공신경망을 이용한 무선 측위 방법 및 장치

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20080083953A (ko) * 2007-03-14 2008-09-19 주식회사 셀리지온 네트워크 기반의 단말기 측위 방법
KR20170078116A (ko) * 2015-12-29 2017-07-07 에스케이플래닛 주식회사 무선 핑거프린트 맵 구축 및 위치 측위 방법 및 이를 위한 장치, 이를 수행하는 컴퓨터 프로그램을 기록한 기록 매체
KR20190007306A (ko) * 2017-07-12 2019-01-22 주식회사 케이티 위치 측위 방법 및 장치
KR101945417B1 (ko) * 2018-01-02 2019-02-07 영남대학교 산학협력단 실내 위치 측정 서버와 이를 이용한 실내 위치 측정 방법 및 시스템
JP2019125227A (ja) * 2018-01-18 2019-07-25 光禾感知科技股▲ふん▼有限公司 屋内測位方法及びシステム、ならびにその屋内マップを作成するデバイス

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101901407B1 (ko) * 2016-04-01 2018-10-01 한국정보공학 주식회사 측위 장치 및 방법
KR20180087814A (ko) * 2017-01-25 2018-08-02 한국과학기술연구원 위치 측정 방법 및 시스템

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20080083953A (ko) * 2007-03-14 2008-09-19 주식회사 셀리지온 네트워크 기반의 단말기 측위 방법
KR20170078116A (ko) * 2015-12-29 2017-07-07 에스케이플래닛 주식회사 무선 핑거프린트 맵 구축 및 위치 측위 방법 및 이를 위한 장치, 이를 수행하는 컴퓨터 프로그램을 기록한 기록 매체
KR20190007306A (ko) * 2017-07-12 2019-01-22 주식회사 케이티 위치 측위 방법 및 장치
KR101945417B1 (ko) * 2018-01-02 2019-02-07 영남대학교 산학협력단 실내 위치 측정 서버와 이를 이용한 실내 위치 측정 방법 및 시스템
JP2019125227A (ja) * 2018-01-18 2019-07-25 光禾感知科技股▲ふん▼有限公司 屋内測位方法及びシステム、ならびにその屋内マップを作成するデバイス

Also Published As

Publication number Publication date
KR102308803B1 (ko) 2021-10-05
KR20210032694A (ko) 2021-03-25

Similar Documents

Publication Publication Date Title
AU2021439678B2 (en) Method for extracting three-dimensional surface deformation by combining unmanned aerial vehicle doms and satellite-borne sar images
JP6906617B2 (ja) 高正確度の無線測位方法及び装置
WO2018139773A1 (fr) Procédé et dispositif slam résistants aux changements dans un environnement sans fil
CN109951830B (zh) 一种多信息融合的室内外无缝定位方法
KR102110813B1 (ko) 무선 환경 변화에 강인한 slam 방법 및 장치
WO2017086561A1 (fr) Détermination d'emplacement de repère
KR101694728B1 (ko) 실내 수집 위치와 이종 인프라 측정정보를 수집하는 장치 및 방법
WO2019212200A1 (fr) Procédé et appareil de positionnement sans fil avec une précision de position améliorée dans différents environnements
CN210321636U (zh) 一种电缆隧道三维激光扫描装置
CN114360093A (zh) 基于北斗rtk、slam定位和图像分析的路侧停车位巡检方法
CN113556680A (zh) 指纹数据的处理方法、介质和移动机器人
WO2018139771A2 (fr) Procédé et dispositif de positionnement sans fil très précis
Tamimi et al. Performance Assessment of a Mini Mobile Mapping System: Iphone 14 pro Installed on a e-Scooter
WO2021054601A1 (fr) Procédé et dispositif de localisation sans fil sur la base d'image
Cho et al. WARP-P: Wireless signal Acquisition with Reference Point by using simplified PDR–system concept and performance assessment
KR102128398B1 (ko) 고정확도의 고속 무선 측위 방법 및 장치
KR102094307B1 (ko) 초기 정확도가 향상된 무선 측위 방법 및 장치
CN110967013A (zh) 一种基于室内地磁场信息和智能手机的室内区域定位系统
KR20130024402A (ko) 실외 연속 측위 방법 및 장치
Ellum et al. Land-based integrated systems for mapping and GIS applications
Glanzer Personal and first-responder positioning: State of the art and future trends
CN108981713B (zh) 一种混合无线自适应导航方法及装置
KR102497581B1 (ko) 인공신경망을 이용한 무선 측위 방법 및 장치
CN113343061A (zh) 一种gps激光融合slam中坐标系动态对齐方法
WO2018139772A2 (fr) Procédé et dispositif de positionnement hybride de haute précision robustes vis-à-vis de changements de trajet

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20866227

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20866227

Country of ref document: EP

Kind code of ref document: A1