[go: up one dir, main page]

WO2018221204A1 - Corps mobile pourvu d'une antenne radio, et système de répartition de véhicule - Google Patents

Corps mobile pourvu d'une antenne radio, et système de répartition de véhicule Download PDF

Info

Publication number
WO2018221204A1
WO2018221204A1 PCT/JP2018/018724 JP2018018724W WO2018221204A1 WO 2018221204 A1 WO2018221204 A1 WO 2018221204A1 JP 2018018724 W JP2018018724 W JP 2018018724W WO 2018221204 A1 WO2018221204 A1 WO 2018221204A1
Authority
WO
WIPO (PCT)
Prior art keywords
beacon
processing circuit
signal
signal processing
array antenna
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2018/018724
Other languages
English (en)
Japanese (ja)
Inventor
伊藤 順治
華璽 劉
朋彦 友金
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nidec Corp
Original Assignee
Nidec Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nidec Corp filed Critical Nidec Corp
Publication of WO2018221204A1 publication Critical patent/WO2018221204A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S3/00Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
    • G01S3/02Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using radio waves
    • G01S3/14Systems for determining direction or deviation from predetermined direction
    • G01S3/46Systems for determining direction or deviation from predetermined direction using antennas spaced apart and measuring phase or time difference between signals therefrom, i.e. path-difference systems
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/123Traffic control systems for road vehicles indicating the position of vehicles, e.g. scheduled vehicles; Managing passenger vehicles circulating according to a fixed timetable, e.g. buses, trains, trams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present application relates to a moving object including a movable array antenna and a vehicle allocation system.
  • an indoor positioning system that estimates the position of mobile terminals in an indoor environment where satellite radio waves cannot be received is actively underway. For example, when a beacon built in a mobile terminal emits a signal wave (electromagnetic wave such as microwave or millimeter wave), the position of the mobile terminal is determined by receiving the signal wave with a plurality of array antennas fixed in the environment. It becomes possible to estimate.
  • a signal wave electromagnetic wave such as microwave or millimeter wave
  • the direction of a beacon that radiates electromagnetic waves that is, the arrival direction of a signal wave.
  • the exact distance from the array antenna to the beacon cannot be obtained. Therefore, in order to accurately estimate the position of the beacon, it is necessary to use a plurality of array antennas arranged at different positions and perform geometric calculation from the arrival directions of the signal waves based on each array antenna. There is.
  • Japanese Patent Application Laid-Open No. 2007-19828 discloses a technique for estimating the direction of an electromagnetic wave radiation source with one array antenna and displaying the estimated position in an image acquired by a camera. According to such a technique, it is possible to estimate the direction or position of the radio wave radiation source with reference to the arrangement of buildings and the like included in the image acquired by the camera.
  • An embodiment of the present disclosure relates to an array antenna that can receive a signal wave radiated from a beacon, for example, while the user freely moves the position of the array antenna and estimates the arrival direction of the signal wave.
  • a vehicle allocation system having a beacon and a moving body.
  • a mobile unit of the present disclosure includes an imaging device that outputs image data, and an array that includes a plurality of antenna elements that receive signal waves periodically or intermittently emitted from beacons.
  • An antenna and a signal processing circuit that estimates a direction of arrival of the signal wave based on a signal output from the array antenna and determines coordinates that define the direction of arrival, the signal processing circuit including the direction of arrival Is output to the image data.
  • a vehicle allocation system includes, in an exemplary non-limiting embodiment, a plurality of beacons and a plurality of vehicles, and the vehicles cycle from any of the imaging devices that output image data and the plurality of beacons.
  • An array antenna having a plurality of antenna elements for receiving signal waves, including additional information having identification information related to the beacon or the person carrying the beacon.
  • a signal processing circuit that estimates an arrival direction of the signal wave based on a signal output from an antenna and determines coordinates defining the arrival direction; and a communication circuit that acquires the additional information from the signal wave.
  • the signal processing circuit outputs a video signal in which information indicating the direction of arrival is added to the image data; Other retrieves the position information of the person who carries the beacon is transmitted to the vehicle.
  • a signal wave radiated periodically or intermittently from a beacon is received while the user freely moves the position of the array antenna, and the arrival direction of the signal wave Can be estimated.
  • the moving body has an imaging device that outputs image data, and can output a video signal in which information indicating the arrival direction is added to the image data.
  • FIG. 1 is a front view illustrating a basic configuration example of a mobile device according to an embodiment of the present disclosure.
  • FIG. 2 is a side view illustrating a basic configuration example of the mobile device according to the embodiment of the present disclosure.
  • FIG. 3 is a perspective view illustrating a basic configuration example of the mobile device according to the embodiment of the present disclosure.
  • FIG. 4 is a hardware block diagram of the mobile device.
  • FIG. 5 is a diagram illustrating a coordinate system based on the imaging apparatus.
  • FIG. 6 is a diagram illustrating a two-dimensional coordinate uv stretched on the image plane SI of the imaging apparatus.
  • FIG. 7 is a diagram schematically showing the image plane SA of the array antenna in the coordinate system based on the imaging device.
  • FIG. 1 is a front view illustrating a basic configuration example of a mobile device according to an embodiment of the present disclosure.
  • FIG. 2 is a side view illustrating a basic configuration example of the mobile device according to the embodiment of the present disclosure.
  • FIG. 8 is a diagram illustrating the angles ⁇ and ⁇ that define the estimated value of the arrival direction of the signal wave.
  • FIG. 9 is a diagram schematically showing a luggage rack in the warehouse and an image of the luggage rack displayed on the display device.
  • FIG. 10A is a diagram for illustrating an example of a posture change of the mobile device.
  • FIG. 10B is a diagram for illustrating an example of the posture change of the mobile device.
  • FIG. 11A is a diagram illustrating dots and identification information indicating beacon positions displayed on a display device of a mobile device.
  • FIG. 11B is a diagram showing dots with corrected position coordinates and identification information.
  • FIG. 11C is a diagram illustrating an example of “display misalignment” when correction processing is not performed.
  • FIG. 12 is a flowchart illustrating an example of a beacon position display process.
  • FIG. 13 is a hardware block diagram of the mobile device according to the second example.
  • FIG. 14 is a hardware block diagram of the mobile device according to the third example.
  • FIG. 15 is a hardware block diagram of the mobile device according to the fourth example.
  • FIG. 16 is a schematic diagram for explaining a vehicle allocation system 1000 including a plurality of beacons 10 and a plurality of vehicles 200.
  • FIG. 17 is an external view of the vehicle 200.
  • FIG. 18 is a diagram illustrating an internal configuration of the electronic apparatus 300 connected to the display device 310.
  • FIG. 19 is a schematic diagram showing the smartphone 240 and the electronic device 300 of the passenger 230 with the connection established.
  • FIG. 20 is a diagram showing a display example of the display device 310.
  • FIG. 21 is a diagram showing a transport system 1100 having a plurality of AGVs 400 each mounting the electronic device 300 (FIG. 18).
  • FIG. 22 is an external view of an exemplary AGV 400.
  • FIG. 23 is a diagram illustrating a hardware configuration of the AGV 400.
  • FIG. 24 is a diagram illustrating a relationship between the AGV 400 and the direction P in which the beacon 10 exists.
  • FIG. 25 is a diagram illustrating a relationship between the AGV 400 after traveling and the direction P in which the beacon 10 exists.
  • FIG. 26 is a diagram illustrating a configuration example of the search system 1200.
  • FIG. 27 is an external perspective view of an exemplary multicopter 600.
  • FIG. 28 is a side view of the multicopter 600.
  • FIG. 1 to FIG. 3 are a front view and a side view, respectively, showing a basic configuration example of a mobile device according to an embodiment of the present disclosure
  • FIG. 3 is a perspective view.
  • the mobile device 100 is a device including a portable array antenna 20 and has a shape and a size that can be easily carried.
  • the array antenna 20 has a plurality of antenna elements 22 that receive signal waves radiated periodically or intermittently from the beacon 10 shown in FIGS.
  • the beacon 10 is also called a tag.
  • the beacon 10 emits a signal wave in accordance with the Bluetooth (registered trademark) Law Energy standard.
  • the beacon 10 may be a device that operates according to another standard.
  • the frequency of the signal wave is, for example, a microwave band or a millimeter wave band.
  • the beacon 10 radiates a 2.4 GHz signal wave at a time interval of, for example, 10 milliseconds to 200 milliseconds, typically 100 milliseconds.
  • the frequency of the signal wave does not need to be constant as long as it can be received by the array antenna 20, and a plurality of frequencies can be hopped.
  • the signal wave radiated from the beacon 10 is schematically described, but directivity is not particularly given to the actual radiation of the signal wave from the beacon 10. Although it is desirable that the signal wave emitted by the beacon 10 is isotropic, it may have anisotropy depending on the antenna of the beacon 10.
  • the signal wave emitted by the beacon 10 may include additional information having identification information regarding the beacon 10 or the person carrying the beacon 10.
  • An example of the additional information is a beacon ID and / or an owner ID of the beacon 10.
  • the beacon 10 may incorporate various sensors such as a heart rate monitor, a thermometer, an altimeter, and / or an acceleration sensor.
  • the beacon 10 may be electrically connected to various external sensors. In such a case, the beacon 10 can radiate by including various measurement values acquired by these sensors in the signal wave.
  • a typical example of the beacon 10 includes an antenna for radiating a signal wave, a high-frequency circuit, a battery for driving the high-frequency circuit, and a processor for controlling these operations.
  • the beacon 10 may be directly carried by a person, or may be incorporated in an electronic device such as a smartphone. Further, the beacon 10 may be used by being attached to a circulated article or case, like a general IC tag.
  • the diameter of the array antenna 20 is about 20 centimeters, for example, and includes seven antenna elements 22 arranged in a two-dimensional manner in a plane.
  • the weight of the array antenna 20 is, for example, about 500 grams.
  • the configuration and size of the array antenna 20 are not limited to this example as long as a person can carry it with one or both hands.
  • the external shape of the array antenna 20 viewed from the front is not necessarily circular, and may be an ellipse, a rectangle, a polygon, a star, or other shapes.
  • the number of antenna elements 22 may be 8 or more, or may be in the range of 3-6.
  • the antenna elements 22 in this example are arranged in a plane extending in both the horizontal direction and the vertical direction (vertical direction) in the drawing. Specifically, six antenna elements 22 are concentrically arranged at equal intervals around one antenna element 22 located at the center of the array antenna 20. This arrangement is only an example.
  • the antenna elements 22 may be arranged in a straight line along the horizontal direction, for example. When a plurality of antenna elements 22 are arranged linearly along one direction, the arrival direction of the signal wave cannot be estimated with respect to the direction intersecting the direction. In the example shown, it is possible to estimate the direction of arrival for both the
  • the array antenna 20 may incorporate a high-frequency circuit such as a monolithic microwave integrated circuit (not shown) and an AD conversion circuit. Such a circuit may be connected between the signal processing circuit 30 described later and the array antenna 20 instead of being provided in the array antenna 20.
  • a high-frequency circuit such as a monolithic microwave integrated circuit (not shown) and an AD conversion circuit.
  • Such a circuit may be connected between the signal processing circuit 30 described later and the array antenna 20 instead of being provided in the array antenna 20.
  • the mobile device 100 includes a signal processing circuit 30 that estimates the arrival direction of a signal wave based on a signal output from the array antenna 20 and determines coordinates that define the arrival direction of the signal wave.
  • the signal processing circuit 30 is configured to execute an array signal processing algorithm and estimate the arrival direction of the signal wave.
  • An array signal processing algorithm is arbitrary.
  • the direction of arrival of the signal wave may be referred to as DOA (Direction Of Arrival) or AOA (Angle Of Arrival).
  • the signal processing circuit 30 is disposed inside the casing of the base 72 in the illustrated example.
  • the base 72 is coupled to one end of the columnar grip 70.
  • the grip 70 supports the array antenna 20 via a fixture 74.
  • the grip 70 has a shape and a size (length and diameter) suitable for being gripped by a human hand.
  • the grip 70 is not held by a person, in other words, when the mobile device 100 is not carried by the user, the mobile device 100 is placed on the fixed object such that the base 72 is in contact with a fixed object such as a table or a floor. It may be placed.
  • the mobile device 100 can be used even when it is mounted on or attached to a mobile body such as a mobile robot, an automatic guided vehicle, a drone, or a car.
  • the mobile device 100 of the present embodiment includes a communication circuit 40 that acquires the above-described additional information from the signal wave of the beacon 10. Therefore, the mobile device 100 can also operate as a “handy scanner” that reads a signal or data emitted from the beacon 10 wirelessly in a contactless manner.
  • a communication circuit 40 is disposed inside the casing of the base 72 in the illustrated example.
  • the communication circuit 40 may transmit and receive other signal waves by an antenna (not shown) other than the array antenna 20, or may be connected to a telephone line or the Internet.
  • the signal processing circuit 30 and the communication circuit 40 are realized by a single or a plurality of semiconductor integrated circuits.
  • the signal processing circuit 30 may be referred to as a CPU (Central Processing Unit) or a computer.
  • the signal processing circuit 30 can be realized by a circuit including a computer such as a general-purpose microcontroller or a digital signal processor, and a memory in which a computer program for causing the computer to execute various instructions is incorporated.
  • the signal processing circuit 30 may include a register (not shown), a cache memory, and / or a buffer.
  • the mobile device 100 includes an imaging device 50 that outputs image data.
  • the imaging device 50 includes a lens 52 and an image sensor 54 as shown in FIGS.
  • a typical example of the imaging device 50 is a digital camera or a digital movie.
  • the relative arrangement relationship between the imaging device 50 and the array antenna 20 is fixed.
  • the optical axis (camera axis) of the imaging device 50 that is, the Z axis is parallel to the Za axis of the array antenna 20. “Parallel” in this specification need not be mathematically strictly parallel. Some misalignment is allowed.
  • the position or orientation of the imaging device 50 changes together with the array antenna 20.
  • the user can point the array antenna 20 in an arbitrary direction.
  • the signal processing circuit 30 outputs a video signal in which information indicating the arrival direction is added to the image data.
  • the mobile device 100 of this embodiment includes a display device 60 that displays image data. As shown in FIG. 2, the display device 60 in this embodiment is supported by the grip 70 via an angle adjustment device 76.
  • the display device 60 can display information indicating the direction of arrival and image data based on the video signal output from the signal processing circuit 30.
  • the information indicating the arrival direction includes marks such as dots or lines displayed at the estimated position coordinates of the arrival direction in the image defined by the image data. Typical examples of such marks can be graphics such as dots, circles, crosses, and arrows, symbols, letters, numbers, or combinations thereof.
  • the signal processing circuit 30 may cause the display device 60 to display a part or all of the selected additional information.
  • the display device 60 may be a liquid crystal display, an OLED display, or various flexible displays.
  • the display device 60 may be a projector that projects an image on the surface of an object such as a wall surface or a desktop.
  • the display device 60 includes a driver circuit, a memory circuit, an input / output interface circuit, and the like (not shown). All or part of the signal processing circuit 30, the storage device 32, and the communication circuit 40 may be mounted on the same printed circuit board together with the driver circuit of the display device 60 and the like.
  • the mobile device 100 may take the form of a smartphone, a tablet terminal, or a laptop computer provided with the array antenna 20.
  • the mobile device of the present disclosure may take the form of a wearable device such as a wristwatch or a head-mounted display.
  • FIG. 4 is a hardware block diagram of the mobile device.
  • the mobile device 100 includes various components. Each component is connected by an internal bus 34, and each component can exchange data with other components.
  • the mobile device 100 includes a storage device 32.
  • the storage device 32 includes a random access memory (RAM) 32a, a read only memory (ROM) 32b, and a storage 32c.
  • the RAM 32a is a volatile memory that can be used as a work memory when the signal processing circuit 30 performs an operation.
  • the read only memory (ROM) 32b is, for example, a non-volatile memory that stores a computer program.
  • the storage 32c is a non-volatile memory that holds information acquired by the mobile device 100. An example of the information is registration information including beacon 10 or identification information of a user who carries beacon 10 described later. In the present disclosure, the storage device 32 may be simply referred to as “memory”.
  • the signal processing circuit 30 reads out the computer program stored in the ROM 32b, develops it in the RAM 32a, reads out the commands constituting the computer program from the RAM 32a, and executes them. Thereby, the signal processing circuit 30 can implement
  • the signal processing circuit 30 performs processing for displaying an image on the display device 60.
  • the signal processing circuit 30 receives a video (moving image) output from the imaging device 50 described later, and adjusts the luminance of the video, for example, or outputs the video to the display device 60 with noise reduced. Further, the signal processing circuit 30 displays an image indicating the position of the beacon 10, for example, an icon, at the position coordinates of the display device 60. Note that the processing for displaying an image on the display device 60 may be performed by a circuit different from the signal processing circuit 30, for example, an image processing circuit.
  • the mobile device 100 in this embodiment includes a motion sensor 80.
  • An example of the motion sensor 80 is a gyro sensor.
  • a three-axis gyro sensor is employed as the motion sensor 80.
  • the motion sensor 80 detects angular velocities around three axes orthogonal to each other. When the mobile device 100 is placed on a horizontal plane, the three axes are typically parallel to the vertical direction (yaw axis), parallel to the horizontal direction (pitching axis), and parallel to the front-rear direction. Axis (roll axis).
  • the motion sensor 80 outputs a detected value of angular velocity around each axis, for example, every 1 millisecond.
  • the mobile device 100 may include an acceleration sensor.
  • the signal processing circuit 30 can acquire the rotation angle around each axis by time-integrating the detected value of the angular velocity around each axis output from the three-axis gyro sensor.
  • FIG. 5 shows a coordinate system based on the imaging device 50.
  • the XYZ coordinates shown in the figure are so-called camera coordinates (right-handed system).
  • the origin O of this coordinate is the optical center (principal point) of the imaging device 50, and the Z-axis is the camera optical axis.
  • a point P (X, Y, Z) is assumed to indicate the position of the beacon 10.
  • FIG. 5 shows the image plane SC of the imaging device 50.
  • the image plane SC is separated from the origin O by the focal length f in the Z-axis direction.
  • the two-dimensional coordinates xy stretched on the image plane SC are the coordinates of the camera image.
  • Corresponding points of the point P (X, Y, Z) appearing on the image plane SC are determined by perspective projection transformation based on the Peanhall camera model.
  • the beacon 10 is visually recognized from the imaging device 50, the beacon 10 is observed at the position of the point Mc (x, y) on the image plane SC.
  • FIG. 6 shows a two-dimensional coordinate uv stretched on the image plane SI of the imaging device 50.
  • This image plane SI corresponds to a pixel area of the image sensor 54.
  • the two-dimensional coordinate uv is a pixel unit coordinate.
  • the coordinates (u0, v0) are the intersections of the Z axis and the image plane SI.
  • Mc (u, v) can be obtained by converting (X, Y, Z) by perspective projection.
  • Equation 1 A basic example of a matrix that defines the perspective transformation is shown in the following Equation 1.
  • ⁇ and ⁇ in Equation 1 are internal parameters of the imaging apparatus 50, and specifically, are determined by the focal length f of the lens, the pixel size of the image sensor 54, and the like.
  • FIG. 7 is a diagram schematically showing the image plane SA of the array antenna 20 in the coordinate system with the imaging device 50 as a reference.
  • the point oa is the center of the array antenna 20, and the Za axis is the center axis of the array antenna 20.
  • the point oa is at a position shifted by a distance d in the positive direction of the Y axis.
  • the distance d is, for example, about 10 centimeters, but may be 10 centimeters or less.
  • the distance from the center oa of the array antenna 20 to the position P of the beacon 10 is unknown.
  • the direction of the position P of the beacon 10 (the arrival direction of the signal wave) can be estimated from the center oa of the array antenna 20.
  • the estimated value of the arrival direction of the signal wave is defined by, for example, the angles ⁇ and ⁇ shown in FIG.
  • the angle parameter that defines the arrival direction of the signal wave is not limited to the angles ⁇ and ⁇ .
  • the arrival direction can be expressed using the inclination angle ⁇ 1 from the Za axis to the X axis direction and the inclination angle ⁇ 2 from the Za axis to the y axis direction.
  • the antenna elements 22 constituting the array antenna 20 can be arranged on a straight line parallel to the X axis.
  • the embodiment of the mobile device according to the present disclosure is realized by a miniaturized device such as a smartphone, for example, three or four antenna elements 22 extend along a direction parallel to the long side of the housing. Can be arranged on a straight line or on a curve.
  • the image plane SA of the array antenna 20 can be virtually set at a position shifted by, for example, 1 meter from the point oa in the positive direction of the Za axis.
  • the image plane SA is a virtual screen.
  • the image plane SA is, for example, a rectangle 4 meters long and 4 meters wide.
  • Such a point Ma (xa, ya) on the image plane SA is determined by the angles ⁇ and ⁇ .
  • the coordinates on the virtual screen can be correlated with the coordinates on the image plane SC or SI of the imaging device 50 by scaling using the value of the Za coordinate of the screen.
  • a point Ma (xa, ya) on the image plane SA is at coordinates (X, Y, Z) when the beacon 10 on the line segment oa-P is located 1 meter in front of the array antenna 20. It corresponds to the X component and the Y component. However, there is a difference by a distance d between the center of the image plane SA and the center of the image plane SC. By correcting this difference, it is possible to calculate the position of the beacon 10 in the camera coordinates, and thus the position coordinates in the image plane SC and the image plane SI.
  • the line segment OP and the line segment oa-P are parallel to each other. May be approximated.
  • the position of the beacon 10 on the image plane SC calculated by the above method almost coincides with the position of the point Mc.
  • the closer the actual position P of the beacon 10 is to the mobile device 100 the more the angle between the line segment OP and the line segment oa-P deviates from the parallel, and thus the image calculated from the coordinates of the point Ma on the image plane SA.
  • the position of the beacon 10 on the plane SC is shifted in the vertical direction from the position of the point Mc.
  • the accuracy of the direction of the beacon 10 is preferably higher in the horizontal direction than in the vertical direction. This is because the movable range of the beacon 10 or the person having the beacon 10 tends to be constrained in a plane substantially parallel to the horizontal plane. For this reason, the positional relationship between the imaging device 50 and the array antenna 20 is preferably a vertical relationship as in this embodiment.
  • the center of the imaging device 50 and the center of the array antenna 20 may coincide with each other.
  • the Z axis may coincide with the Za axis. If such an arrangement relationship is realized, the estimated direction of the beacon 10 can be superimposed on the image plane SC of the imaging device 50 with a simpler calculation.
  • FIG. 9 is a diagram schematically showing the luggage rack 200 in the warehouse and the image of the luggage rack 200 displayed on the display device 60.
  • the dot 90 indicating the position of the beacon 10 and the identification information 92 acquired from the additional information included in the signal wave emitted by the beacon 10 are displayed. Based on such an image, a package having the beacon 10 can be identified. For simplicity, the case where there is one beacon 10 is illustrated, but each of a plurality of packages may have a beacon 10. Since the identification information 92 is unique to each beacon 10, a plurality of packages can be appropriately determined based on the identification information 92. Additional information other than identification information may be selectively displayed on the image, or may be hidden. Further, only the beacon 10 having the specific identification information 92 may be displayed on the display device 60.
  • the member that transmits electromagnetic waves is substantially transparent to the signal wave radiated from the beacon 10.
  • a wall or the like formed mainly of an insulating material transmits electromagnetic waves. For example, when searching for a person carrying a specific beacon 10 on the mobile device 100, the signal wave emitted by the beacon 10 can be received and the arrival direction of the signal wave can be known even if the person is located across the wall. Is possible. When searching for a person carrying the beacon 10 inside a building such as a building, it is only necessary to find a direction in which an incoming wave can be detected while changing the direction of the mobile device 100. Once the arrival direction is detected and the arrival direction can be estimated, it is possible to finally meet the person carrying the beacon 10 by moving the mobile device 100 in the arrival direction. Even if many people or things each have a beacon 10, it becomes possible to find the intended person or thing from the identification information included in the signal wave emitted by each beacon 10.
  • the use of the mobile device according to the present disclosure is not limited to indoors but may be outdoor. When a person carrying the beacon 10 is lost, the mobile device according to the present disclosure can be used to quickly rescue the victim.
  • ⁇ Correction of display position 1 (correction of misalignment due to camera shake, etc.)> As described above, a signal wave is intermittently emitted from the beacon 10. For this reason, the estimated value of the direction of arrival based on the signal wave is also calculated intermittently.
  • the estimated value of the arrival direction is updated at intervals of 100 milliseconds. At least one of the position and posture of the mobile device 100 may change due to camera shake or the like.
  • the image data of each frame is acquired at an interval shorter than 100 milliseconds (for example, an interval of about 8 to 16 milliseconds), it can be updated at a higher frequency than the estimated value of the arrival direction. Due to such a difference in data update, a “display position shift” may occur.
  • the signal processing circuit 30 determines the direction of arrival displayed on the display device 60 from the direction of arrival estimated based on the signal output from the array antenna 20. The position coordinates of the information to be shown are determined. In parallel with the position coordinate determination process, when the signal processing circuit 30 detects a hand shake or the like, the signal correction circuit 30 corrects the signal so as to compensate for the influence. Hereinafter, an example of signal correction will be described.
  • the output of the motion sensor 80 is used.
  • FIG. 10A and FIG. 10B show examples of changes in the posture of the mobile device 100 due to the user's intentional movement or camera shake.
  • the attitude of the mobile device 100 changes from the state shown in FIG. 10A to the state shown in FIG. 10B in a period shorter than about 100 milliseconds.
  • the mobile device 100 is rotated by an angle R in the clockwise direction of the illustrated Y axis (yaw axis) due to camera shake or the like.
  • the time interval at which the beacon 10 emits a signal wave is about 100 milliseconds.
  • the field of view of the imaging device 50 When the attitude of the mobile device 100 changes, the field of view of the imaging device 50 also changes.
  • the imaging device 50 outputs image data (frame group) following the change in the visual field.
  • the frames are updated at intervals of about 8 to 16 milliseconds, the imaging device 50 outputs 6 to 12 frames during a period of about 100 milliseconds until the mobile device 100 starts rotating and ends.
  • the video (background video) from the imaging device 50 displayed on the display device 60 follows the rotation relatively quickly. More specifically, the background image follows the rotation and flows from right to left.
  • the position of the beacon 10 is updated only once during a rotation period of about 100 milliseconds. Until the update is performed, the image indicating the position of the beacon 10 is continuously displayed at the same position on the display device 60. Although the background video flows from right to left on the display device 60, the image indicating the position of the beacon 10 is fixed at a certain point on the display device 60. This is the “display misalignment”. It is preferable for the user that the position of the beacon 10 is always displayed on the display device 60 with a certain degree of accuracy.
  • the inventor of the present application causes the mobile device 100 to perform display position correction processing to increase the accuracy of the position of the beacon 10 displayed on the display device 60.
  • the signal processing circuit 30 detects that the position and / or posture of the mobile device 100 has changed based on the detection value output from the motion sensor 80. Further, the signal processing circuit 30 calculates the rotation angle R by performing time integration of the detected value.
  • the image plane SA of the array antenna 20 is set to a position one meter away from the position of the mobile device 100.
  • the signal processing circuit 30 has moved R meters to the left from the position coordinates of the beacon 10 shown in FIG. 10A. The subsequent position coordinates are calculated.
  • FIG. 11A shows dots 90 indicating the position of the beacon 10 and identification information 92 displayed on the display device 60 of the mobile device 100 in the posture shown in FIG. 10A.
  • FIG. 11B shows the dot 90a and the identification information 92a whose position coordinates are corrected.
  • the “display position shift” on the display device 60 is shorter than the time interval in which the signal wave is emitted from the beacon 10. Can be corrected.
  • emitted intermittently from the beacon 10 is performed in parallel. For this reason, the correction error of the position coordinates using the output of the motion sensor 80 is not accumulated, and the correction error is reset every time update is performed.
  • FIG. 11C shows an example of “display misalignment” when correction processing is not performed.
  • the position coordinates of the dot 90 a indicating the position of the beacon 10 on the display device 60 are updated based only on the signal wave emitted intermittently from the beacon 10. Even if the posture of the mobile device 100 changes due to camera shake before the update, the position coordinates of the dots 90a on the display device 60 are maintained without being updated. Since a shift occurs between the background image in which the change due to the camera shake occurs and the position coordinates of the dot 90a indicating the position of the beacon 10, the smoothness is lacking and the visibility is inferior compared with the case of performing the correction process.
  • the signal processing circuit 30 corrects the position coordinates such as the dot 90 indicating the position of the beacon 10 with respect to the rotation around each of the three axes. can do.
  • the signal processing circuit 30 may estimate a change in the position and / or orientation of the mobile device 100 based on the image data, and correct the arrival direction or the position coordinates of the dot 90 according to the change.
  • the signal processing circuit 30 acquires image data of two frames at time t and (t + ⁇ t) from the imaging device 50.
  • the signal processing circuit 30 determines an arbitrary pattern (referred to as a “feature pattern”) that is commonly included in the two frames.
  • the signal processing circuit 30 acquires information on the relative position between the feature pattern in the frame at time t and the dot 90 indicating the position of the beacon 10 superimposed on the frame.
  • the relative position information is, for example, each difference value in the X-axis direction and the Y-axis direction.
  • the signal processing circuit 30 determines the position coordinate of the dot 90 using the position of the feature pattern of the frame at time (t + ⁇ t) and the acquired information on the relative position, and displays it on the display device 60. Since a plurality of continuous frames include the influence of camera shake, if the position of the dot 90 is corrected by the above-described method, the position coordinates of the dot 90 can be corrected in consideration of the influence of camera shake.
  • ⁇ Display position correction 2 (distortion correction by lens)>
  • distortion due to the lens 52 of the imaging device 50 may occur. Such distortion becomes prominent when a wide-angle lens is employed. The distortion causes a misalignment of the beacon 10 in the image.
  • the distortion can be corrected by calculation or a table if internal parameters unique to the imaging device 50 are known.
  • a table in which each position on a virtual image plane set 1 meter ahead of the position of the imaging device 50 is associated with the position coordinates of the image data output by the imaging device 50 is prepared in advance. 32 can be stored.
  • the signal processing circuit 30 determines the position coordinates on the virtual image plane from the arrival direction of the signal wave estimated based on the signals output from the array antenna 20, and refers to the above table to display the display device The position coordinates on 60 are determined.
  • the position of the beacon 10 may be corrected and superimposed on the image data. This is because if the accuracy of the position or direction of the beacon 10 in the image is high, there is no particular problem even if the image itself is distorted in the vicinity. Since only the process of correcting the position of the beacon 10 is performed without performing the process of correcting the distortion of each frame, the load required for the calculation is greatly reduced.
  • FIG. 12 is a flowchart showing an example of a process for displaying the position of the beacon 10. The process shown in FIG. 12 includes the display position correction process described above.
  • step S1 the signal processing circuit 30 receives signal wave data radiated from the beacon 10 and received by the array antenna 20.
  • the signal processing circuit 30 estimates the arrival direction of the signal wave based on the signal wave data.
  • step S ⁇ b> 3 the signal processing circuit 30 displays information indicating the arrival direction and the image data output from the imaging device 50 on the display device 60.
  • step S ⁇ b> 4 the signal processing circuit 30 corrects the display position shift caused by the shake of the mobile device 100. Thereafter, the process returns to step S1.
  • FIG. 13 is a hardware block diagram of the mobile device 110 according to the second example.
  • the mobile device 110 is different from the mobile device 100 in that the mobile device 110 includes a haptic device 82.
  • the tactile device 82 receives a command indicating the vibration pattern from the signal processing circuit 30, and generates a stimulus to be given to the user according to the command.
  • a command is a PWM signal.
  • the tactile device 82 includes a vibration motor 82a and a motor drive circuit 82b connected to the vibration motor 82a.
  • the vibration motor 82a is, for example, a horizontal linear actuator.
  • the motor drive circuit 82b supplies current to the vibration motor 82a in accordance with the command PWM signal, and causes the vibration motor 82a to operate with a predetermined vibration pattern.
  • the vibration pattern can be determined by, for example, the rising speed, the amplitude of vibration, the frequency of the applied current or voltage, and / or the frequency of amplitude.
  • the signal processing circuit 30 estimates the arrival direction of the signal wave of the beacon 10 and drives the vibration motor 82a of the haptic device 82 based on the estimated direction.
  • a lateral force field phenomenon can be caused to the user, that is, the illusion of being pulled can be given.
  • the haptic device 82 can give information indicating the arrival direction to the haptic sense of the user.
  • the mobile device 110 can guide the user to the position of the beacon 10.
  • the signal processing circuit 30 displays the dots 90 on the display device 60, the direction of the beacon 10 can be presented to the user in a visual angle.
  • vibration patterns for giving the illusion of being pulled are known as disclosed in JP2012-143054A, JP2010-2101010, and the like.
  • FIG. 14 is a hardware block diagram of the mobile device 120 according to the third example.
  • the mobile device 120 is configured by omitting the imaging device 50, the display device 60, and the motion sensor 80 from the mobile device 100.
  • the mobile device 110 can receive a signal wave emitted from the beacon 10 and transmit data indicating the position or direction of the beacon 10 and additional information to the external device via the communication circuit 40.
  • the mobile device 110 can operate as a “wireless handy scanner” for searching for the position of the beacon 10 and acquiring additional information of the beacon 10. Data indicating the position or direction of the beacon 10 may be stored in the storage device 32.
  • the external device can be a smartphone, a tablet terminal, or a laptop computer.
  • the external device can receive data indicating the position or direction of the beacon 10 from the mobile device 110 and display information indicating the arrival direction of the signal wave from the beacon 10 on a display display of the external device.
  • the external device may be a small electronic device having a haptic device.
  • a haptic device 82 built in the mobile device 110 FIG. 13 can be employed.
  • FIG. 15 is a hardware block diagram of the mobile device 130 according to the fourth example.
  • the mobile device 130 is the mobile device 120 (FIG. 13) to which the haptic device 82 is added.
  • the mobile device 130 can receive the signal wave emitted by the beacon 10 to estimate the arrival direction of the signal wave, and give the user the illusion that the signal is pulled in the estimated direction.
  • ⁇ Moving object> The mobile devices described so far are based on the assumption that people can carry them. However, it is not essential that a person is portable.
  • the imaging device, the array antenna, and the signal processing circuit may be attached to, for example, a moving body and move together with the moving body.
  • a moving body examples include flying bodies such as taxis, automatic guided vehicles, and multicopters.
  • an electronic device including an imaging device, an array antenna, and a signal processing circuit is attached to a moving body.
  • Electronic devices can be manufactured, sold, etc. in a size that can be mounted on a mobile object.
  • a mobile object equipped with an electronic device having the configuration of a mobile device will be described.
  • examples of the moving body include a vehicle such as an automobile, an automatic guided vehicle, and a multicopter.
  • the electronic device described below has the same components as the mobile device 100 (FIG. 4) and the mobile device 120 (FIG. 14) described above.
  • FIG. 16 is a schematic diagram for explaining a vehicle allocation system 1000 including a plurality of beacons 10 and a plurality of vehicles 200.
  • the vehicle allocation system 1000 is used to make a taxi, which is the vehicle 200, go to a passenger who has made a vehicle allocation request and to board the passenger.
  • the solid line indicates a road.
  • the beacon 10 is built in the smartphone.
  • An application program provided by an operator of the vehicle allocation system 1000 is installed on the smartphone.
  • the application program controls a communication circuit built in the smartphone, and radiates a signal wave in accordance with the Bluetooth (registered trademark) Low Energy standard. That is, the beacon 10 is realized by a smartphone communication circuit and an application program.
  • the signal wave emitted by the beacon 10 includes additional information having identification information regarding the person carrying the smartphone.
  • An example of the additional information is a smartphone owner ID.
  • the owner ID may be a unique value issued for each user by an application program, for example.
  • a passenger requests a dispatch using a mobile communication network.
  • a passenger having a beacon 10 in a broken-line circle shown in the upper left of FIG. The passenger activates the application program and makes a vehicle allocation request.
  • the smartphone Under the control of the application program, the smartphone transmits a vehicle allocation request to the server 212 installed at the base 210 of the operator of the vehicle allocation system 1000 via the base station 220a.
  • the smartphone sends location information indicating the location of the passenger along with the dispatch request.
  • the position information can be measured and acquired by a GPS module mounted on the smartphone.
  • the smartphone may acquire position information by using access to a public Wi-Fi (registered trademark) spot whose position is known in advance, or the base stations 220a and 220b whose positions are fixed. It is also possible to acquire position information by using communication with the network.
  • a public Wi-Fi registered trademark
  • the server 212 determines the empty vehicle closest to the position of the passenger indicated by the position information in response to receiving the dispatch request. In the example shown in FIG. 16, it is assumed that the server 200 determines the vehicle 200 within the broken circle shown near the center of the drawing. The server 212 transmits an instruction to the vehicle 200 to the position indicated by the position information via the base station 220b of the mobile communication network.
  • the electronic device 300 receives the registration information including the passenger's beacon 10 or the identification information of the passenger from the server 212.
  • the storage device 32 of the electronic device 300 stores the received registration information.
  • FIG. 17 is an external view of the vehicle 200.
  • An electronic device 300 is installed on the ceiling of the vehicle 200.
  • a display device 310 is installed in the company.
  • the electronic device 300 includes the array antenna 20 and the imaging device 50.
  • FIG. 18 shows an internal configuration of the electronic device 300 connected to the display device 310.
  • the internal configuration of the electronic device 300 is substantially the same as the internal configuration of the mobile device 100 (FIG. 4).
  • the display device 60 separated from the mobile device 100 corresponds to the display device 310.
  • Components having the same structure and / or function are denoted by the same reference numerals, and the description thereof is omitted.
  • the difference in configuration between the mobile device 100 and the electronic device 300 is that the electronic device 300 has a driving device 302.
  • the drive device 302 incorporates a motor (not shown) and changes the postures of the array antenna 20 and the imaging device 50 with respect to the moving body.
  • the array antenna 20 and the imaging device 50 are integrally provided in one housing.
  • the drive device 302 simultaneously causes the array antenna 20 and the imaging device 50 to move along an axis parallel to the vertical direction (yaw axis), an axis parallel to the horizontal direction (pitching axis), and an axis parallel to the front-rear direction (roll axis ) Can be rotated around.
  • the communication circuit 40 of the electronic device 300 can perform communication using a mobile communication network in addition to communication according to the Bluetooth (registered trademark) Law Energy standard.
  • a communication module that performs communication using a mobile communication network may be provided independently of the communication circuit 40.
  • the electronic device 300 When the vehicle 200 enters a predetermined range from the position of the passenger specified based on the position information, for example, within 300 m, the electronic device 300 operates the drive device 302 to change the array antenna 20 and the imaging device 50.
  • the beacon 10 is searched for the identification information of the passenger who requested the vehicle dispatch by changing the posture and scanning the surroundings. Eventually, when the beacon 10 is discovered, communication in accordance with the Bluetooth (registered trademark) Law Energy standard is established between the passenger's smartphone and the electronic device 300. Thereby, communication according to the above-mentioned standard becomes possible between the smart phone and the electronic device 300.
  • FIG. 19 is a schematic diagram showing the smart phone 240 of the passenger 230 and the electronic device 300 with the connection established.
  • the smartphone 240 is described as radiating a signal wave only in the direction of the electronic device 300, but the signal wave is radiated substantially isotropically.
  • FIG. 20 shows a display example of the display device 310.
  • the signal processing circuit 30 of the electronic device 300 displays on the display device 310 a mark indicating the position of the beacon 10 (the arrival direction of the signal wave from the beacon 10), for example, the image 250. Details of the processing are as described in detail with reference to FIGS. 1 to 9, for example. As a result, the driver of the vehicle 200 can accurately find the passenger 230 even if another person exists around the passenger 230.
  • the signal processing circuit 30 may additionally display the identification information 252 on the display device 310.
  • the identification information 252 may include a smartphone owner ID or a user ID of the vehicle allocation system 1000 and a passenger name. As shown in FIG. 20, a leader line for connecting the image of the passenger 230 and the identification information 252 may be provided so that the identification information 252 of the passenger 230 can be easily grasped. With the above-described additional display, it is possible to more accurately identify the passenger 230 who requested the dispatch.
  • the electronic device 300 may perform the display position correction processing described with reference to FIGS. 10A to 11B using the detection value of the motion sensor 80.
  • display position correction processing using changes in image data may be performed. This is because display misalignment may occur when the vehicle 200 travels, as in the case of camera shake. By correcting the display position, the driver of the vehicle 200 can find the passenger 230 more easily and reliably.
  • the accuracy of the position information transmitted from the passenger 230 may be rough. Even in such a case, if the electronic device 300 receives the beacon 10 and can estimate the arrival direction of the signal wave, the approximate position where the passenger 230 waits can be displayed on the display device 310. Thereby, the driver of the vehicle 200 can determine whether the passenger 230 is waiting on the right side or the left side of the road, or whether the passenger 230 is waiting at the destination of the right turn or the left turn.
  • the server 212 of the base 210 may associate and register the owner ID of the smartphone or the user ID of the vehicle allocation system 1000 and each passenger's face photograph.
  • the electronic device 300 of the vehicle 200 receives an instruction from the server 212 to go to the position of the passenger 230, the electronic device 300 also receives image data of a registered facial photograph of the passenger 230.
  • the signal processing circuit 30 of the electronic device 300 extracts a feature pattern indicating the facial features of the passenger 230 from the facial photograph data, and an image defined by the facial photograph image and the image data output from the imaging device 50. Check against the person inside. If they match as a result of the collation, the signal processing circuit 30 displays a mark indicating the passenger 230, such as an image 250 shown in FIG.
  • the relative positional relationship between the array antenna 20 and the imaging device 50 may be fixed or movable. In the example of FIG. 17, both are fixed.
  • Electronic device 300 may be removed from vehicle 200 and carried by the driver of the vehicle. In this case, the electronic device 300 may transmit an image to, for example, a driver's smartphone instead of the display device 310.
  • AAV Automatic Guided Vehicle
  • FIG. 21 shows a transport system 1100 having a plurality of AGVs 400 each mounting the electronic device 300 (FIG. 18).
  • the transport system 1100 can be installed in the factory 500.
  • the factory 500 is provided with a plurality of shelves 510.
  • the AGV 400 searches for the target beacon 10 using the identification information of the beacon 10 provided in the target package, and thereby discovers the package.
  • the wall of the factory 500 is provided with beacons 10N, 10S, 10W, and 10E indicating directions.
  • the electronic device 300 of the AGV 400 can receive signal waves from the beacons 10N, 10S, 10W, and 10E in an identifiable manner, and can estimate the position of each beacon. Using the estimated position of each beacon, the AGV 400 can recognize the current position of the own apparatus and the direction in which the own apparatus is facing, that is, the own position and attitude.
  • the four beacons 10N, 10S, 10W, and 10E are examples. Even if two beacons are provided, it is possible to recognize the self-position and posture. Further, by using five or more beacons 10, the accuracy of the self position and the posture can be further improved.
  • the AGV 400 can recognize the path using the image data output from the imaging device and determine the path. For example, a specific color is given to the floor surface, while a color different from the color is given to the shelves 510, luggage, walls, etc. other than the floor.
  • the AGV 400 can recognize a specific color of the floor surface from the image data output from the imaging device 50. Thereby, it can recognize that it is a path
  • the AGV 400 does not need to perform processing that has been conventionally performed in order to acquire information on the self-location.
  • the AGV 400 does not need to have map data and a laser range finder in the factory 500 where it travels. Since it is not necessary to estimate the current position by comparing the sensor data output from the laser range finder with the map data, the processing load of the arithmetic circuit is greatly reduced. Since the arithmetic circuit does not necessarily have high performance, the cost can be reduced. Further, the AGV 400 does not need to perform estimation / interpolation of the current position using odometry.
  • the laser range finder is used, the self-position estimation accuracy can be improved.
  • obstacles can be avoided by using image data output from the imaging device 50 included in the electronic device 300.
  • FIG. 22 is an external view of an exemplary AGV 400 according to the present embodiment.
  • the AGV 400 includes an electronic device 300, four wheels 411a to 411d, a frame 412, a transfer table 413, a travel control device 414, and a laser range finder 415.
  • the electronic device 300 and the laser range finder 415 are installed on the traveling direction side of the AGV 400.
  • the AGV 400 also has a plurality of motors, which are not shown in FIG. FIG. 22 shows a front wheel 411a, a rear wheel 411b, and a rear wheel 411c, but the front wheel 411d is not clearly shown because it is hidden behind the frame 12.
  • the traveling control device 414 is a device that controls the operation of the AGV 400, and mainly includes an integrated circuit including a microcomputer (described later), electronic components, and a board on which they are mounted.
  • the laser range finder 415 is an optical device that measures the distance to the target by, for example, irradiating the target with infrared laser light 415a and detecting the reflected light of the laser light 415a.
  • the laser range finder 415 of the AGV 400 has a pulse shape while changing its direction every 0.25 degrees in a space in the range of 135 degrees to the left and right (total 270 degrees) with respect to the front of the AGV 400, for example.
  • Laser light 415a is emitted, and the reflected light of each laser light 415a is detected. Thereby, the data of the distance to the reflection point in the direction determined by the angle for a total of 1080 steps every 0.25 degrees can be obtained.
  • the AGV 400 can create a map of the factory 500 based on the position and orientation of the AGV 400 and the scan result of the laser range finder 15.
  • the map can reflect the arrangement of objects such as walls around the AGV, structures such as pillars, and shelves placed on the floor.
  • the map data is stored in a storage device provided in the AGV 400.
  • the position and posture of the moving body are called poses.
  • the position and orientation of the moving body in the two-dimensional plane are expressed by position coordinates (x, y) in the XY orthogonal coordinate system and an angle ⁇ with respect to the X axis.
  • the position and orientation of the AGV 400, that is, the pose (x, y, ⁇ ) may be simply referred to as “position” below.
  • the position of the reflection point seen from the radiation position of the laser beam 415a can be expressed using polar coordinates determined by the angle and the distance.
  • the laser range finder 415 outputs sensor data expressed in polar coordinates.
  • the laser range finder 415 may convert the position expressed in polar coordinates into orthogonal coordinates and output the result.
  • Examples of objects that can be detected by the laser range finder 415 are people, luggage, shelves, and walls.
  • the laser range finder 15 is an example of an external sensor for sensing the surrounding space and acquiring sensor data.
  • Other examples of such an external sensor include an image sensor and an ultrasonic sensor.
  • the traveling control device 414 can estimate its current position by comparing the measurement result of the laser range finder 415 with the map data held by itself.
  • the map data may be acquired by the AGV 400 itself using SLAM (Simultaneous Localization and Mapping) technology.
  • FIG. 23 shows the hardware configuration of AGV400.
  • FIG. 23 also shows a specific configuration of the travel control device 14.
  • the AGV 400 includes a travel control device 414, a laser range finder 415, two motors 416a and 416b, and a drive device 417.
  • the traveling control device 414 includes a microcomputer 414a, a memory 414b, a storage device 414c, a communication circuit 414d, and a positioning device 414e.
  • the microcomputer 414a, the memory 414b, the storage device 414c, the communication circuit 414d, and the positioning device 414e are connected by a communication bus 414f and can exchange data with each other.
  • the laser range finder 415 is also connected to the communication bus 414f via a communication interface (not shown), and transmits measurement data as a measurement result to the microcomputer 414a, the positioning device 414e, and / or the memory 414b.
  • the electronic device 300 is connected to the communication bus 414f.
  • the electronic device 300 transmits information indicating the position of the beacon 10 as a target position to the microcomputer 414a via the communication bus 414f, and outputs the image data output from the imaging device 50.
  • the microcomputer 414a is a processor or a control circuit (computer) that performs an operation for controlling the entire AGV 400 including the travel control device 414.
  • the microcomputer 414a is a semiconductor integrated circuit.
  • the microcomputer 414a transmits a PWM (Pulse Width Modulation) signal, which is a control signal, to the driving device 417 to control the driving device 417 and adjust the voltage applied to the motor.
  • PWM Pulse Width Modulation
  • the memory 414b is a volatile storage device that stores a computer program executed by the microcomputer 414a.
  • the memory 414b can also be used as a work memory when the microcomputer 414a and the positioning device 414e perform calculations.
  • the storage device 414c is a nonvolatile semiconductor memory device.
  • the storage device 414c may be a magnetic recording medium typified by a hard disk or an optical recording medium typified by an optical disk.
  • the storage device 414c may include a head device for writing and / or reading data on any recording medium and a control device for the head device.
  • the storage device 414c stores map data M of the traveling factory 500.
  • the map data M is created in advance and stored in the storage device 414c.
  • the AGV 400 can travel toward the position of the target beacon 10 while estimating its own position using the created map and the sensor data output from the laser range finder 15 acquired during the traveling. it can.
  • the positioning device 414e receives the sensor data from the laser range finder 415, and reads the map data M stored in the storage device 414c.
  • the local map data created from the scan results of the laser range finder 415 is matched (matched) with a wider range of map data M to identify the self-position (x, y, ⁇ ) on the map data M To do.
  • the microcomputer 414a and the positioning device 414e are separate components, but this is an example. It may be a single chip circuit or a semiconductor integrated circuit capable of independently performing the operations of the microcomputer 414a and the positioning device 414e.
  • FIG. 23 shows a chip circuit 414g including the microcomputer 414a and the positioning device 414e.
  • the microcomputer 414a and the positioning device 414e are separately provided.
  • the two motors 416a and 416b are attached to the two wheels 411b and 411c, respectively, and rotate each wheel. That is, the two wheels 411b and 411c are drive wheels, respectively.
  • the motor 416a and the motor 416b are motors that drive the right wheel and the left wheel of the AGV 400, respectively.
  • the drive device 417 has motor drive circuits 417a and 417b for adjusting the voltage applied to each of the two motors 416a and 416b.
  • Each of the motor drive circuits 417a and 417b is a so-called inverter circuit, and the current applied to each motor is turned on or off by the PWM signal transmitted from the microcomputer 414a, thereby adjusting the voltage applied to the motor.
  • the electronic device 300 estimates the arrival direction of the signal wave radiated from the beacon 10.
  • the AGV 400 may not be able to travel on a straight route. The reason is that even when an obstacle exists between the self-position of the AGV 400 and the beacon 10, the signal wave radiated from the beacon 10 can pass through the obstacle and reach the electronic device 300. Examples of obstacles are walls, pillars, and shelves.
  • FIG. 24 shows the relationship between the AGV 400 and the direction P in which the beacon 10 exists.
  • the beacon 10 exists on the back side of the shelf 510 in the drawing.
  • a shelf 510 exists on the direction P as viewed from the AGV 400.
  • the microcomputer 414a uses the map data M to determine whether or not there is an obstacle in the direction P of the beacon 10 viewed from its own position. When there is an obstacle, the microcomputer 414a uses the map data M to calculate a route to reach the target position while avoiding the obstacle. In the example of FIG. 24, the AGV 400 does not travel along the route D1, but travels along the route D2.
  • FIG. 25 shows the relationship between the AGV 400 after traveling and the direction P in which the beacon 10 exists.
  • the direction P in which the beacon 10 exists changes.
  • the AGV 400 uses the map data M to travel on a route D3 that avoids the shelf 510. Thereby, the position of the beacon 10 can be reached while avoiding the shelf 510.
  • the microcomputer 414a may further determine the presence or absence of an obstacle from the image data by further using the image data received from the electronic device 300. If a stereo camera is employed as the imaging device 50, the position of the obstacle can be recognized with higher accuracy.
  • FIG. 26 shows a configuration example of the search system 1200.
  • Search system 1200 includes beacon 10, multicopter 600, PC 702 and display device 704 in search and rescue center facility 700.
  • the beacon 10 is owned by a leisure customer in a mountainous area, the sea or a river, for example.
  • the multicopter 600 is equipped with the electronic device 300.
  • the multicopter 600 detects a signal wave radiated from the beacon 10 of the rescuer while flying over a mountainous area, the sea, or a river, for example. Estimate the direction of arrival.
  • the imaging device 50 of the electronic device 300 starts outputting image data.
  • the signal processing circuit 30 of the electronic device 300 transmits a video signal obtained by adding information indicating the arrival direction to the image data via the communication circuit 40.
  • the multicopter 600 has a GPS module and acquires position information of the multicopter 600.
  • the signal processing circuit 30 also transmits the position information of the multicopter 600 via the communication circuit 40.
  • the video signal and the position information are transmitted to the search and rescue center facility 700.
  • the staff uses the PC 702 to reproduce the video signal on the display device 704.
  • the staff can specify the position of the multicopter 600 at that time and the direction in which the rescuer is present as viewed from the position, using the position information. Since the estimated position of the beacon 10 is displayed on the display device 704, it is easy to grasp the position, and rescue activities such as search team organization and dispatch can be started quickly.
  • FIG. 27 is an external perspective view of an exemplary multicopter 600 according to the present disclosure.
  • FIG. 28 is a side view of the multicopter 600.
  • An electronic device 300 is attached to the lower part of the central housing 602 of the multicopter 600 via a driving device 302.
  • the driving device 302 simultaneously causes the array antenna 20 and the imaging device 50 to move along an axis parallel to the vertical direction of the multicopter 600 (yaw axis), an axis parallel to the horizontal direction (pitching axis), And it can be rotated around an axis (roll axis) parallel to the front-rear direction.
  • omitted since the general structure of the multicopter 600 is well-known, description is abbreviate
  • the electronic apparatus 300 may perform the display position correction processing described with reference to FIGS. 10A to 11B by using the detection value of the motion sensor 80.
  • display position correction processing using changes in image data may be performed. This is because, as in the case of camera shake, display misalignment may occur when the multicopter 600 flies. By correcting the display position, the staff of the search / rescue center facility 700 can more easily and reliably find a rescuer.
  • the mobile body of the present disclosure includes an apparatus, a device, or an article including a beacon, or an electronic device that guides a person at or near the position where the beacon is arranged.
  • a moving body can be a mobile robot, an automated guided vehicle, a drone, an automobile, and the like.
  • beacons tags
  • 20 array antennas 30 signal processing circuits
  • 32 storage devices memory
  • 40 communication circuits 50 imaging devices, 52 lenses, 54 image sensors, 80 motion sensors, 82 tactile devices, 82a vibration motors, 82b
  • Motor drive circuit 100, 110, 120, 130 mobile device, 200 vehicle, 300 electronic device, 302 drive device, 310 display device, 400 AGV, 600 multicopter, 1000 dispatch system, 1100 transport system, 1200 search system

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Traffic Control Systems (AREA)

Abstract

Ce dispositif électrique (300) comprend : un dispositif d'imagerie (50) qui délivre des données d'image ; une antenne réseau (20) ayant de multiples éléments de réseau (22) qui reçoivent une onde de signal émise périodiquement ou par intermittence à partir d'une balise (10) ; et un circuit de traitement de signal (30) qui estime la direction d'arrivée de l'onde de signal sur la base d'un signal émis par l'antenne réseau (20) et qui détermine des coordonnées définissant la direction d'arrivée. Le circuit de traitement de signal (30) délivre en sortie un signal vidéo dans lequel des informations indiquant la direction d'arrivée ont été ajoutées aux données d'image.
PCT/JP2018/018724 2017-05-31 2018-05-15 Corps mobile pourvu d'une antenne radio, et système de répartition de véhicule Ceased WO2018221204A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017107952 2017-05-31
JP2017-107952 2017-05-31

Publications (1)

Publication Number Publication Date
WO2018221204A1 true WO2018221204A1 (fr) 2018-12-06

Family

ID=64454559

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/018724 Ceased WO2018221204A1 (fr) 2017-05-31 2018-05-15 Corps mobile pourvu d'une antenne radio, et système de répartition de véhicule

Country Status (1)

Country Link
WO (1) WO2018221204A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110415534A (zh) * 2019-08-03 2019-11-05 唐伟 一种太阳能智能斑马线交通指挥机器人
JP2023022898A (ja) * 2021-08-04 2023-02-16 ローム株式会社 発信素子撮像装置及び発信素子撮像方法
CN116010433A (zh) * 2022-12-28 2023-04-25 广东嘉腾机器人自动化有限公司 基于差异数据的数据更新方法、存储介质和电子设备

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005229449A (ja) * 2004-02-16 2005-08-25 Toyama Prefecture 山岳遭難者探索システム
JP2006242871A (ja) * 2005-03-04 2006-09-14 Victor Co Of Japan Ltd ビーコン受信機及びビュアーシステム
JP2015158802A (ja) * 2014-02-24 2015-09-03 国立研究開発法人宇宙航空研究開発機構 カメラ映像の視点位置補正によって、視差に起因する誤認を防止する方法とそれを実施するシステム
JP2015191641A (ja) * 2014-03-31 2015-11-02 Necエンベデッドプロダクツ株式会社 監視装置、監視システム、監視方法、及びプログラム
JP2016040151A (ja) * 2014-08-12 2016-03-24 エイディシーテクノロジー株式会社 通信システム
JP2016181156A (ja) * 2015-03-24 2016-10-13 株式会社Nttドコモ 配車装置、配車システム、配車方法及びプログラム
JP2017027220A (ja) * 2015-07-17 2017-02-02 日立オートモティブシステムズ株式会社 車載環境認識装置

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005229449A (ja) * 2004-02-16 2005-08-25 Toyama Prefecture 山岳遭難者探索システム
JP2006242871A (ja) * 2005-03-04 2006-09-14 Victor Co Of Japan Ltd ビーコン受信機及びビュアーシステム
JP2015158802A (ja) * 2014-02-24 2015-09-03 国立研究開発法人宇宙航空研究開発機構 カメラ映像の視点位置補正によって、視差に起因する誤認を防止する方法とそれを実施するシステム
JP2015191641A (ja) * 2014-03-31 2015-11-02 Necエンベデッドプロダクツ株式会社 監視装置、監視システム、監視方法、及びプログラム
JP2016040151A (ja) * 2014-08-12 2016-03-24 エイディシーテクノロジー株式会社 通信システム
JP2016181156A (ja) * 2015-03-24 2016-10-13 株式会社Nttドコモ 配車装置、配車システム、配車方法及びプログラム
JP2017027220A (ja) * 2015-07-17 2017-02-02 日立オートモティブシステムズ株式会社 車載環境認識装置

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110415534A (zh) * 2019-08-03 2019-11-05 唐伟 一种太阳能智能斑马线交通指挥机器人
JP2023022898A (ja) * 2021-08-04 2023-02-16 ローム株式会社 発信素子撮像装置及び発信素子撮像方法
CN116010433A (zh) * 2022-12-28 2023-04-25 广东嘉腾机器人自动化有限公司 基于差异数据的数据更新方法、存储介质和电子设备

Similar Documents

Publication Publication Date Title
US11787543B2 (en) Image space motion planning of an autonomous vehicle
US11822353B2 (en) Simple multi-sensor calibration
US10715963B2 (en) Navigation method and device
CN108427123B (zh) Lidar装置和用于操作lidar装置的方法
US10739784B2 (en) Radar aided visual inertial odometry initialization
US9401050B2 (en) Recalibration of a flexible mixed reality device
US10435176B2 (en) Perimeter structure for unmanned aerial vehicle
US7374103B2 (en) Object localization
US10508911B2 (en) Apparatus and method for measurement, and program
CN109478068A (zh) 动态地控制用于处理传感器输出数据的参数以用于碰撞避免和路径规划的系统和方法
WO2019026714A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, programme et unité mobile
US11859997B2 (en) Electronic device for generating map data and operation method thereof
KR20200015880A (ko) 스테이션 장치 및 이동 로봇 시스템
WO2020133172A1 (fr) Procédé de traitement d'images, appareil et support d'informations lisible par ordinateur
CN110609562B (zh) 一种图像信息采集方法和装置
US10514456B2 (en) Radar aided visual inertial odometry outlier removal
US20200133300A1 (en) System and method for adaptive infrared emitter power optimization for simultaneous localization and mapping
Wang et al. A survey of 17 indoor travel assistance systems for blind and visually impaired people
CN110389653A (zh) 用于追踪和渲染虚拟对象的追踪系统及用于其的操作方法
WO2018221204A1 (fr) Corps mobile pourvu d'une antenne radio, et système de répartition de véhicule
JP2021117502A (ja) 着陸制御装置、着陸制御方法およびプログラム。
US20210208608A1 (en) Control method, control apparatus, control terminal for unmanned aerial vehicle
US20230215092A1 (en) Method and system for providing user interface for map target creation
CN111563934A (zh) 单目视觉里程计尺度确定方法和装置
CN113960999A (zh) 移动机器人重定位方法、系统及芯片

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18808861

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18808861

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP