[go: up one dir, main page]

WO2019214642A1 - System and method for guiding autonomous machine - Google Patents

System and method for guiding autonomous machine Download PDF

Info

Publication number
WO2019214642A1
WO2019214642A1 PCT/CN2019/085998 CN2019085998W WO2019214642A1 WO 2019214642 A1 WO2019214642 A1 WO 2019214642A1 CN 2019085998 W CN2019085998 W CN 2019085998W WO 2019214642 A1 WO2019214642 A1 WO 2019214642A1
Authority
WO
WIPO (PCT)
Prior art keywords
communication device
optical communication
light source
image
movable machine
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/CN2019/085998
Other languages
French (fr)
Chinese (zh)
Inventor
牛旭恒
方俊
李江亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Whyhow Information Technology Co Ltd
Original Assignee
Beijing Whyhow Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Whyhow Information Technology Co Ltd filed Critical Beijing Whyhow Information Technology Co Ltd
Publication of WO2019214642A1 publication Critical patent/WO2019214642A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0223Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0251Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting 3D information from a plurality of images taken from different locations, e.g. stereo vision
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0259Control of position or course in two dimensions specially adapted to land vehicles using magnetic or electromagnetic means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • G05D1/0278Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using satellite positioning signals, e.g. GPS
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • G05D1/0285Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using signals transmitted via a public communication network, e.g. GSM network
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1408Methods for optical code recognition the method being specifically adapted for the type of code
    • G06K7/14172D bar codes

Definitions

  • the present invention relates to guidance for machines capable of autonomous movement, and more particularly to a system and method for guiding autonomously movable machine through an optical communication device.
  • a U.S. patent No. 95,621,216 B1 describes a drone cargo delivery system that navigates a drone based on a GPS and an altimeter and can remotely assist the navigation through the camera of the drone.
  • the above system cannot achieve precise navigation of the drone.
  • Amazon exposes the drone is first guided to the destination by GPS, and then the unattended machine looks for a unique "mark" in its field of view, which is a good customer.
  • the above approach requires the buyer to have a courtyard suitable for receiving the goods and place a unique "mark” in the courtyard. Moreover, since the "mark” itself cannot be used to distinguish between different buyers, if there are multiple "tags” placed by multiple buyers near the destination, the drone cannot determine which one to place the package to. Mark”. Therefore, the above scheme is not applicable to people living in urban apartments.
  • Traditional two-dimensional codes can be used to identify different users, but the recognition distance of the two-dimensional code is very limited.
  • the camera when scanning with a camera, the camera must typically be placed at a relatively short distance, typically about 15 times the width of the two-dimensional code.
  • a drone equipped with a camera needs to travel to about 3 meters from the two-dimensional code to recognize the two-dimensional code. Therefore, for long-distance recognition, the two-dimensional code cannot be realized, or a very large two-dimensional code must be customized, but this brings about an increase in cost, and in many cases due to various other restrictions (such as space size limitation) Impossible.
  • the camera needs to photograph the two-dimensional code substantially, and if the deviation angle is too large, the recognition cannot be performed.
  • a CMOS imaging device is a widely used imaging device, as shown in FIG. 1, including an array of image sensitive cells (also referred to as image sensors) and some other components.
  • the image sensor array can be a photodiode array, with each image sensor corresponding to one pixel.
  • Each column of image sensors corresponds to a column amplifier, and the output signal of the column amplifier is sent to an A/D converter (ADC) for analog-to-digital conversion and then output through an interface circuit.
  • ADC A/D converter
  • CMOS imaging devices typically employ rolling shutter imaging.
  • CMOS imaging devices data readout is serial, so clear/exposure/readout can only be done line-by-line in a pipeline-like manner and will be processed after all rows of the image sensor array have been processed. It is synthesized into one frame of image. Thus, the entire CMOS image sensor array is actually progressively exposed (in some cases CMOS image sensor arrays may also be exposed in multiple lines at a time), which results in small delays between rows. Due to this small delay, when the light source flashes at a certain frequency, some undesired streaks appear on the image taken by the CMOS imaging device, which affects the shooting effect.
  • One aspect of the invention relates to a system for guiding a machine capable of autonomous movement, comprising:
  • An optical communication device comprising a light source configured to be operable in at least two modes, the at least two modes comprising a first mode and a second mode,
  • the first mode and the second mode are used to convey different information.
  • the second mode controlling, by a light source control signal having a second frequency different from the first frequency, an attribute of light emitted by the light source to continuously change at a second frequency to pass the scrolling
  • the shutter camera does not exhibit streaks on the image of the light source obtained when the light source is photographed, or exhibits a stripe different from the stripe in the first mode.
  • the second frequency is greater than the first frequency.
  • an attribute of light emitted by the light source continuously changes at the first frequency, and an image of the light source obtained when the light source is photographed by the rolling shutter camera Stripes that differ from the stripes in the first mode are presented.
  • Another aspect of the invention relates to a method of guiding a machine capable of autonomous movement using the above system, comprising:
  • the optical communication device is a target optical communication device
  • the autonomously movable machine or a portion thereof is controlled to travel to the optical communication device.
  • the above method further comprises: if the optical communication device is not a target optical communication device, then:
  • the autonomously moveable machine or portion thereof is directed to the target optical communication device based at least in part on a relative positional relationship between the target optical communication device and the autonomously movable machine or portion thereof.
  • determining a relative positional relationship between the autonomously movable machine or a portion thereof and the optical communication device comprises: determining, by relative positioning, the autonomously movable machine or a portion thereof and the optical communication device Relative positional relationship between them.
  • collecting, by the rolling shutter camera mounted on the autonomously movable machine, information transmitted by a surrounding optical communication device and identifying the transmitted information comprises: obtaining the light by the rolling shutter camera a continuous multi-frame image of the communication device; for each frame image, determining whether a portion of the image corresponding to the position of the light source has stripes or which type of stripes exist; and determining information represented by each frame image .
  • the above method further comprises: first controlling the autonomously movable machine to travel to the vicinity of the target optical communication device.
  • first controlling the autonomously movable machine to travel to the vicinity of the target optical communication device comprises: guiding the autonomously movable machine to the vicinity of the target optical communication device at least partially by a satellite navigation system; and / Or directing the autonomously movable machine to the vicinity of the target optical communication device, at least in part, using a relative positional relationship between the other optical communication device and the target optical communication device.
  • the at least partially utilizing the relative positional relationship between the other optical communication device and the target optical communication device to direct the autonomously movable machine to the vicinity of the target optical communication device comprises:
  • the autonomously movable machine identifies other optical communication devices while traveling, and obtains a relative positional relationship between the other optical communication devices and the target optical communication device;
  • the autonomously moveable machine is directed to the vicinity of the target optical communication device based at least in part on a relative positional relationship between the target optical communication device and the autonomously movable machine.
  • determining whether the optical communication device is the target optical communication device based on the transmitted information comprises: determining whether the transmitted information includes the predetermined information explicitly or implicitly.
  • the predetermined information is a predetermined identifier or a verification code.
  • determining whether the optical communication device is a target optical communication device based on the transmitted information comprises: determining, by the autonomously movable device, whether the optical communication device is a target optical communication device; or The autonomously moving machine transmits the transmitted information to the server, and the server determines whether the optical communication device is the target optical communication device based on the transmitted information, and transmits the determination result to the autonomously movable machine.
  • Another aspect of the invention relates to a machine capable of autonomous movement, comprising a rolling shutter camera, a processor and a memory, wherein the memory stores a computer program that can be used to implement when executed by the processor The above method.
  • Another aspect of the invention relates to a storage medium in which is stored a computer program that, when executed, can be used to implement the method described above.
  • FIG. 1 is a schematic view of a CMOS imaging device
  • CMOS imaging device 2 is a pattern of an image acquired by a CMOS imaging device
  • Figure 3 is a light source in accordance with one embodiment of the present invention.
  • FIG. 4 is a light source in accordance with another embodiment of the present invention.
  • FIG. 5 is an imaging timing chart of a CMOS imaging device
  • FIG. 6 is another imaging timing diagram of a CMOS imaging device
  • Figure 7 shows an image of the CMOS imaging device at different stages when the light source is operating in the first mode
  • FIG. 8 illustrates an imaging timing diagram of a CMOS imaging device when the light source operates in the first mode, in accordance with an embodiment of the present invention
  • FIG. 9 illustrates an imaging timing diagram of a CMOS imaging device when the light source operates in the second mode, in accordance with an embodiment of the present invention
  • FIG. 10 illustrates an imaging timing diagram of a CMOS imaging device when a light source operates in a first mode in accordance with another embodiment of the present invention
  • FIG. 11 shows an imaging timing diagram of a CMOS imaging device for implementing a stripe different from that of FIG. 8 in accordance with another embodiment of the present invention
  • Figures 12-13 show two striped images of a light source obtained at different settings
  • Figure 14 shows a streak-free image of the obtained light source
  • Figure 15 is an image view of an optical tag employing three separate light sources, in accordance with one embodiment of the present invention.
  • 16 is an image view of an optical tag including a positioning mark, in accordance with one embodiment of the present invention.
  • Figure 17 illustrates an optical tag including a reference light source and two data sources in accordance with one embodiment of the present invention
  • FIG. 18 shows an imaging timing chart of the CMOS imaging device for the optical tag shown in FIG. 17;
  • Figure 19 illustrates a method of UAV guidance by optical tags in accordance with one embodiment of the present invention.
  • the optical communication device includes a light source and a controller capable of controlling the light source to operate in two or more modes by a light source control signal, the two or more modes including the first mode and the second mode, Wherein, in the first mode, the light source control signal has a first frequency such that an attribute of light emitted by the light source continuously changes at a first frequency to deliver first information, and in the second mode, The property of the light emitted by the light source continues to change at the second frequency or does not change to deliver second information that is different from the first information.
  • the attribute of light in this application refers to any property that the CMOS imaging device can recognize, for example, it may be an attribute that the human eye can perceive, such as the intensity, color, and wavelength of light, or other attributes that are not perceptible to the human eye.
  • the intensity, color or wavelength of the electromagnetic wavelength outside the visible range of the human eye changes, or any combination of the above properties.
  • a change in the properties of light can be a single property change, or a combination of two or more properties can change.
  • the intensity of the light is selected as an attribute, it can be achieved simply by selecting to turn the light source on or off.
  • the light source is turned on or off to change the properties of the light, but those skilled in the art will appreciate that other ways to change the properties of the light are also possible.
  • the attribute of the light varying at the first frequency in the first mode may be the same as or different from the attribute of the light changing at the second frequency in the second mode.
  • the properties of the light that change in the first mode and the second mode are the same.
  • the light source When the light source operates in the first mode or the second mode, the light source can be imaged using a rolling shutter imaging device such as a CMOS imaging device or a device having a CMOS imaging device (eg, a cell phone, a tablet, smart glasses, etc.), ie , imaging by rolling the shutter.
  • a rolling shutter imaging device such as a CMOS imaging device or a device having a CMOS imaging device (eg, a cell phone, a tablet, smart glasses, etc.), ie , imaging by rolling the shutter.
  • a mobile phone as a CMOS imaging device will be described as an example, as shown in FIG. 2 .
  • the line scanning direction of the mobile phone is shown as a vertical direction in FIG. 2, but those skilled in the art can understand that the line scanning direction can also be a horizontal direction depending on the underlying hardware configuration.
  • the light source can be a light source of various forms as long as one of its properties that can be perceived by the CMOS imaging device can be varied at different frequencies.
  • the light source may be an LED light, an array of a plurality of LED lights, a display screen or a part thereof, and even an illuminated area of light (for example, an illuminated area of light on a wall) may also serve as a light source.
  • the shape of the light source may be various shapes such as a circle, a square, a rectangle, a strip, an L shape, a cross shape, a spherical shape, or the like.
  • Various common optical devices can be included in the light source, such as a light guide plate, a soft plate, a diffuser, and the like.
  • the light source may be a two-dimensional array of a plurality of LED lamps, one dimension of which is longer than the other dimension, preferably a ratio of between about 6-12:1.
  • the LED light array can be composed of a plurality of LED lamps arranged in a row.
  • the LED light array can be rendered as a substantially rectangular light source when illuminated, and the operation of the light source is controlled by a controller.
  • Figure 3 illustrates a light source in accordance with one embodiment of the present invention.
  • the light source shown in FIG. 3 can be a plurality of rectangular shapes. Combination, for example, an L-shaped light source as shown in FIG.
  • the light source may not be limited to a planar light source, but may be implemented as a stereoscopic light source, for example, a strip-shaped cylindrical light source, a cubic light source, or the like.
  • the light source can be placed, for example, on a square, suspended at a substantially central location of an indoor venue (eg, a restaurant, a conference room, etc.) so that a nearby user in each direction can capture the light source through the mobile phone, thereby obtaining the light source.
  • an indoor venue eg, a restaurant, a conference room, etc.
  • FIG. 5 shows an imaging timing diagram of a CMOS imaging device, each of which corresponds to a row of sensors of the CMOS imaging device.
  • two stages are mainly involved, namely, exposure time and readout time.
  • the exposure time of each line may overlap, but the readout time does not overlap.
  • the exposure time of a CMOS imaging device can be set or adjusted (for example, set or adjusted by an APP installed on a mobile phone) to select a relative Short exposure time.
  • the exposure time can be made approximately equal to or less than the readout time of each row. Taking the 1080p resolution as an example, the readout time of each line is approximately 8.7 microseconds.
  • FIG. 6 shows an imaging timing chart of the CMOS imaging device in this case.
  • the exposure time of each line does not substantially overlap, or the number of overlapping portions is small, so that stripes having relatively clear boundaries can be obtained at the time of imaging, which is more easily recognized.
  • FIG. 6 is only a preferred embodiment of the present invention, and a longer (for example, twice or three times, four times or four times the readout time of each row, etc.) or a shorter exposure time is also feasible. of.
  • a longer for example, twice or three times, four times or four times the readout time of each row, etc.
  • the readout time per line is approximately 8.7 microseconds, and the exposure time per line set is 14 microseconds.
  • the length of one cycle of the light source may be set to about twice or more of the exposure time, and preferably may be set to about four times or more of the exposure time.
  • Figure 7 is a view showing an image of a CMOS imaging device at different stages when the light source is operated in the first mode using a controller, in which the property of the light emitted by the light source is changed at a certain frequency, in this example Medium to turn the light source on and off.
  • Fig. 7 shows a state change diagram of the light source at different stages
  • the lower part shows an image of the light source on the CMOS imaging device at different stages, wherein the row direction of the CMOS imaging device is vertical and from the left Scan to the right. Since the image captured by the CMOS imaging device is progressively scanned, when the high-frequency flicker signal is captured, the portion of the obtained image on the image corresponding to the imaging position of the light source forms a stripe as shown in the lower part of FIG.
  • time period 1 the light source is turned on, in which the scanning line of the leftmost portion of the exposure exhibits bright streaks; in time period 2, the light source is turned off, in which the scanned lines of the exposure exhibit dark stripes; in time period 3, the light source is turned on, The scanned lines exposed during this time period exhibit bright streaks; in time period 4, the light source is turned off, during which the scanned lines of exposure exhibit dark stripes.
  • the frequency of the flashing of the light source can be set by the light source control signal, or the length of the light strip can be adjusted each time the light source is turned on and off, and the longer opening or closing time generally corresponds to a wider stripe.
  • the exposure time is set to be substantially equal to the exposure time of each line of the CMOS imaging device (this exposure time can be set by the APP installed on the mobile phone or manually set. ), it is possible to present stripes with a width of only one pixel when imaging. In order to enable long-distance identification of optical tags, the narrower the stripes, the better.
  • the stripe having a width of only one pixel may be less stable or less recognizable due to light interference, synchronization, etc., therefore, in order to improve the stability of recognition, it is preferable to realize a stripe having a width of two pixels.
  • the stripe having a width of about two pixels can be realized by setting the duration of each turn on or off of the light source to be approximately equal to about twice the exposure time of each line of the CMOS imaging device.
  • the signal of the upper portion of FIG. 8 is a light source control signal
  • the high level corresponds to the turn-on of the light source
  • the low level corresponds to the turn-off of the light source.
  • the duty ratio of the light source control signal is set to about 50%, and the exposure duration of each line is set to be substantially equal to the readout time of each line, but those skilled in the art can understand that other Settings are also possible, as long as they can show distinguishable stripes.
  • the synchronization between the light source and the CMOS imaging device is used in FIG. 8 such that the time of turning on and off of the light source substantially corresponds to the start or end time of the exposure time of a certain line of the CMOS imaging device, but the field The skilled person will understand that even if the two are not synchronized as shown in Fig. 8, they can exhibit significant streaks on the CMOS imaging device.
  • the darkest stripe is the line that is exposed when the light source is always on (ie, the brightest stripe), which is separated by one pixel.
  • the light and dark variations (i.e., fringes) of such pixel rows can be easily detected (e.g., by comparing the brightness or grayscale of some of the pixels in the imaged area of the source).
  • the light and dark stripe difference threshold and the ratio threshold are related to the optical label illumination intensity, the photosensitive device property, the shooting distance, and the like. Those skilled in the art will appreciate that other thresholds are also possible as long as computer-resolvable stripes are present. When the streaks are identified, the information conveyed by the light source at this time, such as binary data 0 or data 1, can be determined.
  • the stripe recognition method is as follows: obtaining an image of the optical label, and dividing the imaging area of the light source by means of projection; collecting stripe in different configurations (for example, different distances, different light source flicker frequencies, etc.) Images and unstripe pictures; normalize all collected pictures to a specific size, such as 64*16 pixels; extract each pixel feature as input feature, build a machine learning classifier; perform two-class discrimination to determine a striped picture Still a non-striped picture.
  • a specific size such as 64*16 pixels
  • extract each pixel feature as input feature, build a machine learning classifier perform two-class discrimination to determine a striped picture Still a non-striped picture.
  • strip light source For a strip light source with a length of 5 cm, when using a mobile phone that is currently on the market, setting the resolution to 1080p, when shooting 10 meters away (that is, the distance is 200 times the length of the light source),
  • the strip light source occupies about 6 pixels in its length direction, and if each stripe width is 2 pixels, it will appear in a range of widths of a plurality of apparent pixels within the width of the 6 pixels. At least one distinct stripe that can be easily identified. If a higher resolution is set, or optical zoom is used, the stripe can be recognized at a greater distance, for example, when the distance is 300 or 400 times the length of the light source.
  • the controller can also operate the light source in the second mode.
  • the light source control signal can have another frequency than the first mode to change the properties of the light emitted by the light source, such as turning the light source on and off.
  • the controller can increase the turn-on and turn-off frequencies of the light source in the second mode compared to the first mode.
  • the frequency of the first mode may be greater than or equal to 8000 times/second, and the frequency of the second mode may be greater than the frequency of the first mode.
  • the light source can be configured to turn the light source on and off at least once during the exposure time of each row of the CMOS imaging device.
  • FIG. 9 shows a case where the light source is turned on and off only once during the exposure time of each line, wherein the signal of the upper portion of FIG. 9 is a light source control signal whose high level corresponds to the turn-on of the light source, and the low level corresponds to the light source.
  • the light source is turned off. Since the light source is turned on and off in the same way during the exposure time of each line, the exposure intensity energy obtained at each exposure time is roughly equal, so there is no significant difference in brightness between the individual pixel rows of the final image of the light source. So there are no stripes.
  • CMOS imaging device is used in FIG.
  • the turn-on time of the light source substantially corresponds to the start time of the exposure time of a certain line of the CMOS imaging device, but those skilled in the art can It is understood that even if the two are not synchronized as in Fig. 9, there is no significant difference in brightness between the respective pixel rows of the final image of the light source, so that no streaks exist. When the streaks are not recognized, the information conveyed by the light source at this time, such as binary data 1 or data 0, can be determined. For the human eye, the light source of the present invention does not perceive any flicker when operating in the first mode or the second mode described above.
  • the duty ratios of the first mode and the second mode may be set to be substantially equal, thereby realizing in different modes.
  • direct current may be supplied to the light source such that the light source emits light whose properties do not substantially change, thereby obtaining one of the light sources obtained when the light source is photographed by the CMOS image sensor. No streaks appear on the frame image.
  • substantially the same luminous flux in different modes can also be achieved to avoid flicker that may be perceived by the human eye when switching between the first mode and the second mode.
  • Figure 8 above describes an embodiment in which stripes are rendered by varying the intensity of the light emitted by the source (e.g., by turning the light source on or off).
  • the wavelength or color of the light emitted by the light source changes to present stripes.
  • the light source includes a red light that emits red light and a blue light that emits blue light.
  • the two signals in the upper portion of FIG. 10 are a red light control signal and a blue light control signal, respectively, wherein a high level corresponds to the turn-on of the corresponding light source and a low level corresponds to the turn-off of the corresponding light source.
  • the red light control signal and the blue light control signal are phase shifted by 180°, that is, the two levels are opposite.
  • the red light control signal and the blue light control signal enable the light source to alternately emit red light and blue light outward, so that when the light source is imaged by the CMOS imaging device, red and blue stripes can be presented.
  • CMOS imaging device By determining whether or not there is a streak on a portion of the image of one frame taken by the CMOS imaging device corresponding to the light source, information transmitted by each frame of the image, such as binary data 1 or data 0, can be determined. Further, by taking a continuous multi-frame image of the light source by the CMOS imaging device, a sequence of information composed of binary data 1 and 0 can be determined, and information transmission of the light source to the CMOS imaging device (for example, a mobile phone) is realized.
  • CMOS imaging device for example, a mobile phone
  • control may be performed by the controller such that the switching time interval between the operating modes of the light source is equal to the length of time that a complete frame of the CMOS imaging device is imaged
  • the frame synchronization of the light source and the imaging device is realized, that is, information of 1 bit is transmitted per frame.
  • the information can include, for example, a start frame mark (frame header), an optical tag ID, a password, a verification code. , URL information, address information, timestamps or different combinations thereof, and so on.
  • the order relationship of the above various kinds of information can be set in accordance with a structuring method to form a packet structure. Each time a complete packet structure is received, it is considered to obtain a complete set of data (a packet), which can be read and verified.
  • the following table shows the packet structure in accordance with one embodiment of the present invention:
  • the information transmitted by the frame image is determined by judging whether or not there is a streak at the imaging position of the light source in each frame of the image.
  • different information conveyed by the frame image may be determined by identifying different fringes at the imaging location of the light source in each frame of image.
  • the property of the light emitted by the light source is controlled by the light source control signal having the first frequency to continuously change at the first frequency, thereby being capable of being imaged on the light source obtained when the light source is photographed by the CMOS image sensor.
  • a second stripe different from the first stripe is present on the image of the light source.
  • the difference in stripes may be based, for example, on different widths, colors, brightnesses, etc., or any combination thereof, as long as the difference can be identified.
  • stripes of different widths may be implemented based on different light source control signal frequencies.
  • the light source in the first mode, the light source may operate as shown in FIG. 8 to achieve a width of approximately two pixels. a stripe; in the second mode, the durations of the high level and the low level in each period of the light source control signal in FIG. 8 can be respectively changed to twice the original, as shown in FIG. A second stripe with a width of approximately four pixels is implemented.
  • stripes of different colors may be implemented.
  • the light source may be set to include a red light that emits red light and a blue light that emits blue light.
  • the blue light may be turned off.
  • the red lamp works as shown in Fig. 8 to realize red and black stripes; in the second mode, the red lamp can be turned off, and the blue lamp works as shown in Fig. 8, thereby realizing blue and black stripes.
  • the red and black stripes and the blue and black stripes are realized using the light source control signals having the same frequency in the first mode and the second mode, but it is understood that the first mode and the second mode may also be used. Light source control signals at different frequencies.
  • the third mode can be further set.
  • the red and blue lights are controlled in the manner shown in Figure 10 to achieve a red-blue stripe, a third type of information.
  • another type of information that is, the fourth type of information, can be further transmitted through the fourth mode in which the stripes are not presented.
  • the above four modes can be arbitrarily selected for information transmission, and other modes can be further combined as long as different patterns generate different stripe patterns.
  • Figure 12 shows the use of 1080p resolution imaging for LEDs that are flashing at a frequency of 16,000 times per second (each period has a duration of 62.5 microseconds with an on duration and a closure duration of approximately 31.25 microseconds each)
  • stripes of approximately 2-3 pixel width are presented.
  • Figure 13 shows that the blinking frequency of the LED lamp in Figure 12 is adjusted to 8000 times per second (the duration of each cycle is 125 microseconds, wherein the opening duration and the closing duration are each about 62.5 microseconds), under other conditions. Streaks on the image obtained by experiment under constant conditions.
  • Figure 14 shows the adjustment of the blinking frequency of the LED lamp of Figure 12 to 64,000 times per second (the duration of each cycle is 15.6 microseconds, wherein the opening duration and the closing duration are each about 7.8 microseconds), under other conditions.
  • the image obtained by experiment without change has no streaks on it, because the length of each line of exposure is 14 microseconds, which basically covers one opening time and one closing time of the LED lamp.
  • the square wave is taken as an example to describe the light source control signal having the corresponding frequency, but those skilled in the art can understand that the light source control signal can also use other waveforms such as a sine wave, a triangular wave or the like.
  • Figure 15 is an image view of an optical tag employing three independent light sources in which the imaging positions of the two light sources have streaks, and the imaging position of one light source has no streaks, according to one embodiment of the present invention.
  • One frame of image can be used to convey information, such as binary data 110.
  • the optical tag may further include one or more positioning indicators located adjacent to the information delivery source, the positioning identification being, for example, a lamp of a particular shape or color, which may remain bright, for example, during operation.
  • the location identification can help a user of a CMOS imaging device, such as a cell phone, to easily discover optical tags.
  • the CMOS imaging device is set to a mode in which the optical tag is photographed, the imaging of the positioning mark is conspicuous and easy to recognize.
  • the one or more location markers disposed adjacent the information transfer light source can also assist the handset in quickly determining the location of the information transfer light source to facilitate identifying whether the imaged region corresponding to the information transfer light source has streaks.
  • the location identification may first be identified in the image such that the approximate location of the optical tag is found in the image. After the location identification is identified, one or more regions in the image may be determined based on the relative positional relationship between the location identification and the information delivery light source, the region encompassing an imaging location of the information delivery light source. These areas can then be identified to determine if there are streaks or what stripes are present.
  • 16 is an image diagram of an optical tag including a positioning mark including three horizontally disposed information transfer light sources, and two vertically positioned two positional identification lights located on both sides of the information transfer light source, in accordance with an embodiment of the present invention. .
  • an ambient light detection circuit can be included in the optical tag that can be used to detect the intensity of ambient light.
  • the controller can adjust the intensity of the light emitted by the light source when it is turned on based on the intensity of the detected ambient light. For example, when the ambient light is relatively strong (for example, during the day), the intensity of the light emitted by the light source is relatively large, and when the ambient light is relatively weak (for example, at night), the intensity of the light emitted by the light source is relatively small.
  • an ambient light detection circuit can be included in the optical tag that can be used to detect the frequency of ambient light.
  • the controller can adjust the frequency of the light emitted by the light source when it is turned on based on the frequency of the detected ambient light. For example, when the ambient light has the same frequency flashing light source, the light emitted by the light source is switched to another unoccupied frequency.
  • the accuracy of the recognition may be affected. Therefore, in order to improve the accuracy of the recognition, in one embodiment of the present invention, in addition to the above-described light source for transmitting information (for clarity, hereinafter referred to as "data source” for clarity), It may also include at least one reference light source.
  • the reference source itself is not used to convey information, but rather to aid in identifying the information conveyed by the data source.
  • the reference source can be physically similar to the data source, but operates in a predetermined mode of operation, which can be one or more of various modes of operation of the data source. In this way, the decoding of the data source can be converted to a calculation that matches (eg, correlates) the image of the reference source, thereby improving the accuracy of the decoding.
  • FIG. 17 shows an optical tag including a reference light source and two data light sources, wherein three light sources are arranged side by side, the first light source serves as a reference light source, and the other two light sources respectively serve as a first embodiment.
  • the number of reference light sources in the optical label may be one or more, and is not limited to one; likewise, the number of data light sources may be one or more, and is not limited to two.
  • the reference source is used to provide auxiliary recognition, its shape and size do not have to be the same as the data source. For example, in one embodiment, the length of the reference source can be half of the data source.
  • each of the first data source and the second data source shown in FIG. 17 is configured to operate in three modes to, for example, display a streak-free image, respectively having a stripe width of 2 pixels.
  • the reference light source may be configured to always operate in one of three modes to display one of the three images described above, or alternately operate in different modes to alternately display any two or all of the above three images in different frames, This provides a baseline or reference for image recognition of the data source.
  • the reference light source alternately displays an image with a stripe width of 2 pixels and an image with a stripe width of 4 pixels in different frames.
  • the image of the data source in each frame may be compared with the current frame and an adjacent frame (for example, before An image of a reference light source in a frame or a subsequent frame (the images must include an image with a stripe width of 2 pixels and an image with a stripe width of 4 pixels) to determine the type of the image; or It is also possible to collect successive multi-frame images of the reference light source in one time period, and to average the features of each group of images by taking the image of the odd frame number and the image of the even frame number as a group respectively (for example, finding each group) The average of the stripe width of the image), and which set of images according to the stripe width corresponds to an image with a stripe width of 2 pixels or an image with a stripe width of 4 pixels, thereby obtaining an image with a stripe width of 2 pixels.
  • the reference light source Since the reference light source is located at substantially the same position as the data light source and is subjected to the same ambient lighting conditions, interference, noise, etc., it can provide one or more reference images or reference images for image recognition in real time, thereby improving The accuracy and stability of the identification of information conveyed by the data source.
  • the data pattern can be accurately identified by comparing the imaging of the data source with the imaging of the reference source to identify the data it is transmitting.
  • the reference light source can be controlled to operate in a predetermined mode of operation in which, for example, a stripe of 4 pixels in width is present on the image of the reference source.
  • the stripes appearing on the image of the data source are similar to the stripes appearing on the image of the reference source (eg , the width is also 4 pixels) and there is no phase difference; if the data source is controlled to operate in the working mode at the same time, but the phase of the data source and the reference source are inconsistent (for example, inverted or 180° out of phase), the data source
  • the stripes appearing on the image are similar to the stripes appearing on the image of the reference source (eg, the width is also 4 pixels) but there is a phase difference.
  • FIG. 18 shows an imaging timing chart of the CMOS imaging device for the optical tag shown in FIG.
  • the respective control signals of the reference source, the first data source, and the second data source are shown in the upper portion of FIG. 18, wherein a high level may correspond to the turn-on of the light source and a low level may correspond to the turn-off of the light source.
  • the three control signals have the same frequency, and the first data source control signal is in phase with the reference source control signal, and the second data source control signal is 180 degrees out of phase with the reference source control signal.
  • the reference light source, the first data source, and the second data source will each exhibit a stripe having a width of approximately four pixels, but the first data source and
  • the fringe phase on the imaging of the reference source is uniform (eg, the line of the reference light source's bright stripe coincides with the line of the first data source's bright stripe, the line of the reference source's dark stripe and the first data source
  • the lines of dark stripes are consistent, and the phase of the fringes of the second data source and the reference source are inverted (eg, the line where the light stripe of the reference source is located and the dark strip of the second data source)
  • the lines are identical, and the line of the dark stripe of the reference source is identical to the line of the bright stripe of the second data source).
  • each data source can deliver one of two types of data, such as 0 or 1, in one frame of image.
  • the second mode itself can be used to deliver more than one type of data by providing a reference source and operating it in the second mode and further providing phase control when the data source is operating in the second mode. Taking the method shown in Fig. 18 as an example, the second mode combined with the phase control itself can be used to deliver one of the two kinds of data, so that each data source can deliver one of the three kinds of data in one frame of image.
  • each data light source can deliver one of two kinds of data in one frame of image, thus the entire light label (contains three data sources) can be one of three species of transmitting data in an image composition; and if the reference light source, each data source can be one of three data transfer in an image, and thus the entire optical tag (contains two data sources) can pass one of the three combinations of two kinds of data in an image.
  • positive values represent positive correlation and negative values represent negative correlation. If the frequency and phase of the data source and the reference source are the same, then in an ideal state, the images of the two sources are exactly the same, so the result of the correlation calculation is +1, indicating a complete positive correlation. If the frequency of the data source and the reference source are the same, but the phases are opposite, then under ideal conditions, the image of the two sources has the same stripe width, but the position of the bright and dark stripes is exactly opposite, so the result of the correlation calculation is -1 , indicating complete negative correlation. It can be understood that in the actual imaging process, it is difficult to obtain a completely positively correlated and completely negatively correlated image due to the presence of interference, errors, and the like. If the data source and the reference source operate in different modes of operation to display stripes of different widths, or one of them does not display stripes, the images of the two are typically micro-correlated.
  • Tables 1 and 2 below show the correlation calculation results when the data source and the reference source are of the same frequency at the same frequency, and the correlation calculation results when the data source and the reference source are in the opposite phase of the same frequency. For each case, five images were taken, and the reference light source image in each frame image was correlated with the data source image in the frame image.
  • Data image 1 Data image 2
  • Data image 3 Data image 4
  • Data image 5 Reference image 1 0.7710 Reference image 2 0.7862 Reference image 3 0.7632 Reference image 4 0.7883 Reference image 5 0.7967
  • Data image 1 Data image 2
  • Data image 3 Data image 4
  • Data image 5 Reference image 1 -0.7849 Reference image 2 -0.7786 Reference image 3 -0.7509 Reference image 4 -0.7896 Reference image 5 -0.7647
  • the correlation calculation results can indicate that they are significantly positively correlated.
  • the correlation calculations can indicate that they are significantly negatively correlated.
  • the identification distance of at least 200 times of the optical label of the present invention has obvious advantages.
  • the long-distance recognition capability is especially suitable for outdoor recognition. Taking a recognition distance of 200 times as an example, for a light source with a length of 50 cm set on a street, a person within 100 meters from the light source can pass the mobile phone and the light source. Interact.
  • the solution of the present invention does not require the CMOS imaging device to be located at a fixed distance from the optical tag, nor does it require time synchronization between the CMOS imaging device and the optical tag, and does not require accurate detection of the boundaries and widths of the individual stripes. Therefore, it has extremely strong stability and reliability in actual information transmission.
  • the solution of the present invention also does not require that the CMOS imaging device must be substantially aligned with the optical tag for identification, especially for optical tags having strip, spherical sources.
  • a CMOS imaging device within 360° of it can be identified. If the strip or columnar light label is placed on a wall, it can be identified by a CMOS imaging device that is approximately 180° around it.
  • a spherical optical tag placed on a square it can be identified by a CMOS imaging device at any location in its three-dimensional space.
  • One embodiment of the present invention is directed to a system for guiding autonomously movable machine through an optical tag, comprising a machine capable of autonomous movement and a light tag as described in any of the above embodiments.
  • the autonomously movable machine is equipped with a CMOS camera capable of collecting and identifying information transmitted by the optical tag.
  • Buyers can use their apartment as the shipping address and fill in the shipping address information in the online shopping platform, such as some of the following information: geographic location information, cell information, building number, floor, and so on.
  • Buyers can place a light tag at the apartment (such as the balcony, exterior wall, etc. of the apartment) as the target light tag for the drone to deliver the goods.
  • the optical tag may be configured to deliver the predetermined information by continuously working in different modes, the predetermined information may be, for example, the ID information of the optical tag itself, the ID of the buyer on the online shopping platform.
  • the online shopping platform can transmit the predetermined information to the drone.
  • the method for guiding the drone through the optical tag of the present invention can be as shown in FIG. 19, which includes the following steps:
  • Step 101 Control the drone to travel to the vicinity of the target light tag.
  • the drone After the drone picks up the goods to be sent to the buyer, it can first fly to the buyer's shipping address (ie, the buyer's apartment).
  • the delivery address may preferably be geographic location information of the target optical tag itself (eg, precise latitude and longitude, height information, etc. of the optical tag), and may also include other information, such as the target optical tag. Orientation information, etc.
  • Step 101 can be implemented in various possible existing ways in the art.
  • the drone can fly to the vicinity of the shipping address (ie, near the buyer's light tag) by means of GPS navigation or the like.
  • the existing GPS navigation method can reach the precision range of several tens of meters, and the optical tag of the present invention can realize the recognition distance of at least 200 times, taking the recognition distance of 200 times as an example, for the light source with a length of 20 cm, the drone Identification can be achieved as long as it can fly to within 40 meters of the source.
  • the drone can also be guided to the vicinity of the target optical tag by using the relative positional relationship between the other optical tags and the target optical tag.
  • the relative positional relationship between the individual optical tags can be stored, for example, in advance and can be obtained by the drone.
  • the drone When the drone is flying, it can identify other optical tags along its flight path and obtain the relative positional relationship between the other optical tags and the target optical tags. Then, the drone can be relatively positioned (also called reverse Positioning) to determine the relative positional relationship between the target optical tag and the other optical tag, so that the relative positional relationship between the target optical tag and the drone can be determined. Based on the relative positional relationship, the drone can be guided to the vicinity of the target light tag. Those skilled in the art will appreciate that it is also possible to direct the drone to the vicinity of the target light tag using a combination of the various means described above.
  • the drone can use its imaging device to image the optical tag, and obtain the relative distance from the optical tag based on the acquired image (the larger the imaging, the closer the distance; the smaller the imaging, the farther the distance
  • the current orientation information of the drone can be obtained by the built-in sensor, and the relative direction of the drone and the optical tag is obtained based on the orientation information (preferably, the position of the optical tag in the image can be further combined to more accurately The relative direction of the drone and the optical tag is determined), so that the relative positional relationship between the drone and the optical tag can be obtained based on the relative distance and the relative direction of the drape.
  • the orientation information of the optical label may be stored in the server. After the user identifies the identification information of the optical label, the orientation information may be obtained from the server using the identification information. Then, based on the orientation information of the optical tag and the perspective distortion of the imaging of the optical tag on the user's mobile phone, the relative direction of the user and the optical tag can be calculated.
  • this step 101 is not a necessary step of the present invention, and may be omitted in some cases. For example, if the target light tag itself is already within the field of view of the drone.
  • Step 102 Collect information transmitted by a surrounding optical tag through a CMOS camera installed on the drone, and identify the transmitted information.
  • the drone After flying to the buyer's optical tag, the drone can find the cursor in its field of view and collect the information transmitted by the discovered optical tag through the CMOS camera installed on it and identify the transmitted information. For example, a drone can obtain a continuous multi-frame image of a certain optical tag through its CMOS camera, and determine, for each frame image, whether a portion of the image corresponding to the position of the light source has streaks or which type of streak exists. And determining the information represented by each frame of image. In one embodiment, if the drone finds an optical tag within its field of view, but the distance is too far to recognize the information it transmits, the drone can properly access the optical tag to achieve Identification of information conveyed by optical tags.
  • Step 103 Determine whether the optical tag is a target optical tag based on the transmitted information.
  • the drone can determine whether the optical tag is a target optical tag based on information transmitted by the optical tag. For example, the drone can determine whether the predetermined information is explicitly or implicitly included in the transmitted information. If included, it can be determined that the optical tag is the target optical tag, otherwise, it can be determined that the optical tag is not the target optical tag. In one embodiment, it may be up to the drone itself to determine if the optical tag is a target optical tag. In another embodiment, the drone can transmit the information conveyed by the optical tag to a server capable of communicating with the drone, and the server determines whether the optical tag is the target optical tag based on the transmitted information, and The judgment result is sent to the drone.
  • the information transmitted by the optical tag can be encrypted information.
  • Step 104 If the optical tag is a target optical tag, control the drone to travel to the optical tag.
  • the drone can fly to the optical tag without error, for example by visual guidance of the optical tag.
  • the drone can be stopped at a distance from the light tag using existing ranging techniques, such as a position tens of centimeters from the light tag, to avoid collisions with the light tag.
  • the drone can relatively position and adjust its flight path based on the perspective distortion of the image of the optical tag it captured, such that the drone can eventually stop in a certain direction relative to the light tag, such as , the front of the light label.
  • a shelf for receiving goods may be disposed directly in front of the optical tag, and the drone can easily deliver the goods into the shelf.
  • the drone can identify other optical tags in the vicinity thereof, which is similar to the above process and will not be described again.
  • the drone can determine the relative positional relationship between the optical tag and the target optical tag.
  • the drone can determine its relative positional relationship with the optical tag by relative positioning, and can identify the optical tag based on the information transmitted by the optical tag (eg, obtain identification information of the optical tag) and obtain the The relative positional relationship between the optical tag and the target optical tag (the relative positional relationship between the optical tags can be pre-stored and can be obtained by the drone, for example), thereby determining the relationship between the drone and the target optical tag Relative positional relationship.
  • the drone can fly to the vicinity of the target light tag using the relative positional relationship and optionally other navigation information (eg, GPS information).
  • the drone delivery scheme of the present invention guided by the optical tag is not limited to the optical tag disposed at the buyer's apartment (for example, the balcony of the apartment, the outer wall, etc.), which obviously can also Suitable for light tags arranged in more open areas, for example, light tags placed in a courtyard.
  • the buyer does not have his own light label, or if he wishes to deliver the goods to a location where other light labels are located (for example, public light labels located in squares, parks, etc., or light labels of friends' homes), they can
  • the relevant information of the optical tag (ie, the target optical tag) at the delivery address eg, the ID information of the target optical tag, geographic location information, etc.
  • the online shopping platform can inform the drone of the corresponding information, and the drone can recognize the information transmitted by the nearby optical tag (for example, the ID information transmitted by the optical tag) after flying to the vicinity of the target optical tag, and finally determine the target light. label.
  • the drone delivery scheme guided by the optical tag of the present invention can be applied not only to a light tag having a fixed position, but also to a non-fixed optical tag (for example, a light tag that can be carried by a person) .
  • a non-fixed optical tag for example, a light tag that can be carried by a person.
  • the optical tag may be configured to transmit predetermined information, which may be, for example, ID information of the optical tag itself, ID information of the buyer on the online shopping platform, a verification code received by the buyer from the platform after the buyer purchases on the online shopping platform, And so on, as long as the predetermined information is known to the online shopping platform and can be used to identify the buyer or the goods it purchases.
  • the drone After the drone arrives near the buyer's location, it can identify the information transmitted by the nearby optical tag, and finally determine the target optical tag (that is, the light tag carried by the buyer) to complete the delivery of the goods.
  • the online shopping platform can inform the buyer of the estimated arrival time of the drone so that the buyer can move freely during this time, as long as the expected arrival time is returned to the vicinity of the previous location.
  • the buyer may not return to the previous location, but may send its new location to the online shopping platform, and the online shopping platform may notify the drone of the new location so that the drone can fly to the location. Near the new location.
  • the buyer may also set the goods delivery address to an address that is expected to arrive at a certain time and instruct the online shopping platform to ship the goods to the address at that time.
  • the drone delivery application of the online shopping is taken as an example, but it can be understood that the drone guidance through the optical label is not limited to the above application, but can be used for each of the precise positioning requiring the drone.
  • Applications such as automatic charging of drones, automatic mooring of drones, navigation of drone lines, etc.
  • the optical tag-based guidance of the present invention is not only applicable to the drone, but can also be applied to other types of autonomously movable machines, such as driverless cars, robots, and the like.
  • a CMOS camera can be mounted on a driverless car or robot and can interact with optical tags in a similar manner to drones.
  • a portion of the autonomously movable machine is moveable, but another portion is fixed.
  • the autonomously movable machine may be a machine having a fixed position on a pipeline or in a warehouse, the body portion of the machine being fixed in most cases but having one or more movable machinery arm.
  • a CMOS camera can be mounted on a fixed portion of the machine for determining the position of the optical tag so that the movable portion of the machine (eg, a robotic arm) can be directed to the position of the optical tag.
  • the CMOS camera can also be mounted on a movable portion of the machine, for example, on each robotic arm.
  • appearances of the phrases “in the various embodiments”, “in some embodiments”, “in one embodiment”, or “in an embodiment” are not necessarily referring to the same implementation. example.
  • the particular features, structures, or properties may be combined in any suitable manner in one or more embodiments.
  • the particular features, structures, or properties shown or described in connection with one embodiment may be combined, in whole or in part, with the features, structures, or properties of one or more other embodiments without limitation, as long as the combination is not Logical or not working.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Electromagnetism (AREA)
  • Toxicology (AREA)
  • General Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Optical Communication System (AREA)

Abstract

A system for guiding an autonomous machine. The system comprises: an autonomous machine, wherein a rolling shutter camera is mounted on the autonomous machine; and an optical communication device, comprising a light source, wherein the light source is configured to operate in at least two modes, and the at least two modes comprise a first mode and a second mode. In the first mode, a light source control signal having a first frequency controls a light attribute emitted by the light source to continuously change at the first frequency, such that stripes are present on an image of the light source acquired by the rolling shutter camera photographing the light source; in the second mode, no stripes or stripes different from the stripes in the first mode are present on an image of the light source acquired by the rolling shutter camera photographing the light source.

Description

对能够自主移动的机器进行导引的系统和方法System and method for guiding autonomously movable machine 技术领域Technical field

本发明涉及对能够自主移动的机器的导引,更具体地涉及一种通过光通信装置对能够自主移动的机器进行导引的系统和方法。The present invention relates to guidance for machines capable of autonomous movement, and more particularly to a system and method for guiding autonomously movable machine through an optical communication device.

背景技术Background technique

对于能够自主移动的机器(例如,无人机)的导引,目前通常是通过GPS、IMU等技术来实现,但这些技术的定位精度有限。例如,GPS通常会有若干米甚至几十米的误差,而且,其信号传播会受使用环境的影响,常常会产生延时误差。因此,当前的导航技术通常只能将无人机引导到目标位置的附近(例如,目标位置附近方圆几十米的范围内),但难以将无人机最终引导到一个非常精确的目标位置。The guidance of machines capable of autonomous movement (for example, drones) is currently implemented by technologies such as GPS and IMU, but the positioning accuracy of these technologies is limited. For example, GPS usually has errors of several meters or even tens of meters, and its signal propagation is affected by the environment in which it is used, often resulting in delay errors. Therefore, current navigation techniques typically only direct the drone to a location near the target location (eg, within a few tens of meters of the target location), but it is difficult to ultimately direct the drone to a very precise target location.

近年来,许多厂商都在考虑使用无人机来进行货物配送。例如,在亚马逊的一个专利US9536216B1中介绍了一种无人机货物投递系统,其基于GPS和高度计对无人机进行导航,并可以通过无人机的摄像头进行远程人工辅助导航。但上述系统无法实现无人机的精准导航。在亚马逊公开的另一种方案中,首先通过GPS将无人机引导到目的地附近,然后,无人机会在其视野中寻找一个独特的“标记”,该“标记”是客户在一个良好的着陆地点放置的具有预定图案的电子识别欢迎垫。如果找到了“标记”,无人机便会通过视觉引导来飞到标记处并放下包裹。然而,上述方式需要买家具有一个适合于收货的庭院,并在庭院中放置独特的“标记”。而且,由于该“标记”本身并不能用于区分不同的买家,因此,如果目的地附近有多个买家放置的多个“标记”,无人机便无法确定要将包裹放置到哪个“标记”处。因此,上述方案对于居住在城市公寓中的人而言并不适用。In recent years, many manufacturers are considering using drones for goods distribution. For example, a U.S. patent No. 95,621,216 B1 describes a drone cargo delivery system that navigates a drone based on a GPS and an altimeter and can remotely assist the navigation through the camera of the drone. However, the above system cannot achieve precise navigation of the drone. In another solution that Amazon exposes, the drone is first guided to the destination by GPS, and then the unattended machine looks for a unique "mark" in its field of view, which is a good customer. An electronic identification welcome pad with a predetermined pattern placed at the landing site. If a "tag" is found, the drone will fly through the visual guide to the mark and drop the package. However, the above approach requires the buyer to have a courtyard suitable for receiving the goods and place a unique "mark" in the courtyard. Moreover, since the "mark" itself cannot be used to distinguish between different buyers, if there are multiple "tags" placed by multiple buyers near the destination, the drone cannot determine which one to place the package to. Mark". Therefore, the above scheme is not applicable to people living in urban apartments.

传统的二维码可以用来识别不同的用户,但二维码的识别距离很受限制。例如,对于二维码而言,当用摄像头对其进行扫描时,该摄像头通常必须置于一个比较近的距离内,该距离通常只是二维码的宽度的15倍左右。例如,对于一个宽度为20厘米的二维码,配置有摄像头的无人机需 要行进到距离该二维码3米左右时才能够识别出该二维码。因此,对于远距离识别,二维码不能实现,或者必须定制非常大的二维码,但这会带来成本的提升,并且在许多情形下由于其他各种限制(例如空间大小的限制)是不可能实现的。而且,当识别二维码时,摄像头需要大致正对该二维码进行拍摄,如果偏离角度过大将导致无法进行识别。Traditional two-dimensional codes can be used to identify different users, but the recognition distance of the two-dimensional code is very limited. For example, for a two-dimensional code, when scanning with a camera, the camera must typically be placed at a relatively short distance, typically about 15 times the width of the two-dimensional code. For example, for a two-dimensional code having a width of 20 cm, a drone equipped with a camera needs to travel to about 3 meters from the two-dimensional code to recognize the two-dimensional code. Therefore, for long-distance recognition, the two-dimensional code cannot be realized, or a very large two-dimensional code must be customized, but this brings about an increase in cost, and in many cases due to various other restrictions (such as space size limitation) Impossible. Moreover, when the two-dimensional code is recognized, the camera needs to photograph the two-dimensional code substantially, and if the deviation angle is too large, the recognition cannot be performed.

CMOS成像器件是目前广泛采用的成像器件,其如图1所示,包括像敏单元(也称为图像传感器)阵列以及一些其他元件。图像传感器阵列可以是光电二极管阵列,每一个图像传感器对应于一个像素。每一列图像传感器都对应于一个列放大器,列放大器的输出信号之后被送往A/D转换器(ADC)进行模数转换,然后通过接口电路输出。对于图像传感器阵列中的任一图像传感器,在曝光开始时现将其清零,然后等待曝光时间过后,将信号值读出。CMOS成像器件通常采用滚动快门成像方式。在CMOS成像器件中,数据的读出是串行的,所以清零/曝光/读出也只能以类似于流水线的方式逐行顺序进行,并在图像传感器阵列的所有行都处理完成后将其合成为一帧图像。因此,整个CMOS图像传感器阵列实际上是逐行曝光的(在某些情况下CMOS图像传感器阵列也可能采用每次多行一起曝光的方式),这导致了各个行之间存在小的时延。由于该小的时延,当光源以一定频率闪动时,会在CMOS成像器件拍摄的图像上呈现出一些不期望的条纹,影响到拍摄效果。A CMOS imaging device is a widely used imaging device, as shown in FIG. 1, including an array of image sensitive cells (also referred to as image sensors) and some other components. The image sensor array can be a photodiode array, with each image sensor corresponding to one pixel. Each column of image sensors corresponds to a column amplifier, and the output signal of the column amplifier is sent to an A/D converter (ADC) for analog-to-digital conversion and then output through an interface circuit. For any image sensor in the image sensor array, it is now cleared at the beginning of the exposure, and then the signal value is read after waiting for the exposure time. CMOS imaging devices typically employ rolling shutter imaging. In CMOS imaging devices, data readout is serial, so clear/exposure/readout can only be done line-by-line in a pipeline-like manner and will be processed after all rows of the image sensor array have been processed. It is synthesized into one frame of image. Thus, the entire CMOS image sensor array is actually progressively exposed (in some cases CMOS image sensor arrays may also be exposed in multiple lines at a time), which results in small delays between rows. Due to this small delay, when the light source flashes at a certain frequency, some undesired streaks appear on the image taken by the CMOS imaging device, which affects the shooting effect.

人们已经发现了理论上可以利用CMOS成像器件拍摄的图像上的条纹来传递信息(类似于条形码那样),并试图通过条纹来传递尽可能多的信息,但是这通常需要使得CMOS成像器件与光源尽量接近,并最好始终处于大致固定的距离处,并且还需要精细的时间同步、对各个条纹的边界的精确识别、对各个条纹的宽度的精确检测等等,因此,在实践中其稳定性和可靠性并不令人满意,也未获得广泛使用。而且,这种方式显然也不适用于由行进中的无人机来进行远距离识别。It has been found that it is theoretically possible to use the fringes on the image taken by the CMOS imaging device to convey information (similar to a bar code) and to try to pass as much information as possible through the stripes, but this usually requires the CMOS imaging device and the light source to be as far as possible Close, and preferably always at a roughly fixed distance, and also requires fine time synchronization, precise identification of the boundaries of individual stripes, accurate detection of the width of each stripe, etc., thus, in practice its stability and Reliability is not satisfactory and is not widely used. Moreover, this approach obviously does not apply to long-distance identification by a traveling drone.

发明内容Summary of the invention

本发明的一个方面涉及一种对能够自主移动的机器进行导引的系统,包括:One aspect of the invention relates to a system for guiding a machine capable of autonomous movement, comprising:

能够自主移动的机器,其上安装有滚动快门摄像头;以及Autonomously movable machine having a rolling shutter camera mounted thereon;

光通信装置,其包括光源,所述光源被配置为能够工作于至少两种模式,所述至少两种模式包括第一模式和第二模式,An optical communication device comprising a light source configured to be operable in at least two modes, the at least two modes comprising a first mode and a second mode,

以及其中,在所述第一模式下,通过具有第一频率的光源控制信号控制所述光源发出的光的属性以第一频率持续变化,以在通过所述滚动快门摄像头对所述光源拍摄时所获得的所述光源的图像上呈现出条纹,在所述第二模式下,在通过所述滚动快门摄像头对所述光源拍摄时所获得的所述光源的图像上不呈现条纹或者呈现出与所述第一模式下的条纹不同的条纹。And wherein, in the first mode, controlling, by a light source control signal having a first frequency, an attribute of light emitted by the light source to continuously change at a first frequency to capture the light source by the rolling shutter camera a fringe is present on the image of the obtained light source, and in the second mode, no streaks are present or appear on the image of the light source obtained when the light source is photographed by the rolling shutter camera Stripes of different stripes in the first mode.

优选地,所述第一模式和所述第二模式用于传递不同的信息。Preferably, the first mode and the second mode are used to convey different information.

优选地,在所述第二模式下,通过具有与所述第一频率不同的第二频率的光源控制信号控制所述光源发出的光的属性以第二频率持续变化,以在通过所述滚动快门摄像头对所述光源拍摄时所获得的所述光源的图像上不呈现条纹或者呈现出与所述第一模式下的条纹不同的条纹。Preferably, in the second mode, controlling, by a light source control signal having a second frequency different from the first frequency, an attribute of light emitted by the light source to continuously change at a second frequency to pass the scrolling The shutter camera does not exhibit streaks on the image of the light source obtained when the light source is photographed, or exhibits a stripe different from the stripe in the first mode.

优选地,所述第二频率大于所述第一频率。Preferably, the second frequency is greater than the first frequency.

优选地,在所述第二模式下,所述光源发出的光的属性以所述第一频率持续变化,并在通过所述滚动快门摄像头对所述光源拍摄时所获得的所述光源的图像上呈现出与所述第一模式下的条纹不同的条纹。Preferably, in the second mode, an attribute of light emitted by the light source continuously changes at the first frequency, and an image of the light source obtained when the light source is photographed by the rolling shutter camera Stripes that differ from the stripes in the first mode are presented.

本发明的另一个方面涉及一种使用上述系统对能够自主移动的机器进行导引的方法,包括:Another aspect of the invention relates to a method of guiding a machine capable of autonomous movement using the above system, comprising:

通过所述能够自主移动的机器上安装的滚动快门摄像头对周围的某个光通信装置传递的信息进行采集,并识别所传递的信息;Collecting information transmitted by a surrounding optical communication device through a rolling shutter camera mounted on the autonomously movable machine, and identifying the transmitted information;

基于所传递的信息判断所述光通信装置是否是目标光通信装置;以及Determining whether the optical communication device is a target optical communication device based on the transmitted information;

如果所述光通信装置是目标光通信装置,则控制所述能够自主移动的机器或者其部分向所述光通信装置行进。If the optical communication device is a target optical communication device, the autonomously movable machine or a portion thereof is controlled to travel to the optical communication device.

优选地,上述方法还包括:如果所述光通信装置不是目标光通信装置,则:Preferably, the above method further comprises: if the optical communication device is not a target optical communication device, then:

基于该光通信装置传递的信息识别该光通信装置,并获得该光通信装置与所述目标光通信装置之间的相对位置关系;Identifying the optical communication device based on information transmitted by the optical communication device, and obtaining a relative positional relationship between the optical communication device and the target optical communication device;

确定所述能够自主移动的机器或者其部分与该光通信装置之间的相 对位置关系;Determining a relative positional relationship between the autonomously movable machine or a portion thereof and the optical communication device;

确定所述目标光通信装置与所述能够自主移动的机器或者其部分之间的相对位置关系;以及Determining a relative positional relationship between the target optical communication device and the autonomously movable machine or a portion thereof;

至少部分地基于所述目标光通信装置与所述能够自主移动的机器或者其部分之间的相对位置关系将所述能够自主移动的机器或者其部分引导向所述目标光通信装置。The autonomously moveable machine or portion thereof is directed to the target optical communication device based at least in part on a relative positional relationship between the target optical communication device and the autonomously movable machine or portion thereof.

优选地,其中,确定所述能够自主移动的机器或者其部分与该光通信装置之间的相对位置关系包括:通过相对定位来确定所述能够自主移动的机器或者其部分与该光通信装置之间的相对位置关系。Preferably, wherein determining a relative positional relationship between the autonomously movable machine or a portion thereof and the optical communication device comprises: determining, by relative positioning, the autonomously movable machine or a portion thereof and the optical communication device Relative positional relationship between them.

优选地,其中,通过所述能够自主移动的机器上安装的滚动快门摄像头对周围的某个光通信装置传递的信息进行采集并识别所传递的信息包括:通过所述滚动快门摄像头获得所述光通信装置的连续的多帧图像;针对每一帧图像,判断所述图像上与所述光源的位置对应的部分是否存在条纹或者存在哪种类型的条纹;以及确定每一帧图像所表示的信息。Preferably, wherein collecting, by the rolling shutter camera mounted on the autonomously movable machine, information transmitted by a surrounding optical communication device and identifying the transmitted information comprises: obtaining the light by the rolling shutter camera a continuous multi-frame image of the communication device; for each frame image, determining whether a portion of the image corresponding to the position of the light source has stripes or which type of stripes exist; and determining information represented by each frame image .

优选地,上述方法还包括:首先控制所述能够自主移动的机器行进到目标光通信装置附近。Preferably, the above method further comprises: first controlling the autonomously movable machine to travel to the vicinity of the target optical communication device.

优选地,其中,首先控制所述能够自主移动的机器行进到目标光通信装置附近包括:至少部分地通过卫星导航系统将所述能够自主移动的机器引导到所述目标光通信装置附近;和/或,至少部分地利用其他光通信装置与所述目标光通信装置之间的相对位置关系将所述能够自主移动的机器引导到所述目标光通信装置附近。Preferably, first controlling the autonomously movable machine to travel to the vicinity of the target optical communication device comprises: guiding the autonomously movable machine to the vicinity of the target optical communication device at least partially by a satellite navigation system; and / Or directing the autonomously movable machine to the vicinity of the target optical communication device, at least in part, using a relative positional relationship between the other optical communication device and the target optical communication device.

优选地,其中,至少部分地利用其他光通信装置与所述目标光通信装置之间的相对位置关系将所述能够自主移动的机器引导到所述目标光通信装置附近包括:Preferably, wherein the at least partially utilizing the relative positional relationship between the other optical communication device and the target optical communication device to direct the autonomously movable machine to the vicinity of the target optical communication device comprises:

所述能够自主移动的机器在行进时识别其他光通信装置,并获得该其他光通信装置与所述目标光通信装置之间的相对位置关系;The autonomously movable machine identifies other optical communication devices while traveling, and obtains a relative positional relationship between the other optical communication devices and the target optical communication device;

确定所述能够自主移动的机器与该其他光通信装置之间的相对位置关系;Determining a relative positional relationship between the autonomously movable machine and the other optical communication device;

确定所述目标光通信装置与所述能够自主移动的机器之间的相对位置关系;以及Determining a relative positional relationship between the target optical communication device and the autonomously movable machine;

至少部分地基于所述目标光通信装置与所述能够自主移动的机器之间的相对位置关系将所述能够自主移动的机器引导到所述目标光通信装置附近。The autonomously moveable machine is directed to the vicinity of the target optical communication device based at least in part on a relative positional relationship between the target optical communication device and the autonomously movable machine.

优选地,其中,基于所传递的信息判断所述光通信装置是否是目标光通信装置包括:判断所传递的信息中是否显式地或隐式地包含预定信息。Preferably, determining whether the optical communication device is the target optical communication device based on the transmitted information comprises: determining whether the transmitted information includes the predetermined information explicitly or implicitly.

优选地,其中,所述预定信息是预定的标识符或验证码。Preferably, wherein the predetermined information is a predetermined identifier or a verification code.

优选地,其中,基于所传递的信息判断所述光通信装置是否是目标光通信装置包括:由所述能够自主移动的机器判断所述光通信装置是否是目标光通信装置;或者,所述能够自主移动的机器将所传递的信息传送到服务器,由所述服务器基于所传递的信息判断所述光通信装置是否是目标光通信装置,并将判断结果发送给所述能够自主移动的机器。Preferably, determining whether the optical communication device is a target optical communication device based on the transmitted information comprises: determining, by the autonomously movable device, whether the optical communication device is a target optical communication device; or The autonomously moving machine transmits the transmitted information to the server, and the server determines whether the optical communication device is the target optical communication device based on the transmitted information, and transmits the determination result to the autonomously movable machine.

本发明的另一个方面涉及一种能够自主移动的机器,包括滚动快门摄像头、处理器和存储器,所述存储器中存储有计算机程序,所述计算机程序在被所述处理器执行时能够用于实现上述的方法。Another aspect of the invention relates to a machine capable of autonomous movement, comprising a rolling shutter camera, a processor and a memory, wherein the memory stores a computer program that can be used to implement when executed by the processor The above method.

本发明的另一个方面涉及一种存储介质,其中存储有计算机程序,所述计算机程序在被执行时能够用于实现上述的方法。Another aspect of the invention relates to a storage medium in which is stored a computer program that, when executed, can be used to implement the method described above.

附图说明DRAWINGS

以下参照附图对本发明的实施例作进一步说明,其中:The embodiments of the present invention are further described below with reference to the accompanying drawings, in which:

图1为CMOS成像器件的示意图;1 is a schematic view of a CMOS imaging device;

图2为CMOS成像器件获取图像的方向图;2 is a pattern of an image acquired by a CMOS imaging device;

图3为根据本发明的一个实施例的光源;Figure 3 is a light source in accordance with one embodiment of the present invention;

图4为根据本发明的另一个实施例的光源;Figure 4 is a light source in accordance with another embodiment of the present invention;

图5为CMOS成像器件的成像时序图;5 is an imaging timing chart of a CMOS imaging device;

图6为CMOS成像器件的另一成像时序图;6 is another imaging timing diagram of a CMOS imaging device;

图7示出了当光源工作于第一模式时在不同阶段在CMOS成像器件上的成像图;Figure 7 shows an image of the CMOS imaging device at different stages when the light source is operating in the first mode;

图8示出了根据本发明的一个实施例当光源工作于第一模式时CMOS成像器件的成像时序图;8 illustrates an imaging timing diagram of a CMOS imaging device when the light source operates in the first mode, in accordance with an embodiment of the present invention;

图9示出了根据本发明的一个实施例当光源工作于第二模式时CMOS 成像器件的成像时序图;9 illustrates an imaging timing diagram of a CMOS imaging device when the light source operates in the second mode, in accordance with an embodiment of the present invention;

图10示出了根据本发明的另一个实施例当光源工作于第一模式时CMOS成像器件的成像时序图;10 illustrates an imaging timing diagram of a CMOS imaging device when a light source operates in a first mode in accordance with another embodiment of the present invention;

图11示出了根据本发明的另一个实施例的用于实现与图8不同的条纹的CMOS成像器件的成像时序图;11 shows an imaging timing diagram of a CMOS imaging device for implementing a stripe different from that of FIG. 8 in accordance with another embodiment of the present invention;

图12-13示出了在不同设置下获得的光源的两种有条纹图像;Figures 12-13 show two striped images of a light source obtained at different settings;

图14示出了获得的光源的一种无条纹图像;Figure 14 shows a streak-free image of the obtained light source;

图15是根据本发明的一个实施例的采用三个独立光源的光标签的一个成像图;Figure 15 is an image view of an optical tag employing three separate light sources, in accordance with one embodiment of the present invention;

图16是根据本发明的一个实施例的包括定位标识的光标签的一个成像图;16 is an image view of an optical tag including a positioning mark, in accordance with one embodiment of the present invention;

图17示出了根据本发明的一个实施例的包括了一个参考光源和两个数据光源的光标签;Figure 17 illustrates an optical tag including a reference light source and two data sources in accordance with one embodiment of the present invention;

图18示出了针对图17所示的光标签的CMOS成像器件的一个成像时序图;以及FIG. 18 shows an imaging timing chart of the CMOS imaging device for the optical tag shown in FIG. 17;

图19示出了根据本发明的一个实施例的通过光标签进行无人机导引的方法。Figure 19 illustrates a method of UAV guidance by optical tags in accordance with one embodiment of the present invention.

具体实施方式detailed description

为了使本发明的目的、技术方案及优点更加清楚明白,以下结合附图通过具体实施例对本发明进行进一步详细说明。The present invention will be further described in detail below with reference to the accompanying drawings.

本发明的一个实施例涉及一种光通信装置,其能够通过发出不同的光来传输不同的信息。该光通信装置在本文中也被称为“光标签”,两者在整个本申请中可以互换使用。光通信装置包括光源和控制器,该控制器能够通过光源控制信号来控制所述光源工作于两个或更多个模式,所述两个或更多个模式包括第一模式和第二模式,其中,在所述第一模式下,光源控制信号具有第一频率,使得所述光源发出的光的属性以第一频率持续变化,以传递第一信息,在所述第二模式下,所述光源发出的光的属性以第二频率持续变化或者不发生改变,以传递与第一信息不同的第二信息。One embodiment of the present invention is directed to an optical communication device capable of transmitting different information by emitting different lights. The optical communication device is also referred to herein as a "light tag" and both are used interchangeably throughout this application. The optical communication device includes a light source and a controller capable of controlling the light source to operate in two or more modes by a light source control signal, the two or more modes including the first mode and the second mode, Wherein, in the first mode, the light source control signal has a first frequency such that an attribute of light emitted by the light source continuously changes at a first frequency to deliver first information, and in the second mode, The property of the light emitted by the light source continues to change at the second frequency or does not change to deliver second information that is different from the first information.

光的属性在本申请中指的是CMOS成像器件能够识别的任何一种属 性,例如其可以是光的强度、颜色、波长等人眼可感知的属性,也可以是人眼不可感知的其他属性,例如在人眼可见范围外的电磁波长的强度、颜色或波长改变,或者是上述属性的任一组合。因此,光的属性变化可以是单个属性发生变化,也可以是两个或更多个属性的组合发生变化。当选择光的强度作为属性时,可以简单地通过选择开启或关闭光源实现。在下文中为了简单起见,以开启或关闭光源来改变光的属性,但本领域技术人员可以理解,用于改变光的属性的其他方式也是可行的。需要说明的是,在上述第一模式中以第一频率变化的光的属性可以与在上述第二模式中以第二频率变化的光的属性相同或不相同。优选地,在所述第一模式和第二模式中发生变化的光的属性是相同的。The attribute of light in this application refers to any property that the CMOS imaging device can recognize, for example, it may be an attribute that the human eye can perceive, such as the intensity, color, and wavelength of light, or other attributes that are not perceptible to the human eye. For example, the intensity, color or wavelength of the electromagnetic wavelength outside the visible range of the human eye changes, or any combination of the above properties. Thus, a change in the properties of light can be a single property change, or a combination of two or more properties can change. When the intensity of the light is selected as an attribute, it can be achieved simply by selecting to turn the light source on or off. In the following, for the sake of simplicity, the light source is turned on or off to change the properties of the light, but those skilled in the art will appreciate that other ways to change the properties of the light are also possible. It should be noted that the attribute of the light varying at the first frequency in the first mode may be the same as or different from the attribute of the light changing at the second frequency in the second mode. Preferably, the properties of the light that change in the first mode and the second mode are the same.

当光源以第一模式或第二模式工作时,可以使用滚动快门成像设备(例如CMOS成像器件或者具有CMOS成像器件的设备(例如手机、平板电脑、智能眼镜等))对光源进行成像,也即,通过滚动快门的方式进行成像。在下文中以手机作为CMOS成像器件为例进行说明,如图2所示。该手机的行扫描方向在图2中示出为垂直方向,但本领域技术人员可以理解,依据底层硬件配置的不同,行扫描方向也可以是水平方向。When the light source operates in the first mode or the second mode, the light source can be imaged using a rolling shutter imaging device such as a CMOS imaging device or a device having a CMOS imaging device (eg, a cell phone, a tablet, smart glasses, etc.), ie , imaging by rolling the shutter. Hereinafter, a mobile phone as a CMOS imaging device will be described as an example, as shown in FIG. 2 . The line scanning direction of the mobile phone is shown as a vertical direction in FIG. 2, but those skilled in the art can understand that the line scanning direction can also be a horizontal direction depending on the underlying hardware configuration.

光源可以是各种形式的光源,只要其某一可被CMOS成像器件感知的属性能够以不同频率进行变化即可。例如,该光源可以是一个LED灯、由多个LED灯构成的阵列、显示屏幕或者其中的一部分,甚至光的照射区域(例如光在墙壁上的照射区域)也可以作为光源。该光源的形状可以是各种形状,例如圆形、正方形、矩形、条状、L状、十字状、球状等。光源中可以包括各种常见的光学器件,例如导光板、柔光板、漫射器等。在一个优选实施例中,光源可以是由多个LED灯构成的二维阵列,该二维阵列的一个维度长于另外一个维度,优选地,两者之间的比例约为6-12:1。例如,该LED灯阵列可以由排成一列的多个LED灯构成。在发光时,该LED灯阵列可以呈现为一个大致为长方形的光源,并由控制器控制该光源的操作。The light source can be a light source of various forms as long as one of its properties that can be perceived by the CMOS imaging device can be varied at different frequencies. For example, the light source may be an LED light, an array of a plurality of LED lights, a display screen or a part thereof, and even an illuminated area of light (for example, an illuminated area of light on a wall) may also serve as a light source. The shape of the light source may be various shapes such as a circle, a square, a rectangle, a strip, an L shape, a cross shape, a spherical shape, or the like. Various common optical devices can be included in the light source, such as a light guide plate, a soft plate, a diffuser, and the like. In a preferred embodiment, the light source may be a two-dimensional array of a plurality of LED lamps, one dimension of which is longer than the other dimension, preferably a ratio of between about 6-12:1. For example, the LED light array can be composed of a plurality of LED lamps arranged in a row. The LED light array can be rendered as a substantially rectangular light source when illuminated, and the operation of the light source is controlled by a controller.

图3示出了根据本发明的一个实施例的光源。在使用CMOS成像器件对图3所示的光源进行成像时,优选地使图3所示的光源的长边与CMOS成像器件的行方向(例如,图2所示的手机的行扫描方向)垂直或大致垂 直,以在其他条件相同的情况下成像出尽量多的条纹。然而,有时用户并不了解其手机的行扫描方向,为了保证手机在各种姿态下都能够进行识别,并且在竖屏和横屏下都能够达到最大的识别距离,光源可以为多个长方形的组合,例如,如图4所示的L状光源。Figure 3 illustrates a light source in accordance with one embodiment of the present invention. When imaging the light source shown in FIG. 3 using a CMOS imaging device, it is preferable to make the long side of the light source shown in FIG. 3 perpendicular to the row direction of the CMOS imaging device (for example, the line scanning direction of the mobile phone shown in FIG. 2) Or roughly vertical, to image as many stripes as possible under otherwise identical conditions. However, sometimes the user does not understand the line scanning direction of the mobile phone. In order to ensure that the mobile phone can recognize in various postures, and the maximum recognition distance can be achieved under both the vertical screen and the horizontal screen, the light source can be a plurality of rectangular shapes. Combination, for example, an L-shaped light source as shown in FIG.

在另一实施例中,光源可以不局限于平面光源,而是可以被实现为一个立体光源,例如,一个条状的圆柱形光源、立方体光源、等等。该光源例如可以被放置在广场上、悬置于室内场所(例如餐厅、会议室等)的大致中心位置,从而附近的位于各个方向的用户都可以通过手机拍摄该光源,从而获得该光源所传递的信息。In another embodiment, the light source may not be limited to a planar light source, but may be implemented as a stereoscopic light source, for example, a strip-shaped cylindrical light source, a cubic light source, or the like. The light source can be placed, for example, on a square, suspended at a substantially central location of an indoor venue (eg, a restaurant, a conference room, etc.) so that a nearby user in each direction can capture the light source through the mobile phone, thereby obtaining the light source. Information.

图5示出了CMOS成像器件的成像时序图,其中的每一行对应于CMOS成像器件的一行传感器。在CMOS成像传感器阵列的每一行进行成像时,主要涉及两个阶段,分别为曝光时间和读出时间。各行的曝光时间有可能发生重叠,但读出时间不会重叠。FIG. 5 shows an imaging timing diagram of a CMOS imaging device, each of which corresponds to a row of sensors of the CMOS imaging device. When imaging is performed on each line of the CMOS imaging sensor array, two stages are mainly involved, namely, exposure time and readout time. The exposure time of each line may overlap, but the readout time does not overlap.

需要说明的是,图5中仅示意性地示出了少量的行,在实际的CMOS成像器件中,依赖于分辨率的不同,通常具有上千行传感器。例如,对于1080p分辨率,其具有1920×1080个像素,数字1080表示有1080条扫描行,数字1920表示每行有1920个像素。对于1080p分辨率,每一行的读出时间大致为8.7微秒(即,8.7×10 -6秒)。 It should be noted that only a small number of rows are schematically illustrated in FIG. 5, and in actual CMOS imaging devices, depending on the resolution, there are typically thousands of rows of sensors. For example, for 1080p resolution, it has 1920 x 1080 pixels, the number 1080 indicates 1080 scan lines, and the number 1920 indicates 1920 pixels per line. For 1080p resolution, the read time per line is approximately 8.7 microseconds (ie, 8.7 x 10 -6 seconds).

如果曝光时间过长导致相邻行之间的曝光时间出现大量重叠,则可能在成像时呈现出明显过渡的条纹,例如,在纯黑色像素行与纯白色像素行之间的多条具有不同灰度的像素行。本发明期望能够呈现出尽量清晰的像素行,为此,可以对CMOS成像器件(例如手机)的曝光时间进行设置或调整(例如,通过手机上安装的APP来进行设置或调整),以选择相对较短的曝光时间。在一个优选的实施例中,可以使得曝光时间大致等于或小于每一行的读出时间。以1080p分辨率为例,其每一行的读出时间大致为8.7微秒,在这种情况下,可以考虑将手机的曝光时间调整为大约8.7微秒或更短。图6示出了在这种情况下的CMOS成像器件的成像时序图。在这种情况下,每行的曝光时间基本不发生重叠,或者重叠部分较少,从而可以在成像时获得具有比较清晰的边界的条纹,其更容易被识别出来。需要说明的是,图6仅仅是本发明的一个优选实施例,更长的(例如等于或小 于每一行的读出时间的两倍、三倍或四倍等)或更短的曝光时间也是可行的。例如,在本申请的图12和13中所示的有条纹图像的成像过程中,每一行的读出时间大致为8.7微秒,而所设置的每行曝光时长为14微秒。另外,为了呈现出条纹,可将光源的一个周期的时长设置为曝光时长的两倍左右或更长,优选地可以设置为曝光时长的四倍左右或更长。If the exposure time is too long, resulting in a large overlap of exposure time between adjacent lines, it may appear as a significant transitional fringe when imaging, for example, multiple strips between pure black pixel rows and pure white pixel rows have different grays. The pixel row of degrees. The present invention is expected to be able to present pixel lines as sharp as possible. For this reason, the exposure time of a CMOS imaging device (such as a mobile phone) can be set or adjusted (for example, set or adjusted by an APP installed on a mobile phone) to select a relative Short exposure time. In a preferred embodiment, the exposure time can be made approximately equal to or less than the readout time of each row. Taking the 1080p resolution as an example, the readout time of each line is approximately 8.7 microseconds. In this case, it is conceivable to adjust the exposure time of the mobile phone to about 8.7 microseconds or less. Fig. 6 shows an imaging timing chart of the CMOS imaging device in this case. In this case, the exposure time of each line does not substantially overlap, or the number of overlapping portions is small, so that stripes having relatively clear boundaries can be obtained at the time of imaging, which is more easily recognized. It should be noted that FIG. 6 is only a preferred embodiment of the present invention, and a longer (for example, twice or three times, four times or four times the readout time of each row, etc.) or a shorter exposure time is also feasible. of. For example, in the imaging process of the striped image shown in Figs. 12 and 13 of the present application, the readout time per line is approximately 8.7 microseconds, and the exposure time per line set is 14 microseconds. Further, in order to exhibit streaks, the length of one cycle of the light source may be set to about twice or more of the exposure time, and preferably may be set to about four times or more of the exposure time.

图7示出了当使用控制器使光源工作于第一模式时在不同阶段在CMOS成像器件上的成像图,在该第一模式下,以一定频率改变光源发出的光的属性,在本例中为开启和关闭光源。Figure 7 is a view showing an image of a CMOS imaging device at different stages when the light source is operated in the first mode using a controller, in which the property of the light emitted by the light source is changed at a certain frequency, in this example Medium to turn the light source on and off.

图7的上部示出了在不同阶段的光源的状态变化图,下部示出了在不同阶段该光源在CMOS成像器件上的成像图,其中,CMOS成像器件的行方向为垂直方向,并从左向右扫描。由于CMOS成像器件采集图像是逐行扫描的,因此在拍摄高频闪烁信号时,所获得的一帧图像上与光源的成像位置对应的部分会形成如图7下部所示的条纹,具体地,在时段1,光源开启,在该时段中曝光的最左侧部分的扫描行呈现亮条纹;在时段2,光源关闭,在该时段中曝光的扫描行呈现暗条纹;在时段3,光源开启,在该时段中曝光的扫描行呈现亮条纹;在时段4,光源关闭,在该时段中曝光的扫描行呈现暗条纹。The upper part of Fig. 7 shows a state change diagram of the light source at different stages, and the lower part shows an image of the light source on the CMOS imaging device at different stages, wherein the row direction of the CMOS imaging device is vertical and from the left Scan to the right. Since the image captured by the CMOS imaging device is progressively scanned, when the high-frequency flicker signal is captured, the portion of the obtained image on the image corresponding to the imaging position of the light source forms a stripe as shown in the lower part of FIG. 7, specifically, In time period 1, the light source is turned on, in which the scanning line of the leftmost portion of the exposure exhibits bright streaks; in time period 2, the light source is turned off, in which the scanned lines of the exposure exhibit dark stripes; in time period 3, the light source is turned on, The scanned lines exposed during this time period exhibit bright streaks; in time period 4, the light source is turned off, during which the scanned lines of exposure exhibit dark stripes.

可以通过光源控制信号来设置光源闪烁的频率,或者设置光源每次开启和关闭的时长,来调整出现的条纹的宽度,更长的开启或关闭时间通常对应于更宽的条纹。例如,对于图6所示的情形,如果将光源每次开启和关闭的时长均设置为大致等于CMOS成像器件的每一行的曝光时间(该曝光时间可以通过手机上安装的APP进行设置或者手工设置),则可以在成像时呈现出宽度为仅一个像素的条纹。为了能够实现对光标签的远距离识别,应使条纹越窄越好。但在实践中,由于光线干扰、同步等原因,宽度为仅一个像素的条纹可能不太稳定,或者不太容易识别,因此,为了提高识别的稳定性,优选地实现宽度为两个像素的条纹。例如,对于图6所示的情形,可以通过将光源每次开启或关闭的时长均设置为大致等于CMOS成像器件的每一行的曝光时长的大约2倍,来实现宽度为大约两个像素的条纹,具体如图8所示,其中,图8的上部的信号为光源控制信号,其高电平对应于光源的开启,而低电平对应于光源的关闭。在图8所示的实施 例中,将光源控制信号的占空比设置为大约50%,将每一行的曝光时长设置为大致等于每一行的读出时间,但本领域技术人员可以理解,其他设置也是可行的,只要能够呈现出可分辨的条纹即可。为了描述简单起见,图8中使用了光源与CMOS成像器件之间的同步,以使得光源的开启和关闭的时间大致对应于CMOS成像器件的某一行的曝光时长的开始或结束时间,但是本领域技术人员可以理解,即使两者未能如图8那样同步,也可以在CMOS成像器件上呈现出明显的条纹,此时,可能会存在一些过渡条纹,但一定存在光源始终关闭时曝光的行(也即最暗的条纹)与光源始终开启时曝光的行(也即最亮的条纹),两者间隔一个像素。这种像素行的明暗变化(也即条纹)可以被容易地检测出来(例如,通过比较光源成像区域中的一些像素的亮度或灰度)。更进一步,即使不存在光源始终关闭时曝光的行(也即最暗的条纹)和光源始终开启时曝光的行(也即最亮的条纹),如果存在曝光时间内光源开启部分t1小于一定时间长度或占整个曝光时长较小比例的行(也即较暗条纹),和曝光时间内光源开启部分t2大于一定时间长度或占整个曝光时长较大比例的行(也即较亮条纹),且t2-t1>明暗条纹差值阈值(例如10微秒),或t2/t1>明暗条纹比例阈值(例如2),这些像素行之间的明暗变化也可以被检测出来。上述明暗条纹差值阈值和比例阈值和光标签发光强度、感光器件属性、拍摄距离等相关。本领域技术人员可以理解,其他阈值也是可行的,只要能够呈现出计算机可分辨的条纹即可。当识别出条纹时,可以确定出光源此时所传递的信息,例如二进制数据0或数据1。The frequency of the flashing of the light source can be set by the light source control signal, or the length of the light strip can be adjusted each time the light source is turned on and off, and the longer opening or closing time generally corresponds to a wider stripe. For example, for the case shown in FIG. 6, if the light source is turned on and off each time, the exposure time is set to be substantially equal to the exposure time of each line of the CMOS imaging device (this exposure time can be set by the APP installed on the mobile phone or manually set. ), it is possible to present stripes with a width of only one pixel when imaging. In order to enable long-distance identification of optical tags, the narrower the stripes, the better. However, in practice, the stripe having a width of only one pixel may be less stable or less recognizable due to light interference, synchronization, etc., therefore, in order to improve the stability of recognition, it is preferable to realize a stripe having a width of two pixels. . For example, for the case shown in FIG. 6, the stripe having a width of about two pixels can be realized by setting the duration of each turn on or off of the light source to be approximately equal to about twice the exposure time of each line of the CMOS imaging device. Specifically, as shown in FIG. 8, wherein the signal of the upper portion of FIG. 8 is a light source control signal, the high level corresponds to the turn-on of the light source, and the low level corresponds to the turn-off of the light source. In the embodiment shown in FIG. 8, the duty ratio of the light source control signal is set to about 50%, and the exposure duration of each line is set to be substantially equal to the readout time of each line, but those skilled in the art can understand that other Settings are also possible, as long as they can show distinguishable stripes. For simplicity of description, the synchronization between the light source and the CMOS imaging device is used in FIG. 8 such that the time of turning on and off of the light source substantially corresponds to the start or end time of the exposure time of a certain line of the CMOS imaging device, but the field The skilled person will understand that even if the two are not synchronized as shown in Fig. 8, they can exhibit significant streaks on the CMOS imaging device. At this time, there may be some transition stripes, but there must be a line of exposure when the light source is always off ( That is, the darkest stripe) is the line that is exposed when the light source is always on (ie, the brightest stripe), which is separated by one pixel. The light and dark variations (i.e., fringes) of such pixel rows can be easily detected (e.g., by comparing the brightness or grayscale of some of the pixels in the imaged area of the source). Furthermore, even if there is no line (ie, the darkest stripe) that is exposed when the light source is always off and a line that is exposed when the light source is always on (ie, the brightest stripe), if there is a light-on portion t1 less than a certain time during the exposure time a line having a small proportion of the length of the entire exposure time (ie, a darker stripe), and a line in which the light source opening portion t2 is greater than a certain length of time or a proportion of the entire exposure time (ie, a brighter stripe) during the exposure time, and T2-t1> light and dark stripe difference threshold (for example, 10 microseconds), or t2/t1> light and dark stripe ratio threshold (for example, 2), and the brightness change between these pixel lines can also be detected. The light and dark stripe difference threshold and the ratio threshold are related to the optical label illumination intensity, the photosensitive device property, the shooting distance, and the like. Those skilled in the art will appreciate that other thresholds are also possible as long as computer-resolvable stripes are present. When the streaks are identified, the information conveyed by the light source at this time, such as binary data 0 or data 1, can be determined.

根据本发明的一个实施例的条纹识别方法如下:得到光标签的图像,利用投影的方式分割出光源的成像区域;收集不同配置下(例如,不同距离、不同的光源闪烁频率等)的有条纹图片和无条纹图片;将所有收集的图片统一归一化到一个特定大小,例如64*16像素;提取每一个像素特征作为输入特征,构建机器学习分类器;进行二分类判别以判断是条纹图片还是非条纹图片。对于条纹识别,本领域普通技术人员还可以采用本领域公知的任何其他方法进行处理,对此不再详述。The stripe recognition method according to an embodiment of the present invention is as follows: obtaining an image of the optical label, and dividing the imaging area of the light source by means of projection; collecting stripe in different configurations (for example, different distances, different light source flicker frequencies, etc.) Images and unstripe pictures; normalize all collected pictures to a specific size, such as 64*16 pixels; extract each pixel feature as input feature, build a machine learning classifier; perform two-class discrimination to determine a striped picture Still a non-striped picture. For stripe recognition, one of ordinary skill in the art can also process by any other method known in the art, which will not be described in detail.

对于一个长度为5厘米的条状光源,当使用目前市场上常见的手机,设置分辨率为1080p,在距离其10米远的地方(也即,距离为光源长度的 200倍)进行拍摄时,该条状光源在其长度方向上大约会占据6个像素,如果每个条纹宽度为2个像素,则在该6个像素的宽度范围内会呈现出多个明显素的宽度范围内会呈现出至少一个明显的条纹,其可以被很容易地识别出来。如果设置更高的分辨率,或者采用光学变焦,在更远的距离,例如距离为光源长度的300倍或400倍时,也能够识别出条纹。For a strip light source with a length of 5 cm, when using a mobile phone that is currently on the market, setting the resolution to 1080p, when shooting 10 meters away (that is, the distance is 200 times the length of the light source), The strip light source occupies about 6 pixels in its length direction, and if each stripe width is 2 pixels, it will appear in a range of widths of a plurality of apparent pixels within the width of the 6 pixels. At least one distinct stripe that can be easily identified. If a higher resolution is set, or optical zoom is used, the stripe can be recognized at a greater distance, for example, when the distance is 300 or 400 times the length of the light source.

控制器也可以使光源工作于第二模式。在一个实施例中,在第二模式下,光源控制信号可以具有与第一模式不同的另一频率,来改变光源发出的光的属性,例如开启和关闭光源。在一个实施例中,相比于第一模式,在第二模式下控制器可以提高光源的开启和关闭频率。例如,第一模式的频率可以大于或等于8000次/秒,而第二模式的频率可以大于第一模式的频率。对于图6所示的情形,可以将光源配置为在CMOS成像器件的每一行的曝光时间内光源开启和关闭至少一次。图9示出了在每一行的曝光时间内光源开启和关闭只一次的情形,其中,图9的上部的信号为光源控制信号,其高电平对应于光源的开启,而低电平对应于光源的关闭。由于在每一行的曝光时间内,光源都会以相同的方式开启和关闭一次,每个曝光时间获取的曝光强度能量大致均等,因此光源的最终成像的各个像素行之间的亮度不会存在明显差异,从而不存在条纹。本领域技术人员可以理解,更高的开启和关闭频率也是可行的。另外,为了描述简单起见,图9中使用了光源与CMOS成像器件之间的同步,以使得光源的开启时间大致对应于CMOS成像器件的某一行的曝光时长的开始时间,但是本领域技术人员可以理解,即使两者未能如图9那样同步,在光源的最终成像的各个像素行之间的亮度也不会存在明显差异,从而不存在条纹。当不能识别出条纹时,可以确定出光源此时所传递的信息,例如二进制数据1或数据0。对于人眼而言,本发明的光源工作于上述第一模式或第二模式下时不会察觉到任何闪烁现象。另外,为了避免在第一模式和第二模式之间切换时人眼可能会察觉到的闪烁现象,可以将第一模式和第二模式的占空比设置为大致相等,从而实现在不同模式下的大致相同的光通量。The controller can also operate the light source in the second mode. In one embodiment, in the second mode, the light source control signal can have another frequency than the first mode to change the properties of the light emitted by the light source, such as turning the light source on and off. In one embodiment, the controller can increase the turn-on and turn-off frequencies of the light source in the second mode compared to the first mode. For example, the frequency of the first mode may be greater than or equal to 8000 times/second, and the frequency of the second mode may be greater than the frequency of the first mode. For the situation illustrated in Figure 6, the light source can be configured to turn the light source on and off at least once during the exposure time of each row of the CMOS imaging device. FIG. 9 shows a case where the light source is turned on and off only once during the exposure time of each line, wherein the signal of the upper portion of FIG. 9 is a light source control signal whose high level corresponds to the turn-on of the light source, and the low level corresponds to the light source. The light source is turned off. Since the light source is turned on and off in the same way during the exposure time of each line, the exposure intensity energy obtained at each exposure time is roughly equal, so there is no significant difference in brightness between the individual pixel rows of the final image of the light source. So there are no stripes. Those skilled in the art will appreciate that higher turn-on and turn-off frequencies are also possible. In addition, for simplicity of description, synchronization between the light source and the CMOS imaging device is used in FIG. 9 such that the turn-on time of the light source substantially corresponds to the start time of the exposure time of a certain line of the CMOS imaging device, but those skilled in the art can It is understood that even if the two are not synchronized as in Fig. 9, there is no significant difference in brightness between the respective pixel rows of the final image of the light source, so that no streaks exist. When the streaks are not recognized, the information conveyed by the light source at this time, such as binary data 1 or data 0, can be determined. For the human eye, the light source of the present invention does not perceive any flicker when operating in the first mode or the second mode described above. In addition, in order to avoid the flicker phenomenon that may be perceived by the human eye when switching between the first mode and the second mode, the duty ratios of the first mode and the second mode may be set to be substantially equal, thereby realizing in different modes. The roughly the same luminous flux.

在另一实施例中,在第二模式下,可以向光源提供直流电,以使得光源发出属性基本不会发生改变的光,从而,在通过CMOS图像传感器对光源拍摄时所获得的该光源的一帧图像上不会呈现条纹。另外,在这种情况 下,也可以实现在不同模式下的大致相同的光通量,以避免在第一模式和第二模式之间切换时人眼可能会察觉到的闪烁现象。In another embodiment, in the second mode, direct current may be supplied to the light source such that the light source emits light whose properties do not substantially change, thereby obtaining one of the light sources obtained when the light source is photographed by the CMOS image sensor. No streaks appear on the frame image. In addition, in this case, substantially the same luminous flux in different modes can also be achieved to avoid flicker that may be perceived by the human eye when switching between the first mode and the second mode.

上文的图8描述了通过使光源发出的光的强度发生变化(例如,通过开启或关闭光源)来呈现条纹的实施例,在另一实施例中,如图10所示,也可以通过使光源发出的光的波长或颜色发生变化来呈现条纹。在图10所示的实施例中,光源中包括可发出红光的红色灯和可发出蓝光的蓝色灯。图10的上部的两个信号分别为红光控制信号和蓝光控制信号,其中,高电平对应于相应光源的开启,而低电平对应于相应光源的关闭。该红光控制信号和蓝光控制信号的相位偏移180°,也即,两者电平相反。通过红光控制信号和蓝光控制信号,可以使得光源向外交替地发出红色光和蓝色光,从而当采用CMOS成像器件对光源进行成像时可以呈现出红蓝条纹。Figure 8 above describes an embodiment in which stripes are rendered by varying the intensity of the light emitted by the source (e.g., by turning the light source on or off). In another embodiment, as shown in Figure 10, The wavelength or color of the light emitted by the light source changes to present stripes. In the embodiment shown in Fig. 10, the light source includes a red light that emits red light and a blue light that emits blue light. The two signals in the upper portion of FIG. 10 are a red light control signal and a blue light control signal, respectively, wherein a high level corresponds to the turn-on of the corresponding light source and a low level corresponds to the turn-off of the corresponding light source. The red light control signal and the blue light control signal are phase shifted by 180°, that is, the two levels are opposite. The red light control signal and the blue light control signal enable the light source to alternately emit red light and blue light outward, so that when the light source is imaged by the CMOS imaging device, red and blue stripes can be presented.

通过确定CMOS成像器件拍摄的一帧图像上与光源对应的部分是否存在条纹,可以确定每帧图像所传递的信息,例如二进制数据1或数据0。进一步地,通过CMOS成像器件拍摄光源的连续的多帧图像,可以确定出由二进制数据1和0构成的信息序列,实现光源向CMOS成像器件(例如手机)的信息传递。在一个实施方式中,当通过CMOS成像器件拍摄光源的连续的多帧图像时,可以通过控制器进行控制,使得光源的工作模式之间的切换时间间隔等于CMOS成像器件一个完整帧成像的时间长度,从而实现光源与成像器件的帧同步,即每帧传输1比特的信息。对于30帧/每秒的拍摄速度,每秒钟可以传递30比特的信息,编码空间达到2 30,该信息可以包括例如,起始帧标记(帧头)、光标签的ID、口令、验证码、网址信息、地址信息、时间戳或其不同的组合等等。可以按照结构化方法,设定上述各种信息的顺序关系,形成数据包结构。每接收到一个完整的该数据包结构,视为获得一组完整数据(一个数据包),进而可以对其进行数据读取和校验分析。下表示出了根据本发明的一个实施例的数据包结构: By determining whether or not there is a streak on a portion of the image of one frame taken by the CMOS imaging device corresponding to the light source, information transmitted by each frame of the image, such as binary data 1 or data 0, can be determined. Further, by taking a continuous multi-frame image of the light source by the CMOS imaging device, a sequence of information composed of binary data 1 and 0 can be determined, and information transmission of the light source to the CMOS imaging device (for example, a mobile phone) is realized. In one embodiment, when a continuous multi-frame image of the light source is captured by the CMOS imaging device, control may be performed by the controller such that the switching time interval between the operating modes of the light source is equal to the length of time that a complete frame of the CMOS imaging device is imaged Thereby, the frame synchronization of the light source and the imaging device is realized, that is, information of 1 bit is transmitted per frame. For a shooting speed of 30 frames per second, 30 bits of information can be transmitted per second, and the encoding space reaches 2 30. The information can include, for example, a start frame mark (frame header), an optical tag ID, a password, a verification code. , URL information, address information, timestamps or different combinations thereof, and so on. The order relationship of the above various kinds of information can be set in accordance with a structuring method to form a packet structure. Each time a complete packet structure is received, it is considered to obtain a complete set of data (a packet), which can be read and verified. The following table shows the packet structure in accordance with one embodiment of the present invention:

帧头Frame header 属性(8bit)Attribute (8bit) 数据位(32bit)Data bit (32bit) 校验位(8bit)Check digit (8bit) 帧尾End of frame

在上文的描述中,通过判断每帧图像中在光源的成像位置处是否存在 条纹来确定该帧图像所传递的信息。在其他实施例中,可以通过识别每帧图像中在光源的成像位置处的不同条纹来确定该帧图像所传递的不同信息。例如,在第一模式下,通过具有第一频率的光源控制信号来控制光源发出的光的属性以第一频率持续变化,从而能在通过CMOS图像传感器对光源拍摄时所获得的光源的图像上呈现出第一条纹;在第二模式下,通过具有第二频率的光源控制信号来控制光源发出的光的属性以第二频率持续变化,从而能在通过CMOS图像传感器对光源拍摄时所获得的光源的图像上呈现出与所述第一条纹不同的第二条纹。条纹的不同可以例如基于不同的宽度、颜色、亮度等或它们的任意组合,只要该不同能够被识别即可。In the above description, the information transmitted by the frame image is determined by judging whether or not there is a streak at the imaging position of the light source in each frame of the image. In other embodiments, different information conveyed by the frame image may be determined by identifying different fringes at the imaging location of the light source in each frame of image. For example, in the first mode, the property of the light emitted by the light source is controlled by the light source control signal having the first frequency to continuously change at the first frequency, thereby being capable of being imaged on the light source obtained when the light source is photographed by the CMOS image sensor. Presenting a first stripe; in the second mode, controlling the property of the light emitted by the light source by the light source control signal having the second frequency to continuously change at the second frequency, thereby being obtainable when the light source is photographed by the CMOS image sensor A second stripe different from the first stripe is present on the image of the light source. The difference in stripes may be based, for example, on different widths, colors, brightnesses, etc., or any combination thereof, as long as the difference can be identified.

在一个实施例中,可以基于不同的光源控制信号频率来实现不同宽度的条纹,例如,在第一模式下,光源可以如图8所示的方式工作,从而实现宽度为大约两个像素的第一种条纹;在第二模式下,可以将图8中的光源控制信号的每个周期中的高电平和低电平的持续时间分别修改为原来的两倍,具体如图11所示,从而实现宽度为大约四个像素的第二种条纹。In one embodiment, stripes of different widths may be implemented based on different light source control signal frequencies. For example, in the first mode, the light source may operate as shown in FIG. 8 to achieve a width of approximately two pixels. a stripe; in the second mode, the durations of the high level and the low level in each period of the light source control signal in FIG. 8 can be respectively changed to twice the original, as shown in FIG. A second stripe with a width of approximately four pixels is implemented.

在另一个实施例中,可以实现不同颜色的条纹,例如,可以将光源设置为其中包括可发出红光的红色灯和可发出蓝光的蓝色灯,在第一模式下,可以关闭蓝色灯,并使红色灯如图8所示的方式工作,从而实现红黑条纹;在第二模式下,可以关闭红色灯,并使蓝色灯如图8所示的方式工作,从而实现蓝黑条纹。在上述实施例中,在第一模式和第二模式下使用具有相同频率的光源控制信号实现了红黑条纹和蓝黑条纹,但是可以理解,在第一模式和第二模式下也可以使用具有不同频率的光源控制信号。In another embodiment, stripes of different colors may be implemented. For example, the light source may be set to include a red light that emits red light and a blue light that emits blue light. In the first mode, the blue light may be turned off. And the red lamp works as shown in Fig. 8 to realize red and black stripes; in the second mode, the red lamp can be turned off, and the blue lamp works as shown in Fig. 8, thereby realizing blue and black stripes. . In the above embodiment, the red and black stripes and the blue and black stripes are realized using the light source control signals having the same frequency in the first mode and the second mode, but it is understood that the first mode and the second mode may also be used. Light source control signals at different frequencies.

另外,本领域技术人员可以理解,可以进一步地通过实现不止两种条纹来表示不止两种信息,例如,在上述光源中包括红色灯和蓝色灯的实施例中,可以进一步设置第三模式,在该第三模式下以图10所示的方式对红色灯和蓝色灯进行控制以实现红蓝条纹,即第三种信息。显然,可选地,也可以进一步通过不呈现条纹的第四模式来传递另一种信息,即第四种信息。可以任意选择上述四种模式中的多种来进行信息传递,也可以进一步结合其他模式,只要不同的模式产生不同的条纹图案即可。In addition, those skilled in the art can understand that more than two kinds of information can be further represented by implementing more than two kinds of stripes. For example, in an embodiment including the red light and the blue light in the above light source, the third mode can be further set. In this third mode, the red and blue lights are controlled in the manner shown in Figure 10 to achieve a red-blue stripe, a third type of information. Obviously, alternatively, another type of information, that is, the fourth type of information, can be further transmitted through the fourth mode in which the stripes are not presented. The above four modes can be arbitrarily selected for information transmission, and other modes can be further combined as long as different patterns generate different stripe patterns.

图12示出了在针对以每秒16000次的频率闪烁的LED灯(每个周期的持续时间为62.5微秒,其中开启时长和关闭时长各为大约31.25微秒), 使用1080p分辨率的成像设备,并将每行曝光时长设置为14微秒的情况下,通过实验得到的图像上的条纹。从图12可以看出,呈现出了大致为2-3像素宽度的条纹。图13示出了将图12中的LED灯闪烁频率调整为每秒8000次(每个周期的持续时间为125微秒,其中开启时长和关闭时长各为大约62.5微秒)后,在其他条件不变的情况下通过实验得到的图像上的条纹。从图13可以看出,呈现出了大致为5-6像素宽度的条纹。图14示出了将图12中的LED灯闪烁频率调整为每秒64000次(每个周期的持续时间为15.6微秒,其中开启时长和关闭时长各为大约7.8微秒)后,在其他条件不变的情况下通过实验得到的图像,其上不存在条纹,其原因是每行曝光时长14微秒中基本上涵盖了LED灯的一个开启时长和一个关闭时长。Figure 12 shows the use of 1080p resolution imaging for LEDs that are flashing at a frequency of 16,000 times per second (each period has a duration of 62.5 microseconds with an on duration and a closure duration of approximately 31.25 microseconds each) The stripe on the image obtained by the experiment, with the exposure time of each line set to 14 microseconds. As can be seen from Figure 12, stripes of approximately 2-3 pixel width are presented. Figure 13 shows that the blinking frequency of the LED lamp in Figure 12 is adjusted to 8000 times per second (the duration of each cycle is 125 microseconds, wherein the opening duration and the closing duration are each about 62.5 microseconds), under other conditions. Streaks on the image obtained by experiment under constant conditions. As can be seen from Figure 13, a stripe of approximately 5-6 pixel width is presented. Figure 14 shows the adjustment of the blinking frequency of the LED lamp of Figure 12 to 64,000 times per second (the duration of each cycle is 15.6 microseconds, wherein the opening duration and the closing duration are each about 7.8 microseconds), under other conditions. The image obtained by experiment without change has no streaks on it, because the length of each line of exposure is 14 microseconds, which basically covers one opening time and one closing time of the LED lamp.

在上文中,为了方便说明,以方波为例描述了具有相应频率的光源控制信号,但本领域技术人员可以理解,光源控制信号也可以使用其他波形,例如正弦波、三角波等。In the above, for convenience of description, the square wave is taken as an example to describe the light source control signal having the corresponding frequency, but those skilled in the art can understand that the light source control signal can also use other waveforms such as a sine wave, a triangular wave or the like.

上文中描述了采用一个光源的情形,在一些实施例中,也可以采用两个或更多个光源。控制器可以独立地控制每一个光源的操作。图15是根据本发明的一个实施例的采用三个独立光源的光标签的一个成像图,其中,两个光源的成像位置出现了条纹,一个光源的成像位置没有出现条纹,该组光源的这一帧图像可以用于传递信息,例如二进制数据110。The use of one light source is described above, and in some embodiments, two or more light sources may also be employed. The controller can independently control the operation of each light source. Figure 15 is an image view of an optical tag employing three independent light sources in which the imaging positions of the two light sources have streaks, and the imaging position of one light source has no streaks, according to one embodiment of the present invention, One frame of image can be used to convey information, such as binary data 110.

在一个实施例中,光标签中还可以包括位于信息传递光源附近的一个或多个定位标识,该定位标识例如可以是特定形状或颜色的灯,该灯例如可以在工作时保持常亮。该定位标识可以有助于CMOS成像器件(例如手机)的用户容易地发现光标签。另外,当CMOS成像器件被设置为对光标签进行拍摄的模式时,定位标识的成像比较明显,易于识别。因此,布置于信息传递光源附近的一个或多个定位标识还能够有助于手机快速地确定信息传递光源的位置,从而有助于识别对应于信息传递光源的成像区域是否存在条纹。在一个实施例中,在识别是否存在条纹时,可以首先在图像中对定位标识进行识别,从而在图像中发现光标签的大致位置。在识别了定位标识之后,可以基于定位标识与信息传递光源之间的相对位置关系,确定图像中的一个或多个区域,该区域涵盖信息传递光源的成像位置。接着,可以 针对这些区域进行识别,以判断是否存在条纹,或存在什么样的条纹。图16是根据本发明的一个实施例的包括定位标识的光标签的一个成像图,其中包括三个水平布置的信息传递光源,以及位于信息传递光源两侧的竖直布置的两个定位标识灯。In one embodiment, the optical tag may further include one or more positioning indicators located adjacent to the information delivery source, the positioning identification being, for example, a lamp of a particular shape or color, which may remain bright, for example, during operation. The location identification can help a user of a CMOS imaging device, such as a cell phone, to easily discover optical tags. In addition, when the CMOS imaging device is set to a mode in which the optical tag is photographed, the imaging of the positioning mark is conspicuous and easy to recognize. Thus, the one or more location markers disposed adjacent the information transfer light source can also assist the handset in quickly determining the location of the information transfer light source to facilitate identifying whether the imaged region corresponding to the information transfer light source has streaks. In one embodiment, in identifying whether there are streaks, the location identification may first be identified in the image such that the approximate location of the optical tag is found in the image. After the location identification is identified, one or more regions in the image may be determined based on the relative positional relationship between the location identification and the information delivery light source, the region encompassing an imaging location of the information delivery light source. These areas can then be identified to determine if there are streaks or what stripes are present. 16 is an image diagram of an optical tag including a positioning mark including three horizontally disposed information transfer light sources, and two vertically positioned two positional identification lights located on both sides of the information transfer light source, in accordance with an embodiment of the present invention. .

在一个实施例中,光标签中可以包括环境光检测电路,该环境光检测电路可以用于检测环境光的强度。控制器可以基于检测到的环境光的强度来调整光源在开启时所发出的光的强度。例如,在环境光比较强时(例如白天),使得光源发出的光的强度比较大,而在环境光比较弱时(例如夜里),使得光源发出的光的强度比较小。In one embodiment, an ambient light detection circuit can be included in the optical tag that can be used to detect the intensity of ambient light. The controller can adjust the intensity of the light emitted by the light source when it is turned on based on the intensity of the detected ambient light. For example, when the ambient light is relatively strong (for example, during the day), the intensity of the light emitted by the light source is relatively large, and when the ambient light is relatively weak (for example, at night), the intensity of the light emitted by the light source is relatively small.

在一个实施例中,光标签中可以包括环境光检测电路,该环境光检测电路可以用于检测环境光的频率。控制器可以基于检测到的环境光的频率来调整光源在开启时所发出的光的频率。例如,在环境光存在同频闪动光源时,切换光源发出的光至另一未占用频率。In one embodiment, an ambient light detection circuit can be included in the optical tag that can be used to detect the frequency of ambient light. The controller can adjust the frequency of the light emitted by the light source when it is turned on based on the frequency of the detected ambient light. For example, when the ambient light has the same frequency flashing light source, the light emitted by the light source is switched to another unoccupied frequency.

在实际的应用环境中,如果存在大量的噪声,或者当识别距离非常远时,可能会影响识别的准确度。因此,为了提高识别的准确度,在本发明的一个实施例中,在光标签中除了包括上述用于传递信息的光源(为了清楚起见,下文中将其称为“数据光源”)之外,还可以包括至少一个参考光源。参考光源本身并不用于传递信息,而是用于辅助识别数据光源所传递的信息。参考光源在物理结构上可以与数据光源类似,但是以预先确定的工作模式工作,该工作模式可以是数据光源的各种工作模式中的一种或多种。以此方式,可以将数据光源的解码转化成和参考光源的图像做匹配(例如:相关性)的计算,从而提高了解码的准确性。In the actual application environment, if there is a large amount of noise, or when the recognition distance is very far, the accuracy of the recognition may be affected. Therefore, in order to improve the accuracy of the recognition, in one embodiment of the present invention, in addition to the above-described light source for transmitting information (for clarity, hereinafter referred to as "data source" for clarity), It may also include at least one reference light source. The reference source itself is not used to convey information, but rather to aid in identifying the information conveyed by the data source. The reference source can be physically similar to the data source, but operates in a predetermined mode of operation, which can be one or more of various modes of operation of the data source. In this way, the decoding of the data source can be converted to a calculation that matches (eg, correlates) the image of the reference source, thereby improving the accuracy of the decoding.

图17示出了根据本发明的一个实施例的包括了一个参考光源和两个数据光源的光标签,其中并排布置了三个光源,第一个光源作为参考光源,另外两个光源分别作为第一数据光源和第二数据光源。需要说明的是,光标签中参考光源的数量可以是一个或者更多个,而不限于一个;同样,数据光源的数量也可以是一个或者更多个,而不限于两个。另外,因为参考光源用于提供辅助识别,因此其形状、尺寸不是必须与数据光源相同。例如,在一个实施方式中,参考光源的长度可以是数据光源的一半。17 shows an optical tag including a reference light source and two data light sources, wherein three light sources are arranged side by side, the first light source serves as a reference light source, and the other two light sources respectively serve as a first embodiment. A data source and a second data source. It should be noted that the number of reference light sources in the optical label may be one or more, and is not limited to one; likewise, the number of data light sources may be one or more, and is not limited to two. In addition, because the reference source is used to provide auxiliary recognition, its shape and size do not have to be the same as the data source. For example, in one embodiment, the length of the reference source can be half of the data source.

在一个实施例中,图17中所示的第一数据光源和第二数据光源中的 每一个被配置为可以工作于三种模式,以例如分别显示无条纹图像、条纹宽度为2个像素的图像、条纹宽度为4个像素的图像。而参考光源可以被配置为始终工作于三种模式之一以显示上述三种图像之一,或者交替工作于不同模式,以在不同帧中交替显示上述三种图像中的任意两种或全部,从而为数据光源的图像识别提供比较基准或参考。以参考光源在不同帧中交替显示条纹宽度为2个像素的图像和条纹宽度为4个像素的图像为例,每一帧中的数据光源的图像可以与当前帧以及一个相邻帧(例如之前的帧或之后的帧)中的参考光源的图像(这些图像中一定包含了条纹宽度为2个像素的图像和条纹宽度为4个像素的图像)进行比较,以判断其图像的类型;或者,也可以采集一个时段内的参考光源的连续多帧图像,将奇数帧编号的图像和偶数帧编号的图像分别作为一组,对每一组图像的特征进行平均化(例如,求每一组图像的条纹宽度的平均值),以及根据条纹宽度分辨哪一组图像对应于条纹宽度为2个像素的图像或条纹宽度为4个像素的图像,从而获得条纹宽度为2个像素的图像的平均特征和条纹宽度为4个像素的图像的平均特征,之后,可以判断数据光源在每一帧中的图像是否符合这些平均特征之一。In one embodiment, each of the first data source and the second data source shown in FIG. 17 is configured to operate in three modes to, for example, display a streak-free image, respectively having a stripe width of 2 pixels. Image, image with a stripe width of 4 pixels. The reference light source may be configured to always operate in one of three modes to display one of the three images described above, or alternately operate in different modes to alternately display any two or all of the above three images in different frames, This provides a baseline or reference for image recognition of the data source. For example, the reference light source alternately displays an image with a stripe width of 2 pixels and an image with a stripe width of 4 pixels in different frames. The image of the data source in each frame may be compared with the current frame and an adjacent frame (for example, before An image of a reference light source in a frame or a subsequent frame (the images must include an image with a stripe width of 2 pixels and an image with a stripe width of 4 pixels) to determine the type of the image; or It is also possible to collect successive multi-frame images of the reference light source in one time period, and to average the features of each group of images by taking the image of the odd frame number and the image of the even frame number as a group respectively (for example, finding each group) The average of the stripe width of the image), and which set of images according to the stripe width corresponds to an image with a stripe width of 2 pixels or an image with a stripe width of 4 pixels, thereby obtaining an image with a stripe width of 2 pixels. The average feature and the average width of the image with a stripe width of 4 pixels, after which it can be determined whether the image of the data source in each frame conforms to one of these average features.

由于参考光源与数据光源位于大致相同的位置,且经受相同的环境光照条件、干扰、噪声等,因此其可以实时地提供一种或多种用于图像识别的基准图像或参考图像,从而能改善对数据光源所传递的信息的识别的准确性和稳定性。例如,可以通过将数据光源的成像与参考光源的成像进行比较来准确地识别出数据光源的工作模式,从而识别出其所传递的数据。Since the reference light source is located at substantially the same position as the data light source and is subjected to the same ambient lighting conditions, interference, noise, etc., it can provide one or more reference images or reference images for image recognition in real time, thereby improving The accuracy and stability of the identification of information conveyed by the data source. For example, the data pattern can be accurately identified by comparing the imaging of the data source with the imaging of the reference source to identify the data it is transmitting.

进一步地,根据CMOS的成像原理,当多个光源以相同频率但不同的相位进行属性变化时,会产生相同宽度但不同相位的条纹图案,相同宽度但不同相位的条纹图案可以使用匹配的方法来准确判定。在一个实施例中,可以控制参考光源以预定的工作模式工作,在该工作模式下,该参考光源的图像上例如会呈现出宽度为4个像素的条纹。此时,如果同时控制数据光源在该工作模式下工作,并且使得数据光源与参考光源的相位一致,则该数据光源的图像上呈现出的条纹与参考光源的图像上呈现出的条纹相似(例如,宽度也是4个像素)且不存在相位差;如果同时控制数据光源在该工作模式下工作,但使得数据光源与参考光源的相位不一致(例 如,反相或相差180°),则该数据光源的图像上呈现出的条纹与参考光源的图像上呈现出的条纹相似(例如,宽度也是4个像素)但存在相位差。Further, according to the imaging principle of CMOS, when a plurality of light sources change their properties at the same frequency but different phases, a stripe pattern of the same width but different phases is generated, and the stripe pattern of the same width but different phases can be matched using a matching method. Accurate judgment. In one embodiment, the reference light source can be controlled to operate in a predetermined mode of operation in which, for example, a stripe of 4 pixels in width is present on the image of the reference source. At this time, if the data source is simultaneously controlled to operate in the operation mode, and the phase of the data source is consistent with the reference source, the stripes appearing on the image of the data source are similar to the stripes appearing on the image of the reference source (eg , the width is also 4 pixels) and there is no phase difference; if the data source is controlled to operate in the working mode at the same time, but the phase of the data source and the reference source are inconsistent (for example, inverted or 180° out of phase), the data source The stripes appearing on the image are similar to the stripes appearing on the image of the reference source (eg, the width is also 4 pixels) but there is a phase difference.

图18示出了针对图17所示的光标签的CMOS成像器件的一个成像时序图。在图18的上部示出了参考光源、第一数据光源和第二数据光源各自的控制信号,其中高电平可以对应于光源的开启,而低电平可以对应于光源的关闭。如图18所示,三个控制信号的频率相同,并且第一数据光源控制信号与参考光源控制信号的相位一致,第二数据光源控制信号与参考光源控制信号的相位相差180°。以此方式,当使用CMOS成像器件对该光标签进行成像时,参考光源、第一数据光源和第二数据光源的成像上都会呈现出宽度大致为4个像素的条纹,但是第一数据光源与参考光源的成像上的条纹相位是一致的(例如,参考光源的亮条纹所在的行与第一数据光源的亮条纹所在的行是一致的,参考光源的暗条纹所在的行与第一数据光源的暗条纹所在的行是一致的),而第二数据光源与参考光源的成像上的条纹相位是反相的(例如,参考光源的亮条纹所在的行与第二数据光源的暗条纹所在的行是一致的,参考光源的暗条纹所在的行与第二数据光源的亮条纹所在的行是一致的)。FIG. 18 shows an imaging timing chart of the CMOS imaging device for the optical tag shown in FIG. The respective control signals of the reference source, the first data source, and the second data source are shown in the upper portion of FIG. 18, wherein a high level may correspond to the turn-on of the light source and a low level may correspond to the turn-off of the light source. As shown in FIG. 18, the three control signals have the same frequency, and the first data source control signal is in phase with the reference source control signal, and the second data source control signal is 180 degrees out of phase with the reference source control signal. In this way, when the optical tag is imaged using a CMOS imaging device, the reference light source, the first data source, and the second data source will each exhibit a stripe having a width of approximately four pixels, but the first data source and The fringe phase on the imaging of the reference source is uniform (eg, the line of the reference light source's bright stripe coincides with the line of the first data source's bright stripe, the line of the reference source's dark stripe and the first data source The lines of dark stripes are consistent, and the phase of the fringes of the second data source and the reference source are inverted (eg, the line where the light stripe of the reference source is located and the dark strip of the second data source) The lines are identical, and the line of the dark stripe of the reference source is identical to the line of the bright stripe of the second data source).

通过提供参考光源,并对数据光源采用相位控制,可以在改善识别能力的情况下进一步提升数据光源每次可传递的信息量。对于图17所示的光标签,如果第一数据光源和第二数据光源被配置为可以工作于第一模式和第二模式,其中,在第一模式下不呈现条纹,在第二模式下呈现条纹。如果不提供参考光源的话,每个数据光源在一帧图像中可以传递两种数据之一,例如0或1。而通过提供参考光源并使其工作于第二模式,并在数据光源工作于第二模式时进一步提供相位控制,从而使得第二模式本身可以用于传递不止一种数据。以图18所示的方式为例,结合相位控制的第二模式本身就可以用于传递两种数据之一,从而每个数据光源在一帧图像中可以传递三种数据之一。By providing a reference source and phase control of the data source, the amount of information that the data source can deliver at a time can be further improved with improved recognition capabilities. For the optical tag shown in FIG. 17, if the first data source and the second data source are configured to operate in the first mode and the second mode, wherein no stripes are present in the first mode and presented in the second mode stripe. If no reference source is provided, each data source can deliver one of two types of data, such as 0 or 1, in one frame of image. The second mode itself can be used to deliver more than one type of data by providing a reference source and operating it in the second mode and further providing phase control when the data source is operating in the second mode. Taking the method shown in Fig. 18 as an example, the second mode combined with the phase control itself can be used to deliver one of the two kinds of data, so that each data source can deliver one of the three kinds of data in one frame of image.

上述方式通过引入参考光源,使得对数据光源的相位控制得以实现,因此可以提高光标签的数据光源的编码密度,并且可以相应地提高整个光标签的编码密度。例如,对于上文所述的实施例,如果不采用参考光源(也即,将参考光源作为第三数据光源),每个数据光源在一帧图像中可以传 递两种数据之一,因而整个光标签(包含三个数据光源)在一帧图像中可以传递2 3种数据组合之一;而如果采用参考光源,每个数据光源在一帧图像中可以传递三种数据之一,因而整个光标签(包含两个数据光源)在一帧图像中可以传递3 2种数据组合之一。如果增加光标签中数据光源的数量,该效果会更加明显,例如,对于上文所述的实施例,如果使用包含五个光源的光标签,在不采用参考光源的情况下,整个光标签(包含五个数据光源)在一帧图像中可以传递2 5种数据组合之一;而在选择其中一个光源作为参考光源的情况下,整个光标签(包含四个数据光源)在一帧图像中可以传递3 4种数据组合之一。类似地,通过增加光标签中参考光源的数量,也可以进一步提高整个光标签的编码密度。下面提供将数据光源的图像和参考光源的图像做匹配计算(例如:相关性计算)的一些试验数据。其中,计算结果的含义定义为如下: The above manner enables the phase control of the data source by introducing a reference light source, so that the coding density of the data source of the optical tag can be improved, and the coding density of the entire optical tag can be increased accordingly. For example, for the embodiment described above, if a reference light source is not employed (ie, the reference light source is used as the third data light source), each data light source can deliver one of two kinds of data in one frame of image, thus the entire light label (contains three data sources) can be one of three species of transmitting data in an image composition; and if the reference light source, each data source can be one of three data transfer in an image, and thus the entire optical tag (contains two data sources) can pass one of the three combinations of two kinds of data in an image. This effect is more pronounced if the number of data sources in the optical tag is increased, for example, for the embodiment described above, if an optical tag containing five sources is used, the entire tag is used without the reference source ( Contains five data sources) One of 25 combinations of data can be passed in one frame; and when one of the sources is selected as the reference source, the entire optical tag (including four data sources) can be in one frame. Pass one of 3 4 data combinations. Similarly, by increasing the number of reference sources in the optical tag, the encoding density of the entire optical tag can be further increased. Some experimental data for matching calculations (eg, correlation calculations) of the image of the data source and the image of the reference source are provided below. Among them, the meaning of the calculation results is defined as follows:

0.00~±0.30    微相关0.00~±0.30 micro correlation

±0.30~±0.50  实相关±0.30~±0.50 real correlation

±0.50~±0.80  显著相关±0.50 to ±0.80 significant correlation

±0.80~±1.00  高度相关±0.80~±1.00 highly correlated

其中,正值代表正相关,负值代表负相关。如果数据光源与参考光源的频率和相位都是一致的,则在理想状态下,两个光源的图像完全一致,从而相关性计算的结果为+1,表示完全正相关。如果数据光源与参考光源的频率是一致的,但相位相反,则在理想状态下,两个光源的图像的条纹宽度相同,但亮暗条纹的位置恰好相反,从而相关性计算的结果为-1,表示完全负相关。可以理解,在实际成像过程中,由于干扰、误差等的存在,很难获得完全正相关和完全负相关的图像。如果数据光源与参考光源工作于不同的工作模式以显示不同宽度的条纹,或者其中一者不显示条纹,则两者的图像通常是微相关的。Among them, positive values represent positive correlation and negative values represent negative correlation. If the frequency and phase of the data source and the reference source are the same, then in an ideal state, the images of the two sources are exactly the same, so the result of the correlation calculation is +1, indicating a complete positive correlation. If the frequency of the data source and the reference source are the same, but the phases are opposite, then under ideal conditions, the image of the two sources has the same stripe width, but the position of the bright and dark stripes is exactly opposite, so the result of the correlation calculation is -1 , indicating complete negative correlation. It can be understood that in the actual imaging process, it is difficult to obtain a completely positively correlated and completely negatively correlated image due to the presence of interference, errors, and the like. If the data source and the reference source operate in different modes of operation to display stripes of different widths, or one of them does not display stripes, the images of the two are typically micro-correlated.

下面的表1和表2分别示出了当数据光源与参考光源采用相同频率相同相位时的相关性计算结果和当数据光源与参考光源采用相同频率相反相位时的相关性计算结果。其中针对每种情形,分别拍摄了五张图像,将每帧图像中的参考光源图像与该帧图像中的数据光源图像进行相关性计算。Tables 1 and 2 below show the correlation calculation results when the data source and the reference source are of the same frequency at the same frequency, and the correlation calculation results when the data source and the reference source are in the opposite phase of the same frequency. For each case, five images were taken, and the reference light source image in each frame image was correlated with the data source image in the frame image.

  数据图像1Data image 1 数据图像2Data image 2 数据图像3Data image 3 数据图像4Data image 4 数据图像5Data image 5 参考图像1Reference image 1 0.77100.7710         参考图像2Reference image 2   0.78620.7862       参考图像3Reference image 3     0.76320.7632     参考图像4Reference image 4       0.78830.7883   参考图像5Reference image 5         0.79670.7967

表1-相同频率相同相位时的相关性计算结果Table 1 - Correlation calculation results for the same phase at the same frequency

  数据图像1Data image 1 数据图像2Data image 2 数据图像3Data image 3 数据图像4Data image 4 数据图像5Data image 5 参考图像1Reference image 1 -0.7849-0.7849         参考图像2Reference image 2   -0.7786-0.7786       参考图像3Reference image 3     -0.7509-0.7509     参考图像4Reference image 4       -0.7896-0.7896   参考图像5Reference image 5         -0.7647-0.7647

表2-相同频率相反相位时的相关性计算结果Table 2 - Correlation calculation results for the opposite phase of the same frequency

从上表可以看出,当数据光源与参考光源采用相同频率相同相位时,相关性计算结果能够表明它们二者是显著正相关的。当数据光源与参考光源采用相同频率相反相位时,相关性计算结果能够表明它们二者是显著负相关的。As can be seen from the above table, when the data source and the reference source use the same frequency at the same frequency, the correlation calculation results can indicate that they are significantly positively correlated. When the data source and the reference source are in opposite phases of the same frequency, the correlation calculations can indicate that they are significantly negatively correlated.

相比于现有技术中二维码大概15倍左右的识别距离,本发明的光标签的至少200倍的识别距离具有明显的优势。该远距离识别能力尤其适合于室外识别,以200倍的识别距离为例,对于街道上设置的一个长度为50厘米的光源,在距离该光源100米范围内的人都可以通过手机与该光源进行交互。另外,本发明的方案不要求CMOS成像设备位于与光标签的固定的距离处,也不要求CMOS成像设备与光标签之间的时间同步,并且不需要对各个条纹的边界和宽度进行精确检测,因此,其在实际的信息传输中具有极强的稳定性和可靠性。另外,本发明的方案也不需要CMOS成像设备必须大致正对光标签才能进行识别,对于具有条状、球状光源的光标签而言,尤其如此。例如,对于设置在广场上的一个条状或柱状光标签,在 其周围360°范围内的CMOS成像设备都可以对其进行识别。如果该条状或柱状光标签被布置在一个墙壁上,则在其周围大致180°范围内的CMOS成像设备都可以对其进行识别。对于设置在广场上的一个球状光标签,则在其周围的三维空间中的任何位置的CMOS成像设备都可以对其进行识别。Compared with the recognition distance of about 15 times of the two-dimensional code in the prior art, the identification distance of at least 200 times of the optical label of the present invention has obvious advantages. The long-distance recognition capability is especially suitable for outdoor recognition. Taking a recognition distance of 200 times as an example, for a light source with a length of 50 cm set on a street, a person within 100 meters from the light source can pass the mobile phone and the light source. Interact. In addition, the solution of the present invention does not require the CMOS imaging device to be located at a fixed distance from the optical tag, nor does it require time synchronization between the CMOS imaging device and the optical tag, and does not require accurate detection of the boundaries and widths of the individual stripes. Therefore, it has extremely strong stability and reliability in actual information transmission. In addition, the solution of the present invention also does not require that the CMOS imaging device must be substantially aligned with the optical tag for identification, especially for optical tags having strip, spherical sources. For example, for a strip or columnar light tag placed on a square, a CMOS imaging device within 360° of it can be identified. If the strip or columnar light label is placed on a wall, it can be identified by a CMOS imaging device that is approximately 180° around it. For a spherical optical tag placed on a square, it can be identified by a CMOS imaging device at any location in its three-dimensional space.

由于本发明的光标签的上述优势,其可以用于实现视野范围内的精确导引,例如,对能够自主移动的机器的导引。本发明的一个实施例涉及一种通过光标签对能够自主移动的机器进行导引的系统,其包括能够自主移动的机器和上述任一实施例中描述的光标签。该能够自主移动的机器上安装有CMOS摄像头,其能够对光标签传递的信息进行采集并识别。Due to the aforementioned advantages of the optical tag of the present invention, it can be used to achieve precise guidance within the field of view, for example, guidance of machines capable of autonomous movement. One embodiment of the present invention is directed to a system for guiding autonomously movable machine through an optical tag, comprising a machine capable of autonomous movement and a light tag as described in any of the above embodiments. The autonomously movable machine is equipped with a CMOS camera capable of collecting and identifying information transmitted by the optical tag.

下文中以网络购物的无人机投递应用为例进行描述。买家可以将自己的公寓作为收货地址,并在网络购物平台中填写收货地址信息,例如如下信息中的一些:地理位置信息、小区信息、楼号、楼层、等等。买家可以在该公寓处(例如公寓的阳台、外墙等位置)布置一个光标签,作为无人机进行货物投递时的目标光标签。在买家通过网络完成购物之后,该光标签可以被配置为通过连续工作于不同的模式来传递预定信息,该预定信息例如可以是该光标签本身的ID信息、买家在网络购物平台的ID信息、买家在网络购物平台购物后从平台接收的验证码、等等,只要该预定信息是网络购物平台知悉的并能够用于标识该买家或其购买的货物即可。网络购物平台可以将该预定信息传送给该无人机。The following describes an example of a drone delivery application for online shopping. Buyers can use their apartment as the shipping address and fill in the shipping address information in the online shopping platform, such as some of the following information: geographic location information, cell information, building number, floor, and so on. Buyers can place a light tag at the apartment (such as the balcony, exterior wall, etc. of the apartment) as the target light tag for the drone to deliver the goods. After the buyer completes the shopping through the network, the optical tag may be configured to deliver the predetermined information by continuously working in different modes, the predetermined information may be, for example, the ID information of the optical tag itself, the ID of the buyer on the online shopping platform. The information, the verification code received by the buyer from the platform after shopping on the online shopping platform, and the like, as long as the predetermined information is known to the online shopping platform and can be used to identify the buyer or the goods it purchases. The online shopping platform can transmit the predetermined information to the drone.

本发明的通过光标签进行无人机导引的方法可以如图19所示,其包括如下步骤:The method for guiding the drone through the optical tag of the present invention can be as shown in FIG. 19, which includes the following steps:

步骤101:控制无人机行进到目标光标签附近。Step 101: Control the drone to travel to the vicinity of the target light tag.

在无人机取到要发送给买家的货物后,可以首先飞到买家的收货地址(也即,买家的公寓)附近。在一个实施例中,该收货地址优选的可以是目标光标签本身的地理位置信息(例如,光标签的精确的经纬度、高度信息等),并还可以包括其他信息,例如该目标光标签的朝向信息等。After the drone picks up the goods to be sent to the buyer, it can first fly to the buyer's shipping address (ie, the buyer's apartment). In one embodiment, the delivery address may preferably be geographic location information of the target optical tag itself (eg, precise latitude and longitude, height information, etc. of the optical tag), and may also include other information, such as the target optical tag. Orientation information, etc.

步骤101可以以本领域中各种可能的现有方式来实现。例如,无人机可以通过GPS导航等方式飞到该收货地址附近(也即买家的光标签附近)。现有的GPS导航方式能够达到几十米的精度范围,而本发明的光标签能够 实现至少200倍的识别距离,以200倍的识别距离为例,对于长度为20厘米的光源,无人机只要能飞到该光源周围40米的范围内即可以实现识别。在步骤101中,也可以利用其他光标签与目标光标签之间的相对位置关系将无人机引导到目标光标签附近。各个光标签之间的相对位置关系例如可以被预先存储并可以被无人机获得。无人机在飞行时可以识别其飞行路径沿线的其他光标签,并获得该其他光标签与目标光标签之间的相对位置关系,然后,无人机可以通过相对定位(也可称为反向定位)来确定其与该其他光标签之间的相对位置关系,从而,可以确定出目标光标签与无人机之间的相对位置关系。基于该相对位置关系,可以将无人机引导到目标光标签附近。本领域技术人员可以理解,也可以是使用上述各种方式的组合将无人机引导到目标光标签附近。Step 101 can be implemented in various possible existing ways in the art. For example, the drone can fly to the vicinity of the shipping address (ie, near the buyer's light tag) by means of GPS navigation or the like. The existing GPS navigation method can reach the precision range of several tens of meters, and the optical tag of the present invention can realize the recognition distance of at least 200 times, taking the recognition distance of 200 times as an example, for the light source with a length of 20 cm, the drone Identification can be achieved as long as it can fly to within 40 meters of the source. In step 101, the drone can also be guided to the vicinity of the target optical tag by using the relative positional relationship between the other optical tags and the target optical tag. The relative positional relationship between the individual optical tags can be stored, for example, in advance and can be obtained by the drone. When the drone is flying, it can identify other optical tags along its flight path and obtain the relative positional relationship between the other optical tags and the target optical tags. Then, the drone can be relatively positioned (also called reverse Positioning) to determine the relative positional relationship between the target optical tag and the other optical tag, so that the relative positional relationship between the target optical tag and the drone can be determined. Based on the relative positional relationship, the drone can be guided to the vicinity of the target light tag. Those skilled in the art will appreciate that it is also possible to direct the drone to the vicinity of the target light tag using a combination of the various means described above.

可以使用本领域已知的各种相对定位方式来确定无人机与光标签的相对位置关系。在一个实施例中,无人机可以使用其成像装置对光标签进行图像采集,基于所采集的图像获得其与光标签的相对距离(成像越大,距离越近;成像越小,距离越远),并且可以通过内置的传感器获得无人机当前的朝向信息,基于该朝向信息获得无人机与光标签的相对方向(优选地,可以进一步结合光标签在图像中的位置来更为精确地确定无人机与光标签的相对方向),从而,可以基于无人机与光标签的相对距离和相对方向来获得它们之间的相对位置关系。另外,目前市场销售的很多成像装置上通常配备有双目摄像头或深度摄像头,利用配备有双目摄像头或深度摄像头的成像装置对光标签进行图像采集,也可以容易地获得该成像装置与光标签之间的相对距离。在另一个实施例中,为了确定用户与光标签的相对方向,可以在服务器中存储光标签的朝向信息,当用户识别出了光标签的标识信息之后,可以使用标识信息从服务器获得该朝向信息,之后,基于光标签的朝向信息以及光标签在用户手机上的成像的透视变形,可以计算出用户与光标签的相对方向。Various relative positioning methods known in the art can be used to determine the relative positional relationship of the drone to the optical tag. In one embodiment, the drone can use its imaging device to image the optical tag, and obtain the relative distance from the optical tag based on the acquired image (the larger the imaging, the closer the distance; the smaller the imaging, the farther the distance And the current orientation information of the drone can be obtained by the built-in sensor, and the relative direction of the drone and the optical tag is obtained based on the orientation information (preferably, the position of the optical tag in the image can be further combined to more accurately The relative direction of the drone and the optical tag is determined), so that the relative positional relationship between the drone and the optical tag can be obtained based on the relative distance and the relative direction of the drape. In addition, many imaging devices currently on the market are usually equipped with a binocular camera or a depth camera, and the image tag is collected by an imaging device equipped with a binocular camera or a depth camera, and the imaging device and the optical tag can also be easily obtained. The relative distance between them. In another embodiment, in order to determine the relative direction of the user and the optical label, the orientation information of the optical label may be stored in the server. After the user identifies the identification information of the optical label, the orientation information may be obtained from the server using the identification information. Then, based on the orientation information of the optical tag and the perspective distortion of the imaging of the optical tag on the user's mobile phone, the relative direction of the user and the optical tag can be calculated.

需要说明的是,该步骤101并非本发明的必要步骤,其在某些情况下可以被省略。例如,如果目标光标签本身已经处于无人机的视野范围内。It should be noted that this step 101 is not a necessary step of the present invention, and may be omitted in some cases. For example, if the target light tag itself is already within the field of view of the drone.

步骤102:通过无人机上安装的CMOS摄像头对周围的某个光标签传递的信息进行采集,并识别所传递的信息。Step 102: Collect information transmitted by a surrounding optical tag through a CMOS camera installed on the drone, and identify the transmitted information.

在飞到了买家的光标签附近后,无人机可以查找其视野范围内的光标 签,并通过其上安装的CMOS摄像头对查找到的光标签传递的信息进行采集并识别所传递的信息。例如,无人机可以通过其CMOS摄像头获得某个光标签的连续的多帧图像,并针对每一帧图像判断该图像上与光源的位置对应的部分是否存在条纹或者存在哪种类型的条纹,以及确定每一帧图像所表示的信息。在一个实施例中,如果无人机在其视野范围内发现了一个光标签,但由于距离过远而无法识别出其传递的信息,则无人机可以适当地接近该光标签,以实现对光标签所传递的信息的识别。After flying to the buyer's optical tag, the drone can find the cursor in its field of view and collect the information transmitted by the discovered optical tag through the CMOS camera installed on it and identify the transmitted information. For example, a drone can obtain a continuous multi-frame image of a certain optical tag through its CMOS camera, and determine, for each frame image, whether a portion of the image corresponding to the position of the light source has streaks or which type of streak exists. And determining the information represented by each frame of image. In one embodiment, if the drone finds an optical tag within its field of view, but the distance is too far to recognize the information it transmits, the drone can properly access the optical tag to achieve Identification of information conveyed by optical tags.

步骤103:基于所传递的信息判断所述光标签是否是目标光标签。Step 103: Determine whether the optical tag is a target optical tag based on the transmitted information.

无人机可以基于光标签所传递的信息判断该光标签是否是目标光标签。例如,无人机可以判断所传递的信息中是否显式地或隐式地包含上述预定信息。如果包含,则可以确定该光标签是目标光标签,否则,可以确定该光标签不是目标光标签。在一个实施例中,可以由无人机自身来判断该光标签是否是目标光标签。在另一个实施例中,无人机可以将光标签所传递的信息传送到能够与无人机进行通信的服务器,由该服务器基于所传递的信息判断该光标签是否是目标光标签,并将判断结果发送给无人机。光标签传递的信息可以是加密后的信息。The drone can determine whether the optical tag is a target optical tag based on information transmitted by the optical tag. For example, the drone can determine whether the predetermined information is explicitly or implicitly included in the transmitted information. If included, it can be determined that the optical tag is the target optical tag, otherwise, it can be determined that the optical tag is not the target optical tag. In one embodiment, it may be up to the drone itself to determine if the optical tag is a target optical tag. In another embodiment, the drone can transmit the information conveyed by the optical tag to a server capable of communicating with the drone, and the server determines whether the optical tag is the target optical tag based on the transmitted information, and The judgment result is sent to the drone. The information transmitted by the optical tag can be encrypted information.

步骤104:如果所述光标签是目标光标签,则控制无人机向所述光标签行进。Step 104: If the optical tag is a target optical tag, control the drone to travel to the optical tag.

由于无人机已经确定了视野范围中的某个光标签是飞行目的地,因此,无人机可以例如通过该光标签的视觉引导来无误差地向该光标签飞行。在一个实施例中,无人机可以使用现有的测距技术停止于距离光标签的某个距离处,例如距离光标签几十厘米的位置,避免碰撞到光标签。在一个实施例中,无人机可以基于其拍摄到的光标签的图像的透视变形来进行相对定位并调整其飞行线路,使得无人机最终能够停止于相对于光标签的某一方向上,例如,光标签的正前方。在该光标签的正前方可以布置有用于收货的货架,无人机可以容易地将货物投递到该货架中。Since the drone has determined that a certain optical tag in the field of view is a flight destination, the drone can fly to the optical tag without error, for example by visual guidance of the optical tag. In one embodiment, the drone can be stopped at a distance from the light tag using existing ranging techniques, such as a position tens of centimeters from the light tag, to avoid collisions with the light tag. In one embodiment, the drone can relatively position and adjust its flight path based on the perspective distortion of the image of the optical tag it captured, such that the drone can eventually stop in a certain direction relative to the light tag, such as , the front of the light label. A shelf for receiving goods may be disposed directly in front of the optical tag, and the drone can easily deliver the goods into the shelf.

如果确定该光标签不是目标光标签,无人机可以对其附近的其他光标签进行识别,其与上述过程类似,不再赘述。优选地,在一个实施例中,如果确定该光标签不是目标光标签,无人机可以通过该光标签来确定出其与目标光标签之间的相对位置关系。例如,无人机可以通过相对定位确定 其与该光标签之间的相对位置关系,并可以基于该光标签传递的信息识别出该光标签(例如,获得该光标签的标识信息)并获得该光标签与目标光标签之间的相对位置关系(各个光标签之间的相对位置关系例如可以被预先存储并可以被无人机获得),从而,可以确定出无人机与目标光标签之间的相对位置关系。在获得了该相对位置关系后,无人机可以利用该相对位置关系以及可选的其他导航信息(例如,GPS信息)飞到目标光标签附近。If it is determined that the optical tag is not the target optical tag, the drone can identify other optical tags in the vicinity thereof, which is similar to the above process and will not be described again. Preferably, in one embodiment, if it is determined that the optical tag is not the target optical tag, the drone can determine the relative positional relationship between the optical tag and the target optical tag. For example, the drone can determine its relative positional relationship with the optical tag by relative positioning, and can identify the optical tag based on the information transmitted by the optical tag (eg, obtain identification information of the optical tag) and obtain the The relative positional relationship between the optical tag and the target optical tag (the relative positional relationship between the optical tags can be pre-stored and can be obtained by the drone, for example), thereby determining the relationship between the drone and the target optical tag Relative positional relationship. After obtaining the relative positional relationship, the drone can fly to the vicinity of the target light tag using the relative positional relationship and optionally other navigation information (eg, GPS information).

本领域技术人员可以理解,本发明的通过光标签进行导引的无人机投递方案并不限于在买家公寓处(例如公寓的阳台、外墙等位置)布置的光标签,其显然也可以适用于在更为空旷的地带布置的光标签,例如,布置在庭院中的光标签。It will be understood by those skilled in the art that the drone delivery scheme of the present invention guided by the optical tag is not limited to the optical tag disposed at the buyer's apartment (for example, the balcony of the apartment, the outer wall, etc.), which obviously can also Suitable for light tags arranged in more open areas, for example, light tags placed in a courtyard.

另外,如果买家不具有自己的光标签,或者希望将货物投递到其他光标签所在的位置(例如,位于广场、公园等中的公共光标签,或者朋友家的光标签),其可以将货物投递地址处的光标签(也即目标光标签)的相关信息(例如,目标光标签的ID信息、地理位置信息等)告知网络购物平台。网络购物平台可以将相应信息告知无人机,无人机在飞到目标光标签附近后,可以识别附近的光标签传递的信息(例如,光标签传递的ID信息),并最终确定出目标光标签。In addition, if the buyer does not have his own light label, or if he wishes to deliver the goods to a location where other light labels are located (for example, public light labels located in squares, parks, etc., or light labels of friends' homes), they can The relevant information of the optical tag (ie, the target optical tag) at the delivery address (eg, the ID information of the target optical tag, geographic location information, etc.) is notified to the online shopping platform. The online shopping platform can inform the drone of the corresponding information, and the drone can recognize the information transmitted by the nearby optical tag (for example, the ID information transmitted by the optical tag) after flying to the vicinity of the target optical tag, and finally determine the target light. label.

另外,本发明的通过光标签进行导引的无人机投递方案不仅可以适用于具有固定位置的光标签,而且也可以适用于非固定的光标签(例如,可以由人随身携带的光标签)。例如,如果买家在广场活动时希望进行网络购物并希望能够将货物投递到其当前位置,其可以将其当前的地理位置信息告知网络购物平台,并开启其随身携带的光标签。该光标签可以被配置为传递预定信息,该预定信息例如可以是该光标签本身的ID信息、买家在网络购物平台的ID信息、买家在网络购物平台购物后从平台接收的验证码、等等,只要该预定信息是网络购物平台知悉的并能够用于标识该买家或其购买的货物即可。无人机在飞到买家的位置附近后,可以识别附近的光标签传递的信息等,并最终确定出目标光标签(也即,买家随身携带的光标签),从而完成货物投递。在一个实施例中,网络购物平台可以将无人机的预计到达时间告知买家,从而,买家可以在此期间自由活动,只 要在预计到达时间时返回之前的位置附近即可。在一个实施例中,买家也可以不返回之前的位置,而是可以将其新位置发送到网络购物平台,网络购物平台可以将该新位置通知无人机,以便无人机能够飞到该新位置附近。在一个实施例中,买家也可以将货物投递地址设置为其在某个时刻预计会到达的某个地址,并指示网络购物平台在该时刻将货物运送到该地址附近。In addition, the drone delivery scheme guided by the optical tag of the present invention can be applied not only to a light tag having a fixed position, but also to a non-fixed optical tag (for example, a light tag that can be carried by a person) . For example, if a buyer wants to make a online purchase while in a plaza event and wants to be able to deliver the goods to their current location, they can inform their online shopping platform of their current geographic location information and turn on the light tags they carry with them. The optical tag may be configured to transmit predetermined information, which may be, for example, ID information of the optical tag itself, ID information of the buyer on the online shopping platform, a verification code received by the buyer from the platform after the buyer purchases on the online shopping platform, And so on, as long as the predetermined information is known to the online shopping platform and can be used to identify the buyer or the goods it purchases. After the drone arrives near the buyer's location, it can identify the information transmitted by the nearby optical tag, and finally determine the target optical tag (that is, the light tag carried by the buyer) to complete the delivery of the goods. In one embodiment, the online shopping platform can inform the buyer of the estimated arrival time of the drone so that the buyer can move freely during this time, as long as the expected arrival time is returned to the vicinity of the previous location. In one embodiment, the buyer may not return to the previous location, but may send its new location to the online shopping platform, and the online shopping platform may notify the drone of the new location so that the drone can fly to the location. Near the new location. In one embodiment, the buyer may also set the goods delivery address to an address that is expected to arrive at a certain time and instruct the online shopping platform to ship the goods to the address at that time.

上文中以网络购物的无人机投递应用为例进行了描述,但可以理解,通过光标签的无人机导引并不限于上述应用,而是可以用于需要无人机的精准定位的各种应用,例如无人机自动充电、无人机自动停泊、无人机线路导航等等。另外,本领域技术人员可以理解,本发明的基于光标签的导引并非仅仅适用于无人机,而是也可以适用于其他类型的能够自主移动的机器,例如,无人驾驶汽车、机器人等。无人驾驶汽车或机器人上可以安装有CMOS摄像头,并可以以与无人机类似的方式与光标签进行交互。在一个实施例中,该能够自主移动的机器的一部分是可移动的,但另一部分是固定的。例如,该能够自主移动的机器可以是位于流水线上或仓库中的一个通常具有固定位置的机器,该机器的主体部分在大多数情况下可以是固定的,但具有一个或多个可移动的机械臂。CMOS摄像头可以安装在该机器的固定部分,以用于确定光标签的位置,从而可以将该机器的可移动部分(例如机械臂)导引到光标签的位置。对于这种机器,显然,上文中描述的步骤101是不需要的。另外,可以理解,CMOS摄像头也可以安装在该机器的可移动部分上,例如,安装在每个机械臂上。In the above, the drone delivery application of the online shopping is taken as an example, but it can be understood that the drone guidance through the optical label is not limited to the above application, but can be used for each of the precise positioning requiring the drone. Applications such as automatic charging of drones, automatic mooring of drones, navigation of drone lines, etc. In addition, those skilled in the art can understand that the optical tag-based guidance of the present invention is not only applicable to the drone, but can also be applied to other types of autonomously movable machines, such as driverless cars, robots, and the like. . A CMOS camera can be mounted on a driverless car or robot and can interact with optical tags in a similar manner to drones. In one embodiment, a portion of the autonomously movable machine is moveable, but another portion is fixed. For example, the autonomously movable machine may be a machine having a fixed position on a pipeline or in a warehouse, the body portion of the machine being fixed in most cases but having one or more movable machinery arm. A CMOS camera can be mounted on a fixed portion of the machine for determining the position of the optical tag so that the movable portion of the machine (eg, a robotic arm) can be directed to the position of the optical tag. For such a machine, it is apparent that step 101 described above is not required. Additionally, it will be appreciated that the CMOS camera can also be mounted on a movable portion of the machine, for example, on each robotic arm.

本文中针对“各个实施例”、“一些实施例”、“一个实施例”、或“实施例”等的参考指代的是结合所述实施例所描述的特定特征、结构、或性质包括在至少一个实施例中。因此,短语“在各个实施例中”、“在一些实施例中”、“在一个实施例中”、或“在实施例中”等在整个本文中各处的出现并非必须指代相同的实施例。此外,特定特征、结构、或性质可以在一个或多个实施例中以任何合适方式组合。因此,结合一个实施例中所示出或描述的特定特征、结构或性质可以整体地或部分地与一个或多个其他实施例的特征、结构、或性质无限制地组合,只要该组合不是非逻辑性的或不能工作。本文中出现的类似于“根据A”或“基于A”的表述意指非 排他性的,也即,“根据A”可以涵盖“仅仅根据A”,也可以涵盖“根据A和B”,除非特别声明或者根据上下文明确可知其含义为“仅仅根据A”。在本申请中为了清楚说明,以一定的顺序描述了一些示意性的操作步骤,但本领域技术人员可以理解,这些操作步骤中的每一个并非是必不可少的,其中的一些步骤可以被省略或者被其他步骤替代。这些操作步骤也并非必须以所示的方式依次执行,相反,这些操作步骤中的一些可以根据实际需要以不同的顺序执行,或者并行执行,只要新的执行方式不是非逻辑性的或不能工作。References herein to "individual embodiments," "some embodiments," "an embodiment," or "an embodiment" or "an" or "an" In at least one embodiment. Thus, appearances of the phrases "in the various embodiments", "in some embodiments", "in one embodiment", or "in an embodiment" are not necessarily referring to the same implementation. example. Furthermore, the particular features, structures, or properties may be combined in any suitable manner in one or more embodiments. Thus, the particular features, structures, or properties shown or described in connection with one embodiment may be combined, in whole or in part, with the features, structures, or properties of one or more other embodiments without limitation, as long as the combination is not Logical or not working. A phrase similar to "according to A" or "based on A" as used herein means non-exclusive, that is, "according to A" may cover "based only on A" or "according to A and B" unless special The statement or the context clearly indicates that it means "based only on A". In the present application, some illustrative operational steps are described in a certain order for clarity of explanation, but those skilled in the art will appreciate that each of these operational steps is not essential, and some of the steps may be omitted. Or be replaced by other steps. These operational steps are not necessarily required to be performed sequentially in the manner shown. Instead, some of these operational steps may be performed in a different order depending on actual needs, or performed in parallel, as long as the new execution mode is not non-logical or inoperable.

由此描述了本发明的至少一个实施例的几个方面,可以理解,对本领域技术人员来说容易地进行各种改变、修改和改进。这种改变、修改和改进意于在本发明的精神和范围内。Having thus described several aspects of at least one embodiment of the present invention, it is understood that various changes, modifications and improvements can be readily made by those skilled in the art. Such changes, modifications, and improvements are intended to be within the spirit and scope of the invention.

Claims (18)

一种对能够自主移动的机器进行导引的系统,包括:A system for guiding autonomously movable machine, comprising: 能够自主移动的机器,其上安装有滚动快门摄像头;以及Autonomously movable machine having a rolling shutter camera mounted thereon; 光通信装置,其包括光源,所述光源被配置为能够工作于至少两种模式,所述至少两种模式包括第一模式和第二模式,An optical communication device comprising a light source configured to be operable in at least two modes, the at least two modes comprising a first mode and a second mode, 以及其中,在所述第一模式下,通过具有第一频率的光源控制信号控制所述光源发出的光的属性以第一频率持续变化,以在通过所述滚动快门摄像头对所述光源拍摄时所获得的所述光源的图像上呈现出条纹,在所述第二模式下,在通过所述滚动快门摄像头对所述光源拍摄时所获得的所述光源的图像上不呈现条纹或者呈现出与所述第一模式下的条纹不同的条纹。And wherein, in the first mode, controlling, by a light source control signal having a first frequency, an attribute of light emitted by the light source to continuously change at a first frequency to capture the light source by the rolling shutter camera a fringe is present on the image of the obtained light source, and in the second mode, no streaks are present or appear on the image of the light source obtained when the light source is photographed by the rolling shutter camera Stripes of different stripes in the first mode. 根据权利要求1所述的系统,其中,在所述第二模式下,通过具有与所述第一频率不同的第二频率的光源控制信号控制所述光源发出的光的属性以第二频率持续变化,以在通过所述滚动快门摄像头对所述光源拍摄时所获得的所述光源的图像上不呈现条纹或者呈现出与所述第一模式下的条纹不同的条纹。The system of claim 1 wherein, in said second mode, the property of the light emitted by said source is controlled by a light source control signal having a second frequency different from said first frequency to continue at a second frequency Varying to not present streaks on the image of the light source obtained when the light source is photographed by the rolling shutter camera or to present a stripe different from the stripe in the first mode. 根据权利要求2所述的系统,其中,所述第二频率大于所述第一频率。The system of claim 2 wherein said second frequency is greater than said first frequency. 根据权利要求1所述的系统,其中,在所述第二模式下,所述光源发出的光的属性以所述第一频率持续变化,并在通过所述滚动快门摄像头对所述光源拍摄时所获得的所述光源的图像上呈现出与所述第一模式下的条纹不同的条纹。The system according to claim 1, wherein in said second mode, an attribute of light emitted by said light source continuously changes at said first frequency, and when said light source is photographed by said rolling shutter camera The obtained image of the light source exhibits a stripe different from the stripe in the first mode. 根据权利要求1所述的系统,其中,所述光源为条状光源或球状光源。The system of claim 1 wherein the light source is a strip light source or a spherical light source. 根据权利要求1所述的系统,其中,所述能够自主移动的机器包括仅其部分能够移动的机器。The system of claim 1 wherein said autonomously movable machine comprises a machine that is only partially movable. 一种使用权利要求1-6中任一项所述的系统对能够自主移动的机器进行导引的方法,包括:A method of guiding a machine capable of autonomous movement using the system of any one of claims 1-6, comprising: 通过所述能够自主移动的机器上安装的滚动快门摄像头对周围的某个光通信装置传递的信息进行采集,并识别所传递的信息;Collecting information transmitted by a surrounding optical communication device through a rolling shutter camera mounted on the autonomously movable machine, and identifying the transmitted information; 基于所传递的信息判断所述光通信装置是否是目标光通信装置;以及Determining whether the optical communication device is a target optical communication device based on the transmitted information; 如果所述光通信装置是目标光通信装置,则控制所述能够自主移动的机器或者其部分向所述光通信装置行进。If the optical communication device is a target optical communication device, the autonomously movable machine or a portion thereof is controlled to travel to the optical communication device. 根据权利要求7所述的方法,还包括:如果所述光通信装置不是目标光通信装置,则:The method of claim 7 further comprising: if said optical communication device is not a target optical communication device: 基于该光通信装置传递的信息识别该光通信装置,并获得该光通信装置与所述目标光通信装置之间的相对位置关系;Identifying the optical communication device based on information transmitted by the optical communication device, and obtaining a relative positional relationship between the optical communication device and the target optical communication device; 确定所述能够自主移动的机器或者其部分与该光通信装置之间的相对位置关系;Determining a relative positional relationship between the autonomously movable machine or a portion thereof and the optical communication device; 确定所述目标光通信装置与所述能够自主移动的机器或者其部分之间的相对位置关系;以及Determining a relative positional relationship between the target optical communication device and the autonomously movable machine or a portion thereof; 至少部分地基于所述目标光通信装置与所述能够自主移动的机器或者其部分之间的相对位置关系将所述能够自主移动的机器或者其部分引导向所述目标光通信装置。The autonomously moveable machine or portion thereof is directed to the target optical communication device based at least in part on a relative positional relationship between the target optical communication device and the autonomously movable machine or portion thereof. 根据权利要求8所述的方法,其中,确定所述能够自主移动的机器或者其部分与该光通信装置之间的相对位置关系包括:The method of claim 8 wherein determining a relative positional relationship between said autonomously movable machine or portion thereof and said optical communication device comprises: 通过相对定位来确定所述能够自主移动的机器或者其部分与该光通信装置之间的相对位置关系。The relative positional relationship between the autonomously movable machine or a portion thereof and the optical communication device is determined by relative positioning. 根据权利要求7所述的方法,其中,通过所述能够自主移动的机器上安装的滚动快门摄像头对周围的某个光通信装置传递的信息进行采集并识别所传递的信息包括:The method of claim 7 wherein collecting, by the rolling shutter camera mounted on the autonomously movable machine, information conveyed by a surrounding optical communication device and identifying the communicated information comprises: 通过所述滚动快门摄像头获得所述光通信装置的连续的多帧图像;Obtaining a continuous multi-frame image of the optical communication device by the rolling shutter camera; 针对每一帧图像,判断所述图像上与所述光源的位置对应的部分是否存在条纹或者存在哪种类型的条纹;以及Determining, for each frame image, whether a portion of the image corresponding to the position of the light source has stripes or which type of stripes exist; 确定每一帧图像所表示的信息。The information represented by each frame of image is determined. 根据权利要求7所述的方法,还包括:首先控制所述能够自主移动的机器行进到目标光通信装置附近。The method of claim 7 further comprising first controlling said autonomously movable machine to travel to a vicinity of the target optical communication device. 根据权利要求11所述的方法,其中,首先控制所述能够自主移动的机器行进到目标光通信装置附近包括:The method of claim 11 wherein first controlling the autonomously movable machine to travel to the vicinity of the target optical communication device comprises: 至少部分地通过卫星导航系统将所述能够自主移动的机器引导到所 述目标光通信装置附近;和/或Directing the autonomously movable machine to the vicinity of the target optical communication device, at least in part, by a satellite navigation system; and/or 至少部分地利用其他光通信装置与所述目标光通信装置之间的相对位置关系将所述能够自主移动的机器引导到所述目标光通信装置附近。The autonomously movable machine is directed to the vicinity of the target optical communication device based at least in part on a relative positional relationship between the other optical communication device and the target optical communication device. 根据权利要求12所述的方法,其中,至少部分地利用其他光通信装置与所述目标光通信装置之间的相对位置关系将所述能够自主移动的机器引导到所述目标光通信装置附近包括:The method of claim 12, wherein the at least partially utilizing a relative positional relationship between the other optical communication device and the target optical communication device directs the autonomously moveable machine to the vicinity of the target optical communication device comprises : 所述能够自主移动的机器在行进时识别其他光通信装置,并获得该其他光通信装置与所述目标光通信装置之间的相对位置关系;The autonomously movable machine identifies other optical communication devices while traveling, and obtains a relative positional relationship between the other optical communication devices and the target optical communication device; 确定所述能够自主移动的机器与该其他光通信装置之间的相对位置关系;Determining a relative positional relationship between the autonomously movable machine and the other optical communication device; 确定所述目标光通信装置与所述能够自主移动的机器之间的相对位置关系;以及Determining a relative positional relationship between the target optical communication device and the autonomously movable machine; 至少部分地基于所述目标光通信装置与所述能够自主移动的机器之间的相对位置关系将所述能够自主移动的机器引导到所述目标光通信装置附近。The autonomously moveable machine is directed to the vicinity of the target optical communication device based at least in part on a relative positional relationship between the target optical communication device and the autonomously movable machine. 根据权利要求7所述的方法,其中,基于所传递的信息判断所述光通信装置是否是目标光通信装置包括:The method of claim 7, wherein determining whether the optical communication device is a target optical communication device based on the communicated information comprises: 判断所传递的信息中是否显式地或隐式地包含预定信息。It is judged whether or not the predetermined information is explicitly or implicitly included in the transmitted information. 根据权利要求14所述的方法,其中,所述预定信息是预定的标识符或验证码。The method of claim 14, wherein the predetermined information is a predetermined identifier or a verification code. 根据权利要求7所述的方法,其中,基于所传递的信息判断所述光通信装置是否是目标光通信装置包括:The method of claim 7, wherein determining whether the optical communication device is a target optical communication device based on the communicated information comprises: 由所述能够自主移动的机器判断所述光通信装置是否是目标光通信装置;或者Determining, by the autonomously movable machine, whether the optical communication device is a target optical communication device; or 所述能够自主移动的机器将所传递的信息传送到服务器,由所述服务器基于所传递的信息判断所述光通信装置是否是目标光通信装置,并将判断结果发送给所述能够自主移动的机器。The autonomously movable machine transmits the transmitted information to a server, and the server determines, based on the transmitted information, whether the optical communication device is a target optical communication device, and transmits the determination result to the autonomously movable machine. 一种能够自主移动的机器,包括滚动快门摄像头、处理器和存储器,所述存储器中存储有计算机程序,所述计算机程序在被所述处理器执行时能够用于实现权利要求7-16中任一项所述的方法。A machine capable of autonomous movement, comprising a rolling shutter camera, a processor and a memory, wherein the memory stores a computer program that, when executed by the processor, can be used to implement any of claims 7-16 One of the methods described. 一种存储介质,其中存储有计算机程序,所述计算机程序在被执行时能够用于实现权利要求7-16中任一项所述的方法。A storage medium having stored therein a computer program, which when executed, can be used to implement the method of any of claims 7-16.
PCT/CN2019/085998 2018-05-09 2019-05-08 System and method for guiding autonomous machine Ceased WO2019214642A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810435227.7A CN110471402A (en) 2018-05-09 2018-05-09 The system and method that the machine for capableing of autonomous is guided
CN201810435227.7 2018-05-09

Publications (1)

Publication Number Publication Date
WO2019214642A1 true WO2019214642A1 (en) 2019-11-14

Family

ID=68468461

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/085998 Ceased WO2019214642A1 (en) 2018-05-09 2019-05-08 System and method for guiding autonomous machine

Country Status (3)

Country Link
CN (1) CN110471402A (en)
TW (1) TWI702805B (en)
WO (1) WO2019214642A1 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI738271B (en) * 2020-03-30 2021-09-01 新能量科技股份有限公司 Automatically guiding method for self-propelled apparatus
US11553824B2 (en) 2020-06-25 2023-01-17 Power Logic Tech, Inc. Automatic guiding method for self-propelled apparatus
WO2022134057A1 (en) * 2020-12-25 2022-06-30 Intel Corporation Re-localization of robot
WO2023005301A1 (en) * 2021-07-28 2023-02-02 广东奥普特科技股份有限公司 Agv forklift intelligent guide device, agv forklift intelligent guide method and agv forklift intelligent guide system
CN113607158B (en) * 2021-08-05 2024-11-19 中铁工程装备集团有限公司 Flat panel light source visual recognition matching positioning method and system based on visible light communication

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060113386A1 (en) * 2004-12-01 2006-06-01 Psc Scanning, Inc. Illumination pulsing method for a data reader
CN102324013A (en) * 2005-03-11 2012-01-18 手持产品公司 Bar code reading device with global electronic shutter control
US8150255B2 (en) * 2010-06-25 2012-04-03 Apple Inc. Flash control for electronic rolling shutter
CN104395910A (en) * 2012-03-23 2015-03-04 Opto电子有限公司 Image reading device capable of producing illumination including a continuous, low-intensity level illumination component and one or more pulsed, high-intensity level illumination components
WO2016031359A1 (en) * 2014-08-29 2016-03-03 ソニー株式会社 Control device, control method, and program
WO2017111201A1 (en) * 2015-12-24 2017-06-29 엘지전자 주식회사 Night image display apparatus and image processing method thereof
CN107370913A (en) * 2016-05-11 2017-11-21 松下知识产权经营株式会社 Camera device, camera system, and light detection method

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7739034B2 (en) * 2007-04-17 2010-06-15 Itt Manufacturing Enterprises, Inc. Landmark navigation for vehicles using blinking optical beacons
EP2910019A4 (en) * 2012-10-19 2016-08-24 Daniel Ryan Self-identifying one-way authentication method using optical signals
CN103427902A (en) * 2013-04-09 2013-12-04 北京半导体照明科技促进中心 Method, device and system of utilizing visible light to transmit information and light source
CN104661000B (en) * 2015-03-17 2018-01-09 珠海横琴华策光通信科技有限公司 Alignment system, based on the image of located space come the method and apparatus that is positioned
CN105515657B (en) * 2015-11-19 2018-01-02 广东顺德中山大学卡内基梅隆大学国际联合研究院 A kind of visible light camera communication system using LED lamp MIMO array framework
CN109964321A (en) * 2016-10-13 2019-07-02 六度空间有限责任公司 Method and apparatus for indoor positioning

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060113386A1 (en) * 2004-12-01 2006-06-01 Psc Scanning, Inc. Illumination pulsing method for a data reader
CN102324013A (en) * 2005-03-11 2012-01-18 手持产品公司 Bar code reading device with global electronic shutter control
US8150255B2 (en) * 2010-06-25 2012-04-03 Apple Inc. Flash control for electronic rolling shutter
CN104395910A (en) * 2012-03-23 2015-03-04 Opto电子有限公司 Image reading device capable of producing illumination including a continuous, low-intensity level illumination component and one or more pulsed, high-intensity level illumination components
WO2016031359A1 (en) * 2014-08-29 2016-03-03 ソニー株式会社 Control device, control method, and program
WO2017111201A1 (en) * 2015-12-24 2017-06-29 엘지전자 주식회사 Night image display apparatus and image processing method thereof
CN107370913A (en) * 2016-05-11 2017-11-21 松下知识产权经营株式会社 Camera device, camera system, and light detection method

Also Published As

Publication number Publication date
TWI702805B (en) 2020-08-21
CN110471402A (en) 2019-11-19
TW201947893A (en) 2019-12-16

Similar Documents

Publication Publication Date Title
TWI702805B (en) System and method for guiding a machine capable of autonomous movement
US11338920B2 (en) Method for guiding autonomously movable machine by means of optical communication device
US10371504B2 (en) Light fixture commissioning using depth sensing device
CN110476148B (en) Display system and method for providing multi-view content
WO2018041136A1 (en) Optical communication device and system and corresponding information transferring and receiving method
CN110662162B (en) Dual mode optical device for time-of-flight sensing and information transfer, and apparatus, systems, and methods utilizing the same
WO2019120156A1 (en) Optical tag-based positioning method and system
CN110943778B (en) Optical communication device and method for transmitting and receiving information
WO2019120052A1 (en) Method and apparatus for decoding information transmitted by optical source
WO2019120053A1 (en) Optical communication apparatus having reference light source, and corresponding information transmitting and receiving methods
US20180054876A1 (en) Out of plane sensor or emitter for commissioning lighting devices
US10990774B2 (en) Optical communication device and system, and corresponding information transmitting and receiving methods
WO2020062876A1 (en) Service provision method and system based on optical label
WO2019120051A1 (en) Optical label security determination method and system
HK40013131A (en) Systems and methods for guiding machines capable of autonomous movement
HK40013135A (en) Method for guiding a machine capable of autonomous movement through optical communication devices
HK40005608B (en) Optical tag-based positioning method and system
HK40005608A (en) Optical tag-based positioning method and system
HK40005606B (en) An optical communication apparatus including reference light source, and corresponding information transmitting and receiving methods
HK40005606A (en) An optical communication apparatus including reference light source, and corresponding information transmitting and receiving methods

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19799188

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19799188

Country of ref document: EP

Kind code of ref document: A1