[go: up one dir, main page]

CN110967703A - Indoor navigation method and indoor navigation device using laser radar and camera - Google Patents

Indoor navigation method and indoor navigation device using laser radar and camera Download PDF

Info

Publication number
CN110967703A
CN110967703A CN201811131837.4A CN201811131837A CN110967703A CN 110967703 A CN110967703 A CN 110967703A CN 201811131837 A CN201811131837 A CN 201811131837A CN 110967703 A CN110967703 A CN 110967703A
Authority
CN
China
Prior art keywords
information
detection
preset
detection information
adjusted
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811131837.4A
Other languages
Chinese (zh)
Inventor
韩翰
江旭
马向阳
刁飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Midea Life Electric Manufacturing Co Ltd
Original Assignee
Guangdong Midea Life Electric Manufacturing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Midea Life Electric Manufacturing Co Ltd filed Critical Guangdong Midea Life Electric Manufacturing Co Ltd
Priority to CN201811131837.4A priority Critical patent/CN110967703A/en
Publication of CN110967703A publication Critical patent/CN110967703A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4002Installations of electric equipment
    • A47L11/4008Arrangements of switches, indicators or the like
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4061Steering means; Means for avoiding obstacles; Details related to the place where the driver is accommodated
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Automation & Control Theory (AREA)
  • Multimedia (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention discloses an indoor navigation method and an indoor navigation device using a laser radar and a camera, comprising the following steps: acquiring a preset navigation path of a current scene, and acquiring a current time or a current illumination intensity value; when the current time is within a first preset time period or the current illumination intensity value is larger than a first preset illumination threshold value, acquiring first detection information through a first detection device, acquiring second detection information through a second detection device, and adjusting the first detection information based on the second detection information to obtain adjusted detection information; when the current time is within a second preset time period or the current illumination intensity value is smaller than or equal to a first preset illumination threshold value, acquiring first detection information through a first detection device, acquiring second detection information through a second detection device, and adjusting the second detection information based on the first detection information to obtain adjusted detection information; and adjusting the preset navigation path based on the adjusted detection information to obtain the adjusted navigation path.

Description

Indoor navigation method and indoor navigation device using laser radar and camera
Technical Field
The invention relates to the technical field of household appliances, in particular to an indoor navigation method using a laser radar and a camera and an indoor navigation device using the laser radar and the camera.
Background
With the continuous improvement of the living standard and the continuous development of science and technology, the floor sweeping robot is taken as a household device, and is widely popular because the labor intensity of people at home can be reduced and the labor efficiency of people at home is improved.
Because the sweeping robot needs to automatically run indoors and various obstacles or obstacles exist indoors, the sweeping robot needs to have the detection capability of the front obstacle. The traditional sweeping robot is often provided with an infrared sensor to detect a front obstacle, but the detection through the infrared sensor has great limitation and detection deviation, so that the automatically planned path of the sweeping robot has great deviation; also there is the technical scheme of the image information who sweeps floor the robot the place ahead through the camera collection among the prior art, but the definition of the image information of camera collection has very big relation with illumination intensity on every side, therefore the application scene of this scheme has very big limitation, consequently current product can't satisfy the user to the demand of the accuracy of product and the adaptability of product, and user experience is low.
Disclosure of Invention
In order to overcome the defects or shortcomings in the prior art, the invention provides an indoor navigation method using the laser radar and the camera and an indoor navigation device using the laser radar and the camera.
In order to achieve the above object, the present invention provides an indoor navigation method using a laser radar and a camera, which is applied to a sweeping robot, and the indoor navigation method includes: acquiring a preset navigation path of a current scene, and acquiring the current time or the current illumination intensity value of the current scene; under the condition that the current time is within a first preset time period or the current illumination intensity value is larger than a first preset illumination threshold value, acquiring first detection information through a first detection device, acquiring second detection information through a second detection device, and adjusting the first detection information based on the second detection information to obtain adjusted detection information; under the condition that the current time is within a second preset time period or the current illumination intensity value is smaller than or equal to the first preset illumination threshold value, acquiring first detection information through a first detection device, acquiring second detection information through a second detection device, and adjusting the second detection information based on the first detection information to obtain adjusted detection information; and adjusting the preset navigation path based on the adjusted detection information to obtain an adjusted navigation path.
Preferably, the acquiring the preset navigation path of the current scene includes: judging the working state of the sweeping robot; under the condition that the sweeping robot is in an initial working state: controlling the sweeping robot to move in a current scene according to a preset program so as to acquire image acquisition information and laser scanning information in the current scene; acquiring preliminary map information of the current scene based on the image acquisition information; optimizing the preliminary map information of the current scene based on the laser scanning information to obtain and store the map information of the current scene; under the condition that the sweeping robot is in a non-initial working state, acquiring map information of the current scene; and generating a preset navigation path of the current scene based on the map information.
Preferably, the first detection device is a laser radar, the first detection information is laser scanning information, the second detection device is a camera, and the second detection information is image information.
Preferably, the first preset time period is a preset day time period, and the second preset time period is a preset night time period; the adjusting the first detection information based on the second detection information to obtain adjusted detection information includes: adjusting the laser scanning information based on the image information to obtain adjusted laser scanning information, and taking the adjusted laser scanning information as adjusted detection information; the adjusting the second detection information based on the first detection information to obtain adjusted detection information includes: and adjusting the image information based on the laser scanning information to obtain adjusted image information, and taking the adjusted image information as adjusted detection information.
Preferably, the adjusting the preset navigation path based on the adjusted detection information to obtain an adjusted navigation path includes: judging whether an obstacle exists on the preset navigation path or not based on the adjusted detection information; under the condition that the obstacle exists on the preset navigation path, acquiring contour information of the obstacle; and adjusting the preset navigation path based on the outline information of the obstacle to obtain an adjusted navigation path.
Preferably, the indoor navigation method further includes: judging whether the current illumination intensity value is lower than a second preset illumination threshold value or not; under the condition that the current illumination intensity value is lower than the second preset illumination threshold value, judging whether a forced auxiliary detection instruction is acquired; under the condition that the mandatory auxiliary detection instruction is obtained, generating corresponding auxiliary lighting prompt information or starting a corresponding auxiliary lighting device, and starting the second detection device; and controlling the second detection device to be in an enable prohibition state under the condition that the forced auxiliary detection instruction is not acquired.
In addition, the invention also provides an indoor navigation device using the laser radar and the camera, which is applied to the sweeping robot, and the indoor navigation device comprises: a processor to: acquiring a preset navigation path of a current scene, and acquiring the current time or the current illumination intensity value of the current scene; under the condition that the current time is within a first preset time period or the current illumination intensity value is larger than a first preset illumination threshold value, acquiring first detection information through a first detection device, acquiring second detection information through a second detection device, and adjusting the first detection information based on the second detection information to obtain adjusted detection information; under the condition that the current time is within a second preset time period or the current illumination intensity value is smaller than or equal to the first preset illumination threshold value, acquiring first detection information through a first detection device, acquiring second detection information through a second detection device, and adjusting the second detection information based on the first detection information to obtain adjusted detection information; and adjusting the preset navigation path based on the adjusted detection information to obtain an adjusted navigation path.
Preferably, the acquiring the preset navigation path of the current scene includes: judging the working state of the sweeping robot; under the condition that the sweeping robot is in an initial working state: controlling the sweeping robot to move in a current scene according to a preset program so as to acquire image acquisition information and laser scanning information in the current scene; acquiring preliminary map information of the current scene based on the image acquisition information; optimizing the preliminary map information of the current scene based on the laser scanning information to obtain and store the map information of the current scene; under the condition that the sweeping robot is in a non-initial working state, acquiring map information of the current scene; and generating a preset navigation path of the current scene based on the map information.
Preferably, the first detection device is a laser radar, the first detection information is laser scanning information, the second detection device is a camera, and the second detection information is image information.
Preferably, the first preset time period is a preset day time period, and the second preset time period is a preset night time period; the adjusting the first detection information based on the second detection information to obtain adjusted detection information includes: adjusting the laser scanning information based on the image information to obtain adjusted laser scanning information, and taking the adjusted laser scanning information as adjusted detection information; the adjusting the second detection information based on the first detection information to obtain adjusted detection information includes: and adjusting the image information based on the laser scanning information to obtain adjusted image information, and taking the adjusted image information as adjusted detection information.
Preferably, the adjusting the preset navigation path based on the adjusted detection information to obtain an adjusted navigation path includes: judging whether an obstacle exists on the preset navigation path or not based on the adjusted detection information; under the condition that the obstacle exists on the preset navigation path, acquiring contour information of the obstacle; and adjusting the preset navigation path based on the outline information of the obstacle to obtain an adjusted navigation path.
Preferably, the processor is further configured to: judging whether the current illumination intensity value is lower than a second preset illumination threshold value or not; under the condition that the current illumination intensity value is lower than the second preset illumination threshold value, judging whether a forced auxiliary detection instruction is acquired; under the condition that the mandatory auxiliary detection instruction is obtained, generating corresponding auxiliary lighting prompt information or starting a corresponding auxiliary lighting device, and starting the second detection device; and controlling the second detection device to be in an enable prohibition state under the condition that the forced auxiliary detection instruction is not acquired.
In addition, the present invention also provides a computer-readable storage medium on which a computer program is stored, which when executed by a processor implements the method provided by the present invention.
According to the invention, the additional detection device is configured on the traditional sweeping robot, so that the information detection is carried out on the current scene from multiple aspects, better detection data is preferentially adopted according to different application scenes, and meanwhile, the auxiliary detection is carried out by using other detection data, so that the detection accuracy of the sweeping robot in different scenes is improved, the detection deviation is reduced, the navigation accuracy is improved, and the user experience is improved.
Additional features and advantages of the invention will be set forth in the detailed description which follows.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention and not to limit the invention. In the drawings:
fig. 1 is a schematic bottom structure diagram of a conventional sweeping robot according to an embodiment of the present invention;
fig. 2 is a schematic view illustrating a conventional sweeping robot moving in a current scene according to an embodiment of the present invention;
fig. 3 is a flowchart of a specific implementation of an indoor navigation method using a laser radar and a camera according to an embodiment of the present invention.
Description of reference numerals:
101 casing 102 road wheel
103 universal wheel 104 dust brush
105 wind gap 106 cleaning brush
Detailed Description
The following detailed description of embodiments of the invention refers to the accompanying drawings. It should be understood that the detailed description and specific examples, while indicating the present invention, are given by way of illustration and explanation only, not limitation.
It should be noted that the embodiments and features of the embodiments may be combined with each other without conflict.
The terms "system" and "network" in embodiments of the present invention may be used interchangeably. The "plurality" means two or more, and in view of this, the "plurality" may also be understood as "at least two" in the embodiments of the present invention. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" generally indicates that the preceding and following related objects are in an "or" relationship, unless otherwise specified. In addition, it should be understood that the terms first, second, etc. in the description of the embodiments of the invention are used for distinguishing between the descriptions and are not intended to indicate or imply relative importance or order to be construed.
The background of the invention is first described below.
Referring to fig. 1, a bottom structure of a conventional sweeping robot includes a housing 101, for example, a circular housing adopted for ensuring an optimal traversing capability when moving in a current scene (see fig. 2), a driving motor (not shown) is disposed inside the sweeping robot, the driving motor rotates by controlling a traveling wheel 102 to drive the entire sweeping robot to move in the current scene, and a universal wheel 103 is further disposed in front of a bottom of the sweeping robot for better controlling operations such as turning of the sweeping robot. In the moving process, on one hand, the dust brush 104 at the bottom of the sweeping robot continuously rotates to sweep the garbage or dust near the sweeping robot into the bottom of the sweeping robot, and on the other hand, the sweeping robot generates strong suction through the air opening 105 at the bottom, so that the garbage swept to the bottom or the garbage located right in front of the sweeping robot is sucked into the sweeping robot and stored, and the sweeping operation of the current scene is completed. Further, in order to effectively clean large garbage or strip-shaped garbage and improve the cleaning effect of the sweeping robot, the cleaning brush 106 is arranged in the middle of the bottom of the sweeping robot, so that the garbage swept into the bottom of the sweeping robot is thoroughly cleaned, and the cleaning effect of the sweeping robot is improved.
The present invention will be described in detail below with reference to the embodiments with reference to the attached drawings.
Referring to fig. 3, the present invention provides an indoor navigation method using a laser radar and a camera, which is applied to a sweeping robot, and the indoor navigation method includes:
s10), acquiring a preset navigation path of the current scene, and acquiring the current time or the current illumination intensity value of the current scene;
s20) when the current time is within a first preset time period or the current illumination intensity value is larger than a first preset illumination threshold value, acquiring first detection information through a first detection device, acquiring second detection information through a second detection device, and adjusting the first detection information based on the second detection information to obtain adjusted detection information;
s30) when the current time is in a second preset time period or the current illumination intensity value is less than or equal to the first preset illumination threshold value, acquiring first detection information through a first detection device, acquiring second detection information through a second detection device, and adjusting the second detection information based on the first detection information to obtain adjusted detection information;
s40) adjusting the preset navigation path based on the adjusted detection information to obtain an adjusted navigation path.
In one possible embodiment, the sweeping robot is started according to a cleaning instruction or a preset program of a user, and enters a state to be cleaned after initialization. The sweeping robot further obtains a preset navigation path and current time of a current scene, obtains that the current time is in a first preset time period, starts to execute sweeping operation, obtains first detection information in real time through a first detection device and second detection information in real time through a second detection device in the sweeping process, adjusts the first detection information through the second detection information to obtain more accurate detection information, takes the detection information as adjusted detection information, adjusts the preset navigation path according to the adjusted detection information, and adjusts the preset navigation path according to the real-time change condition of the current scene and obtains the adjusted navigation path.
In another possible embodiment, the sweeping robot is started according to a sweeping instruction or a preset program of a user, and enters a state to be swept after initialization. The sweeping robot further obtains a preset navigation path and a current illumination intensity value of a current scene, for example, the current illumination intensity value is 2lux, the current illumination intensity value is smaller than a first preset illumination threshold value (for example, the first preset illumination threshold value is 5lux), the sweeping robot starts to perform a sweeping operation, and obtains first detection information in real time through a first detection device and second detection information in real time through a second detection device in a sweeping process, and adjusts the second detection information through the first detection information to obtain more accurate detection information, and takes the detection information as adjusted detection information, and adjusts the preset navigation path according to the adjusted detection information, so as to adjust the preset navigation path according to a real-time change condition of the current scene and obtain the adjusted navigation path.
In the embodiment of the invention, the plurality of different detection devices are arranged on the sweeping robot, the detection information in the current scene is collected in real time from different aspects, and the detection information in different aspects is combined to generate more accurate detection information, so that the problems of insufficient accuracy, large deviation and incapability of adapting to the current scene in the detection of a single detection device are solved, the adaptability of the sweeping robot to different scenes is greatly improved, the sweeping robot can acquire accurate detection information in different scenes, an accurate navigation path is further obtained, accidents such as collision or trapping in the current scene are avoided, the running smoothness of the sweeping robot in the current scene is improved, the sweeping effect and the sweeping efficiency of the sweeping robot in the current scene are improved, and the user experience is improved.
In an embodiment of the present invention, the obtaining of the preset navigation path of the current scene includes: judging the working state of the sweeping robot; under the condition that the sweeping robot is in an initial working state: controlling the sweeping robot to move in a current scene according to a preset program so as to acquire image acquisition information and laser scanning information in the current scene; acquiring preliminary map information of the current scene based on the image acquisition information; optimizing the preliminary map information of the current scene based on the laser scanning information to obtain and store the map information of the current scene; under the condition that the sweeping robot is in a non-initial working state, acquiring map information of the current scene; and generating a preset navigation path of the current scene based on the map information.
Further, in this embodiment of the present invention, the first detection device is a laser radar, the first detection information is laser scanning information, the second detection device is a camera, and the second detection information is image information.
Before the sweeping robot is started and performs a sweeping operation, the sweeping robot first needs to acquire map information in a current scene, in a possible implementation manner, after the sweeping robot is started and performs an initialization operation, the sweeping robot does not detect pre-stored map information of the current scene, that is, the sweeping robot is currently in an initial working state, so that the sweeping robot slowly moves in the current scene according to a preset program with a current position as a starting point, and simultaneously acquires image acquisition information and laser scanning information in the current scene in real time during the moving process. For example, places of noise, shadow and ghost in the image information are optimized to eliminate interference or deviation information in the image information, so that more accurate map information of the current scene is obtained and stored, and then a corresponding navigation path is automatically generated according to the map information. If the sweeping robot detects the pre-stored map information of the current scene after starting and executing the initialization operation, namely the sweeping robot is currently in a non-initial working state, the sweeping robot generates the navigation path of the current scene based on the map information of the current scene.
In the embodiment of the invention, by adopting a method of combining two detection devices, the map information in the current scene is collected from different aspects and optimized mutually, so that more accurate map information is obtained, and the sweeping robot can greatly reduce the error of the path in the process of generating the navigation path according to the more accurate map information, for example, the situation that a detour path is generated by identifying noise points in the image as obstacles can be avoided, so that the drawing accuracy of the map information of the current scene is greatly improved.
In the embodiment of the present invention, the first preset time period is a preset day time period, and the second preset time period is a preset night time period; the adjusting the first detection information based on the second detection information to obtain adjusted detection information includes: adjusting the laser scanning information based on the image information to obtain adjusted laser scanning information, and taking the adjusted laser scanning information as adjusted detection information; the adjusting the second detection information based on the first detection information to obtain adjusted detection information includes: and adjusting the image information based on the laser scanning information to obtain adjusted image information, and taking the adjusted image information as adjusted detection information.
In this embodiment of the present invention, the adjusting the preset navigation path based on the adjusted detection information to obtain an adjusted navigation path includes: judging whether an obstacle exists on the preset navigation path or not based on the adjusted detection information; under the condition that the obstacle exists on the preset navigation path, acquiring contour information of the obstacle; and adjusting the preset navigation path based on the outline information of the obstacle to obtain an adjusted navigation path.
In a possible implementation manner, the sweeping robot is started according to a sweeping instruction of a user, after an initialization operation is executed and a to-be-swept mode is entered, the sweeping robot acquires that the current time is 10 minutes at night for 30 minutes, that is, the current time is a preset evening time period (for example, the preset evening time period is 8 minutes at night-8 minutes in the next morning), the sweeping robot starts to execute a sweeping operation, and collects laser scanning information in real time through a laser radar and collects image information in real time through a camera, because the illumination intensity around the night is low, the definition of the image information collected by the camera may be poor, and therefore the image information is adjusted through the acquired laser scanning information, for example, error information such as noise, shadow and the like in the image information is eliminated through the laser scanning information, so as to obtain accurate scene information.
Further, the sweeping robot determines whether an obstacle exists on a preset navigation path according to the adjusted detection information, for example, after the adjusted detection information is compared with preset scene information, abnormal information different from the preset scene information exists in the adjusted detection information, and after the abnormal information is compared with the preset navigation path, the abnormal information is found to be located on the preset navigation path, so that the obstacle exists on the preset navigation path is determined, the sweeping robot further obtains profile information of the obstacle, and adjusts the preset navigation path according to the profile information, for example, if the occupied area of the obstacle is large, the navigation path is re-planned along the profile of the obstacle to obtain the adjusted navigation path so as to bypass the obstacle to perform a sweeping operation.
In the embodiment of the invention, by adjusting the priority of different detection devices in different scenes, the sweeping robot carries out comprehensive detection mainly by taking the detection device more suitable for the current scene and taking other detection devices as auxiliary devices in different scenes, thereby obtaining more accurate detection information, has higher detection accuracy compared with the sweeping robot with only a single detection device in the prior art, and has stronger scene adaptability, can have higher adaptability under different scenes, has the best sweeping effect under different scenes, greatly improves the scene adaptability of the sweeping robot, improves the detection accuracy of the sweeping robot, reduces the abnormal and fault probability of the sweeping robot, by generating a more accurate navigation path, the cleaning efficiency of the sweeping robot is improved, and the user experience is improved.
Further, in the embodiment of the present invention, the indoor navigation method further includes: judging whether the current illumination intensity value is lower than a second preset illumination threshold value or not; under the condition that the current illumination intensity value is lower than the second preset illumination threshold value, judging whether a forced auxiliary detection instruction is acquired; under the condition that the mandatory auxiliary detection instruction is obtained, generating corresponding auxiliary lighting prompt information or starting a corresponding auxiliary lighting device, and starting the second detection device; and controlling the second detection device to be in an enable prohibition state under the condition that the forced auxiliary detection instruction is not acquired.
Because the definition of the image information acquired by the camera has strong correlation with the illumination intensity value of the current scene, if the illumination intensity value is too low, the reliability of the acquired image information may be greatly reduced, for example, more noise points exist, a large number of unrecognizable shadow regions exist, and the like, which causes great interference to the detection result.
In a possible embodiment, during the sweeping operation, the sweeping robot further determines, according to a current illumination intensity value detected in real time, for example, the illumination intensity value detected by the sweeping robot at a certain time is 3lux and is lower than a second preset illumination threshold (for example, the second preset illumination threshold is 5lux), so that the sweeping robot further determines whether to obtain a forced auxiliary detection instruction of the user, for example, the sweeping robot may send a prompt message to the user to prompt that auxiliary detection is required at the current position but the illumination intensity value is insufficient, the user may determine, according to actual needs, whether to perform the forced auxiliary detection, for example, the position is a shadow area on the side of the sofa, the user determines that the forced auxiliary detection is not required, and thus feeds back an instruction to cancel the forced auxiliary detection to the sweeping robot, or the sweeping robot does not receive a feedback instruction from the user when waiting for a preset time (for example, the preset time is 10s), the forced auxiliary detection is not carried out by default, so that the camera is controlled to be in the disabled state and the cleaning operation is continuously carried out according to the current navigation path.
In another possible implementation manner, a user remotely receives prompt information of forced auxiliary detection sent by the sweeping robot through a user terminal, and the user finds that the sweeping robot is currently located at a certain corner of a warehouse through the user terminal and determines that the forced auxiliary detection is required, so that a forced auxiliary detection instruction is fed back to the sweeping robot, the sweeping robot controls a lighting device where a current position is located to automatically start or start a configured lighting device or move to a corresponding switch position to turn on a lighting source according to the forced auxiliary detection instruction, so as to improve the illumination intensity value of the current position (to be higher than 5lux), and simultaneously starts a camera to perform image information acquisition work.
In the embodiment of the invention, the starting of the camera is further controlled according to the illumination intensity value around the sweeping robot, so that the problem that the camera cannot clearly acquire image information under the condition of insufficient illumination intensity value can be effectively avoided, and the problem that the detection result is greatly interfered and influenced by the image information acquired by the camera in a dark environment is avoided, so that the sweeping robot can effectively improve the detection accuracy in a scene in any scene, the cleaning effect of the sweeping robot on the current scene is improved, the sweeping robot is prevented from being collided or trapped, the intelligent degree of the sweeping robot is improved, and the user experience is improved.
In addition, the invention also provides an indoor navigation device using the laser radar and the camera, which is applied to the sweeping robot, and the indoor navigation device comprises: a processor to: acquiring a preset navigation path of a current scene, and acquiring the current time or the current illumination intensity value of the current scene; under the condition that the current time is within a first preset time period or the current illumination intensity value is larger than a first preset illumination threshold value, acquiring first detection information through a first detection device, acquiring second detection information through a second detection device, and adjusting the first detection information based on the second detection information to obtain adjusted detection information; under the condition that the current time is within a second preset time period or the current illumination intensity value is smaller than or equal to the first preset illumination threshold value, acquiring first detection information through a first detection device, acquiring second detection information through a second detection device, and adjusting the second detection information based on the first detection information to obtain adjusted detection information; and adjusting the preset navigation path based on the adjusted detection information to obtain an adjusted navigation path.
In an embodiment of the present invention, the obtaining of the preset navigation path of the current scene includes: judging the working state of the sweeping robot; under the condition that the sweeping robot is in an initial working state: controlling the sweeping robot to move in a current scene according to a preset program so as to acquire image acquisition information and laser scanning information in the current scene; acquiring preliminary map information of the current scene based on the image acquisition information; optimizing the preliminary map information of the current scene based on the laser scanning information to obtain and store the map information of the current scene; under the condition that the sweeping robot is in a non-initial working state, acquiring map information of the current scene; and generating a preset navigation path of the current scene based on the map information.
In an embodiment of the present invention, the first detection device is a laser radar, the first detection information is laser scanning information, the second detection device is a camera, and the second detection information is image information.
In the embodiment of the present invention, the first preset time period is a preset day time period, and the second preset time period is a preset night time period; the adjusting the first detection information based on the second detection information to obtain adjusted detection information includes: adjusting the laser scanning information based on the image information to obtain adjusted laser scanning information, and taking the adjusted laser scanning information as adjusted detection information; the adjusting the second detection information based on the first detection information to obtain adjusted detection information includes: and adjusting the image information based on the laser scanning information to obtain adjusted image information, and taking the adjusted image information as adjusted detection information.
In this embodiment of the present invention, the adjusting the preset navigation path based on the adjusted detection information to obtain an adjusted navigation path includes: judging whether an obstacle exists on the preset navigation path or not based on the adjusted detection information; under the condition that the obstacle exists on the preset navigation path, acquiring contour information of the obstacle; and adjusting the preset navigation path based on the outline information of the obstacle to obtain an adjusted navigation path.
In an embodiment of the present invention, the processor is further configured to: judging whether the current illumination intensity value is lower than a second preset illumination threshold value or not; under the condition that the current illumination intensity value is lower than the second preset illumination threshold value, judging whether a forced auxiliary detection instruction is acquired; under the condition that the mandatory auxiliary detection instruction is obtained, generating corresponding auxiliary lighting prompt information or starting a corresponding auxiliary lighting device, and starting the second detection device; and controlling the second detection device to be in an enable prohibition state under the condition that the forced auxiliary detection instruction is not acquired.
In addition, the present invention also provides a computer-readable storage medium on which a computer program is stored, which when executed by a processor implements the method provided by the present invention.
The preferred embodiments of the present invention have been described in detail with reference to the accompanying drawings, however, the present invention is not limited to the specific details of the above embodiments, and various simple modifications can be made to the technical solution of the present invention within the technical idea of the present invention, and these simple modifications are within the protective scope of the present invention.
It should be noted that the various technical features described in the above embodiments can be combined in any suitable manner without contradiction, and the invention is not described in any way for the possible combinations in order to avoid unnecessary repetition.
In addition, any combination of the various embodiments of the present invention is also possible, and the same should be considered as the disclosure of the present invention as long as it does not depart from the spirit of the present invention.

Claims (13)

1. An indoor navigation method using a laser radar and a camera is applied to a sweeping robot, and is characterized by comprising the following steps:
acquiring a preset navigation path of a current scene, and acquiring the current time or the current illumination intensity value of the current scene;
under the condition that the current time is within a first preset time period or the current illumination intensity value is larger than a first preset illumination threshold value, acquiring first detection information through a first detection device, acquiring second detection information through a second detection device, and adjusting the first detection information based on the second detection information to obtain adjusted detection information;
under the condition that the current time is within a second preset time period or the current illumination intensity value is smaller than or equal to the first preset illumination threshold value, acquiring first detection information through a first detection device, acquiring second detection information through a second detection device, and adjusting the second detection information based on the first detection information to obtain adjusted detection information;
and adjusting the preset navigation path based on the adjusted detection information to obtain an adjusted navigation path.
2. The indoor navigation method of claim 1, wherein the obtaining of the preset navigation path of the current scene comprises:
judging the working state of the sweeping robot;
under the condition that the sweeping robot is in an initial working state:
controlling the sweeping robot to move in a current scene according to a preset program so as to acquire image acquisition information and laser scanning information in the current scene; acquiring preliminary map information of the current scene based on the image acquisition information; optimizing the preliminary map information of the current scene based on the laser scanning information to obtain and store the map information of the current scene;
under the condition that the sweeping robot is in a non-initial working state, acquiring map information of the current scene;
and generating a preset navigation path of the current scene based on the map information.
3. The indoor navigation method of claim 1, wherein the first detection device is a laser radar, the first detection information is laser scanning information, the second detection device is a camera, and the second detection information is image information.
4. The indoor navigation method according to claim 3, wherein the first preset time period is a preset day time period, and the second preset time period is a preset night time period;
the adjusting the first detection information based on the second detection information to obtain adjusted detection information includes: adjusting the laser scanning information based on the image information to obtain adjusted laser scanning information, and taking the adjusted laser scanning information as adjusted detection information;
the adjusting the second detection information based on the first detection information to obtain adjusted detection information includes: and adjusting the image information based on the laser scanning information to obtain adjusted image information, and taking the adjusted image information as adjusted detection information.
5. The indoor navigation method of claim 1, wherein the adjusting the preset navigation path based on the adjusted detection information to obtain an adjusted navigation path comprises:
judging whether an obstacle exists on the preset navigation path or not based on the adjusted detection information;
under the condition that the obstacle exists on the preset navigation path, acquiring contour information of the obstacle;
and adjusting the preset navigation path based on the outline information of the obstacle to obtain an adjusted navigation path.
6. The indoor navigation method of any one of claims 1 to 5, further comprising:
judging whether the current illumination intensity value is lower than a second preset illumination threshold value or not;
under the condition that the current illumination intensity value is lower than the second preset illumination threshold value, judging whether a forced auxiliary detection instruction is acquired;
under the condition that the mandatory auxiliary detection instruction is obtained, generating corresponding auxiliary lighting prompt information or starting a corresponding auxiliary lighting device, and starting the second detection device;
and controlling the second detection device to be in an enable prohibition state under the condition that the forced auxiliary detection instruction is not acquired.
7. The utility model provides an use indoor navigation head of laser radar and camera is applied to the robot of sweeping the floor, its characterized in that, indoor navigation head includes:
a processor to:
acquiring a preset navigation path of a current scene, and acquiring the current time or the current illumination intensity value of the current scene;
under the condition that the current time is within a first preset time period or the current illumination intensity value is larger than a first preset illumination threshold value, acquiring first detection information through a first detection device, acquiring second detection information through a second detection device, and adjusting the first detection information based on the second detection information to obtain adjusted detection information;
under the condition that the current time is within a second preset time period or the current illumination intensity value is smaller than or equal to the first preset illumination threshold value, acquiring first detection information through a first detection device, acquiring second detection information through a second detection device, and adjusting the second detection information based on the first detection information to obtain adjusted detection information;
and adjusting the preset navigation path based on the adjusted detection information to obtain an adjusted navigation path.
8. The indoor navigation device of claim 7, wherein the obtaining of the preset navigation path of the current scene comprises:
judging the working state of the sweeping robot;
under the condition that the sweeping robot is in an initial working state:
controlling the sweeping robot to move in a current scene according to a preset program so as to acquire image acquisition information and laser scanning information in the current scene; acquiring preliminary map information of the current scene based on the image acquisition information; optimizing the preliminary map information of the current scene based on the laser scanning information to obtain and store the map information of the current scene;
under the condition that the sweeping robot is in a non-initial working state, acquiring map information of the current scene;
and generating a preset navigation path of the current scene based on the map information.
9. The indoor navigation device of claim 7, wherein the first detection device is a laser radar, the first detection information is laser scanning information, the second detection device is a camera, and the second detection information is image information.
10. The indoor navigation device of claim 9, wherein the first preset time period is a preset day time period, and the second preset time period is a preset night time period;
the adjusting the first detection information based on the second detection information to obtain adjusted detection information includes: adjusting the laser scanning information based on the image information to obtain adjusted laser scanning information, and taking the adjusted laser scanning information as adjusted detection information;
the adjusting the second detection information based on the first detection information to obtain adjusted detection information includes: and adjusting the image information based on the laser scanning information to obtain adjusted image information, and taking the adjusted image information as adjusted detection information.
11. The indoor navigation device of claim 9, wherein the adjusting the preset navigation path based on the adjusted detection information to obtain an adjusted navigation path comprises:
judging whether an obstacle exists on the preset navigation path or not based on the adjusted detection information;
under the condition that the obstacle exists on the preset navigation path, acquiring contour information of the obstacle;
and adjusting the preset navigation path based on the outline information of the obstacle to obtain an adjusted navigation path.
12. The indoor navigation device of any one of claims 7-11, wherein the processor is further configured to:
judging whether the current illumination intensity value is lower than a second preset illumination threshold value or not;
under the condition that the current illumination intensity value is lower than the second preset working threshold value, judging whether a forced auxiliary detection instruction is acquired or not;
under the condition that the mandatory auxiliary detection instruction is obtained, generating corresponding auxiliary lighting prompt information or starting a corresponding auxiliary lighting device, and starting the second detection device;
and controlling the second detection device to be in an enable prohibition state under the condition that the forced auxiliary detection instruction is not acquired.
13. A computer-readable storage medium, on which a computer program is stored, which program, when being executed by a processor, is adapted to carry out the method of any one of claims 1 to 6.
CN201811131837.4A 2018-09-27 2018-09-27 Indoor navigation method and indoor navigation device using laser radar and camera Pending CN110967703A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811131837.4A CN110967703A (en) 2018-09-27 2018-09-27 Indoor navigation method and indoor navigation device using laser radar and camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811131837.4A CN110967703A (en) 2018-09-27 2018-09-27 Indoor navigation method and indoor navigation device using laser radar and camera

Publications (1)

Publication Number Publication Date
CN110967703A true CN110967703A (en) 2020-04-07

Family

ID=70026653

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811131837.4A Pending CN110967703A (en) 2018-09-27 2018-09-27 Indoor navigation method and indoor navigation device using laser radar and camera

Country Status (1)

Country Link
CN (1) CN110967703A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111714037A (en) * 2020-06-19 2020-09-29 北京真机智能科技有限公司 Indoor and outdoor universal intelligent sweeping and washing integrated robot and cleaning method thereof
CN112363494A (en) * 2020-09-24 2021-02-12 深圳优地科技有限公司 Method and device for planning advancing path of robot and storage medium
CN114474149A (en) * 2021-12-21 2022-05-13 深圳优地科技有限公司 Automatic testing method, device, server and readable storage medium
CN115191884A (en) * 2021-04-09 2022-10-18 美智纵横科技有限责任公司 Movement control method and device, cleaning robot and storage medium
CN116490064A (en) * 2020-12-01 2023-07-25 苏州宝时得电动工具有限公司 Mobile robot intelligent obstacle avoidance method and mobile robot

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102368158A (en) * 2011-09-15 2012-03-07 西北农林科技大学 Navigation positioning method of orchard machine
WO2012091807A2 (en) * 2010-12-30 2012-07-05 Irobot Corporation Mobile human interface robot
CN105910599A (en) * 2016-04-15 2016-08-31 深圳乐行天下科技有限公司 Robot device and method for locating target
CN106092104A (en) * 2016-08-26 2016-11-09 深圳微服机器人科技有限公司 The method for relocating of a kind of Indoor Robot and device
CN107137026A (en) * 2017-06-26 2017-09-08 深圳普思英察科技有限公司 A kind of sweeping robot and its camera light-supplementing system, method
CN108089586A (en) * 2018-01-30 2018-05-29 北醒(北京)光子科技有限公司 A kind of robot autonomous guider, method and robot
WO2018107479A1 (en) * 2016-12-16 2018-06-21 云鲸智能科技(东莞)有限公司 Cleaning robot and cleaning robot system
WO2018117616A1 (en) * 2016-12-23 2018-06-28 엘지전자 주식회사 Mobile robot
CN108290294A (en) * 2015-11-26 2018-07-17 三星电子株式会社 Mobile robot and its control method

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012091807A2 (en) * 2010-12-30 2012-07-05 Irobot Corporation Mobile human interface robot
CN102368158A (en) * 2011-09-15 2012-03-07 西北农林科技大学 Navigation positioning method of orchard machine
CN108290294A (en) * 2015-11-26 2018-07-17 三星电子株式会社 Mobile robot and its control method
CN105910599A (en) * 2016-04-15 2016-08-31 深圳乐行天下科技有限公司 Robot device and method for locating target
CN106092104A (en) * 2016-08-26 2016-11-09 深圳微服机器人科技有限公司 The method for relocating of a kind of Indoor Robot and device
WO2018107479A1 (en) * 2016-12-16 2018-06-21 云鲸智能科技(东莞)有限公司 Cleaning robot and cleaning robot system
WO2018117616A1 (en) * 2016-12-23 2018-06-28 엘지전자 주식회사 Mobile robot
CN107137026A (en) * 2017-06-26 2017-09-08 深圳普思英察科技有限公司 A kind of sweeping robot and its camera light-supplementing system, method
CN108089586A (en) * 2018-01-30 2018-05-29 北醒(北京)光子科技有限公司 A kind of robot autonomous guider, method and robot

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
郭彤颖等: "移动机器人导航与定位技术研究进展", 《科技广场》 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111714037A (en) * 2020-06-19 2020-09-29 北京真机智能科技有限公司 Indoor and outdoor universal intelligent sweeping and washing integrated robot and cleaning method thereof
CN112363494A (en) * 2020-09-24 2021-02-12 深圳优地科技有限公司 Method and device for planning advancing path of robot and storage medium
CN116490064A (en) * 2020-12-01 2023-07-25 苏州宝时得电动工具有限公司 Mobile robot intelligent obstacle avoidance method and mobile robot
CN115191884A (en) * 2021-04-09 2022-10-18 美智纵横科技有限责任公司 Movement control method and device, cleaning robot and storage medium
CN114474149A (en) * 2021-12-21 2022-05-13 深圳优地科技有限公司 Automatic testing method, device, server and readable storage medium
CN114474149B (en) * 2021-12-21 2024-04-05 深圳优地科技有限公司 Automatic test method, device, server and readable storage medium

Similar Documents

Publication Publication Date Title
CN110946513B (en) Control method and device of sweeping robot
CN110967703A (en) Indoor navigation method and indoor navigation device using laser radar and camera
CN110946508B (en) Control method and device of sweeping robot using laser radar and camera
CN110968081B (en) Control method and control device of sweeping robot with telescopic camera
CN110955235A (en) Control method and control device of sweeping robot
CN110325938B (en) Electric vacuum cleaner
GB2570240A (en) Electric vacuum cleaner
US11119484B2 (en) Vacuum cleaner and travel control method thereof
CN108606728B (en) Sweeping robot control method and equipment, sweeping robot and storage medium
US20190053683A1 (en) Autonomous traveler
CN113675923B (en) Charging method, charging device and robot
CN112806912B (en) Robot cleaning control method and device and robot
CN113467448A (en) Fixed-point working method, self-moving robot and storage medium
CN111358362A (en) Cleaning control method, device, chip for visual robot and cleaning robot
CN116711996A (en) Operation method, self-mobile device, and storage medium
CN112016375A (en) Floor sweeping robot and method for adaptively controlling floor sweeping robot based on ground material
CN110946512B (en) Control method and device of sweeping robot based on laser radar and camera
CN113741441A (en) Operation method and self-moving equipment
CN112336250A (en) Intelligent cleaning method and device and storage device
CN112043216A (en) A kind of intelligent mechanical cleaning and environmental protection control system and control method
CN110946509A (en) Sweeping method of sweeping robot and sweeping device of sweeping robot
CN112748721A (en) Visual robot and cleaning control method, system and chip thereof
EP4523590A1 (en) Cleaning method, cleaning apparatus, cleaning device, and storage medium
US20250325160A1 (en) Cleaning robot and movement control method thereof
WO2025045264A1 (en) Control method and apparatus for self-propelled device, medium and self-propelled device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200407