WO2025234604A1 - Robot mobile et son procédé de déplacement - Google Patents
Robot mobile et son procédé de déplacementInfo
- Publication number
- WO2025234604A1 WO2025234604A1 PCT/KR2025/004377 KR2025004377W WO2025234604A1 WO 2025234604 A1 WO2025234604 A1 WO 2025234604A1 KR 2025004377 W KR2025004377 W KR 2025004377W WO 2025234604 A1 WO2025234604 A1 WO 2025234604A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- sensing
- data
- driving robot
- driving
- robot
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1674—Programme controls characterised by safety, monitoring, diagnostic
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1628—Programme controls characterised by the control loop
- B25J9/1653—Programme controls characterised by the control loop parameters identification, estimation, stiffness, accuracy, error analysis
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1679—Programme controls characterised by the tasks executed
- B25J9/1692—Calibration of manipulator
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01D—MEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
- G01D18/00—Testing or calibrating apparatus or arrangements provided for in groups G01D1/00 - G01D15/00
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K19/00—Record carriers for use with machines and with at least a part designed to carry digital markings
- G06K19/06—Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K19/00—Record carriers for use with machines and with at least a part designed to carry digital markings
- G06K19/06—Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code
- G06K19/06009—Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code with optically detectable marking
- G06K19/06037—Record carriers for use with machines and with at least a part designed to carry digital markings characterised by the kind of the digital marking, e.g. shape, nature, code with optically detectable marking multi-dimensional coding
Definitions
- Various embodiments of the present disclosure relate to a driving robot, and more particularly, to a driving robot capable of distance sensing and a driving method thereof.
- robots that navigate specific spaces to provide services to users can accurately provide the services they need by considering various types of information along their path.
- robots were developed for industrial use and have been widely used in various industrial settings. However, with the recent expansion of robot-using fields, they are now being utilized not only in homes but also in various stores. For example, robots can move within their respective spaces and perform various functions, such as cleaning and/or serving.
- robots To provide diverse services to users, robots must be able to perform efficiently even in complex and dynamic environments. For example, accurate distance measurement capabilities are required to support various functions, such as positional accuracy, obstacle recognition, and/or path planning. In particular, for autonomous robots, maintaining and managing the performance of distance sensors is crucial for safe operation and efficient path selection.
- Various embodiments of the present disclosure can provide a driving robot that continuously tests sensing performance and performs calibration while driving. Furthermore, various embodiments of the present disclosure can provide a driving robot that tests the sensing performance of a distance sensor under various environments and conditions through a test mode for a specific scenario. Furthermore, various embodiments of the present disclosure can provide a robot control system that provides the results of the sensing performance test and calibration in the form of a report and proactively optimizes the sensing performance of a new robot by reflecting the actual driving environment.
- a driving robot may include a driving unit, at least one sensor, a memory, and at least one processor.
- the at least one processor may be configured to perform sensing performance testing to calculate a sensing error between ground truth (GT) data and sensing data during driving, and to perform calibration for the sensing error.
- GT ground truth
- the at least one processor may be configured to enter a testing mode, acquire sensing data for a testing distance using the at least one sensor, confirm the GT data for the testing distance, and calculate the sensing error by comparing the sensing data and the GT data.
- the inspection mode may include at least one of a first inspection mode that senses the distance from the station, a second inspection mode that utilizes sensor checkpoints, or a third inspection mode that utilizes at least one other driving robot.
- the at least one processor may enter a first inspection mode at each preset unit distance, and in the first inspection mode, generate the sensing data by sensing the inspection distance between the current position and the station, estimate the inspection distance based on at least one of a received signal strength indicator (RSSI) or odometry, thereby determining the GT data, and calculate the sensing error by comparing the sensing data and the GT data.
- RSSI received signal strength indicator
- the station may be a starting point for the driving robot.
- the station may include at least one of a charging station, a maintenance station, a data synchronization station, or a control center.
- the at least one processor may enter a second inspection mode based on the driving robot being located at a sensor checkpoint, and in the second inspection mode, sense the inspection distance between the sensor checkpoint and a preset target object to generate the sensing data, receive an actual distance of the inspection distance from a server to confirm the GT data, and calculate the sensing error by comparing the sensing data and the GT data.
- the sensor checkpoint may include a QR code on the floor.
- the at least one processor may generate the sensing data by sensing the inspection distance from the target object indicated by the QR code at the location of the QR code based on recognizing the QR code on the floor.
- the at least one processor can generate the sensing data by recognizing a marker image on the target object, and sensing the inspection distance to the target object including the marker image at the sensor checkpoint that recognized the marker image.
- the at least one processor may enter a third inspection mode at preset intervals, and in the third inspection mode, generate first sensing data by sensing a relative distance with at least one other driving robot, receive second sensing data generated based on the at least one other driving robot sensing the relative distance, determine the GT data based on the first sensing data and the second sensing data, and calculate the sensing error by comparing the first sensing data and the GT data.
- the at least one processor can determine the GT data by calculating an average of first data reflecting a first weight to the first sensing data and second data reflecting a second weight to the second sensing data.
- the at least one processor may collect sub-sensing data measured using one or more sub-sensors included in the at least one sensor, perform filtering to remove dummy sensing data exceeding a reference deviation from the collected sub-sensing data, and generate the first sensing data based on an average of the filtered sub-sensing data.
- the at least one processor can perform at least one of offset correction, scale factor adjustment, or nonlinear correction based on the sensing error, based on the sensing error being greater than or equal to a first error rate and less than a second error rate.
- the at least one processor may provide a sensor maintenance alert to the user based on the sensing error being greater than or equal to a second error rate.
- the at least one processor may perform a first calibration for a first sensing error calculated through a first sensing performance test, additionally perform a second sensing performance test based on performing the first calibration, and additionally perform a second calibration for a second sensing error calculated through the second sensing performance test.
- the at least one processor may generate a sensing performance test report including a record of the sensing performance test and a record of the calibration.
- the sensing performance test report may be provided to the production process of the at least one robot so that the newly created at least one robot reflects the characteristics of the driving environment.
- a driving method of a driving robot may include an operation of performing sensing performance testing to calculate a sensing error between ground truth (GT) data and sensing data during driving, and an operation of performing calibration for the sensing error.
- the operation of performing the sensing performance testing may include an operation of entering a testing mode, an operation of acquiring sensing data for a testing distance using at least one sensor, an operation of confirming the GT data for the testing distance, and an operation of calculating the sensing error by comparing the sensing data and the GT data.
- the inspection mode may include at least one of a first inspection mode that senses the distance from the station, a second inspection mode that utilizes sensor checkpoints, or a third inspection mode that utilizes at least one other driving robot.
- the operation of performing the sensing performance test may include an operation of entering a first test mode for each preset unit distance, an operation of generating the sensing data by sensing the test distance between the current position and the station in the first test mode, an operation of determining the GT data by estimating the test distance based on at least one of a received signal strength indicator (RSSI) or an odometry, and an operation of calculating the sensing error by comparing the sensing data and the GT data.
- RSSI received signal strength indicator
- the operation of performing the sensing performance test may include an operation of entering a second inspection mode based on the driving robot being located at a sensor checkpoint, an operation of generating the sensing data by sensing the inspection distance between the sensor checkpoint and a preset target object in the second inspection mode, an operation of confirming the GT data by receiving an actual distance of the inspection distance from a server, and an operation of calculating the sensing error by comparing the sensing data and the GT data.
- the operation of performing the sensing performance test may include an operation of entering a third inspection mode at preset intervals, an operation of generating first sensing data by sensing a relative distance with at least one other driving robot in the third inspection mode, an operation of receiving second sensing data generated based on the at least one other driving robot sensing the relative distance, an operation of determining the GT data based on the first sensing data and the second sensing data, and an operation of calculating the sensing error by comparing the first sensing data and the GT data.
- the act of performing the calibration may include performing at least one of offset correction, scale factor adjustment, or nonlinear correction based on the sensing error, based on the sensing error being greater than or equal to a first error rate and less than a second error rate.
- the driving robot and its driving method of the present disclosure can ensure that the sensing performance of the driving robot is maintained at an optimal state by continuously checking the sensing performance during driving and performing calibration.
- the driving robot of the present disclosure and its driving method can determine a sensing error and perform effective calibration by examining the sensing performance of a distance sensor in various environments and conditions through an inspection mode of a specific scenario.
- the driving robot and its driving method of the present disclosure can improve the productivity and reliability of the driving robot by providing the results of sensing performance inspection and calibration in the form of a report and optimizing the sensing performance of a new robot in advance by reflecting an actual driving environment.
- FIG. 1 is a diagram illustrating a robot control system according to one embodiment of the present disclosure.
- FIG. 2 is a block diagram showing the configuration of a driving robot according to one embodiment of the present disclosure.
- FIG. 3 is a flowchart showing the operation of a driving robot according to one embodiment of the present disclosure.
- FIG. 4 is a flowchart illustrating a sensing performance test in a first test mode according to one embodiment of the present disclosure.
- FIG. 5 is a drawing showing an operation of a driving robot departing from a station according to one embodiment of the present disclosure.
- FIG. 6 is an exemplary diagram showing an operation of a driving robot performing a sensing performance test in a first test mode according to one embodiment of the present disclosure.
- FIG. 7 is a flowchart illustrating a sensing performance test in a second test mode according to one embodiment of the present disclosure.
- FIG. 8 is an exemplary diagram showing an operation of a driving robot performing a sensing performance test in a second test mode according to one embodiment of the present disclosure.
- FIG. 9 is an exemplary diagram showing an operation of a user performing a sensing performance test in a second test mode according to one embodiment of the present disclosure.
- FIG. 10 is a flowchart illustrating a sensing performance test in a third test mode according to one embodiment of the present disclosure.
- FIG. 11 is an exemplary diagram showing sensing performance testing and data transmission between multiple driving robots according to one embodiment of the present disclosure.
- FIG. 12 is an exemplary diagram showing sensing performance testing and data transmission between multiple driving robots according to one embodiment of the present disclosure.
- FIG. 13 is a flowchart showing a calibration operation of a driving robot according to one embodiment of the present disclosure.
- FIG. 14 is a flowchart showing additional operations of a driving robot according to one embodiment of the present disclosure.
- each of the phrases “A or B”, “at least one of A and B”, “at least one of A or B”, “A, B, or C”, “at least one of A, B, and C”, and “at least one of A, B, or C” may include any one of the items listed together in that phrase, or all possible combinations thereof.
- a component e.g., a first component
- another component e.g., a second component
- the component can be connected to the other component directly (e.g., wired), wirelessly, or through a third component.
- FIG. 1 is a diagram illustrating a robot control system according to one embodiment of the present disclosure.
- the robot control system (10) may include a driving robot (100), a user's electronic device (hereinafter, user device) (200), and/or a server (300).
- the mobile robot (100) may be a robot of various forms capable of performing locomotion functions.
- the mobile robot (100) may be a robot capable of performing various computing functions, such as locomotion functions, object organization functions (e.g., cleaning functions), object transport functions, sensing functions, display functions, communication functions, and/or output functions (e.g., voice or audio output functions).
- object organization functions e.g., cleaning functions
- object transport functions e.g., object transport functions
- sensing functions e.g., display functions
- communication functions e.g., communication functions
- output functions e.g., voice or audio output functions
- the present invention is not limited thereto, and various forms of robots may be implemented as the mobile robot (100).
- a mobile robot may be classified into industrial, medical, domestic, military, and/or exploration robots based on the functions and operations it can perform.
- Industrial robots may be further categorized into, for example, robots used in the product manufacturing process in factories, and robots used to greet customers, take orders, and/or serve food in stores and/or restaurants.
- the mobile robot (100) can be implemented as a serving robot capable of transporting service items to a user-desired location, i.e., a target location, in various locations such as restaurants, hotels, supermarkets, hospitals, and clothing stores.
- a user-desired location i.e., a target location
- robots can be categorized in various ways depending on their field of application, intended use, and the functions and actions they can perform.
- the driving robot (100) may perform communication using any of a variety of wired and/or wireless communication protocols, such as Ethernet, GSM (global system for mobile communications), EDGE (enhanced data GSM environment), CDMA (code division multiple access), TDMA (time division multiplexing access), LTE (long term evolution), LTE-A (LTE advance), NR (new radio), Wi-Fi, and/or Bluetooth.
- wired and/or wireless communication protocols such as Ethernet, GSM (global system for mobile communications), EDGE (enhanced data GSM environment), CDMA (code division multiple access), TDMA (time division multiplexing access), LTE (long term evolution), LTE-A (LTE advance), NR (new radio), Wi-Fi, and/or Bluetooth.
- the driving robot (100) may perform communication with the user device (200) and/or the server (300) based on the wired and/or wireless communication protocols.
- the user device (200) may be a device capable of performing various computing functions, such as a communication function, a display function, and/or an output function (e.g., a voice or audio output function).
- the user device (200) may be a TV, a wearable device (e.g., earbuds, a hearing aid, or a head mounted display (HMD)), a mobile device (e.g., a smartphone or a mobile phone), a tablet, a personal computer (PC), a desktop computer, a notebook computer, a personal digital assistant (PDA), a laptop, a media player, an e-book reader, a digital broadcasting terminal, a navigation device, a kiosk, a digital camera, or a home appliance.
- the user device (200) is not limited to the above-described devices and may also be another type of electronic device.
- the user device (200) can perform various functions to support the sensing performance test of the driving robot (100).
- the user device (200) can provide a user interface for receiving user feedback for the sensing performance test of the driving robot (100), and can transmit a feedback message including data related to the user feedback obtained through the user interface to the server (300) or the driving robot (100).
- the user device (200) may perform communication using any of a variety of wired and/or wireless communication protocols, such as Ethernet, GSM, EDGE, CDMA, TDMA, LTE, LTE-A, NR, Wi-Fi, and/or Bluetooth.
- the user device (200) may perform communication with the driving robot (100) and/or the server (300) based on wired and/or wireless communication protocols.
- the server (300) may be connected to the driving robot (100) and/or the user device (200).
- the server (108) may be a cloud server.
- the server (300) may receive a feedback message including data associated with user feedback from the driving robot (100) and/or the user device (200).
- the server (300) may support the sensing performance inspection of the driving robot (100) by updating an AI model used in the driving robot (100) based on the data included in the feedback message.
- FIG. 2 is a block diagram showing the configuration of a driving robot (100) according to one embodiment of the present disclosure.
- the driving robot (100) may include a driving unit (110), a sensor unit (120), a memory (130), and/or a processor (140).
- a driving unit (110) may be configured to move a driving robot (100).
- the driving unit (110) may include at least one physical component (e.g., a motor, a wheel, a plurality of wheels, and/or a brake) for controlling the movement of the driving robot (100).
- the driving unit (110) may change the moving direction and/or moving speed of the driving robot (100) under the control of the processor (140).
- a driving robot (100) can move on its own within a specific space by using a motor and wheels included in a driving unit (110).
- the driving robot (100) can stop within a specific space or control its movement speed by using a brake.
- a sensor unit (120) may include at least one sensor.
- the sensor unit (120) may sense data related to the driving robot (100) and/or the surroundings of the driving robot (100) using at least one sensor.
- the sensor unit (120) may include an IMU (inertial measurement unit) sensor.
- the IMU sensor may sense the acceleration and/or angular velocity of the driving robot (100) using an accelerometer, a gyroscope, and/or a magnetometer.
- the sensor unit (120) may include a wheel encoder.
- the wheel encoder may sense the rotational speed and/or rotational direction of each of the plurality of wheels installed on the driving robot (100).
- each of the plurality of wheels may be rotated by a motor to perform a role of moving the driving robot (100).
- the driving robot (100) may generate a driving record (odometry) based on the rotational speed and rotational direction of each of the plurality of wheels sensed by the wheel encoder.
- the sensor unit (120) may include a camera.
- the camera may capture images by photographing the surroundings of the driving robot (100).
- the camera may be implemented as a 3D camera and may generate 3D image information about the surroundings of the driving robot (100).
- the sensor unit (120) may include two or more cameras. In this case, the two or more cameras may be implemented in a stereo vision manner, which captures images by photographing and combines the captured images to generate 3D coordinate information.
- the sensor unit (120) may include a lidar sensor.
- the lidar sensor can rotate 360 degrees and output laser light. When the laser light is reflected from an object around the driving robot (100) and received, the lidar sensor can detect the distance to the object based on the time the laser light was received.
- the sensor unit (120) may include a ToF (time of flight) sensor.
- the ToF sensor may output light of an infrared wavelength.
- the ToF sensor may detect the distance to the object based on the time at which the light is received.
- the sensor unit (120) may include an ultrasonic sensor.
- the ultrasonic sensor may output ultrasonic waves.
- the ultrasonic sensor may detect the distance to the object based on the time at which the ultrasonic waves were received.
- the sensor unit (120) may be mounted on the processor (140) or may be provided on the driving robot (100) separately from the processor (140).
- a memory (130) can store data required for various embodiments of the present disclosure.
- the memory (130) may be implemented in the form of a memory (130) embedded in a driving robot (100) or in the form of a memory (130) that can be attached or detached to the driving robot (100), depending on the purpose of data storage.
- data for driving the driving robot (100) may be stored in a memory (130) embedded in the driving robot (100)
- data for expanded functions of the driving robot (100) may be stored in a memory (130) that can be attached or detached to the driving robot (100).
- the memory (130) embedded in the driving robot (100) may be implemented as at least one of volatile memory (130) (e.g., dynamic RAM (DRAM), static RAM (SRAM), or synchronous dynamic RAM (SDRAM)) or non-volatile memory (130) (non-volatile memory) (e.g., one time programmable ROM (OTPROM), programmable ROM (PROM), erasable and programmable ROM (EPROM), electrically erasable and programmable ROM (EEPROM), mask ROM, flash ROM, flash memory (130) (e.g., NAND flash or NOR flash), hard drive, or solid state drive (SSD)).
- volatile memory e.g., dynamic RAM (DRAM), static RAM (SRAM), or synchronous dynamic RAM (SDRAM)
- non-volatile memory e.g., one time programmable ROM (OTPROM), programmable ROM (PROM), erasable and programmable ROM (EPROM), electrically eras
- the memory (130) that can be attached or detached to the driving robot (100) may be implemented as a memory (130) card (e.g., compact flash (CF), secure digital (SD), micro secure digital (Micro-SD), mini secure digital (Mini-SD), extreme digital (xD), It can be implemented in the form of an MMC (multi-media card)), an external memory (130) connectable to a USB port (e.g., a USB memory (130)), etc.
- CF compact flash
- SD secure digital
- Micro-SD micro secure digital
- Mini-SD mini secure digital
- xD extreme digital
- MMC multi-media card
- the memory (130) can store a computer program including at least one instruction for controlling a driving robot (100).
- various data may be stored in the external memory (130) of the processor (140), and some of the data may be stored in the internal memory (130) of the processor (140) and the rest may be stored in the external memory (130).
- the memory (130) may store information about the sensing performance test performed by the processor (140) and/or information about calibration.
- the memory (130) may store a sensing performance test report including information about the sensing performance test and/or information about calibration.
- At least one processor (140) can control the overall operation of the driving robot (100).
- the processor (140) may be implemented as a digital signal processor (DSP) for processing digital signals, a microprocessor (140), or a time controller (TCON).
- DSP digital signal processor
- MCU microcontroller unit
- MPU microprocessing unit
- AP application processor
- CP communication processor
- ARM ARM processor
- AI artificial intelligence
- the processor (140) may be implemented as a system on chip (SoC) having a built-in processing algorithm, a large scale integration (LSI), or may be implemented in the form of a field programmable gate array (FPGA).
- SoC system on chip
- LSI large scale integration
- FPGA field programmable gate array
- the processor (140) can perform various functions by executing computer executable instructions stored in the memory (130).
- At least one processor (140) may include one or more of a central processing unit (CPU), a graphics processing unit (GPU), an accelerated processing unit (APU), a many integrated core (MIC), a digital signal processor (DSP), a neural processing unit (NPU), a hardware accelerator, or a machine learning accelerator.
- the one or more processors (140) may control one or any combination of other components of the electronic device and may perform operations related to communication or data processing.
- At least one processor (140) can execute one or more programs or instructions stored in the memory (130). For example, at least one processor (140) can perform a method according to an embodiment of the present disclosure by executing one or more instructions stored in the memory (130). When a method according to an embodiment of the present disclosure includes a plurality of operations, the plurality of operations may be performed by one processor (140) or by a plurality of processors (140).
- the first operation, the second operation, and the third operation may all be performed by the first processor (140), or the first operation and the second operation may be performed by the first processor (140) (e.g., a general-purpose processor (140)) and the third operation may be performed by the second processor (140) (e.g., an artificial intelligence-only processor (140)).
- the first processor 140
- the second processor 140
- an artificial intelligence-only processor 140
- At least one processor (140) may be implemented as a single core processor (140) including one core, or may be implemented as one or more multi-core processors (140) including multiple cores (e.g., homogeneous multi-cores or heterogeneous multi-cores).
- processors (140) may be implemented as a multi-core processor (140)
- each of the multiple cores included in the multi-core processor (140) may include an internal memory (130) of the processor (140), such as a cache memory (130) or an on-chip memory (130), and a common cache shared by the multiple cores may be included in the multi-core processor (140).
- each of the plurality of cores (or some of the plurality of cores) included in the multi-core processor (140) may independently read and execute a program command for implementing a method according to an embodiment of the present disclosure, or all (or some) of the plurality of cores may be linked to read and execute a program command for implementing a method according to an embodiment of the present disclosure.
- the plurality of operations may be performed by one core among the plurality of cores included in the multi-core processor (140), or may be performed by the plurality of cores.
- the first operation, the second operation, and the third operation may all be performed by a first core included in the multi-core processor (140), or the first operation and the second operation may be performed by a first core included in the multi-core processor (140), and the third operation may be performed by a second core included in the multi-core processor (140).
- At least one processor (140) may perform sensing performance testing to calculate a sensing error between ground truth (GT) data and sensing data during driving. At least one processor (140) may perform calibration for the sensing error.
- GT ground truth
- At least one processor (140) can continuously check the sensing performance and perform calibration while the driving robot (100) is driving, thereby ensuring that the sensing performance of the driving robot (100) is maintained at an optimal state.
- At least one processor (140) can determine a sensing error and perform effective calibration by examining the sensing performance of the distance sensor in various environments and conditions through an inspection mode of a specific scenario.
- FIG. 3 is a flowchart showing the operation of a driving robot (e.g., the driving robot (100) of FIG. 1) according to one embodiment of the present disclosure.
- a driving robot e.g., the driving robot (100) of FIG. 1
- a driving robot (100) can perform sensing performance testing (operation 310) to calculate a sensing error between ground truth (GT) data and sensing data while driving, and perform calibration for the sensing error (operation 320).
- sensing performance testing operation 310 to calculate a sensing error between ground truth (GT) data and sensing data while driving
- calibration for the sensing error operation 320.
- GT data may refer to actual or accurate values used to verify measurement values acquired from the sensor unit (120).
- GT data may be reference data related to the location, distance measurement, and/or other environmental characteristics of the mobile robot (100).
- GT data may be an actual distance measured in advance when the mobile robot (100) measures a predetermined distance to a specific object.
- the sensing data may be data collected in real time through the sensor unit (120) of the driving robot (100).
- the driving robot (100) may generate sensing data by capturing or measuring information about the surrounding environment using the sensor unit (120).
- the driving robot (100) may use the sensing data for at least one of determining a driving path, detecting obstacles, or measuring distances.
- a sensing error may refer to a difference or error between sensing data and GT data.
- the sensing error may be caused by the accuracy of the sensor unit (120), environmental factors, and/or functional limitations of the sensor unit (120). For example, if the sensing error is large, the driving robot (100) may malfunction or stop.
- the driving robot (100) may perform a sensing performance test to calculate a sensing error between GT data and sensing data during driving.
- the sensing performance test may be a test performed by the driving robot (100) to evaluate the sensing performance of the sensor unit (120).
- the driving robot (100) may evaluate the sensing performance of the sensor unit (120) by calculating the sensing error between the GT data and the sensing data.
- the driving robot (100) when performing a sensing performance test, can enter a testing mode.
- the driving robot (100) can sense a testing distance using the sensor unit (120), thereby generating (or acquiring) the sensing data for the testing distance.
- the driving robot (100) can confirm the GT data for the testing distance.
- the driving robot (100) can calculate a sensing error by comparing the sensing data and GT data.
- the sensing performance test may be performed in a test mode of a specific scenario.
- the test mode of the specific scenario may include at least one of a first test mode that senses the distance from the station (400), a second test mode that utilizes sensor checkpoints, or a third test mode that utilizes at least one other driving robot (600).
- the driving robot (100) may perform the sensing performance test by combining multiple test modes among the first test mode, the second test mode, and the third test mode.
- the driving robot (100) may perform calibration for the sensing error while driving.
- Calibration may be a process of adjusting the settings of a sensor based on the sensing error.
- Calibration may include at least one of offset correction, scale factor adjustment, or nonlinear correction.
- the driving robot (100) may determine that calibration is not necessary based on a sensing error being less than a first error rate and may not perform calibration.
- the first error rate may be about 2% to 3%, but is not limited thereto.
- the driving robot (100) may determine that calibration is necessary based on a sensing error being greater than or equal to a first error rate and less than a second error rate, and may perform at least one of offset correction, scale factor adjustment, or nonlinear correction.
- the second error rate may be about 5% to 10%, but is not limited thereto.
- the driving robot (100) may determine that sensor maintenance is required based on the sensing error being greater than or equal to the second error rate, and may provide a sensor maintenance notification to the user.
- FIG. 4 is a flowchart showing a sensing performance test in a first test mode according to one embodiment of the present disclosure
- FIG. 5 is a diagram showing an operation of a driving robot (100) according to one embodiment of the present disclosure departing from a station (400)
- FIG. 6 is an exemplary diagram showing an operation of a driving robot (100) according to one embodiment of the present disclosure performing a sensing performance test in a first test mode.
- the driving robot (100) can perform a sensing performance test in a first test mode that senses a distance from a station (400).
- the driving robot (100) can enter the first test mode at each preset unit distance (operation 410).
- the driving robot (100) can generate the sensing data (operation 420) by sensing the test distance between the current position and the station (400).
- the driving robot (100) can estimate the test distance based on at least one of a received signal strength indicator (RSSI) or an odometry record, thereby determining the GT data (operation 430).
- the driving robot (100) can calculate the sensing error (operation 440) by comparing the sensing data and the GT data.
- RSSI received signal strength indicator
- GT data operation 430
- the station (400) may be the starting point of the driving robot (100).
- the station (400) may include at least one of a charging station, a maintenance station, a data synchronization station, or a control center.
- the charging station may supply electrical energy to the driving robot (100) so that the driving robot (100) can operate continuously for an extended period of time.
- the maintenance station may provide regular inspection and necessary repairs of the hardware or software of the driving robot (100).
- the data synchronization station may upload data collected by the driving robot (100) to a central database (e.g., the server (300) of FIG. 1) and provide the driving robot (100) with the latest work instructions or software updates.
- the control center may monitor the movement and operation of the driving robot (100) and control the driving robot (100) to optimize the movement and operation of the driving robot (100).
- the driving robot (100) may enter the first inspection mode at each preset unit distance.
- the driving robot (100) may start from the station (400) and perform a predetermined operation based on a work instruction.
- the driving robot (100) may enter the first inspection mode at each preset unit distance and perform a sensing performance test.
- the unit distance may be set and/or changed based on a user's input.
- the driving robot (100) may recognize the unit distance (e.g., UD1, UD2, UD3) based on at least one of a communication strength (received signal strength indicator, RSSI) or an odometry record.
- the driving robot (100) can generate (or obtain) the sensing data by sensing the inspection distance between the current position and the station (400) in the first inspection mode.
- the driving robot (100) can rotate from the current position (e.g., UD1, UD2, UD3) toward the station (400) and then sense the inspection distance (TD) away from the station (400) using the sensor unit (120).
- the driving robot (100) can sense the inspection distance (TD) using at least one of an IMU sensor, a wheel encoder, a camera, a lidar sensor, a ToF sensor, or an ultrasonic sensor.
- the driving robot (100) can generate the sensing data for the inspection distance (TD) based on the sensing value collected from the sensor unit (120). For example, the driving robot (100) can generate sensing data for the inspection distance (TD) by synthesizing the sensing values collected from the sensor unit (120).
- the driving robot (100) can determine the GT data by estimating the inspection distance (TD) based on at least one of the communication strength or the driving record.
- the communication strength may be the strength of a signal that the driving robot (100) receives from the station (400).
- the communication strength may be proportional to the distance between the station (400) and the driving robot (100). For example, the closer the distance between the station (400) and the driving robot (100), the stronger the communication strength may be. For example, the farther the distance between the station (400) and the driving robot (100) is, the weaker the communication strength may be.
- the driving robot (100) may store reference distance information regarding the communication strength.
- the reference distance information regarding the communication strength may include data representing the actual distance between the station (400) and the driving robot (100) when a specific communication strength is detected.
- the driving robot (100) may estimate the inspection distance (TD) using the communication strength based on the reference distance information regarding the communication strength. In some embodiments, the driving robot (100) may improve the accuracy of the inspection distance (TD) estimation by applying a compensation algorithm that compensates for the reduction in communication strength due to environmental factors and/or obstacles. The driving robot (100) may confirm the inspection distance (TD) estimated from the communication strength as GT data.
- the driving record may be information about changes in the number of wheel rotations and/or geometric changes in the movement path according to the movement of the driving robot (100).
- the driving record may include the distance traveled by the driving robot (100) starting from the station (400).
- the driving robot (100) may generate the driving record by sensing the number of rotations and/or the direction of rotation of a plurality of wheels using the wheel encoder of the sensor unit (120).
- the driving robot (100) may calculate the distance traveled from the starting from the station (400) to the current position using the driving record, and estimate the inspection distance (TD) based on the travel distance.
- the driving robot (100) may improve the accuracy of the inspection distance (TD) estimation by applying a correction algorithm that compensates for errors in the driving record caused by wheel slippage and/or changes in the coefficient of friction with the ground.
- the driving robot (100) may confirm the inspection distance (TD) estimated from the driving record as GT data.
- the driving robot (100) can calculate the sensing error by comparing the sensing data and the GT data.
- the sensing error may refer to an error between the actual inspection distance estimated using the communication strength and the sensing data.
- the sensing error may refer to an error between the actual inspection distance estimated using the driving record and the sensing data.
- the driving robot (100) can evaluate the sensing performance of the sensor unit (120) based on the sensing error.
- the driving robot (100) can determine whether calibration of the sensor unit (120) is necessary based on the sensing error.
- FIG. 7 is a flowchart showing a sensing performance test in a second test mode according to one embodiment of the present disclosure
- FIG. 8 is an exemplary diagram showing an operation in which a driving robot (100) according to one embodiment of the present disclosure performs a sensing performance test in the second test mode
- FIG. 9 is an exemplary diagram showing an operation in which a user according to one embodiment of the present disclosure performs a sensing performance test in the second test mode.
- the driving robot (100) can perform a sensing performance test in a second test mode using a sensor checkpoint.
- the driving robot (100) can enter the second test mode (operation 710) based on the driving robot (100) being located at the sensor checkpoint.
- the driving robot (100) can generate the sensing data (operation 720) by sensing the test distance between the sensor checkpoint and a preset target object (500).
- the driving robot (100) can confirm the GT data (operation 730) by receiving the actual distance of the test distance from the server.
- the driving robot (100) can calculate the sensing error (operation 740) by comparing the sensing data and the GT data.
- the driving robot (100) may enter a second inspection mode based on the driving robot (100) being located at a sensor checkpoint (CP).
- the driving robot (100) may identify that the driving robot (100) is located at a sensor checkpoint (CP).
- a sensor checkpoint (CP) may be pre-positioned in the surrounding environment of the driving robot (100).
- the sensor checkpoint (CP) may be a predetermined inspection site pre-positioned at an appropriate location to inspect the sensing performance of the driving robot (100) in the surrounding environment.
- the distance e.g., inspection distance
- between the sensor checkpoint (CP) and the target object (500) may be measured in advance and stored in a server (e.g., server (300) of FIG. 1).
- the sensor checkpoint (CP) may include a QR code on the floor.
- the driving robot (100) may identify that the driving robot (100) is located at the sensor checkpoint (CP) and enter the second inspection mode.
- the driving robot (100) can recognize a marker image (510) on a predetermined target object (500) and enter the second inspection mode at the sensor checkpoint (CP) that recognizes the marker image (510).
- the driving robot (100) can identify that the driving robot (100) is located at a sensor checkpoint (CP) based on an ultrasonic signal reflected from a reflective tile and/or a radio-frequency identification (RFID) tag placed at a specific point. For example, the driving robot (100) can enter a second inspection mode based on receiving an ultrasonic signal reflected from a reflective tile. For example, the driving robot (100) can enter a second inspection mode based on recognizing an RFID tag placed at a specific point.
- CP sensor checkpoint
- RFID radio-frequency identification
- the driving robot (100) can generate (or acquire) the sensing data by sensing the inspection distance (TD) between the sensor checkpoint (CP) and the preset target object (500) in the second inspection mode.
- a driving robot (100) can stop at a sensor checkpoint (CP) (e.g., a QR code on the floor) and sense an inspection distance (TD) from a preset target object (500) using a sensor unit (120).
- CP sensor checkpoint
- TD inspection distance
- a driving robot (100) during driving can stop at a sensor checkpoint (CP) (e.g., a point where a marker image (510) of a target object (500) is recognized) and sense an inspection distance (TD) from a preset target object (500) using a sensor unit (120).
- CP sensor checkpoint
- TD inspection distance
- the driving robot (100) can sense the inspection distance (TD) between the sensor checkpoint (CP) and the target object (500) using at least one of an IMU sensor, a wheel encoder, a camera, a lidar sensor, a ToF sensor, or an ultrasonic sensor.
- the driving robot (100) can generate sensing data for the inspection distance (TD) based on at least some or all of the sensing values collected from the sensor unit (120).
- the driving robot (100) can confirm the GT data by receiving the actual distance of the inspection distance (TD) from the server.
- the distance e.g., inspection distance
- CP sensor checkpoint
- target object 500
- the server can transmit the actual distance to the driving robot (100).
- the driving robot (100) can confirm the inspection distance (TD) received from the server as GT data.
- the driving robot (100) can calculate the sensing error by comparing the sensing data and the GT data.
- the sensing error may refer to an error between the actual value of the inspection distance received by the driving robot (100) from the server and the sensing data.
- the driving robot (100) can evaluate the sensing performance of the sensor unit (120) based on the sensing error.
- the driving robot (100) can determine whether calibration of the sensor unit (120) is necessary based on the sensing error.
- FIG. 10 is a flowchart showing a sensing performance test in a third test mode according to one embodiment of the present disclosure
- FIG. 11 is an exemplary diagram showing a sensing performance test and data transmission between a plurality of driving robots (100) according to one embodiment of the present disclosure
- FIG. 12 is an exemplary diagram showing a sensing performance test and data transmission between a plurality of driving robots (100) according to one embodiment of the present disclosure.
- the driving robot (100) can perform a sensing performance test in a third test mode using at least one other driving robot (600).
- the driving robot (100) can enter the third test mode at preset intervals (operation 1010).
- the driving robot (100) can generate first sensing data (operation 1020) by sensing a relative distance with at least one other driving robot (600).
- the driving robot (100) can receive second sensing data generated based on the at least one other driving robot (600) sensing the relative distance (operation 1030).
- the driving robot (100) can determine the GT data (operation 1040) based on the first sensing data and the second sensing data.
- the driving robot (100) can calculate the sensing error (operation 1050) by comparing the first sensing data and the GT data.
- the driving robot (100) may enter the third inspection mode at preset intervals.
- the interval may be set and/or changed based on user input. For example, if the interval is shortened, the frequency of sensing performance tests in the third inspection mode may increase. For example, if the interval is lengthened, the frequency of sensing performance tests in the third inspection mode may decrease.
- the driving robot (100) can generate (or acquire) first sensing data by sensing a relative distance (TD) with at least one other driving robot (600) in the third inspection mode.
- the driving robot (100) can sense the relative distance (TD) with at least one other driving robot (600) using at least one of an IMU sensor, a wheel encoder, a camera, a lidar sensor, a ToF sensor, or an ultrasonic sensor.
- the driving robot (100) may utilize filtered sub-sensing data to increase the accuracy of the first sensing data.
- the driving robot (100) may collect sub-sensing data measured using at least one sub-sensor included in the sensor unit (120).
- the driving robot (100) may collect sub-sensing data measured from at least one of an IMU sensor, a wheel encoder, a camera, a lidar sensor, a ToF sensor, or an ultrasonic sensor.
- a predetermined deviation may occur between the sub-sensing data measured from different sensors.
- the driving robot (100) may perform filtering to remove dummy sensing data exceeding a reference deviation from the collected sub-sensing data.
- the reference deviation may be a criterion for determining inaccurate sub-sensing data from sub-sensing data measured from different sensors.
- the reference deviation may be set and/or changed according to a user's input.
- the driving robot (100) may generate the first sensing data based on an average of the filtered sub-sensing data.
- the driving robot (100) may obtain the average of the filtered sub-sensing data as the first sensing data.
- the driving robot (100) can receive second sensing data generated based on the sensing of the relative distance (TD) by at least one other driving robot (600).
- at least one other driving robot (600) positioned in an environment surrounding the driving robot (100) can generate (or obtain) the second sensing data by sensing the relative distance (TD) with the driving robot (100).
- the driving robot (100) can directly communicate with at least one other driving robot (600).
- the driving robot (100) can directly transmit first sensing data to at least one other driving robot (600).
- the driving robot (100) can directly receive second sensing data (SD2) from at least one other driving robot (600).
- SD2 second sensing data
- the driving robot (100) and at least one other driving robot (600) can communicate via the server (300).
- the driving robot (100) can transmit first sensing data (SD1) to the server (300).
- at least one other driving robot (600) can transmit second sensing data (SD2) to the server (300).
- the driving robot (100) can receive the second sensing data (SD2) from the server (300).
- the driving robot (100) can determine the GT data based on the first sensing data and the second sensing data.
- the driving robot (100) can generate the first data by reflecting a first weight to the first sensing data.
- the driving robot (100) can generate the second data by reflecting a second weight to the second sensing data.
- the first weight and the second weight can be set and/or changed based on the relative importance or relative accuracy between the first sensing data and the second sensing data.
- the driving robot (100) can determine the GT data by integrating the first sensing data and the second sensing data.
- the driving robot (100) can integrate the first sensing data and the second sensing data by calculating an average of the first data in which the first weight is reflected in the first sensing data and the second data in which the second weight is reflected in the second sensing data. For example, the driving robot (100) can determine the average of the first data and the second data as GT data.
- FIGS. 10 to 12 illustrate the case of two driving robots (100), the number of driving robots (100) used for sensing performance testing in the third inspection mode is not limited thereto.
- the driving robot (100) of the present disclosure can determine GT data by integrating the first sensing data to the nth sensing data.
- the integration of the first sensing data and the second sensing data may be performed by the server (300).
- the server (300) may determine the average of the first data, which reflects the first weighting to the first sensing data, and the second data, which reflects the second weighting to the second sensing data, as GT data.
- the driving robot (100) may receive GT data (GTD) from the server (300).
- the driving robot (100) can calculate the sensing error by comparing the first sensing data and the GT data.
- the sensing error can mean an error between the first sensing data and the second sensing data.
- the sensing error can mean an error between the first sensing data and the integrated data (e.g., GT data) in which the first sensing data and the second sensing data are integrated and the first sensing data.
- the driving robot (100) can evaluate the sensing performance of the sensor unit (120) based on the sensing error.
- the driving robot (100) can determine whether calibration of the sensor unit (120) is necessary based on the sensing error.
- FIG. 13 is a flowchart showing a calibration operation of a driving robot (100) according to one embodiment of the present disclosure.
- the driving robot (100) can determine whether the sensing error is smaller than a first error rate (operation 1310), determine whether the sensing error is smaller than a second error rate (operation 1320), and, depending on the sensing error, perform calibration (operation 1330) or provide a sensor maintenance notification (operation 1340).
- the driving robot (100) may determine whether the sensing error is less than a first error rate.
- the first error rate may be about 2% to 3%, but is not limited thereto. If the sensing error is less than the first error rate, the driving robot (100) may determine that calibration is not necessary and may not perform calibration.
- the driving robot (100) may determine whether the sensing error is less than a second error rate.
- the second error rate may be about 5% to 10%, but is not limited thereto.
- the driving robot (100) may determine that calibration is required.
- the driving robot (100) may determine that sensor maintenance is required.
- the driving robot (100) may perform calibration.
- Calibration may be a process of adjusting sensor settings based on sensing errors.
- Calibration may include at least one of offset correction, scale factor adjustment, and nonlinear correction.
- Offset correction may be a calibration that corrects the sensor output by adding or subtracting a certain amount from the sensing value measured by the sensor unit (120). For example, if a given sensor consistently measures a distance 10 cm further than the actual distance, offset correction may be performed by subtracting 10 cm from all sensing values of the given sensor.
- Scale factor adjustment may be a calibration that adjusts the sensed values to accurately match the scale when the sensor unit (120) does not return the sensed values at the correct ratio within the sensing range. For example, scale factor adjustment may be performed by multiplying the sensed values by 1.1 when a given sensor consistently determines that the distance is 10% closer than the actual distance.
- Nonlinear correction may be a calibration that corrects the sensing value by applying a mathematical model and/or a correction curve when the sensing value of the sensor unit (120) is transformed differently depending on the measurement range (e.g., when the sensing value is not a linear function of the input data).
- the driving robot (100) may provide a sensor maintenance notification. For example, if the sensing error is greater than or equal to the second error rate, the driving robot (100) may determine that the sensing unit is operating improperly and may provide a sensor maintenance notification to the user. For example, the driving robot (100) may provide the sensor maintenance notification through a visual signal (e.g., a warning light) or an auditory signal (e.g., a warning sound). For example, the driving robot (100) may transmit a sensor maintenance notification message to a user device (e.g., the user device (200) of FIG. 1) or a server (e.g., the server (300) of FIG. 1).
- a user device e.g., the user device (200) of FIG. 1
- a server e.g., the server (300) of FIG. 1).
- the driving robot (100) may perform calibration and additional sensing performance tests.
- the driving robot (100) may perform additional calibration for sensing errors resulting from the additional sensing performance tests.
- the driving robot (100) can perform a first calibration for the first sensing error calculated through the first sensing performance test.
- the driving robot (100) can additionally perform a second sensing performance test based on the first calibration.
- the driving robot (100) can additionally perform a second calibration for the second sensing error calculated through the second sensing performance test. In this way, the driving robot (100) can minimize the sensing error by repeatedly performing the sensing performance test and calibration.
- FIG. 14 is a flowchart showing additional operations of a driving robot (100) according to one embodiment of the present disclosure.
- the driving robot (100) of the present disclosure can generate a sensing performance test report based on information about the sensing performance test and information about calibration.
- the driving robot (100) can perform a sensing performance test to calculate a sensing error between GT data and sensing data during driving (operation 1410), perform calibration for the sensing error (operation 1420), and generate a sensing performance test report based on the information about the sensing performance test and/or the information about calibration.
- the driving robot (100) may perform a sensing performance test to calculate a sensing error between GT data and sensing data during driving.
- the sensing performance test may be a test performed by the driving robot (100) to evaluate the sensing performance of the sensor unit (120).
- the driving robot (100) may evaluate the sensing performance of the sensor unit (120) by calculating the sensing error between the GT data and the sensing data.
- the driving robot (100) may perform calibration for the sensing error while driving.
- Calibration may be a process of adjusting the settings of a sensor based on the sensing error.
- Calibration may include at least one of offset correction, scale factor adjustment, or nonlinear correction.
- the driving robot (100) may generate a sensing performance test report.
- the driving robot (100) may generate a sensing performance test report including a record of the sensing performance test and a record of the calibration.
- the sensing performance test report may include sensing data, GT data, sensing errors of a plurality of sensors included in the sensor unit (120), and/or calibration results.
- the driving robot (100) can transmit a sensing performance test report to a user device (e.g., the user device (200) of FIG. 1) or a server (e.g., the server (300) of FIG. 1).
- the sensing performance test report can be used for the calibration of at least one other driving robot (600) driving in a similar environment as the driving robot (100).
- the sensing performance test report can be provided to the production process of at least one newly created robot so that the at least one robot reflects the characteristics of the driving environment.
- the driving robot (100) of the present disclosure and its driving method can ensure that the sensing performance of the driving robot (100) is maintained at an optimal state by continuously checking the sensing performance and performing calibration even while driving.
- the driving robot (100) of the present disclosure and its driving method can accurately determine a sensing error and perform effective calibration by examining the sensing performance of a distance sensor in various environments and conditions through an inspection mode of a specific scenario.
- the driving robot (100) of the present disclosure and its driving method can improve the productivity and reliability of the driving robot (100) by providing the results of sensing performance inspection and calibration in the form of a report and optimizing the sensing performance of a new robot in advance by reflecting the actual driving environment.
- module used in various embodiments of this document may include a unit implemented in hardware, software, or firmware, and may be used interchangeably with terms such as logic, logic block, component, or circuit.
- a module may be an integral component, or a minimum unit or part of such a component that performs one or more functions.
- a module may be implemented in the form of an application-specific integrated circuit (ASIC).
- ASIC application-specific integrated circuit
- each component e.g., a module or a program of the above-described components may include one or more entities, and some of the entities may be separated and placed in other components.
- one or more components or operations of the aforementioned components may be omitted, or one or more other components or operations may be added.
- a plurality of components e.g., a module or a program
- the integrated component may perform one or more functions of each of the plurality of components identically or similarly to those performed by the corresponding component among the plurality of components prior to the integration.
- the operations performed by a module, program, or other component may be executed sequentially, in parallel, iteratively, or heuristically, or one or more of the operations may be executed in a different order, omitted, or one or more other operations may be added.
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
Un robot mobile selon les modes de réalisation divulgués ici peut comprendre une unité de déplacement, au moins un capteur, une mémoire et au moins un processeur. Ledit processeur peut être configuré pour effectuer une inspection de performances de détection pour calculer une erreur de détection entre des données de réalité de terrain (GT) et des données de détection pendant le déplacement, et effectuer un étalonnage pour l'erreur de détection. Afin d'effectuer l'inspection de performances de détection, ledit processeur peut être configuré pour entrer dans un mode d'inspection, obtenir les données de détection pour une distance d'inspection à l'aide dudit capteur, déterminer les données de GT pour la distance d'inspection, et calculer l'erreur de détection en comparant les données de détection et les données de GT.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| KR1020240062095A KR20250162221A (ko) | 2024-05-10 | 2024-05-10 | 주행 로봇 및 이의 구동 방법 |
| KR10-2024-0062095 | 2024-05-10 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025234604A1 true WO2025234604A1 (fr) | 2025-11-13 |
Family
ID=97675012
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/KR2025/004377 Pending WO2025234604A1 (fr) | 2024-05-10 | 2025-04-02 | Robot mobile et son procédé de déplacement |
Country Status (2)
| Country | Link |
|---|---|
| KR (1) | KR20250162221A (fr) |
| WO (1) | WO2025234604A1 (fr) |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2006231477A (ja) * | 2005-02-25 | 2006-09-07 | Mitsubishi Heavy Ind Ltd | 移動体における距離検出手段の校正方法 |
| JP2008290184A (ja) * | 2007-05-24 | 2008-12-04 | Fujitsu Ltd | 校正ロボットシステム及び距離センサの校正方法 |
| JP2017521755A (ja) * | 2014-07-10 | 2017-08-03 | アクチエボラゲット エレクトロルックス | ロボット型清掃装置における計測誤差を検出する方法 |
| JP2019168828A (ja) * | 2018-03-22 | 2019-10-03 | カシオ計算機株式会社 | ロボット、ロボットの制御方法及びプログラム |
| KR20210094388A (ko) * | 2020-01-21 | 2021-07-29 | 엘지전자 주식회사 | 자율 이동 로봇 및 그의 제어방법 |
-
2024
- 2024-05-10 KR KR1020240062095A patent/KR20250162221A/ko active Pending
-
2025
- 2025-04-02 WO PCT/KR2025/004377 patent/WO2025234604A1/fr active Pending
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2006231477A (ja) * | 2005-02-25 | 2006-09-07 | Mitsubishi Heavy Ind Ltd | 移動体における距離検出手段の校正方法 |
| JP2008290184A (ja) * | 2007-05-24 | 2008-12-04 | Fujitsu Ltd | 校正ロボットシステム及び距離センサの校正方法 |
| JP2017521755A (ja) * | 2014-07-10 | 2017-08-03 | アクチエボラゲット エレクトロルックス | ロボット型清掃装置における計測誤差を検出する方法 |
| JP2019168828A (ja) * | 2018-03-22 | 2019-10-03 | カシオ計算機株式会社 | ロボット、ロボットの制御方法及びプログラム |
| KR20210094388A (ko) * | 2020-01-21 | 2021-07-29 | 엘지전자 주식회사 | 자율 이동 로봇 및 그의 제어방법 |
Also Published As
| Publication number | Publication date |
|---|---|
| KR20250162221A (ko) | 2025-11-18 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2013039306A1 (fr) | Dispositif et procédé de fourniture d'informations basées sur une position | |
| WO2020046038A1 (fr) | Robot et procédé de commande associé | |
| WO2020096365A1 (fr) | Suppression de fuite assistée par mouvement pour applications radar | |
| WO2019112335A1 (fr) | Dispositif électronique pour effectuer un positionnement et procédé de commande de dispositif électronique | |
| WO2020055112A1 (fr) | Dispositif électronique, et procédé pour l'identification d'une position par un dispositif électronique | |
| WO2017048067A1 (fr) | Terminal et procédé pour mesurer un emplacement de celui-ci | |
| JP2016224547A (ja) | 画像処理装置、画像処理システム及び画像処理方法 | |
| WO2020075954A1 (fr) | Système et procédé de positionnement utilisant une combinaison de résultats de reconnaissance d'emplacement basée sur un capteur multimodal | |
| EP3351023A1 (fr) | Terminal et procédé pour mesurer un emplacement de celui-ci | |
| CN112723068A (zh) | 电梯轿厢定位方法、装置及存储介质 | |
| WO2020180021A1 (fr) | Dispositif électronique et procédé de balayage de canal pour effectuer un service basé sur la localisation | |
| WO2022154242A1 (fr) | Robot et procédé de commande de celui-ci | |
| JP2010169521A (ja) | 位置検出装置、位置検出方法及びプログラム | |
| WO2018073900A1 (fr) | Système informatique, méthode et programme de diagnostic d'un sujet | |
| EP4005250A1 (fr) | Dispositif électronique permettant de détecter un emplacement d'utilisateur et procédé associé | |
| WO2019245320A1 (fr) | Dispositif de robot mobile destiné à corriger une position par fusion d'un capteur d'image et d'une pluralité de capteurs géomagnétiques, et procédé de commande | |
| WO2025234604A1 (fr) | Robot mobile et son procédé de déplacement | |
| CN205540275U (zh) | 室内移动定位系统 | |
| WO2021060894A1 (fr) | Procédé de génération de représentation schématique d'une zone et dispositif électronique associé | |
| CN118794424A (zh) | 一种基于多传感器融合的机器人定位方法及系统 | |
| TWM523862U (zh) | 室內移動定位系統 | |
| KR20210067383A (ko) | 이동 로봇의 위치 추정 시스템 및 방법 | |
| WO2021158066A1 (fr) | Appareil électronique et son procédé de commande | |
| WO2021182845A1 (fr) | Procédé permettant de traiter des données utilisées pour une détermination d'emplacement et dispositif électronique prenant en charge celui-ci | |
| US11747362B2 (en) | Method and system for determining vibrations generated by a device |