WO2020008999A1 - Dispositif de traitement d'informations, système de traitement d'informations et programme de traitement d'informations - Google Patents
Dispositif de traitement d'informations, système de traitement d'informations et programme de traitement d'informations Download PDFInfo
- Publication number
- WO2020008999A1 WO2020008999A1 PCT/JP2019/025638 JP2019025638W WO2020008999A1 WO 2020008999 A1 WO2020008999 A1 WO 2020008999A1 JP 2019025638 W JP2019025638 W JP 2019025638W WO 2020008999 A1 WO2020008999 A1 WO 2020008999A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- article
- information processing
- unit
- flying object
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K17/00—Methods or arrangements for effecting co-operative working between equipments covered by two or more of main groups G06K1/00 - G06K15/00, e.g. automatic card files incorporating conveying and reading operations
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K7/00—Methods or arrangements for sensing record carriers, e.g. for reading patterns
- G06K7/10—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K7/00—Methods or arrangements for sensing record carriers, e.g. for reading patterns
- G06K7/10—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
- G06K7/14—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/08—Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/30—UAVs specially adapted for particular uses or applications for imaging, photography or videography
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2201/00—UAVs characterised by their flight controls
- B64U2201/10—UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]
Definitions
- the present invention relates to an information processing apparatus, an information processing system, and an information processing program for performing information processing relating to article management.
- WMS Warehouse Management System
- Patent Document 1 Such an inventory information processing system is disclosed in, for example, Patent Document 1.
- the inventory information processing system disclosed in Patent Literature 1 compares the actual number of stocks measured by the user in the inventory work with the number of stocks managed by the master, and when the comparison result satisfies a predetermined condition. Next, an alarm is notified to the user. Thereby, the user can easily specify the article in which the abnormality has occurred in the stock quantity.
- the present invention has been made in view of such circumstances, and has as its object to provide an information processing apparatus, an information processing system, and an information processing program for more easily managing articles.
- an information processing device of one embodiment of the present invention From a flying object that has flown at the location of the article, receiving means for receiving imaging information obtained by associating an image captured by the flying object with information indicating a position at which the flying object was captured, Detecting means for detecting identification information of the article from an image included in the photographing information; Identification means for detecting the identification information detected by the detection means, and a position where the flying object included in the shooting information included in the detection information detected by the detection means, to specify the location of the article, Is provided.
- 1 is a schematic diagram illustrating an example of an overall configuration of an information processing system according to an embodiment of the present invention.
- 1 is a schematic diagram illustrating an example of a configuration in which an information processing system according to an embodiment of the present invention is overlooked.
- It is a block diagram showing an example of composition of an inventory management device concerning one embodiment of the present invention.
- It is a block diagram showing an example of composition of a drone concerning one embodiment of the present invention.
- It is a block diagram showing an example of composition of a base unit concerning one embodiment of the present invention.
- It is a block diagram showing an example of composition of an information processor concerning one embodiment of the present invention.
- 6 is a flowchart illustrating a flow of a data matching process in the information processing system according to the embodiment of the present invention. It is an image figure showing an example of display of the matching result of data matching processing in the information processing system concerning one embodiment of the present invention.
- FIG. 1 is a schematic diagram illustrating an overall configuration of an information processing system S according to the present embodiment.
- the information processing system S includes an inventory management device 10, a drone 20, a base device 30, and an information processing device 40.
- a shelf 53 in a place here, as an example, a warehouse
- articles managed by the information processing system S are installed is also illustrated.
- a manager of a warehouse or the like arranges a plurality of pallets 51, which are storage containers for articles to be managed, in a plurality of stages. Further, a pallet label 52 for identifying an article in the pallet is attached to each of the pallets 51.
- the pallet label 52 may be any label. However, in the present embodiment, it is assumed that the pallet label 52 is a pallet label in which a management number (hereinafter, referred to as “article identification information”) for identifying an article in a warehouse is described in a two-dimensional code. I do. In the drawing, for convenience of illustration, only one pallet 51 and one pallet label 52 are denoted by reference numerals, and the other pallets 51 and other pallet labels 52 are not denoted by reference numerals.
- the inventory management device 10 and the information processing device 40 are communicably connected to each other directly or via a network (not shown).
- the drone 20 and the base device 30 are communicably connected to each other directly or via a network.
- the base device 30 and the information processing device 40 are communicably connected to each other directly or via a network. Communication between these devices may be performed in accordance with an arbitrary communication method, and the communication method is not particularly limited.
- the network is realized by, for example, any one of the Internet, a LAN (Local Area Network), and a mobile phone network, or a combination of these.
- one device may perform communication based on a plurality of communication methods.
- the base device 30 may conform to both a communication method for performing wired communication with the drone 20 and a communication method for performing wireless communication with the information processing device 40.
- the inventory management device 10, the base device 30, and the information processing device 40 are realized by an electronic device having an information processing function, such as a personal computer, a server device, or a device unique to the present embodiment.
- the drone 20 is realized by a drone having a photographing function.
- FIG. 2 is a schematic diagram illustrating an example of a configuration in which the information processing system S according to the present embodiment is overlooked.
- shelves 53 are arranged in a plurality of rows (corresponding to shelves 53a to 53e in the figure).
- Pallets 51 to which pallet labels 52 are attached are arranged in the right and left rows in the drawing of each shelf 53 in the vertical direction in the drawing. That is, the articles stored on the pallet 51 to which the pallet label 52 is attached are arranged side by side in the horizontal direction and stacked in the vertical direction.
- a drone 20 and a base device 30 corresponding to the drone 20 are arranged corresponding to each of the left and right columns in these figures.
- the drone 20a and the base device 30a are arranged corresponding to the left column in the drawing of 53a. Further, for example, the drone 20b and the base device 30b are arranged corresponding to the right column in the drawing of 53a and the left column in the drawing of 53b.
- the information processing system S having such a configuration performs a data matching process.
- the data matching process refers to specifying the actual existence position of the article stored in each pallet 51 based on the photographing of the drone 20 and including the actual existence position of the identified article and the management information. It refers to a series of processes that collate the managed arrangement positions.
- the drone 20 flies over the section of the warehouse where the shelves 53 are installed, and photographs each pallet 51 of each stage installed on the shelves 53 corresponding to the drone 20 itself. I do. Then, the drone 20 generates shooting information in which the shot image is associated with the position information at the time of shooting. When the shooting is completed, the drone 20 moves to the installation location of the base device 30 and transmits shooting information to the base device 30. The base device 30 transmits the received photographing information to the information processing device 40. Upon receiving the photographing information, the information processing device 40 detects the pallet label 52 by image-analyzing the received photographing information, and decodes the two-dimensional code described in the detected pallet label 52 to identify the item. Detect information.
- the information processing apparatus 40 determines the actual existence position of each of the pallets 51 (the articles stored in the pallets 51) based on the detected article identification information and the position information included in the imaging information to be detected. Identify. Further, the information processing device 40 acquires the inventory information of each of the pallets 51 (articles stored in the pallet 51) managed by the inventory management device 10 by communicating with the inventory management device 10. Finally, the information processing apparatus 40 determines the actual location of each of the pallets 51 (the articles stored in the pallets) identified based on the photographing information, and the location of the management article included in the stock information. Matches and outputs the result of this match.
- the present embodiment By performing the above-described data matching processing, in the present embodiment, it is possible to eliminate the need for manual visual inspection and manual barcode scanning. Further, even if the warehouse is large-scale or the ceiling height of the warehouse is high, the present embodiment can be applied. That is, according to the present embodiment, it is possible to more easily manage the articles.
- the configuration illustrated in FIG. 1 is merely an example of the present embodiment, and the present embodiment is not limited to this configuration.
- the number of each device and the number of shelves 53 included in the information processing system S are not limited to those illustrated, and may be any number.
- the articles stored in the pallet 51 may be arbitrary.
- the pallet label 52 may be directly attached to the article.
- the inventory management device 10 includes a CPU (Central Processing Unit) 11, a ROM (Read Only Memory) 12, a RAM (Random Access Memory) 13, a communication unit 14, a storage unit 15, An input unit 16 and a display unit 17 are provided. These units are connected by a bus via signal lines, and mutually transmit and receive signals.
- a CPU Central Processing Unit
- ROM Read Only Memory
- RAM Random Access Memory
- a communication unit 14 a storage unit
- An input unit 16 and a display unit 17 are provided. These units are connected by a bus via signal lines, and mutually transmit and receive signals.
- the CPU 11 executes various processes according to a program recorded in the ROM 12 or a program loaded from the storage unit 15 into the RAM 13.
- the RAM 13 also appropriately stores data and the like necessary for the CPU 11 to execute various processes.
- the communication unit 14 performs communication control for allowing the CPU 11 to communicate with another device included in the information processing system S.
- the storage unit 15 is composed of a semiconductor memory such as a DRAM (Dynamic Random Access Memory) and stores various data.
- the input unit 16 is configured by various buttons and a touch panel, or an external input device such as a mouse and a keyboard, and inputs various information according to a user's instruction operation.
- the display unit 17 includes a liquid crystal display or the like, and displays an image corresponding to the image data output by the CPU 11.
- the inventory information management unit 111 functions in the CPU 11 as shown in FIG. In one area of the storage unit 15, a stock information storage unit 151 is set.
- the stock information storage unit 151 stores stock information of articles in a warehouse managed by the information processing system S.
- This stock information is, for example, the same information as the stock information managed in a general WMS. More specifically, for example, data such as article identification information, the number of stocked articles, and identification information (location information) of each position where the articles are arranged are stored as stock information in a table format, for example.
- the inventory information management unit 111 includes a function of managing inventory information stored in the inventory information storage unit 151.
- the stock information management unit 111 stores the stock information based on, for example, a user operation received through the input unit 16 or data received from another device (not shown) used by the user via the communication unit 14.
- the management is performed by updating the stock information stored in the storage unit 151 to the latest content. That is, the inventory management device 10 realizes a function as a so-called WMS.
- the inventory information management unit 111 reads out the latest inventory information from the inventory information storage unit 151. Then, the inventory information management unit 111 transmits the read latest inventory information to the information processing device 40 as a response to the inventory information request.
- the drone 20 includes a CPU 21, a ROM 22, a RAM 23, a communication unit 24, a storage unit 25, a photographing unit 26, a driving unit 27, a sensor unit 28, and a battery 29. Have. These units are connected by a bus via signal lines, and mutually transmit and receive signals.
- the functions of the CPU 21, the ROM 22, the RAM 23, the communication unit 24, and the storage unit 25 as hardware are the same as the hardware functions of the units having the same names but differing only in the codes, provided in the inventory management device 10 described above. It is. Therefore, duplicate description will be omitted.
- the image capturing unit 26 is configured by a camera or the like including an optical lens, an image sensor, and the like, and captures an image around the drone 20 (for example, an image including the pallet 51 and the pallet label 52 arranged on the shelf 53).
- the image obtained by the photographing of the photographing unit 26 is converted into a digital signal and output to, for example, the CPU 21 or the like.
- the drive unit 27 is driven using electric power supplied from a battery 29 described later.
- the drone 20 can fly in space by the driving of the driving unit 27.
- the drive unit 27 is composed of, for example, a set of a propeller that generates lift and thrust and a motor that rotates the propeller.
- the sensor unit 28 is a sensor for detecting a distance from another object (for example, the shelf 53 or the bottom or wall surface of the warehouse). The distance detected by the sensor unit 28 is converted into a digital signal and output to, for example, the CPU 21 or the like.
- the sensor unit 28 includes, for example, a sensor that detects a distance using a radio wave in a microwave band and a sensor that detects a distance using an ultrasonic wave. Further, the sensor unit 28 may include an acceleration sensor, an angular velocity sensor, and the like, for detecting a moving distance, a moving direction, and the like of the drone 20.
- the battery 29 stores electric power, and supplies this electric power to the driving unit 27 and other parts of the drone 20. In addition, the battery 29 stores the consumed power again by receiving power supply from the base device 30.
- a schedule management unit 211 When the drone 20 operates, as shown in FIG. 4, in the CPU 21, a schedule management unit 211, a flight control unit 212, and a shooting information generation unit 213 function. In one area of the storage unit 25, a schedule storage unit 251, a position detection information storage unit 252, and a shooting information storage unit 253 are set.
- the schedule storage unit 251 stores a schedule for performing photographing or the like by the drone 20.
- a time zone during which the drone 20 should take a picture while flying is stored as a schedule.
- the time period during which the drone 20 should take an image while flying can be set arbitrarily, but may be set to a time period during which the user does not work in the warehouse. For example, a nighttime or early morning time zone may be set. Accordingly, it is possible to perform photographing or the like by the drone 20 without disturbing a user who works in the warehouse.
- the information for the drone 20 to detect its own current position is stored in the position detecting information storage unit 252.
- the respective positional relationships between the respective base devices 30, the respective shelves 53, and markers for example, markers on a ladder (ladder)
- markers for example, markers on a ladder (ladder)
- Information indicating the number of stages of each shelf 53, identification information (location information) of each position in the warehouse, and the like are stored as information for detecting the current position.
- Pieces of information are represented by, for example, coordinates in a three-dimensional coordinate system common to the pieces of information.
- the shooting information storage unit 253 stores the shooting information generated by the shooting information generation unit 213.
- the photographing information is information in which a photographed image is associated with positional information at the time of photographing (for example, information on a two-dimensional position or information on a three-dimensional position).
- the schedule management unit 211 manages the execution of photographing and the like by the drone 20 based on the schedule stored in the schedule storage unit 251. For example, when a predetermined time zone set in the schedule has arrived, the schedule management unit 211 instructs the flight control unit 212 to start flying and photographing by the drone 20. In addition, the schedule management unit 211 can stop the flight and the imaging by giving an instruction to the flight control unit 212 at a time other than the predetermined time zone set in the schedule. In this case, the schedule management unit 211 instructs the flight control unit 212 to cause the drone 20 to receive power supply from the base device 30 corresponding to the drone 20 and store the power consumed in the battery 29 again. Can be.
- the flight control unit 212 starts the photography of the drone 20 or the like based on the instruction from the schedule management unit 211.
- the flight control unit 212 causes the drone 20 to autonomously fly around the shelf 53 corresponding to the drone 20, and the photographing unit 26 photographs the shelf 53.
- the flight control unit 212 realizes an autonomous flight based on the information for detecting its own current position stored in the position detection information storage unit 252 and the detection result by the sensor unit 28.
- the drone 20 is flying by calculating the moving distance and the moving direction from the position of the base device 30 corresponding to the drone 20 that is the position where the flight starts, based on the detection result by the sensor unit 28. Identify the current 3D position.
- the flight control unit 212 performs autonomous control based on the current three-dimensional position of the drone 20 and the positional relationship between each base device 30 and each shelf 53 stored in the position detection information storage unit 252. Realize the flight.
- the flight control unit 212 analyzes the image captured by the image capturing unit 26 (or by the sensor unit 28) to detect a marker disposed on a floor of a warehouse or the like, and based on the detection result of the marker. Thus, the current three-dimensional position where the drone 20 is flying may be corrected.
- the flight control unit 212 controls the photographing unit 26 during the autonomous flight, so that the surroundings of the drone 20 (for example, the periphery including the pallet 51 and the pallet label 52 arranged on the shelf 53) are controlled at a predetermined cycle. ) To shoot. Then, the flight control unit 212 outputs the photographed image and information on the position at the time of photographing to the photographing information generating unit 213.
- the autonomous flight and photographing are performed until the entire periphery of the shelf 53 corresponding to the drone 20 is photographed. More specifically, pallets 51 and pallet labels 52 are arranged in a row on the shelf 53 in a horizontal direction and stacked in a vertical direction. For this reason, the flight control unit 212, for example, shoots one stage of the shelf 53 as a shooting target, and when all the stages have been shot, ends the autonomous flight and shooting and responds to itself. It returns to the base device 30 which performs. Then, the drone 20 receives power supply from the base device 30 until a predetermined time zone arrives again and there is an instruction again from the schedule management unit 211.
- the drone 20 may shoot all the stages of the shelf 53 at one time, but each time one stage is shot, the drone 20 returns to the base device 30 and receives power, and then shoots the next stage. It may be performed. In this way, the capacity of the battery 29 can be reduced.
- the shooting information generation unit 213 generates shooting information by associating a shot image input from the flight control unit 212 with information on the position at the time of shooting. Then, the shooting information generation unit 213 causes the shooting information storage unit 253 to store the generated shooting information.
- the base device 30 includes a CPU 31, a ROM 32, a RAM 33, a communication unit 34, a storage unit 35, and a power supply unit 36. These units are connected by a bus via signal lines, and mutually transmit and receive signals.
- the functions of the hardware of the CPU 31, the ROM 32, the RAM 33, the communication unit 34, and the storage unit 35 are the same as those of the units having the same names but differing only in the signs, provided in the inventory management device 10 and the drone 20 described above. Equivalent to function. Therefore, duplicate description will be omitted.
- the power supply unit 36 is a unit that supplies power to the drone 20.
- the base device 30 obtains electric power from a home power supply in the warehouse. Then, power is supplied to the drone 20 by converting the power into a voltage or the like suitable for supplying power to the drone 20.
- the power supply to the drone 20 is continued after the autonomous flight and the photographing by the drone 20 are completed and until the autonomous flight and the photographing by the drone 20 are performed again (or until the battery 29 is sufficiently charged).
- the photographing information relay unit 311 and the power supply control unit 312 function.
- a shooting information storage unit 351 is set.
- the shooting information storage unit 351 stores shooting information acquired from the drone 20.
- the photographing information relay unit 311 transmits the photographing information acquired from the drone 20 to the information processing device 40. That is, the photographing information relay unit 311 relays the photographing information.
- the relay of the photographing information can be performed at an arbitrary timing.
- the photographing information relay unit 311 communicates with the drone 20 while receiving power from the power supply unit 36 after the drone 20 has completed the automatic flight and the photographing.
- the present invention is not limited to this, and the imaging information relay unit 311 may communicate with the drone 20, for example, while the drone 20 is performing automatic flight and imaging. Then, the photographing information relay unit 311 acquires the photographing information generated by the drone 20 through this communication and stored in the photographing information storage unit 253.
- the photographing information relay unit 311 stores the acquired photographing information in the photographing information storage unit 351. Further, the photographing information relay unit 311 transmits the photographing information stored in the photographing information storage unit 351 to the information processing device 40 at a predetermined timing.
- the predetermined timing may be immediately after acquiring the shooting information from the drone 20, or may be a timing at which the information processing device 40 requests the shooting information.
- the power supply control unit 312 is a part that controls power supply by the power supply unit 36 described above. Specifically, the power supply control unit 312 controls the above-described conversion of the voltage and the like, and the start and end of power supply.
- the information processing device 40 includes a CPU 41, a ROM 42, a RAM 43, a communication unit 44, a storage unit 45, an input unit 46, and a display unit 47. These units are connected by a bus via signal lines, and mutually transmit and receive signals.
- the functions as hardware of the CPU 41, the ROM 42, the RAM 43, the communication unit 44, the storage unit 45, the input unit 46, and the display unit 47 are provided in the inventory management device 10, the drone 20, and the base device 30 described above. This is equivalent to the function as hardware of each unit having the same name except for the sign. Therefore, duplicate description will be omitted.
- the photographing information acquisition unit 411, the identification information detection unit 412, the presence position identification unit 413, the inventory data acquisition unit 414, the data matching unit 415 function.
- a shooting information storage unit 451 and a specific information storage unit 452 are set.
- the shooting information storage unit 451 stores shooting information of each drone 20 received from each base device 30.
- the specific information storage unit 452 stores the actual existence position of each of the pallets 51 (articles stored in the pallets 51) identified by the existence position identification unit 413 described below based on the photographing information.
- the photographing information acquisition unit 411 acquires the photographing information of each drone 20 by receiving it from each base device 30, and stores the acquired photographing information of each drone 20 in the photographing information storage unit 451.
- the identification information detection unit 412 detects the pallet label 52 from the image by performing image analysis on the imaging information stored in the imaging information storage unit 451 using an existing image analysis technique. Further, the identification information detection unit 412 detects the article identification information by decoding the two-dimensional code described in the detected pallet label 52. Further, the identification information detection unit 412 associates the position information included in the imaging information to be detected with the article identification information. Then, the identification information detection unit 412 outputs the associated position information and article identification information to the existing position identification unit 413.
- the existence position specifying unit 413 specifies the actual existence position of each of the pallets 51 (the items stored in the pallets 51) based on the item identification information and the position information input from the identification information detecting unit 412. That is, the existence position specifying unit 413 specifies that the article stored in the pallet 51 corresponding to the article identification information exists at the position corresponding to the position information. Further, the existence position specifying unit 413 causes the specific information storage unit 452 to store the actual position of each specified article.
- the inventory data acquisition unit 414 acquires the inventory information of the articles stored in each pallet 51 managed by the inventory management device 10 by communicating with the inventory management device 10. For example, the inventory data acquisition unit 414 requests the inventory management device 10 for current inventory information of the article. Further, the inventory data acquisition unit 414 outputs the current inventory information of the article returned from the inventory management device 10 to the data matching unit 415 in response to the request. Note that the current stock information of the article may be spontaneously transmitted from the stock management device 10 without requiring a request from the stock data acquisition unit 414.
- the data collating unit 415 includes the actual existence position of the article stored in each pallet 51 identified by the existence position identification unit 413 based on the photographing information, and the article management information input from the inventory data acquisition unit 414.
- the arrangement positions of the managed articles are collated, and the result of the collation is output.
- the output of the matching result is realized by, for example, displaying on the display unit 47, printing on a paper medium using another terminal (not shown) used by the user, or a printer (not shown). A specific example of the output matching result will be described later with reference to FIG.
- the data matching process refers to specifying the actual existence position of the article stored in each pallet 51 based on the photographing of the drone 20, and adding the actual existence position of the identified article to the management information. This refers to a series of processes for collating the included management arrangement positions and the like.
- the data matching process is periodically performed when the operation of the information processing system S starts. It should be noted that as a premise of the data matching process, management of inventory information by the inventory information management unit 111 is also performed periodically.
- step S11 the base device 30 supplies power to the drone 20.
- step S12 the drone 20 determines whether a predetermined time zone for performing autonomous flight and photographing has arrived. If the predetermined time period has arrived, it is determined as Yes in step S12, and the process proceeds to step S13. On the other hand, when the predetermined time zone has not arrived, No is determined in step S12, and the power supply in step S11 is continued.
- step S13 the drone 20 generates shooting information by performing autonomous flight and shooting.
- step S ⁇ b> 14 the drone 20 determines whether or not all the positions of the shelf 53 to be photographed have been photographed and the generation of the photographing information based on the photographing has been completed. When the generation of the photographing information is completed, Yes is determined in step S14, and the process proceeds to step S15. On the other hand, if the generation of the shooting information has not been completed, No is determined in step S14, and the generation of the shooting information in step S13 continues.
- step S15 the drone 20 returns to the base device 30 corresponding to itself, and transmits the imaging information generated in step S13 to the base device 30.
- step S16 the base device 30 transmits the imaging information received in step S15 to the information processing device 40.
- step S17 the base device 30 performs power supply to the drone 20 again.
- step S18 the information processing device 40 detects article identification information by performing image analysis or the like on the photographing information received in step S16.
- step S19 the information processing device 40 specifies the actual existence position of the article stored in each pallet 51 based on the article identification information detected in step S18 and the corresponding position information.
- step S20 the information processing device 40 requests the inventory management device 10 for current inventory information of the article.
- step S21 the inventory management device 10 transmits the current item inventory information to the information processing device 40 as a response to the request in step S20.
- step S22 the information processing device 40 includes the actual existence position of the item stored in each pallet 51 specified in step S19 and the item inventory information transmitted from the inventory management device 10 in step S21. Collate the positions of the items on the management.
- step S23 the information processing device 40 outputs the result of the abutment performed in step S23.
- the present embodiment By performing the above-described data matching processing, in the present embodiment, it is possible to eliminate the need for manual visual inspection and manual barcode scanning. Further, even if the warehouse is large-scale or the ceiling height of the warehouse is high, the present embodiment can be applied. Further, in the present embodiment, since the drone 20 performs autonomous flight using a sensor or the like, data matching processing is performed in an indoor warehouse or the like where it is difficult to perform positioning or the like by GPS (Global Positioning System). be able to.
- GPS Global Positioning System
- the pallet label 52 generally used in the management of articles in the warehouse is read and used, it is not necessary to separately prepare a module such as an RFID (Radio Frequency IDentifier). That is, according to the present embodiment, the data matching process can be performed at low cost. Further, in the present embodiment, since a two-dimensional code is used instead of a one-dimensional barcode which needs to be photographed with high precision in order to perform decoding, an image photographed during flight is photographed with such high precision. Even if the decoding has not been performed, decoding can be performed.
- a module such as an RFID (Radio Frequency IDentifier).
- the present embodiment since it is not necessary to perform processing such as detection of a two-dimensional code by image analysis and decoding of the detected two-dimensional code in the drone 20, it is possible to suppress the arithmetic processing capability required for the drone 20. Can be. Accordingly, the power consumption of the drone 20 can be reduced, and the drone 20 itself can be easily reduced in size. That is, according to the present embodiment, it is possible to more easily manage the articles.
- FIG. 8 is an image diagram showing an example of the display of the result of the data matching process in the information processing system S. As shown in FIG. 8, this display example includes three display areas, a display area AR1, a display area AR2, and a display area AR3.
- the first operation button is displayed in the display area AR1.
- the first operation button is a user interface for the user to instruct the printing of the matching result.
- the user performs an operation of pressing the first operation button when the user wants to print the matching result.
- a second operation button is displayed in the display area AR2.
- the second operation button is a user interface for the user to instruct output of the data of the matching result in a predetermined format (here, for example, CSV (comma-separated @ values) format).
- the user performs an operation of pressing the second operation button when the user desires to output the result data in a predetermined format.
- the output destination of the data may be the storage unit 45 included in the information processing apparatus 40, or may be output to another terminal (not shown) used by the user, as described above.
- the result of the comparison is displayed in a predetermined format (in this example, a table format).
- the reconciliation result indicates, for example, whether or not the article exists at a position corresponding to the arrangement position of the article on management included in the stock information of the article managed by the stock management apparatus 10.
- Information, information indicating whether there is an article not included in the stock information, and information indicating whether an article included in the stock information does not exist.
- IPA_PRODUCT CODE stores a product number of an article. Further, “item: IPA_SUB INVENTORY” stores status information of the article.
- IPA_ARRIVED DATE stores information indicating the date when the pallet 51 storing the articles arrives at the warehouse. Further, “item: IPA_QUANTITY” stores the carton quantity of the article.
- IPA_PACKAGES stores the package quantity of the article. Further, in “item: IPA_DAMAGE FLAG”, information indicating whether or not the article has damage such as a scratch or breakage is stored as a flag.
- “item: CHECKED DATE” stores information indicating a date on which the autonomous flight and imaging by the drone 20 were performed.
- “item: CHECKED TIME” stores information indicating a time zone in which the autonomous flight and imaging by the drone 20 are performed.
- “item: RESULT” stores information indicating a result of the data matching process.
- the information indicating the matching result is information indicating whether or not the article is present at a position corresponding to the management article arrangement position included in the article stock information.
- “Item: RESULT @ ANALYSIS” stores information indicating a more detailed matching result obtained by analyzing the matching result in the data matching process.
- the information indicating a more detailed matching result is, for example, information “1: OK” indicating that an article is present at a position that coincides with a management arrangement position of the article included in the stock information of the article,
- the information “2: Wrong @ Location” indicating that an article exists at a position that does not match the management article placement position included in the article inventory information, or the management information included in the article inventory information.
- 6: Decode @ Failed indicating that the analysis of the two-dimensional code has failed
- 7 Non-Located "indicating that the location information has not been correctly assigned.
- the information indicating whether or not there is damage at a position that does not match the arrangement position of the managed article is the information “Found @ Damaged” indicating that there is an article stored as a flag.
- the user who refers to the information indicating the result of the matching in the data matching process can know whether or not the article exists at an appropriate position according to the stock information for each article. Further, even if the article is not located at an appropriate position, the user can know that the article is located at another position. Further, the user can know that there is an article that is not included in the stock information in the first place. Thereby, for example, the user moves the article in the warehouse to an appropriate position, or corrects the stock information managed by the stock management apparatus 10 so that the stock information is the same as the actual article location. , And so on. In addition, the user can visually confirm the actual thing of the article that could not be analyzed.
- the information processing device 40 may also notify the user of such specific countermeasures in accordance with the result of the data matching process.
- “item: FINAL @ RESULT” stores information indicating the result of a match when the user visually checks the location of the article. As described above, the user may visually check an article or the like that could not be analyzed in the data matching process. In such a case, information is stored in “item: FINAL @ RESULT”.
- the type of information stored in “item: RESULT @ ANALYSIS” is the same as the type of information such as “1: OK” stored in “item: RESULT” described above. Therefore, a duplicate description is omitted here.
- “item: PICTURE” functions as a user interface for displaying an image photographed on the drone 20 in the data matching process.
- the user can cause the drone 20 to display an image photographed by photographing by performing an operation of checking a check box provided in “item: PICTURE”.
- the user can refer to the images of the pallet 51 and the pallet label 52, and can visually recognize the state of the pallet 51 and specify the article identification information. Therefore, it is possible to reduce the trouble of the user visually confirming the actual product.
- the drone 20 has realized the autonomous flight by specifying the current position without using the GPS.
- the present invention is not limited thereto, and the drone 20 may realize an autonomous flight using a positioning result by the GPS in an environment where a signal from a satellite in the GPS can be received.
- the drone 20 performs the autonomous flight and the photographing when the predetermined time period has arrived, and generates the photographing information.
- the present invention is not limited to this, and the drone 20 may generate the shooting information by performing the flight and the shooting in response to an instruction from the user. Further, the drone 20 may fly in response to a user operation, instead of flying autonomously.
- the photographing information relay unit 311 of the base device 30 transmits the photographing information acquired from the drone 20 to the information processing device 40. That is, the photographing information relay unit 311 relays the photographing information.
- the invention is not limited thereto, and the drone 20 and the information processing device 40 may be communicably connected to each other directly or via a network. Then, the drone 20 may transmit the shooting information to the information processing device 40 without passing through the base device 30.
- the present invention is not limited to the above-described embodiment, and includes modifications and improvements as long as the object of the present invention can be achieved.
- the series of processes described above can be executed by hardware or can be executed by software.
- each of the above-described functional blocks may be configured by hardware alone, may be configured by software alone, or may be configured by a combination thereof.
- the functional configurations illustrated in FIGS. 3, 4, 5, and 6 are merely examples, and are not particularly limited. That is, it is sufficient that the information processing system S has a function capable of executing the above-described series of processes as a whole, and what kind of functional blocks are used to realize this function is particularly described in FIGS. 5 and 6 are not limited.
- the functional configuration included in the present embodiment can be realized by a processor that executes arithmetic processing, and a processor that can be used in the present embodiment includes various types of processors such as a single processor, a multiprocessor, and a multicore processor.
- the present invention also includes a combination of these various processing devices and a processing circuit such as an ASIC (Application Specific Integrated Circuit) or an FPGA (Field-Programmable Gate Array).
- a program constituting the software is installed in a computer or the like from a network or a recording medium.
- the computer may be a computer embedded in dedicated hardware. Further, the computer may be a computer that can execute various functions by installing various programs, for example, a general-purpose personal computer.
- the recording medium including such a program may be provided to the user by being distributed separately from the apparatus main body in order to provide the program to the user, or may be provided to the user in a state where the recording medium is incorporated in the apparatus main body in advance. Is also good.
- the storage medium distributed separately from the apparatus main body is composed of, for example, a magnetic disk (including a floppy disk), an optical disk, a magneto-optical disk, or the like.
- the optical disc is composed of, for example, a CD-ROM (Compact Disc-Only Memory), a DVD (Digital Versatile Disc), a Blu-ray (registered trademark) Disc (Blu-ray Disc), and the like.
- the magneto-optical disk is composed of an MD (Mini-Disk) or the like.
- the recording medium provided to the user in a state in which the program is pre-installed in the apparatus main body is, for example, the ROM 12 in FIG. 3, the ROM 22, in FIG. 4, the ROM 32 in FIG. 5, and the ROM 42 in FIG.
- the steps of describing a program recorded on a recording medium may be performed in chronological order according to the order, or in parallel or individually, even if not necessarily performed in chronological order. This includes the processing to be executed.
- the term “system” refers to an entire device including a plurality of devices and a plurality of means.
- Reference Signs List 10 inventory management device 20 drone 30 base device 40 information processing device 11, 21, 31, 41 CPU 12, 22, 32, 42 ROM 13, 23, 33, 43 RAM 14, 24, 34, 44 Communication unit 15, 25, 35, 45 Storage unit 16, 46 Input unit 17, 47 Display unit 26 Imaging unit 27 Drive unit 28 Sensor unit 29 Battery 36 Power supply unit 51 Pallet 52 Pallet label 53 Shelf 111 Inventory information management unit 151 Inventory information storage unit 211 Schedule management unit 212 Flight control unit 213 Imaging information generation unit 251 Schedule storage unit 252 Position detection information storage unit 253 Imaging information storage unit 311 Imaging information relay unit 312 Power supply control unit 351 Imaging information Storage unit 411 Photographing information acquisition unit 412 Identification information detection unit 413 Location identification unit 414 Inventory data acquisition unit 415 Data matching unit 451 Photography information storage unit 452 Specific information storage unit S Information processing system
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Business, Economics & Management (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Economics (AREA)
- Artificial Intelligence (AREA)
- Toxicology (AREA)
- General Health & Medical Sciences (AREA)
- Electromagnetism (AREA)
- Health & Medical Sciences (AREA)
- Human Resources & Organizations (AREA)
- General Business, Economics & Management (AREA)
- Tourism & Hospitality (AREA)
- Strategic Management (AREA)
- Quality & Reliability (AREA)
- Operations Research (AREA)
- Marketing (AREA)
- Entrepreneurship & Innovation (AREA)
- Development Economics (AREA)
- Warehouses Or Storage Devices (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
La présente invention a pour objet de gérer plus simplement des articles. À cet effet, l'invention porte sur dispositif de traitement d'informations 40 qui comprend une unité d'acquisition d'informations de capture d'image 411, une unité de détection d'informations d'identification 412 et une unité d'identification de position existante 413. L'unité d'acquisition d'informations de capture d'image 411 reçoit, à partir d'un corps volant ayant survolé un endroit où un article est placé, des informations de capture d'image dans lesquelles une image capturée par le corps volant est associée à des informations indiquant une position dans laquelle le corps volant a capturé l'image. L'unité de détection d'informations d'identification détecte, dans l'image incluse dans les informations de capture d'image, des informations d'identification relatives à l'article. L'unité d'identification de position existante 413 identifie la position existante de l'article à partir des informations d'identification détectées par l'unité de détection d'informations d'identification 412 et de la position au niveau de laquelle le corps volant a capturé l'image, la position étant comprise dans les informations de capture d'image dans lesquelles le moyen de détection a effectué la détection.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2018128788A JP2020009083A (ja) | 2018-07-06 | 2018-07-06 | 情報処理装置、情報処理システム、及び情報処理プログラム |
| JP2018-128788 | 2018-07-06 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2020008999A1 true WO2020008999A1 (fr) | 2020-01-09 |
Family
ID=69059758
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2019/025638 Ceased WO2020008999A1 (fr) | 2018-07-06 | 2019-06-27 | Dispositif de traitement d'informations, système de traitement d'informations et programme de traitement d'informations |
Country Status (2)
| Country | Link |
|---|---|
| JP (1) | JP2020009083A (fr) |
| WO (1) | WO2020008999A1 (fr) |
Families Citing this family (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2025115127A1 (fr) * | 2023-11-29 | 2025-06-05 | Necプラットフォームズ株式会社 | Dispositif d'identification d'emplacement, procédé d'identification d'emplacement et support d'enregistrement |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2011150460A (ja) * | 2010-01-20 | 2011-08-04 | Hitachi Information & Control Solutions Ltd | 入出庫管理システムおよび入出庫管理方法 |
| JP2017050718A (ja) * | 2015-09-02 | 2017-03-09 | 株式会社東芝 | ウェアラブル端末、方法及びシステム |
| JP2017218325A (ja) * | 2016-06-03 | 2017-12-14 | 裕之 本地川 | 情報収集装置およびこれを用いた物品管理システム、ならびに巻取装置 |
| JP2018043815A (ja) * | 2016-09-12 | 2018-03-22 | ユーピーアール株式会社 | ドローンを活用した倉庫内の荷物監視システム |
| WO2018100676A1 (fr) * | 2016-11-30 | 2018-06-07 | 株式会社オプティム | Système de commande d'appareil photo, procédé de commande d'appareil photo et programme |
-
2018
- 2018-07-06 JP JP2018128788A patent/JP2020009083A/ja active Pending
-
2019
- 2019-06-27 WO PCT/JP2019/025638 patent/WO2020008999A1/fr not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2011150460A (ja) * | 2010-01-20 | 2011-08-04 | Hitachi Information & Control Solutions Ltd | 入出庫管理システムおよび入出庫管理方法 |
| JP2017050718A (ja) * | 2015-09-02 | 2017-03-09 | 株式会社東芝 | ウェアラブル端末、方法及びシステム |
| JP2017218325A (ja) * | 2016-06-03 | 2017-12-14 | 裕之 本地川 | 情報収集装置およびこれを用いた物品管理システム、ならびに巻取装置 |
| JP2018043815A (ja) * | 2016-09-12 | 2018-03-22 | ユーピーアール株式会社 | ドローンを活用した倉庫内の荷物監視システム |
| WO2018100676A1 (fr) * | 2016-11-30 | 2018-06-07 | 株式会社オプティム | Système de commande d'appareil photo, procédé de commande d'appareil photo et programme |
Also Published As
| Publication number | Publication date |
|---|---|
| JP2020009083A (ja) | 2020-01-16 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11093896B2 (en) | Product status detection system | |
| US9984354B1 (en) | Camera time synchronization system | |
| US9505554B1 (en) | Capturing packaging image via scanner | |
| EP3343481A1 (fr) | Identification de l'emplacement de colis dans un véhicule au chargement et au moment de la livraison | |
| KR102240997B1 (ko) | 배송 서비스 시스템 | |
| EP3702985A1 (fr) | Optimisation de chargement de fret à réalité augmentée | |
| US12430521B2 (en) | Error correction using combination RFID signals | |
| US9892378B2 (en) | Devices, systems and methods for tracking and auditing shipment items | |
| JP2018147138A (ja) | 情報処理システム、情報処理装置、情報処理方法および情報処理プログラム | |
| US20140222709A1 (en) | Method and apparatus for updating detailed delivery tracking | |
| JP2020033173A (ja) | 管理システム、位置算出プログラム、位置算出方法及び管理装置 | |
| US20210065585A1 (en) | Automated user interfaces for efficient packaging of objects | |
| JP6728995B2 (ja) | 自動倉庫システムおよび自動倉庫の管理方法 | |
| JP2017214197A (ja) | 管理システム、管理方法および情報処理装置 | |
| JP2019112231A (ja) | 物品管理システム及び物品管理モジュール | |
| US20250086582A1 (en) | Shelf label management system, shelf label management method, and recording medium | |
| JP2019104625A (ja) | 配置支援システム、配置支援方法、およびプログラム | |
| US20200182623A1 (en) | Method, system and apparatus for dynamic target feature mapping | |
| WO2022107000A1 (fr) | Suivi automatisé d'articles d'inventaire pour l'exécution de commandes et le réapprovisionnement | |
| WO2020008999A1 (fr) | Dispositif de traitement d'informations, système de traitement d'informations et programme de traitement d'informations | |
| JP2018092434A (ja) | 管理システム、情報処理装置、プログラム、管理方法 | |
| JP2017215821A (ja) | 管理システム、管理方法および搬送システム | |
| JP6690411B2 (ja) | 管理システム、管理方法および搬送システム | |
| JP2017024823A (ja) | 船舶の予備品管理システム、プログラム及び船舶の予備品収納構造 | |
| US10277451B2 (en) | Control method and apparatus in a mobile automation system |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19830999 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 19830999 Country of ref document: EP Kind code of ref document: A1 |