WO2019225875A1 - Procédé et appareil de suivi d'inventaire - Google Patents
Procédé et appareil de suivi d'inventaire Download PDFInfo
- Publication number
- WO2019225875A1 WO2019225875A1 PCT/KR2019/005187 KR2019005187W WO2019225875A1 WO 2019225875 A1 WO2019225875 A1 WO 2019225875A1 KR 2019005187 W KR2019005187 W KR 2019005187W WO 2019225875 A1 WO2019225875 A1 WO 2019225875A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- electronic device
- information
- type
- learning model
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/08—Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
- G06Q10/087—Inventory or stock management, e.g. order filling, procurement or balancing against orders
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/09—Supervised learning
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/0464—Convolutional networks [CNN, ConvNet]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/086—Learning methods using evolutionary algorithms, e.g. genetic algorithms or genetic programming
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/091—Active learning
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/08—Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/08—Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
- G06Q10/083—Shipping
- G06Q10/0833—Tracking
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/64—Three-dimensional objects
Definitions
- Various embodiments of the present disclosure relate to a method and an apparatus for tracking inventory related to an object to be inserted into or withdrawn from an electronic device.
- Artificial intelligence systems unlike conventional rule-based smart systems, are machines that learn and judge themselves and become smart. As the artificial intelligence system is used, the recognition rate is improved and the user's taste can be understood more accurately, and the existing rule-based smart system is gradually replaced by the deep learning-based artificial intelligence system.
- Artificial intelligence technology may be composed of elementary technologies utilizing machine learning (eg, deep learning) and machine learning.
- Machine learning is an algorithm technology that classifies / learns characteristics of input data by itself.
- Element technology is a technology that simulates the functions of human brain cognition and judgment by using machine learning algorithms such as deep learning. It may consist of technical fields such as understanding, reasoning / prediction, knowledge representation, motion control, and the like.
- Linguistic understanding is a technology for recognizing and applying / processing a human language / character, and may include natural language processing, machine translation, dialogue system, question and answer, speech recognition / synthesis, and the like.
- Visual understanding is a technology for recognizing and processing objects as human vision, and may include object recognition, object tracking, image search, person recognition, scene understanding, spatial understanding, image enhancement, and the like.
- Inference prediction is a technique of determining, logically inferring, and predicting information, and may include knowledge / probability based inference, optimization prediction, preference based planning, and recommendation.
- Knowledge expression is a technology for automatically processing human experience information into knowledge data, and may include knowledge construction (data generation / classification), knowledge management (data utilization), and the like.
- Motion control is a technology for controlling autonomous driving of a vehicle and movement of a robot, and may include motion control (navigation, collision, driving), operation control (action control), and the like.
- An electronic device may include a structure capable of storing an object therein (or an upper portion). The user may want to automatically manage an object that is drawn in or drawn out of the electronic device.
- Various embodiments of the present disclosure may provide an electronic device that automatically tracks and manages an inventory regarding an object to be inserted into or withdrawn from an electronic device using an artificial intelligence system.
- An electronic device may include a plurality of sensors including a visual sensor and a weight sensor; Memory; And at least one processor, wherein the at least one processor determines whether at least one object is inserted into the electronic device using at least one of the plurality of sensors, and pulls in the at least one object. Acquiring an image of the at least one object using the visual sensor, identifying a type of the at least one object corresponding to the image using a learning model trained through an artificial intelligence algorithm, and The residual amount information of the at least one object may be obtained using a learning model and the weight sensor, and the type of the object and the residual amount information of the object may be stored in the memory.
- An inventory management method of an electronic device may include determining whether at least one object is inserted into the electronic device using at least one of a plurality of sensors including a visual sensor and a weight sensor; Acquiring an image of the at least one object using the visual sensor; Identifying a type of the at least one object based on the acquired image by using a learning model learned through an artificial intelligence algorithm; Obtaining residual amount information of the at least one object by using the learning model learned through the artificial intelligence algorithm and the weight sensor; And storing the type and the remaining amount information of the at least one object in a memory of the electronic device.
- Various embodiments of the present disclosure may provide an electronic device that automatically tracks and manages an inventory regarding an object to be inserted into or withdrawn from an electronic device using an artificial intelligence system.
- FIG. 1 is a block diagram of an electronic device 101 in a network environment 100 according to various embodiments of the present disclosure.
- FIG. 2 is a block diagram illustrating an electronic device 200 according to various embodiments of the present disclosure.
- 3 is a diagram for describing a method of acquiring arrival information of an object with respect to the electronic device 200 according to various embodiments of the present disclosure.
- FIG. 4 is a diagram for describing a method of identifying object type and remaining amount information in an electronic device 200 or 300 according to various embodiments of the present disclosure.
- FIG 5 is a diagram for describing an information receiving operation of the electronic device 200 according to an embodiment of the present disclosure.
- FIG. 6 is a diagram illustrating an information output screen in an electronic device according to various embodiments of the present disclosure.
- FIG. 1 is a block diagram of an electronic device 101 in a network environment 100, according to various embodiments.
- the electronic device 101 communicates with the electronic device 102 through a first network 198 (eg, a short-range wireless communication network) or the second network 199.
- the electronic device 104 may communicate with the server 108 through a long range wireless communication network.
- the electronic device 101 may communicate with the electronic device 104 through the server 108.
- the electronic device 101 may include a processor 120, a memory 130, an input device 150, an audio output device 155, a display device 160, an audio module 170, and a sensor module ( 176, interface 177, haptic module 179, camera module 180, power management module 188, battery 189, communication module 190, subscriber identification module 196, or antenna module 197. ) May be included.
- a sensor module 176, interface 177, haptic module 179, camera module 180, power management module 188, battery 189, communication module 190, subscriber identification module 196, or antenna module 197.
- the components for example, the display device 160 or the camera module 180
- the sensor module 176 may be implemented embedded in the display device 160 (eg, display).
- the processor 120 executes software (eg, the program 140) to execute at least one other component (eg, hardware or software component) of the electronic device 101 connected to the processor 120. It can control and perform various data processing or operations. According to one embodiment, as at least part of data processing or operation, processor 120 may send instructions or data received from another component (eg, sensor module 176 or communication module 190) to volatile memory 132. Can be loaded into, processed in a command or data stored in volatile memory 132, and stored in the non-volatile memory (134).
- software eg, the program 140
- processor 120 may send instructions or data received from another component (eg, sensor module 176 or communication module 190) to volatile memory 132. Can be loaded into, processed in a command or data stored in volatile memory 132, and stored in the non-volatile memory (134).
- the processor 120 may include a main processor 121 (eg, a central processing unit or an application processor), and a coprocessor 123 (eg, a graphics processing unit, an image signal processor) that may operate independently or together. , Sensor hub processor, or communication processor). Additionally or alternatively, the coprocessor 123 may be set to use lower power than the main processor 121 or to be specialized for its designated function. The coprocessor 123 may be implemented separately from or as part of the main processor 121.
- a main processor 121 eg, a central processing unit or an application processor
- a coprocessor 123 eg, a graphics processing unit, an image signal processor
- the coprocessor 123 may be set to use lower power than the main processor 121 or to be specialized for its designated function.
- the coprocessor 123 may be implemented separately from or as part of the main processor 121.
- the coprocessor 123 may, for example, replace the main processor 121 while the main processor 121 is in an inactive (eg, sleep) state, or the main processor 121 may be active (eg, execute an application). At least one of the components of the electronic device 101 (eg, the display device 160, the sensor module 176, or the communication module 190) together with the main processor 121 while in the) state. Control at least some of the functions or states associated with the. According to one embodiment, the coprocessor 123 (eg, an image signal processor or communication processor) may be implemented as part of other functionally related components (eg, camera module 180 or communication module 190). have.
- the memory 130 may store various data used by at least one component (eg, the processor 120 or the sensor module 176) of the electronic device 101.
- the data may include, for example, software (eg, the program 140) and input data or output data for a command related thereto.
- the memory 130 may include a volatile memory 132 or a nonvolatile memory 134.
- the program 140 may be stored as software in the memory 130, and may include, for example, an operating system 142, middleware 144, or an application 146.
- the input device 150 may receive a command or data to be used for a component (for example, the processor 120) of the electronic device 101 from the outside (for example, a user) of the electronic device 101.
- the input device 150 may include, for example, a microphone, a mouse, a keyboard, or a digital pen (eg, a stylus pen).
- the sound output device 155 may output a sound signal to the outside of the electronic device 101.
- the sound output device 155 may include, for example, a speaker or a receiver.
- the speaker may be used for general purposes such as multimedia playback or recording playback, and the receiver may be used to receive an incoming call.
- the receiver may be implemented separately from or as part of a speaker.
- the display device 160 may visually provide information to the outside (eg, a user) of the electronic device 101.
- the display device 160 may include, for example, a display, a hologram device, or a projector and a control circuit for controlling the device.
- the display device 160 may include a touch circuitry configured to sense a touch, or a sensor circuit (eg, a pressure sensor) configured to measure the strength of a force generated by the touch. have.
- the audio module 170 may convert sound into an electric signal or, conversely, convert an electric signal into a sound. According to an embodiment, the audio module 170 may acquire sound through the input device 150, or may output an external electronic device (eg, a sound output device 155, or directly or wirelessly connected to the electronic device 101). Sound may be output through the electronic device 102 (eg, a speaker or a headphone).
- an external electronic device eg, a sound output device 155, or directly or wirelessly connected to the electronic device 101. Sound may be output through the electronic device 102 (eg, a speaker or a headphone).
- the sensor module 176 detects an operating state (eg, power or temperature) of the electronic device 101, or an external environmental state (eg, a user state), and generates an electrical signal or data value corresponding to the detected state. can do.
- the sensor module 176 may include, for example, a gesture sensor, a gyro sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an infrared sensor, a biometric sensor, It may include a temperature sensor, a humidity sensor, or an illuminance sensor.
- the interface 177 may support one or more designated protocols that may be used for the electronic device 101 to be directly or wirelessly connected to an external electronic device (for example, the electronic device 102).
- the interface 177 may include, for example, a high definition multimedia interface (HDMI), a universal serial bus (USB) interface, an SD card interface, or an audio interface.
- HDMI high definition multimedia interface
- USB universal serial bus
- SD card interface Secure Digital Card interface
- audio interface audio interface
- connection terminal 178 may include a connector through which the electronic device 101 may be physically connected to an external electronic device (eg, the electronic device 102).
- the connection terminal 178 may include, for example, an HDMI connector, a USB connector, an SD card connector, or an audio connector (eg, a headphone connector).
- the haptic module 179 may convert an electrical signal into a mechanical stimulus (eg, vibration or movement) or an electrical stimulus that can be perceived by the user through tactile or kinesthetic senses.
- the haptic module 179 may include, for example, a motor, a piezoelectric element, or an electrical stimulation device.
- the camera module 180 may capture still images and videos. According to one embodiment, the camera module 180 may include one or more lenses, image sensors, image signal processors, or flashes.
- the power management module 188 may manage power supplied to the electronic device 101.
- the power management module 388 may be implemented, for example, as at least part of a power management integrated circuit (PMIC).
- PMIC power management integrated circuit
- the battery 189 may supply power to at least one component of the electronic device 101.
- the battery 189 may include, for example, a non-rechargeable primary cell, a rechargeable secondary cell or a fuel cell.
- the communication module 190 may establish a direct (eg wired) communication channel or wireless communication channel between the electronic device 101 and an external electronic device (eg, the electronic device 102, the electronic device 104, or the server 108). Establish and perform communication over established communication channels.
- the communication module 190 may operate independently of the processor 120 (eg, an application processor) and include one or more communication processors supporting direct (eg, wired) or wireless communication.
- the communication module 190 is a wireless communication module 192 (eg, a cellular communication module, a near field communication module, or a global navigation satellite system (GNSS) communication module) or a wired communication module 194 (eg It may include a local area network (LAN) communication module, or a power line communication module.
- GNSS global navigation satellite system
- the corresponding communication module of these communication modules may be a first network 198 (e.g. a short range communication network such as Bluetooth, WiFi direct or infrared data association (IrDA)) or a second network 199 (e.g. cellular network, the Internet, or Communicate with external electronic devices via a telecommunications network, such as a computer network (eg, LAN or WAN).
- a first network 198 e.g. a short range communication network such as Bluetooth, WiFi direct or infrared data association (IrDA)
- a second network 199 e.g. cellular network, the Internet, or Communicate with external electronic devices via a telecommunications network, such as a computer network (eg, LAN or WAN).
- a telecommunications network such as a computer network (eg, LAN or WAN).
- the wireless communication module 192 uses subscriber information (e.g., international mobile subscriber identifier (IMSI)) stored in the subscriber identification module 196 in a communication network such as the first network 198 or the second network 199.
- subscriber information e.g., international mobile subscriber identifier (IMSI)
- IMSI international mobile subscriber identifier
- the antenna module 197 may transmit or receive a signal or power to an external (eg, an external electronic device) or from the outside.
- the antenna module may include one antenna including a radiator made of a conductor or a conductive pattern formed on a substrate (eg, a PCB).
- the antenna module 197 may include a plurality of antennas. In this case, at least one antenna suitable for the communication scheme used in the communication network, such as the first network 198 or the second network 199, is for example by means of the communication module 190 from the plurality of antennas. Can be selected.
- the signal or power may be transmitted or received between the communication module 190 and the external electronic device through the at least one selected antenna.
- other components eg, RFICs
- peripheral devices eg, a bus, a general purpose input and output (GPIO), a serial peripheral interface (SPI), or a mobile industry processor interface (MIPI)
- GPIO general purpose input and output
- SPI serial peripheral interface
- MIPI mobile industry processor interface
- the command or data may be transmitted or received between the electronic device 101 and the external electronic device 104 through the server 108 connected to the second network 199.
- Each of the electronic devices 102 and 104 may be a device of the same or different type as the electronic device 101.
- all or part of operations executed in the electronic device 101 may be executed in one or more external devices among the external electronic devices 102, 104, or 108. For example, when the electronic device 101 needs to perform a function or service automatically or in response to a request from a user or another device, the electronic device 101 instead of executing the function or service itself.
- one or more external electronic devices may be requested to perform at least a part of the function or the service.
- the one or more external electronic devices that receive the request may execute at least a part of the requested function or service, or an additional function or service related to the request, and transmit a result of the execution to the electronic device 101.
- the electronic device 101 may process the result as it is or additionally and provide it as at least part of a response to the request.
- cloud computing distributed computing, or client-server computing technology. This can be used.
- FIG. 2 is a block diagram illustrating an electronic device 200 according to various embodiments of the present disclosure.
- the electronic device 200 may include a plurality of sensors 212 and 214, a memory 220, a communication module 230, and at least one processor 240.
- At least one processor 240 may control the plurality of sensors 212 and 214, the memory 220, and the communication module 230 as a whole.
- At least one processor 240 of the electronic device 200 may determine whether a specific object is inserted into the electronic device 200 using the plurality of sensors 212 and 214. .
- the at least one processor 240 may determine that the object is inserted into the electronic device 200 when the object is placed on the upper portion of the electronic device 200 or inside the electronic device 200. have.
- the plurality of sensors may include a visual sensor 212 and a weight sensor 214.
- at least one processor 240 of the electronic device 200 may identify whether an object is inserted into the electronic device 200 through an image change acquired using the visual sensor 212, or may include a weight sensor ( Based on the weight change sensed by 214, it is possible to identify whether the object is retracted.
- the electronic device 200 may identify whether an object is drawn through various types of sensors such as a proximity sensor, a temperature sensor, and a pressure sensor.
- the visual sensor 212 may include, for example, a camera module (eg, the camera module 180) capable of capturing an environment inside and / or outside the electronic device 200.
- the electronic device 200 may obtain an image of at least one object introduced into the electronic device 200 using the visual sensor 212.
- the visual sensor 212 may be disposed in plural locations at various locations in order to acquire internal and / or external images of the electronic device 200.
- the weight sensor 214 may measure, for example, the weight of at least one object introduced into the electronic device 200.
- the electronic device 200 may identify whether the object is inserted or acquire the weight information of the inserted object using the weight sensor 214.
- the electronic device 200 may transmit and receive information between the electronic device 200 and the external device 260 or the electronic device 200 and the external server through the communication module 230.
- the external server may include a learning model 250 learned through artificial intelligence algorithms.
- the electronic device 200 may use the learning model 250 learned through the artificial intelligence algorithm or learn the learning model 250 through the communication module 230.
- the artificial intelligence algorithm may include at least one of machine learning, neural networks, genes, deep learning, and classification algorithms.
- the electronic device 200 may identify a type of a specific object introduced into the electronic device 200 using the learning model 250 learned through an artificial intelligence algorithm. For example, the electronic device 200 may determine whether an object is inserted by using at least one of the plurality of sensors, and image the incoming object by using the visual sensor 212 of the plurality of sensors. Can be obtained. The electronic device 200 may identify the type of object corresponding to the acquired image by using the learning model 250.
- the electronic device 200 may obtain residual information of an object inserted into the electronic device 200 by using the learning model 250.
- the electronic device 200 may obtain the weight information of the inserted object through the weight sensor 214, and may determine the total capacity information of the object corresponding to the image (and the jar of the object). Weight information) can be obtained using the learning model 250.
- the electronic device 200 may determine the remaining amount information of the object based on the weight information and the total capacity information of the object.
- the memory 220 may include information about the type of the object or information about the object acquired using the learning model 250 learned through artificial intelligence algorithms, or information obtained from a plurality of sensors (eg, images and weight information). At least one of the remaining amount information may be stored.
- the electronic device 200 may include a plurality of sensors or an output module (eg, the audio output device 155) or a display device (eg, the display device 160, a display) capable of outputting information received from the outside. ), And an input module (eg, a microphone, a touch screen, etc.) capable of receiving a user input.
- an output module eg, the audio output device 155
- a display device eg, the display device 160, a display
- an input module eg, a microphone, a touch screen, etc.
- the electronic device 200 may communicate with the external device 260 using the communication module 230.
- the external device 260 may include a portable communication terminal including a communication module 262, an input module 264, and an output module 266 for communicating with the electronic device 200.
- the electronic device 200 may identify a type of an object corresponding to an image related to the inserted object by using the learning model 250 learned through an artificial intelligence algorithm. For example, when the electronic device 200 fails to identify the type of the object using the learning model 250, the electronic device 200 may output an output module (not shown) or an external device 260 of the electronic device 200. Information input regarding the type of the object corresponding to the image may be requested through at least one of the output modules 262 of FIG. Hereinafter, an operation of receiving information input from the outside regarding the type of the object will be described with reference to FIG. 5.
- FIG 5 is a diagram for describing an information receiving operation of the electronic device 200 according to an embodiment of the present disclosure.
- the electronic device 200 recognizes the ingress of the object, obtains an image of the object using the visual sensor 212, and uses the learning model 250 trained through an artificial intelligence algorithm to the image.
- the type of the corresponding object can be identified. However, as shown in FIG. 5, when the identification of the type of the object corresponding to the image fails using the training model 250, the electronic device 200 may externally through the output module in operation 510. The user may request to input information regarding the type of the object.
- the electronic device 200 may also receive the reliability of the identification result from the learning model 250. If the reliability of the identification result is less than or equal to a specified threshold, the electronic device 200 may determine that the identification of the type of the object has failed.
- the electronic device 200 may determine that the inserted object is one of 'tomato sauce' or 'dry tomato' using the learning model 250, but the reliability of the identification result is specified. If it is equal to or less than the threshold value, a message may be output to request to input information on whether the kind of the object displayed in the image is 'tomato sauce' or 'dry tomato'.
- the information input request may be output through, for example, a display and / or a sound output device that is at least one of output modules (not shown) of the electronic device 200.
- the information input request may be transmitted to an external device (eg, a portable communication terminal) preset in the electronic device 200 and output through an output module of the external device.
- an external device eg, a portable communication terminal
- the electronic device 200 may receive user input information according to the information input request. For example, the user may input that the image related to the inserted object corresponds to the 'dry tomato' through an input module (not shown) of the electronic device 200 or an input module 262 of the external device 260. have.
- the electronic device 200 may determine the type of the object corresponding to the image of the inserted object through the user input information received from the input module of the electronic device 200 or the external device 262.
- the electronic device 200 may store, in the memory 220 of the electronic device 200, the type information of the inserted object determined through the user input information.
- the electronic device 200 may store, in FIG. 5, the inserted object as a 'dry tomato' in the memory 220.
- the electronic device 200 may train the learning model 250 to learn the type information of the inserted object determined by the user input information. For example, the electronic device 200 identifies the type of the object corresponding to the image based on user input information received through the input module, and trains the learning model 250 to learn the type of the object corresponding to the image. Can be. For example, the electronic device 200 or another electronic device (not shown) using the learning model 250 may identify the type of the object through an image related to the object after the learning.
- the electronic device 200 may, in the same manner as learning about the type information of the object in FIG. 5, based on user input information to determine an object corresponding to an image.
- the learning model 250 may be acquired by acquiring the full capacity information or the weight information of the jar of the object.
- the electronic device 200 may use the learning model 250 or train the learning model 250 through the communication module 230.
- a plurality of electronic devices may learn a learning model learned through an artificial intelligence algorithm based on user information input through at least one electronic device, and by using the learned learning model, Information about objects introduced into or withdrawn from each of the electronic devices may be obtained to provide inventory information of each electronic device to the user.
- the electronic device 200 may be inserted through an output module (not shown) (eg, the audio output device 155) or a display device (eg, the display device 160 or the display) of the electronic device 200. And at least one of a type, an image related to the object, and residual amount information of the object. According to another embodiment, the electronic device 200 transmits at least one of the information to the external device 260 using the communication module 230, and outputs it through the output module 266 of the external device 260. Command can be sent.
- an output module not shown
- a display device eg, the display device 160 or the display
- the electronic device 200 transmits at least one of the information to the external device 260 using the communication module 230, and outputs it through the output module 266 of the external device 260. Command can be sent.
- 3 is a diagram for describing a method of acquiring arrival information of an object with respect to the electronic device 200 according to various embodiments of the present disclosure.
- the electronic device 200 may identify whether an object is placed inside an electronic device or on an upper portion of the electronic device.
- the electronic device 200 may include at least one shelf inside or at the top, and each of the at least one shelf may include at least one weight sensor 310 and a visual sensor 320.
- the electronic device 200 may use the at least one weight sensor 310 and the visual sensor 320 based on the order and time at which the at least one object is drawn in or drawn out. , 300b) to obtain information about the withdrawal or withdrawal.
- Reference numeral 301 illustrates a case in which the first object 300a is inserted in the electronic device 200.
- Reference numeral 302 is a diagram illustrating a case where a second object 300b is additionally inserted into the electronic device 200, and reference numeral 303 is an electronic device into which the first object 300a and the second object 300b are inserted.
- FIG. 200 illustrates a case where the first object 300a is drawn out, and reference numeral 304 illustrates a case where the first object 300a is drawn back into the electronic device 200.
- the electronic device 200 may identify an incoming of the first object 300a using at least one of the plurality of sensors. For example, the electronic device 200 may use the weight sensor 310 to identify a change in the weight of the shelf on which the first object 300a is placed, and thereby recognize the entry of the first object 300a. The electronic device 200 may obtain weight information of the first object 300a based on the change in the weight of the shelf. In addition, the electronic device 200 may acquire an image of the first object 300a using the visual sensor 320.
- the electronic device 200 recognizes that the weight change is caused by the second object 300b based on the weight change of the shelf.
- the weight information of the second object 300b may be obtained.
- the electronic device 200 may acquire an image of the second object 300b using the visual sensor 320.
- the electronic device 200 determines which object is drawn through the visual sensor 320, or determines the plurality of weight sensors 310. This allows you to determine which object was fetched.
- the electronic device 200 may include a plurality of weight sensors 310 on each shelf of the electronic device 200.
- the electronic device 200 may identify an object corresponding to a position on the shelf through the plurality of weight sensors 310.
- the electronic device 200 when the first object 300a is retracted, the electronic device 200 recognizes that the weight change due to the retraction of the first object 300a is a change in weight by the first object 300a.
- the input or withdrawal time information of the electronic device 200 may be used, the visual sensor 320 may be used, or the plurality of weight sensors 310 may be used.
- the electronic device 200 may store the changed weight information in the memory. .
- FIG. 4 is a diagram for describing a method of identifying object type and remaining amount information in an electronic device 200 or 300 according to various embodiments of the present disclosure.
- the electronic devices 200 and 300 may identify an incoming of the object 250 and may acquire the image of the object 250 by using a visual sensor to determine information of the object. For example, when the object 250 is inserted, the electronic device may acquire weight information of the object using a weight sensor, but the type of the object 250 may be obtained through the memory 220 of the electronic device 200 or 300. May not be identified.
- an electronic device may use information about an object (eg, type of object and total weight of an object) by using a learning model 255 learned through an artificial intelligence algorithm through an image acquired using a visual sensor. , Container weight of the object, etc.).
- an object eg, type of object and total weight of an object
- the electronic device may transmit an image of the object 250 to an external server and obtain image identification result information (information of the object) obtained by using the learning model 255.
- the electronic device is a result of the identification of the image, and the kind of the inserted object 250 is 'salmon flavored milk', and the total weight of the object 250 is 1030 g.
- the weight of 30g can be recognized using the learning model 255.
- the electronic device may store the information of the object 250 in the memory 220 of the electronic device, and identify the remaining amount of the object 250 based on the information of the object 250. For example, the remaining amount of the object 250 from the weight excluding 30g corresponding to the container weight among the total weight 1030g of the object 250 and the weight excluding 30g which is the weight of the container from the weight 630g of the inserted object 250, It can be identified that 60%. Through this, the electronic device may recognize that the inserted object 250 corresponds to 60% of remaining milk and 600g of salmon flavor milk.
- FIG. 6 is a diagram illustrating an information output method in an electronic device according to various embodiments of the present disclosure.
- the electronic device may identify information about an object introduced into the electronic device or an object drawn out from the electronic device, and output the information to the user through an output module.
- the electronic device is an output module of the electronic device, and the object drawn or drawn out through at least one of an audio output device (eg, the audio output device 155) or a display device (eg, the display device 160). You can output information about.
- the electronic device may transmit the information to an external device so that the information is output through an output module of the external device.
- the electronic device may display information about an object inserted into the electronic device on a display screen.
- the electronic device may display an image 612 about an object inserted into the electronic device.
- the type, remaining amount information, and weight information of each object corresponding to the image may be displayed as shown in reference numeral 614.
- the electronic device may display an item list of at least one object inserted into the electronic device as shown in reference numeral 616.
- the user may easily grasp the object item inserted into the electronic device through the display screen 616.
- the item list is information on at least one object to be inserted and may include at least one of a kind of object, remaining amount information, and weight information.
- the electronic device may acquire analysis information about the at least one inserted object using a learning model learned through an artificial intelligence algorithm and display it in the form of a chart as shown in reference numeral 618.
- the analysis information may be nutritional information about at least one object introduced.
- the electronic device may acquire analysis information about at least one inserted object, using a learning model learned through an artificial intelligence algorithm, and obtain the acquired analysis information to a preset external device. Can transmit
- the analysis information may be obtained by using a learning model trained through the artificial intelligence algorithm, based on the type and residual information of the at least one object inserted into the electronic device (eg : Shopping suggestion list 630) or utilization suggestion list (eg, diet suggestion list 630).
- a learning model trained through the artificial intelligence algorithm based on the type and residual information of the at least one object inserted into the electronic device (eg : Shopping suggestion list 630) or utilization suggestion list (eg, diet suggestion list 630).
- the electronic device may generate shopping suggestion list information for the object when the remaining amount is less than or equal to a preset value, based on at least one of the type, remaining amount, and nutritional information of the inserted object.
- the shopping suggestion list information may be transmitted to an external device and displayed on the display module of the external device.
- the electronic device may generate a list of dietary suggestions based on at least one of the type of object, remaining amount, and nutritional information, and transmits the information to an external device. It can be displayed through the display module of the external device.
- Electronic devices may be various types of devices.
- the electronic device may include, for example, a portable communication device (eg, a smartphone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, or a home appliance device.
- a portable communication device eg, a smartphone
- a computer device e.g., a tablet, or a smart phone
- a portable multimedia device e.g., a portable medical device
- a camera e.g., a camera
- a wearable device e.g., a smart watch, or a smart watch, or a smart watch, or a smart watch, or a smart watch, or a smart watch, or a smart watch, or a smart watch, or a smart watch, or a smart watch, or a smart watch, or a smart watch, or a smart watch, or a smart watch, or a smart watch, or a smart watch, or a smart watch, or a smart watch
- first, second, or first or second may be used merely to distinguish a component from other corresponding components, and to separate the components from other aspects (e.g. Order).
- Some (eg, first) component may be referred to as “coupled” or “connected” to another (eg, second) component, with or without the term “functionally” or “communically”.
- any component can be connected directly to the other component (eg, by wire), wirelessly, or via a third component.
- module may include a unit implemented in hardware, software, or firmware, and may be used interchangeably with terms such as logic, logic block, component, or circuit.
- the module may be an integral part or a minimum unit or part of the component, which performs one or more functions.
- the module may be implemented in the form of an application-specific integrated circuit (ASIC).
- ASIC application-specific integrated circuit
- Various embodiments of this document may include one or more instructions stored on a storage medium (eg, internal memory 136 or external memory 138) that can be read by a machine (eg, electronic device 101). It may be implemented as software (eg, program 140) including the.
- the processor eg, the processor 120 of the device (eg, the electronic device 101) may call and execute at least one command among one or more instructions stored from the storage medium. This enables the device to be operated to perform at least one function in accordance with the at least one command invoked.
- the one or more instructions may include code generated by a compiler or code executable by an interpreter.
- the device-readable storage medium may be provided in the form of a non-transitory storage medium.
- 'non-transitory' means only that the storage medium is a tangible device and does not contain a signal (e.g. electromagnetic wave), which is the same as the case where data is stored semi-permanently on the storage medium. It does not distinguish cases where it is temporarily stored.
- a signal e.g. electromagnetic wave
- a method according to various embodiments disclosed in the present disclosure may be included in a computer program product.
- the computer program product may be traded between the seller and the buyer as a product.
- the computer program product is distributed in the form of a device-readable storage medium (e.g. compact disc read only memory (CD-ROM)), or through an application store (e.g. Play StoreTM) or two user devices ( Example: smartphones) can be distributed (eg downloaded or uploaded) directly or online.
- a device-readable storage medium such as a server of a manufacturer, a server of an application store, or a relay server, or may be temporarily created.
- each component eg, module or program of the above-described components may include a singular or plural entity.
- one or more of the aforementioned components or operations may be omitted, or one or more other components or operations may be added.
- a plurality of components eg, a module or a program
- the integrated component may perform one or more functions of the component of each of the plurality of components the same as or similar to that performed by the corresponding component of the plurality of components before the integration. .
- operations performed by a module, program, or other component may be executed sequentially, in parallel, repeatedly, or heuristically, or one or more of the operations may be executed in a different order, or omitted. Or one or more other actions may be added.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Business, Economics & Management (AREA)
- General Physics & Mathematics (AREA)
- Economics (AREA)
- Software Systems (AREA)
- Health & Medical Sciences (AREA)
- Evolutionary Computation (AREA)
- Data Mining & Analysis (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- Mathematical Physics (AREA)
- Computational Linguistics (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Tourism & Hospitality (AREA)
- Molecular Biology (AREA)
- Human Resources & Organizations (AREA)
- Strategic Management (AREA)
- General Business, Economics & Management (AREA)
- Marketing (AREA)
- Operations Research (AREA)
- Entrepreneurship & Innovation (AREA)
- Quality & Reliability (AREA)
- Development Economics (AREA)
- Accounting & Taxation (AREA)
- Finance (AREA)
- Medical Informatics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- Physiology (AREA)
- Primary Health Care (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Selon divers modes de réalisation, un dispositif électronique comprend : une pluralité de capteurs comprenant un capteur visuel et un capteur de poids; une mémoire; et au moins un processeur, le ou les processeurs pouvant être configurés pour : déterminer si au moins un objet a été inséré dans le dispositif électronique à l'aide d'au moins un capteur de la pluralité de capteurs; acquérir une image du ou des objets à l'aide du capteur visuel d'après l'insertion du ou des objets; identifier le type du ou des objets correspondant à l'image à l'aide d'un modèle d'apprentissage appris au moyen d'un algorithme d'intelligence artificielle; obtenir des informations de quantité résiduelle du ou des objets au moyen du modèle d'apprentissage et du capteur de poids; et enregistrer le type de l'objet ainsi que les informations de quantité résiduelle de l'objet dans la mémoire. D'autres modes de réalisation sont possibles.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| KR10-2018-0059746 | 2018-05-25 | ||
| KR1020180059746A KR20190140509A (ko) | 2018-05-25 | 2018-05-25 | 재고 추적 방법 및 장치 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2019225875A1 true WO2019225875A1 (fr) | 2019-11-28 |
Family
ID=68616429
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/KR2019/005187 Ceased WO2019225875A1 (fr) | 2018-05-25 | 2019-04-30 | Procédé et appareil de suivi d'inventaire |
Country Status (2)
| Country | Link |
|---|---|
| KR (1) | KR20190140509A (fr) |
| WO (1) | WO2019225875A1 (fr) |
Families Citing this family (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR20230066187A (ko) | 2021-11-05 | 2023-05-15 | 주식회사 포에스에스 | IoT 판매기 및 이를 이용한 축산물 거래 시스템 |
| KR102588267B1 (ko) * | 2022-08-22 | 2023-10-13 | 최성민 | 물류 관리 시스템 |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2007094655A (ja) * | 2005-09-28 | 2007-04-12 | Nec Corp | 摂取カロリー管理システム、摂取カロリー管理方法、保管庫、及びプログラム |
| KR20150001921A (ko) * | 2013-06-28 | 2015-01-07 | 엘지전자 주식회사 | 전기제품 |
| US20150161871A1 (en) * | 2013-12-06 | 2015-06-11 | Samsung Electronics Co., Ltd. | Method for providing health service and refrigerator therefor |
| KR20170133448A (ko) * | 2015-08-18 | 2017-12-05 | 시아오미 아이엔씨. | 정보 생성 방법 및 디바이스 |
| KR101812524B1 (ko) * | 2016-10-27 | 2017-12-27 | 서영빈 | 인공 지능형 냉장고를 위한 가전 제어용 크라우딩 운영관리 시스템 및 그 구동방법 |
-
2018
- 2018-05-25 KR KR1020180059746A patent/KR20190140509A/ko not_active Withdrawn
-
2019
- 2019-04-30 WO PCT/KR2019/005187 patent/WO2019225875A1/fr not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2007094655A (ja) * | 2005-09-28 | 2007-04-12 | Nec Corp | 摂取カロリー管理システム、摂取カロリー管理方法、保管庫、及びプログラム |
| KR20150001921A (ko) * | 2013-06-28 | 2015-01-07 | 엘지전자 주식회사 | 전기제품 |
| US20150161871A1 (en) * | 2013-12-06 | 2015-06-11 | Samsung Electronics Co., Ltd. | Method for providing health service and refrigerator therefor |
| KR20170133448A (ko) * | 2015-08-18 | 2017-12-05 | 시아오미 아이엔씨. | 정보 생성 방법 및 디바이스 |
| KR101812524B1 (ko) * | 2016-10-27 | 2017-12-27 | 서영빈 | 인공 지능형 냉장고를 위한 가전 제어용 크라우딩 운영관리 시스템 및 그 구동방법 |
Also Published As
| Publication number | Publication date |
|---|---|
| KR20190140509A (ko) | 2019-12-20 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2020171621A1 (fr) | Procédé de commande d'affichage d'avatar et dispositif électronique associé | |
| WO2020159217A1 (fr) | Dispositif électronique et procédé de détermination de tâche comprenant plusieurs actions | |
| TWI884303B (zh) | 用於連續小樣本學習的方法以及使用者設備 | |
| KR20230025337A (ko) | 분할화 태스크들의 비지도식 학습을 위한 시스템 및 방법 | |
| WO2020130689A1 (fr) | Dispositif électronique pour recommander un contenu de jeu et son procédé de fonctionnement | |
| WO2020171548A1 (fr) | Procédé de traitement d'entrée utilisateur et dispositif électronique prenant en charge ledit procédé | |
| WO2019078507A1 (fr) | Dispositif électronique et procédé de fourniture d'un indice de stress correspondant à l'activité d'un utilisateur | |
| WO2022211271A1 (fr) | Dispositif électronique pour traiter une saisie manuscrite sur la base d'un apprentissage, son procédé de fonctionnement et support de stockage | |
| WO2021045552A1 (fr) | Dispositif électronique de synthèse d'image et son procédé de fonctionnement | |
| WO2019168377A1 (fr) | Dispositif électronique et procédé de commande de dispositif électronique externe basé sur des informations de motif d'utilisation correspondant à un utilisateur | |
| CN112860169A (zh) | 交互方法及装置、计算机可读介质和电子设备 | |
| WO2020166894A1 (fr) | Dispositif électronique et procédé de recommandation de mot associé | |
| WO2019177373A1 (fr) | Dispositif électronique pour commander une fonction prédéfinie sur la base d'un temps de réponse d'un dispositif électronique externe à une entrée d'utilisateur, et procédé associé | |
| CN109886324B (zh) | 图标识别方法和装置 | |
| WO2022071708A1 (fr) | Dispositif électronique de fourniture de service de recommandation de contenu, et procédé associé | |
| WO2019225875A1 (fr) | Procédé et appareil de suivi d'inventaire | |
| WO2019147101A1 (fr) | Dispositif électronique de classification de code malveillant et son procédé de fonctionnement | |
| WO2021085785A1 (fr) | Appareil électronique et procédé de commande associé | |
| WO2023200114A1 (fr) | Dispositif électronique et procédé de vérification de licence de source ouverte | |
| WO2020040528A1 (fr) | Dispositif électronique permettant de commander des fonctions spécifiées sur la base d'une détection de signal électromagnétique et procédé associé | |
| EP3837624A1 (fr) | Dispositif électronique et procédé de commande de connexion de dispositif externe l'utilisant | |
| WO2023054913A1 (fr) | Dispositif électronique qui identifie une force tactile et son procédé de fonctionnement | |
| WO2021172893A1 (fr) | Procédé et dispositif d'annulation d'écho | |
| WO2020009353A1 (fr) | Appareil électronique pour reconnaître un utilisateur et son procédé de commande | |
| WO2020071636A1 (fr) | Dispositif électronique fournissant un service en nuage et son procédé de fonctionnement |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19807897 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 19807897 Country of ref document: EP Kind code of ref document: A1 |