WO2018150756A1 - Dispositif de traitement d'informations, procédé de traitement d'informations et support d'informations - Google Patents
Dispositif de traitement d'informations, procédé de traitement d'informations et support d'informations Download PDFInfo
- Publication number
- WO2018150756A1 WO2018150756A1 PCT/JP2017/047343 JP2017047343W WO2018150756A1 WO 2018150756 A1 WO2018150756 A1 WO 2018150756A1 JP 2017047343 W JP2017047343 W JP 2017047343W WO 2018150756 A1 WO2018150756 A1 WO 2018150756A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- food
- beverage
- information processing
- display
- information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
- G06Q50/12—Hotels or restaurants
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
Definitions
- the present disclosure relates to an information processing apparatus, an information processing method, and a storage medium.
- Devices that display various information according to user operations on the touch panel such as smartphones and tablet terminals, are widely used.
- tablet terminals the screen size has been increased, and the use method in which a plurality of users operate at the same time is being considered.
- a projector is conventionally used as a device for displaying information.
- Patent Document 1 information is displayed according to the environment in which information is to be displayed and the status of the information being displayed. Techniques for performing are disclosed.
- the present disclosure provides a mechanism that can further improve the quality of services related to foods or beverages provided according to user interaction.
- control unit that controls display of the display object related to the food or beverage based on the information related to the food or beverage obtained as a result of the sensing and the setting information associated with the food or beverage.
- An information processing apparatus is provided.
- the display of the display object related to the food or drink is controlled by the processor.
- the computer controls display of the display object related to the food or beverage based on the information related to the food or beverage obtained as a result of the sensing and the setting information associated with the food or beverage.
- a storage medium storing a program for functioning as a control unit is provided.
- the display of the display object related to the food and drink is controlled based on the information about the food or beverage (hereinafter, also simply referred to as food or drink) obtained as a result of the sensing. It is possible to provide a real-time service according to meal conditions. Furthermore, according to this indication, since the display of the display object regarding the said food / beverage is controlled based on the setting information matched with food / beverage, it is possible to provide a detailed service for every food / beverage. . Thus, according to the present disclosure, it is possible to provide a real-time and detailed service.
- FIG. 1 is a diagram illustrating a configuration example of an information processing system according to an embodiment of the present disclosure.
- the system may mean a configuration for executing a predetermined process, and the system as a whole can be regarded as one device, or a system is configured by a plurality of devices. It can also be regarded as.
- the information processing system according to the present embodiment illustrated in FIG. 1 may also be configured to execute predetermined processing (for example, processing realized by the functional configuration illustrated in FIG. 4) as the entire information processing system. Which of these configurations is considered as one device may be arbitrary.
- an information processing system 100a includes an input unit 110a and an output unit 130a.
- the output unit 130a visually notifies the user of the information by displaying various types of information on the table 140a.
- a projector is used as the output unit 130a.
- the output unit 130a is disposed above the table 140a, for example, spaced from the table 140a by a predetermined distance while being suspended from the ceiling, and projects information on the top surface of the table 140a.
- a method for displaying information on the top surface of the table 140a from above is also referred to as a “projection type”.
- the entire area where information is displayed by the output unit 130a is also referred to as a display screen.
- the output unit 130a displays information presented to the user as the application is executed by the information processing system 100a on the display screen.
- the displayed information is, for example, an operation screen of each application.
- each display area in which the operation screen of such an application is displayed on the display screen is also referred to as a display object.
- the display object may be a so-called GUI (Graphical User Interface) component (widget).
- the output unit 130a may include a lighting device.
- the information processing system 100a turns on the lighting device based on the content of information input by the input unit 110a and / or the content of information displayed by the output unit 130a. You may control states, such as light extinction.
- the output unit 130a may include a speaker and may output various kinds of information as sound.
- the number of speakers may be one or plural.
- the output unit 130a includes a plurality of speakers, the information processing system 100a may limit the speakers that output sound or adjust the direction in which sound is output.
- the output unit 130a may include a plurality of output devices, and may include, for example, a projector, a lighting device, and a speaker.
- the input unit 110a is a device that inputs operation details of a user who uses the information processing system 100a.
- the input unit 110a is provided above the table 140a, for example, in a state suspended from the ceiling.
- the input unit 110a is provided apart from the table 140a on which information is displayed.
- the input unit 110a may be configured by an imaging device that can capture the top surface of the table 140a, that is, the display screen.
- a camera that images the table 140a with one lens a stereo camera that can record information in the depth direction by imaging the table 140a with two lenses, or the like can be used.
- the input unit 110a is a stereo camera, for example, a visible light camera or an infrared camera can be used.
- the information processing system 100a analyzes the image (captured image) captured by the camera to physically locate the table 140a. It is possible to detect an object (hereinafter also referred to as a real object), for example, the position of a user's hand.
- a stereo camera is used as the input unit 110a, the information processing system 100a analyzes the captured image by the stereo camera, and in addition to the position information of the object located on the table 140a, Depth information (in other words, three-dimensional information) can be acquired.
- the information processing system 100a can detect contact or proximity of the user's hand to the table 140a in the height direction and separation of the hand from the table 140a based on the depth information.
- contact when the user touches or brings an operating body such as a hand in contact with information on the display screen is also simply referred to as “contact”.
- the position of the operating body for example, the user's hand on the display screen (that is, the top surface of the table 140a) is detected, and based on the detected position of the operating body.
- Various information is input. That is, the user can perform various operation inputs by moving the operating tool on the display screen. For example, when a user's hand contact with the display object is detected, an operation input for the display object is performed.
- a case where a user's hand is used as an operation body will be described as an example, but the present embodiment is not limited to this example, and various operation members such as a stylus are used as the operation body. May be.
- the input unit 110a may capture not only the top surface of the table 140a but also a user existing around the table 140a.
- the information processing system 100a can detect the position of the user around the table 140a based on the captured image.
- the information processing system 100a may perform personal recognition of the user by extracting physical features that can identify the individual of the user, such as the size of the user's face and body included in the captured image. .
- the present embodiment is not limited to such an example, and user operation input may be executed by other methods.
- the input unit 110a may be provided as a touch panel on the top surface of the table 140a, and the user's operation input may be detected by contact of the user's finger or the like with respect to the touch panel.
- the touch panel may be realized by various methods such as a pressure-sensitive type, a capacitance type, and an optical type.
- the input unit 110a performs spatial position recognition of an object using ultrasonic reflection, or detects a contact position between the object and another object by detecting and analyzing the vibration of the object.
- a user operation on the table top surface 140a may be detected.
- the input unit 110a may employ any one or any combination thereof as a technique for detecting a user operation on the table top surface 140a. Further, the user's operation input may be detected by a gesture with respect to the imaging device constituting the input unit 110a.
- the input unit 110a may include a voice input device such as a microphone that picks up sounds produced by the user and environmental sounds of the surrounding environment.
- a microphone array for collecting sound in a specific direction can be suitably used. Further, the microphone array can be configured such that the sound collection direction can be adjusted to an arbitrary direction.
- an operation input may be performed using the collected voice.
- the information processing system 100a may perform individual recognition based on the voice by analyzing the collected voice.
- the input unit 110a may be configured by a remote control device (so-called remote control).
- the remote controller may be one in which a predetermined instruction is input by operating a predetermined button arranged on the remote controller, or movement of the remote controller by a sensor such as an acceleration sensor or a gyro sensor mounted on the remote controller. Or a posture may be detected, and a predetermined instruction may be input by an operation of the user moving the remote controller.
- the information processing system 100a may include other input devices such as a mouse, a keyboard, buttons, switches, and levers (not shown) as the input unit 110a, and user operations are input through these input devices. Also good.
- the configuration of the information processing system 100a according to the present embodiment has been described above with reference to FIG. Although not shown in FIG. 1, another device may be connected to the information processing system 100a.
- an illumination device for illuminating the table 140a may be connected to the information processing system 100a.
- the information processing system 100a may control the lighting state of the lighting device according to the state of the display screen.
- the configuration of the information processing system is not limited to that shown in FIG.
- the information processing system according to the present embodiment only needs to include an output unit that displays various types of information on the display screen, and an input unit that can accept at least an operation input for the displayed information. Is not limited.
- FIG.2 and FIG.3 the other structural example of the information processing system which concerns on this embodiment is demonstrated.
- 2 and 3 are diagrams showing another configuration example of the information processing system according to the present embodiment.
- an output unit 130a is provided below the table 140b.
- the output unit 130a is a projector, for example, and projects information from below toward the top plate of the table 140b.
- the top plate of the table 140b is formed of a transparent material such as a glass plate or a transparent plastic plate, and the information projected by the output unit 130a is displayed on the top surface of the table 140b.
- a method of projecting information from the bottom of the table 140b onto the output unit 130a and displaying the information on the top surface of the table 140b is also referred to as a “rear projection type”.
- the input unit 110b is provided on the top surface (front surface) of the table 140b.
- the input unit 110b is configured by, for example, a touch panel, and the operation input by the user is performed when the touch of the operating body on the display screen on the top surface of the table 140b is detected by the touch panel.
- the configuration of the input unit 110b is not limited to this example, and the input unit 110b may be provided below the table 140b and separated from the table 140b, similarly to the information processing system 100a shown in FIG.
- the input unit 110b is configured by an imaging device, for example, and can detect the position of the operation body on the top surface of the table 140b through a top plate formed of a transparent material.
- a touch panel display is installed on a table with its display screen facing upward.
- the input unit 110c and the output unit 130c can be integrally configured as the touch panel display. That is, various types of information are displayed on the display screen of the display, and the operation input by the user is performed by detecting the touch of the operating body on the display screen of the display by the touch panel.
- an imaging device may be provided above the touch panel display as the input unit 110c. The position of the user around the table can be detected by the imaging device.
- the information processing system according to the present embodiment can be realized by various configurations.
- the present embodiment will be described by taking as an example the configuration of the information processing system 100a in which the input unit 110a and the output unit 130a are provided above the table 140a shown in FIG.
- the information processing system 100a, the input unit 110a, and the output unit 130a are simply referred to as the information processing system 100, the input unit 110, and the output unit 130.
- FIG. 4 is a block diagram illustrating an example of a functional configuration of the information processing system 100 according to the present embodiment.
- the information processing system 100 includes an input unit 110, a processing unit 120, an output unit 130, a storage unit 150, and a communication unit 160 as its functions.
- the input unit 110 is an input interface for inputting various information to the information processing system 100. A user can input various types of information to the information processing system 100 via the input unit 110.
- the input unit 110 corresponds to the input units 110a to 110c shown in FIGS.
- the input unit 110 can include various sensors.
- the input unit 110 performs sensing on the user in the sensing target range, user actions, real objects, and the relationship between these and display objects, generates sensing information indicating the sensing result, and outputs the sensing information to the processing unit 120.
- the sensing target range may not be limited to the top surface of the table 140, and may include, for example, the periphery of the table 140.
- the input unit 110 includes an imaging device and captures a captured image including a user's body, a user's face, a user's hand, and an object positioned on the top surface of the table 140.
- Information for example, information about the captured image
- the imaging device may be a visible light camera or an infrared camera, for example.
- the input unit 110 may be configured as an imaging device including a function as a depth sensor capable of acquiring depth information such as a stereo camera.
- the depth sensor may be configured separately from the imaging device as a sensor using an arbitrary method such as a time of flight method or a structured light method.
- the input unit 110 may include a touch sensor. In that case, the touch sensor detects a touch on the display screen. And the detection function of the user's hand which is not touching on the display screen and the object on the display screen may be secured by the imaging device which images the depth sensor and / or the display screen from above.
- the input unit 110 includes a sound collection device and collects a user's voice, a sound accompanying a user's operation, an environmental sound, and the like. Information input via the input unit 110 is provided to the processing unit 120 described later, and a user's voice input is recognized, or movement of an object is detected.
- the sound collection device may be an array microphone, and the sound source direction may be detected by the processing unit 120.
- the sound collection device is typically configured by a microphone, but may be configured as a sensor that detects sound by vibration of light.
- the processing unit 120 includes various processors such as a CPU and a DSP, and controls the operation of the information processing system 100 by executing various arithmetic processes. For example, the processing unit 120 processes various types of information obtained from the input unit 110 or the communication unit 160 and stores the information in the storage unit 150 or causes the output unit 130 to output the information.
- the processing unit 120 may be regarded as an information processing apparatus that processes various types of information.
- the processing unit 120 includes a setting unit 121, an acquisition unit 123, a display control unit 125, and a notification unit 127 as its functions. Note that the processing unit 120 may have functions other than these functions.
- each function of the processing unit 120 is realized by a processor constituting the processing unit 120 operating according to a predetermined program.
- the setting unit 121 has a function of setting setting information in association with food and drink.
- the acquisition part 123 has a function which acquires the information regarding food / beverage obtained as a result of sensing, and the setting information matched with the said food / beverage.
- the display control part 125 has a function which controls the display of the display object regarding the said food / beverage based on the information regarding the acquired food / beverage and setting information.
- the notification unit 127 has a function of notifying the external device of information about the food and drink based on the acquired information about food and drink and the setting information. Details of processing by each of these components will be described in detail later.
- the output unit 130 is an output interface for notifying the user of various types of information processed by the information processing system 100.
- the output unit 130 includes a display device such as a display, a touch panel, or a projector, and displays various types of information on the display screen under the control of the display control unit 123.
- the output unit 130 corresponds to the output units 130a to 130c shown in FIGS. 1 to 3, and displays a display object on the display screen as described above.
- the present embodiment is not limited to this example, and the output unit 130 may further include an audio output device such as a speaker, and may output various types of information as audio.
- Storage unit 150 is a storage device that temporarily or permanently stores information for the operation of the information processing system 100.
- the storage unit 150 stores setting information.
- the communication unit 160 is a communication interface for transmitting / receiving data to / from an external device by wire / wireless.
- the communication unit 160 is a method such as wireless LAN (Local Area Network), Wi-Fi (Wireless Fidelity (registered trademark)), infrared communication, Bluetooth (registered trademark), etc., directly with an external device or via a network access point.
- Communicate For example, the communication unit 160 communicates information with a user device such as a user's smartphone or a wearable device attached to the user.
- the communication unit 160 may acquire information from an SNS (Social Networking Service) or the like by communicating with a server on the Web, for example.
- SNS Social Networking Service
- FIG. 5 is a diagram for explaining an overview of the information processing system 100 according to the present embodiment.
- FIG. 5 shows a scene at a restaurant such as a restaurant, bar or cafe.
- users ie, guests
- 10A and 10B are seated on the table 140, and the food and drink 20A, 20B, and 20C are placed on the top surface (ie, display screen) of the table 140.
- These food and drink 20A, 20B, and 20C are ordered by the user 10A or 10B and distributed by a restaurant clerk.
- the information processing system 100 is installed in such a restaurant, for example. Then, the information processing system 100 senses food and drinks 20A, 20B, and 20C by the users 10A and 10B via the input unit 110. Then, as illustrated in FIG. 5, the information processing system 100 displays the display objects 30 ⁇ / b> A and 30 ⁇ / b> B on the top surface of the table 140 via the output unit 130.
- the information processing system 100 displays a display object on the table 140 on which food and drink are arranged by a display device (that is, the output unit 130) installed in a restaurant that provides food and drink.
- the display object may be displayed on a user terminal such as a user's smartphone or wearable device, or may be displayed on a wall or floor of a restaurant.
- FIG. 6 is a diagram illustrating an example of the flow of operation processing of the information processing system 100 according to the present embodiment.
- the information processing system 100 performs sensing using the input unit 110 and acquires information about the foods 20A, 20B, and 20C (step S102).
- the information processing system 100 acquires setting information associated with the foods 20A, 20B, and 20C (step S104).
- the information processing system 100 controls the display of the display object based on the information on the foods 20A, 20B, and 20C and the setting information associated with the foods 20A, 20B, and 20C (step S106).
- the information processing system 100 displays a display object 30A including a beverage menu for prompting the user 10A who eats the meat dish 20A to order additional beverages.
- the information processing system 100 displays a display object 30 ⁇ / b> B including a cooking menu for prompting the user 10 ⁇ / b> B who eats the shrimp fried food 20 ⁇ / b> B and drinks the beer 20 ⁇ / b> C to order additional side dishes.
- the information processing system 100 accepts additional orders in response to user operations on these display objects 30A or 30B.
- the information processing system 100 senses food and drink and displays display objects corresponding to the food and drink setting information, thereby simplifying the user's ordering operation or proposing a combination of food and drink, for example. It becomes possible to do. Thereby, it is possible to enrich a user's eating and drinking experience.
- the information processing system 100 controls the notification of information to the external device based on the information on the foods 20A, 20B, and 20C and the setting information associated with the foods 20A, 20B, and 20C ( Step S108).
- Examples of external devices include various devices in restaurants, terminals held by store clerk, and servers on a network.
- the information processing system 100 transmits a lower bowl instruction to the store clerk's terminal at the timing when the food and drink have been eaten, and an instruction to arrange the next food and drink in the course dish.
- the information processing system 100 transmits the order information to the kitchen apparatus at the timing when the additional order is made.
- the information processing system 100 transmits an accounting instruction to the accounting apparatus of the restaurant at the timing when all the food and drink on the table 140 have been eaten and finished drinking. Thereby, the restaurant can provide an appropriate service at an appropriate timing.
- step S110 determines whether or not the end condition is satisfied. If it is determined that the condition is not satisfied, the process returns to step S102. If it is determined that the condition is satisfied, the process ends. As the termination condition, for example, a user standing up can be considered.
- Information processing system 100 acquires information about food and drink based on sensing information.
- the information processing system 100 acquires information on food and drink based on sensing information such as a captured image, an infrared image, user voice, or depth information.
- the information processing system may acquire information on food and drink associated with the user.
- the user here is, for example, a person who is provided with food or drink (for example, a person who ordered or a person who eats or drinks) or a person who provides food or drink (for example, a person who cooks food or drink in a kitchen, or a kitchen A person who carries food and drink to the table).
- the information regarding food / beverage may be acquired for every food / beverage, may be acquired collectively regarding several food / beverage products corresponding to a user, and may be acquired collectively for every table.
- food and drink may be captured as ingested items. The ingestion is a food, drink or combination thereof that is taken orally.
- information on food and drink may include information unique to food and drink.
- information unique to food and drink for example, names of food and drink, raw materials, cooking methods, and the like are conceivable.
- information on food and drink may include information on the state of food and drink.
- information regarding the state of food and drink for example, the temperature, remaining amount and consumption speed of food and drink can be considered.
- information on food and drink may include information on provision of food and drink.
- information related to the provision of food and drink include, for example, the elapsed time since provision, whether or not it is a part of a course dish, related food and drink (for example, other dishes included in the course dish, or main dishes)
- the corresponding wine may be offered.
- the information regarding food and drink may include information regarding people regarding food and drink.
- information about a person related to food and drink which user is associated (orderer or food and drink), whether or not the user who eats or drinks belongs to one or more groups, the role of the user in the group, food and drink
- the degree of store congestion, the number of store clerk, and the like can be considered.
- Information processing system 100 (for example, acquisition part 123) acquires setting information matched with food and drink. For example, the information processing system 100 identifies food and drink by recognizing food and drink, and acquires setting information associated with the identified food and drink from the storage unit 150.
- Setting information related to display execution condition may include information related to the display execution condition of the display object.
- the information processing system 100 (for example, the display control unit 125) displays a display object when a display execution condition included in the setting information is satisfied.
- the display execution condition includes a condition for starting display and a condition for ending.
- the information processing system 100 displays the display object for a period from when the condition for starting display is satisfied until the condition for ending is satisfied.
- the setting information may include a plurality of display execution conditions. For example, there may be a plurality of display object candidates that can be displayed for one food and drink, and display execution conditions can be set for each candidate. Then, the information processing system 100 displays a display object corresponding to the satisfied display execution condition.
- setting information regarding display execution conditions An example of setting information regarding display execution conditions will be described below.
- the setting information related to the display execution condition is based on at least one of those described below.
- the display execution condition may be related to food and drink. An example will be described below.
- the display execution condition may be based on the remaining amount of food and drink. For example, when the remaining amount of the beverage is equal to or less than the threshold value, it may be determined that display of the display object that prompts the additional order should be executed.
- the display execution condition may be based on contact between food and drink and a predetermined real object. For example, when a fork comes into contact with a meat dish, it may be determined that display of a display object that explains the production area of the meat should be executed.
- the display execution condition may be based on the relationship between the food and drink and other food and drink. For example, regarding a plurality of beverages provided to a plurality of users, when the remaining amount of some of the beverages is 20% or less, it may be determined that display of a display object that prompts an additional order should be executed.
- the display execution condition may be based on whether food or drink is provided as a single item or as part of a course meal. For example, it may be determined that different display objects should be displayed for food and drink provided as a single item (for example, in a la carte) and food and drink provided as part of a course meal.
- the display execution condition may be related to time. An example will be described below.
- the display execution condition may be based on time. For example, it may be determined that a different display object should be displayed depending on whether the time belongs to a lunch time zone or a dinner time zone.
- the display execution condition may be based on the elapsed time since the food or drink is provided. For example, when the elapsed time from the provision of food and drink exceeds a threshold, it can be determined that display of a display object that prompts an additional order should be executed.
- the display execution condition may be based on the elapsed time since the user started eating. For example, when the elapsed time since the user started eating for a seat exceeds a threshold value, it may be determined that display of a display object that prompts an additional order should be executed.
- the display execution condition may be related to the user. An example will be described below.
- the display execution condition may be based on user attribute information.
- user attribute information for example, identification information such as a user name, sex, age, occupation, and the like are conceivable. For example, it may be determined that different display objects should be displayed depending on whether the user is male or female.
- the display execution condition may be based on the user's state.
- the user's state for example, the user's biometric information, emotion, seat position, and the like can be considered.
- biological information pulse, body temperature, blood pressure, complexion color, and the like can be considered. For example, it may be determined that different display objects should be displayed when the emotion is high and when the emotion is calm.
- the display execution condition may be based on the relationship between the user and another user (for example, a fellow person). For example, when the user is the host of the group, it is determined that the display object that prompts the additional order should be displayed, and when the user is the guest, the display object that prompts the additional order should not be displayed. It can be judged.
- the display execution condition may be based on the user's eating history or preference.
- the user's food history can be acquired from, for example, a database that stores food and drinks that the user has eaten and pasted in association with identification information.
- the user's preference can be estimated from the user's food history, for example. For example, it may be determined that display object display recommending food and drink that matches the user's preference should be executed.
- the display execution condition may be based on the user's voice. For example, it may be determined that display object display that recommends food and drink along a topic should be executed based on a voice recognition result of a conversation between the user and another user (for example, a cohabitant) or a store clerk.
- the display execution conditions related to the user described above may be set for one user or may be set for a group. For example, it may be determined that different display objects should be displayed depending on whether the user is seated alone or a group of people is seated.
- Display execution condition related to user operation may relate to a user operation. For example, when a predetermined operation such as a single tap, a double tap, or a drag on the displayed display object is detected, it can be determined that display of the display object should be executed.
- the setting information related to the display execution condition may be information indicating the display execution condition itself.
- the information indicating the display execution condition itself is, for example, information indicating a threshold for the remaining amount of beverage.
- the setting information related to the display execution condition may be information for changing the display execution condition itself.
- the information that affects the display execution condition itself is, for example, information that lowers the threshold value of the remaining amount of beverage according to the elapsed time since the meal was started.
- the consumption of food and drink is fast in the time zone immediately after the meal is started, and the consumption becomes slow as time passes. Based on this, in the first half of the meal, a display object that prompts an additional order is displayed even if the remaining amount of beverage is large, and in the second half of the meal, a display object that prompts an additional order is displayed when the remaining amount of beverage is small. obtain.
- the setting information related to the display execution condition may include information indicating a time lag from when the display execution condition is satisfied until the display object is actually displayed.
- the information processing system 100 displays the display object after the set time lag has elapsed after the display execution condition is satisfied.
- the information processing system 100 displays a display object that prompts an additional order as soon as the remaining amount of beverage falls below the threshold in the first half of the meal, and predetermined after the remaining amount of beverage falls below the threshold in the second half of the meal.
- a display object that prompts for additional orders is displayed after a lapse of time.
- the same thing can be realized by changing the display execution condition such as lowering the threshold value of the remaining amount of beverage according to the elapsed time since the meal was started.
- the setting information may include information regarding the contents of the display object.
- the information processing system 100 for example, the display control unit 125
- the setting information related to the content of the display object can include at least one of image (moving image / still image) data, audio data, and text data.
- the setting information related to the content is at least one of the items described below.
- the content of the display object may be information on additional orders.
- information relating to additional orders include, for example, an order issue button, information indicating food and drink candidates that can be ordered, information indicating order history at that time (may include total amount information at that time), and ordering Information indicating customizable options such as number, size and other options may be considered.
- the display object related to the additional order may be in a format for accepting selection of the food and options desired by the user, or an order for the same food and drink that the user is currently eating and drinking, that is, a request for a replacement. May be accepted. By displaying the information regarding the additional order, the user can easily place the additional order.
- the content of the display object may be information related to the elapsed time since the food and drink were provided.
- the elapsed time since the food and drink are provided, it is possible to motivate the user to eat early. Thereby, it becomes possible for a restaurant to improve the rotation rate of a seat.
- the content of the display object may be information regarding the temperature of food and drink.
- the temperature of the food and drink By displaying the temperature of the food and drink, it is possible to motivate the user to eat at an appropriate temperature.
- the content of the display object may be information related to the association between the food and drink and the user.
- the user can prevent losing his / her beverage due to, for example, moving the seat.
- the content of the display object may be information explaining food and drink.
- information explaining food and drink a production area, raw materials, allergy information, a nutrient, energy amount, how to eat, etc. can be considered, for example.
- the user can eat the food in an appropriate way even if it is the first food, for example.
- the setting information may include information regarding the display style of the display object.
- the information processing system 100 for example, the display control unit 125
- the setting information regarding the display format is at least one of the following items.
- the setting information regarding the display style of the display object may be setting information regarding the display position of the display object.
- As setting information regarding the display position for example, a position on the display screen where the display object is to be displayed (for example, an absolute position or a relative position based on food and drink), a range, a size, and other display objects or real objects It is conceivable whether or not to allow duplication.
- the setting information regarding the display style of the display object may be setting information regarding the display orientation of the display object.
- the setting information regarding the display posture may include, for example, information indicating the direction of characters included in the display object, and the information processing system 100 rotates the display object so that the characters included in the display object face the user, for example. Can be.
- the setting information related to the display style of the display object may be setting information related to animation.
- animation for example, movement, rotation, size change, color change, and the like of the display object can be considered.
- the setting information regarding the display style of the display object may be setting information regarding the degree of detail. For example, a simple display object may be displayed when a sufficient display area cannot be secured, and a detailed display object may be displayed when a sufficient display area is secured. As a simple display object, an icon indicating that there is new information can be considered.
- the setting information related to notification to an external device may include information similar to setting information related to display of a display object.
- the setting information related to the notification to the external device may include information similar to the setting information related to the display execution condition as setting information related to the notification execution condition to the external device.
- the information processing system 100 (for example, the setting unit 121) can set setting information in various ways.
- the information processing system 100 may first set default setting information, and change the setting information by performing machine learning according to the subsequent usage history.
- the default setting information may be learned based on the usage history in other restaurants, for example.
- the information processing system 100 may change the setting information according to the input by the restaurant. Specifically, each item of the setting information (for example, the threshold value of the remaining amount of beverage related to the display execution condition) may be input by the restaurant. Moreover, each item of setting information may be changed according to the rough input by a restaurant. For example, the restaurant inputs a flag for setting the display timing of the display object such as “normal”, “early”, and “late” for food and drink. Then, the information processing system 100 raises or lowers the threshold value of the remaining amount of the beverage related to the display execution condition, for example, so as to satisfy the request to make the display timing early / slow. In addition, the input of the flag may be performed individually for each food and drink, or common to a plurality of food and drink such as common to drinks or small quantities of food. May be done.
- the information processing system 100 may flexibly change the setting information according to circumstances for each user or convenience for each restaurant.
- the setting information may be variably set based on at least one of a user or a restaurant that provides food and drink.
- the information processing system 100 can change the setting information according to the past use history for a user who has a past visit history (for example, a regular customer).
- the information processing system 100 can change the setting information according to various seasonal conditions such as recommended items for each season, the purchase status of the day, and sales.
- the information processing system 100 may change the setting information according to the weather or the like.
- ⁇ Whether the setting information can be changed may be set for each item. For example, items that cannot be changed by the restaurant side can be fixed.
- a changeable range may be set for changeable items. For example, with respect to items having a preferable range for the restaurant side, it is possible to keep the change within the preferable range.
- Information processing system 100 controls display of a display object about the food based on information about food and drink obtained as a result of sensing and setting information associated with the food and drink. To do.
- the information processing system 100 controls the display object to be displayed, the display start / end timing of the display object, and / or the display style based on the information about the food and drink provided to the user and the setting information.
- the information processing system 100 may control the display of display objects according to the amount of food and drink. For example, the information processing system 100 may make a display object that prompts an additional order stand out as the remaining amount of food or drink decreases.
- FIG. FIG. 7 is a diagram illustrating an example of display control by the information processing system 100 according to the present embodiment.
- the information processing system 100 displays a small menu icon 30A when the remaining amount of beer 20 provided to the user 10 is abundant, and a large menu icon 30B when the remaining amount decreases. Is displayed, and a refill order button 30C is displayed when it is drunk. In this way, as the remaining amount of food and drink decreases, it is possible to prevent the user from forgetting to order and to increase sales for the restaurant by prompting the additional order.
- the information processing system 100 may control the display of display objects according to the consumption pace of food and drink. For example, the information processing system 100 may display a display object that prompts an additional order at an earlier timing as the consumption pace of food and drink is faster and at a later timing as the rate is slower. Specifically, when the time from serving to the end of eating or drinking is short, the information processing system 100 displays a display object that prompts an additional order after a short time from the end of eating or the end of drinking. In addition, when the time from serving to the end of eating or drinking is long, the information processing system 100 displays a display object that prompts an additional order after a long time from the end of eating or drinking. Thereby, it becomes possible to receive an additional order at an appropriate timing according to the user's consumption pace.
- the information processing system 100 may control the display of display objects according to the order in which the user eats and drinks food and drink. For example, the information processing system 100 estimates the user's preference based on the order of eating and drinking food, in other words, how to reduce or leave the food, and adds other food and beverage such as matching the user's preference. A display object that prompts an order may be displayed. Note that the information processing system 100 may acquire profile information from the database and determine whether it is left because it does not like the left food or food, or whether it is left intentionally.
- the information processing system 100 may control the display of display objects according to the progress of food and beverage provision.
- the progress of food and beverage provision may be, for example, the progress of course cooking (how many of all the products have been provided) or the progress of a plurality of foods and beverages ordered individually.
- the information processing system 100 displays the display object earlier in the first half of the course dish and displays the display object later in the second half of the course dish.
- the information processing system 100 displays a display object that instructs the user to open the center of the table.
- the information processing system 100 may control the display of display objects according to the correspondence between food and drink and the user.
- the information processing system 100 may display a display object that indicates the correspondence between the food and drink and the user.
- Such a display object may clearly indicate to the user the food and drink ordered by the user. In this case, for example, it is possible to prevent the user from losing sight of his / her beverage by moving the seat.
- the display object may clearly indicate to the store clerk which user ordered the food or drink. In that case, the store clerk can know the place where the food or drink is to be served.
- FIG. FIG. 8 is a diagram illustrating an example of display control by the information processing system 100 according to the present embodiment. As shown in FIG.
- the information processing system 100 displays a display object 30A indicating that the beer 20A belongs to the user 10A and a display object 30B indicating that the beer 20B belongs to the user 10B.
- the information processing system 100 includes a display object 30C indicating that the orderer of the dish 20C is the user 10A and should be arranged in front of the user 10A, and the orderer of the dish 20D is the user 10B.
- a display object 30D indicating that the user 10B should be arranged is displayed in front of the user 10B.
- the information processing system 100 may control the display of display objects according to the end of eating and drinking of food and drink.
- the information processing system 100 may display a display object that accepts an evaluation of food and drink at the timing when the food and drink of the food and drink are finished (finished eating or finished drinking).
- the evaluation may be in a scoring format or may be a free-filled questionnaire format. Thereby, the evaluation for every food and drink finer than the evaluation for every restaurant widely performed conventionally becomes possible.
- the information processing system 100 may post the input evaluation to the user's SNS together with the captured image of the food and drink.
- the information processing system 100 may control the display of the display object according to the elapsed time after the display object is displayed. For example, the information processing system 100 makes the display area less noticeable, for example, by reducing the display area over time after the display of the display object is started, and finally ends the display.
- the information processing system 100 may control the display of the display object according to the congestion degree of the restaurant.
- the degree of congestion of the restaurant may be the degree of order congestion or the degree of customer congestion. For example, when the degree of order congestion (for example, the order frequency or the number of unsuccessful orders) exceeds a threshold value, the information processing system 100 delays the display timing of the display object related to the additional order or displays it inconspicuously. To do. As a result, new orders are suppressed and the degree of congestion can be reduced.
- the information processing system 100 can control the display of display objects according to the role of the user in the group. For example, when the user is a group host, the information processing system 100 displays a display object that prompts an additional order and a display object that includes accounting information. Thereby, for example, the user can appropriately treat the guest.
- the information processing system 100 may control the display of the display object according to the user action. For example, the information processing system 100 starts or ends display of the display object when detecting a specific gesture such as shaking a glass.
- the information processing system 100 may control display of the display object according to the user's audio.
- the information processing system 100 can recognize the content of the user's conversation by voice to estimate what the user wants to eat or drink and can control the display of the display object according to the estimation result.
- Biometric information The information processing system 100 may control display of a display object according to biometric information. For example, the information processing system 100 displays a display object that recommends warm food when the user's body temperature is lower than a threshold.
- the information processing system 100 (for example, the notification unit 127) externally outputs the information about the food and drink based on the information about the food and drink provided to the user obtained as a result of the sensing and the setting information associated with the food and drink. Control device notifications. For example, the information processing system 100 controls the information to be notified, the notification timing, and / or the notification format based on the information about the food and drink provided to the user and the setting information.
- the information processing system 100 may notify the external device of order information when an order for a predetermined amount of food or drink is made in units of tables. In other words, the information processing system 100 may wait for notification of order information until there is an order for a predetermined amount of food and drink in units of tables. Thereby, it is possible to improve the work efficiency of the restaurant side.
- the information processing system 100 may issue an alert or stop accepting an order when an order exceeding a threshold set for each user or each table is performed.
- the user can limit the accounting amount or the amount of alcohol consumed.
- FIG. 9 is a block diagram illustrating an example of a hardware configuration of the information processing apparatus according to the present embodiment.
- the information processing apparatus 900 illustrated in FIG. 9 can realize the information processing system 100 illustrated in FIG. 4, for example.
- Information processing by the information processing system 100 according to the present embodiment is realized by cooperation between software and hardware described below.
- the information processing apparatus 900 includes a CPU (Central Processing Unit) 901, a ROM (Read Only Memory) 902, a RAM (Random Access Memory) 903, and a host bus 904a.
- the information processing apparatus 900 includes a bridge 904, an external bus 904b, an interface 905, an input device 906, an output device 907, a storage device 908, a drive 909, a connection port 911, and a communication device 913.
- the information processing apparatus 900 may include a processing circuit such as an electric circuit, a DSP, or an ASIC instead of or in addition to the CPU 901.
- the CPU 901 functions as an arithmetic processing unit and a control unit, and controls the overall operation in the information processing apparatus 900 according to various programs. Further, the CPU 901 may be a microprocessor.
- the ROM 902 stores programs used by the CPU 901, calculation parameters, and the like.
- the RAM 903 temporarily stores programs used in the execution of the CPU 901, parameters that change as appropriate during the execution, and the like.
- the CPU 901 can form the processing unit 120 illustrated in FIG. 4.
- the CPU 901, ROM 902, and RAM 903 are connected to each other by a host bus 904a including a CPU bus.
- the host bus 904 a is connected to an external bus 904 b such as a PCI (Peripheral Component Interconnect / Interface) bus via a bridge 904.
- an external bus 904 b such as a PCI (Peripheral Component Interconnect / Interface) bus
- PCI Peripheral Component Interconnect / Interface
- the host bus 904a, the bridge 904, and the external bus 904b do not necessarily have to be configured separately, and these functions may be mounted on one bus.
- the input device 906 is realized by a device in which information is input by the user, such as a mouse, a keyboard, a touch panel, a button, a microphone, a switch, and a lever.
- the input device 906 may be, for example, a remote control device using infrared rays or other radio waves, or may be an external connection device such as a mobile phone or a PDA that supports the operation of the information processing device 900.
- the input device 906 may include, for example, an input control circuit that generates an input signal based on information input by the user using the above-described input means and outputs the input signal to the CPU 901.
- a user of the information processing apparatus 900 can input various data and instruct a processing operation to the information processing apparatus 900 by operating the input device 906.
- the input device 906 can be formed by a device that detects information about the user.
- the input device 906 includes various sensors such as an image sensor (for example, a camera), a depth sensor (for example, a stereo camera), an acceleration sensor, a gyro sensor, a geomagnetic sensor, an optical sensor, a sound sensor, a distance sensor, and a force sensor. Can be included.
- the input device 906 includes information related to the information processing device 900 state, such as the posture and movement speed of the information processing device 900, and information related to the surrounding environment of the information processing device 900, such as brightness and noise around the information processing device 900. May be obtained.
- the input device 906 receives a GNSS signal from a GNSS (Global Navigation Satellite System) satellite (for example, a GPS signal from a GPS (Global Positioning System) satellite) and receives position information including the latitude, longitude, and altitude of the device.
- GNSS Global Navigation Satellite System
- a GNSS module to measure may be included.
- the input device 906 may detect the position by transmission / reception with Wi-Fi (registered trademark), a mobile phone / PHS / smartphone, or the like, or near field communication.
- Wi-Fi registered trademark
- the input device 906 can form, for example, the input unit 110 shown in FIG.
- the output device 907 is formed of a device that can notify the user of the acquired information visually or audibly. Examples of such devices include CRT display devices, liquid crystal display devices, plasma display devices, EL display devices, display devices such as laser projectors, LED projectors and lamps, audio output devices such as speakers and headphones, printer devices, and the like. .
- the output device 907 outputs results obtained by various processes performed by the information processing device 900. Specifically, the display device visually displays results obtained by various processes performed by the information processing device 900 in various formats such as text, images, tables, and graphs.
- the audio output device converts an audio signal composed of reproduced audio data, acoustic data, and the like into an analog signal and outputs it aurally.
- the output device 907 can form, for example, the output unit 130 shown in FIG.
- the storage device 908 is a data storage device formed as an example of a storage unit of the information processing device 900.
- the storage apparatus 908 is realized by, for example, a magnetic storage device such as an HDD, a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like.
- the storage device 908 may include a storage medium, a recording device that records data on the storage medium, a reading device that reads data from the storage medium, a deletion device that deletes data recorded on the storage medium, and the like.
- the storage device 908 stores programs executed by the CPU 901, various data, various data acquired from the outside, and the like.
- the storage device 908 can form, for example, the storage unit 150 shown in FIG.
- the drive 909 is a storage medium reader / writer, and is built in or externally attached to the information processing apparatus 900.
- the drive 909 reads information recorded on a removable storage medium such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, and outputs the information to the RAM 903.
- the drive 909 can also write information to a removable storage medium.
- connection port 911 is an interface connected to an external device, and is a connection port with an external device capable of transmitting data by USB (Universal Serial Bus), for example.
- USB Universal Serial Bus
- the communication device 913 is a communication interface formed by a communication device or the like for connecting to the network 920, for example.
- the communication device 913 is, for example, a communication card for wired or wireless LAN (Local Area Network), LTE (Long Term Evolution), Bluetooth (registered trademark), or WUSB (Wireless USB).
- the communication device 913 may be a router for optical communication, a router for ADSL (Asymmetric Digital Subscriber Line), a modem for various communication, or the like.
- the communication device 913 can transmit and receive signals and the like according to a predetermined protocol such as TCP / IP, for example, with the Internet and other communication devices.
- the communication device 913 can form, for example, the communication unit 160 illustrated in FIG.
- the network 920 is a wired or wireless transmission path for information transmitted from a device connected to the network 920.
- the network 920 may include a public line network such as the Internet, a telephone line network, and a satellite communication network, various LANs including the Ethernet (registered trademark), a wide area network (WAN), and the like.
- the network 920 may include a dedicated line network such as an IP-VPN (Internet Protocol-Virtual Private Network).
- IP-VPN Internet Protocol-Virtual Private Network
- each of the above components may be realized using a general-purpose member, or may be realized by hardware specialized for the function of each component. Therefore, it is possible to change the hardware configuration to be used as appropriate according to the technical level at the time of carrying out this embodiment.
- a computer program for realizing each function of the information processing apparatus 900 according to the present embodiment as described above can be produced and mounted on a PC or the like.
- a computer-readable recording medium storing such a computer program can be provided.
- the recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like.
- the above computer program may be distributed via a network, for example, without using a recording medium.
- the information processing system 100 is based on information about food and drink provided to a user obtained as a result of sensing and setting information associated with the food and drink. Controls display of display objects related to objects. Thereby, the information processing system 100 can provide a fine-tuned service set for each food and drink in real time according to the state of meal.
- the information processing system 100 can be applied in any place where food and drink can be provided, such as a general home or a cafeteria cafeteria.
- the processing unit 120 and the storage unit 150 are connected to an apparatus such as a server connected to the input unit 110, the output unit 130, and the communication unit 160 via a network or the like. It may be provided.
- the processing unit 120 and the storage unit 150 are provided in a device such as a server, information obtained by the input unit 110 or the communication unit 160 is transmitted to the device such as the server via a network or the like, and the processing unit 120 sets the drawing information set.
- the information for the output unit 130 to output from the device such as the server is sent to the output unit 130 through a network or the like.
- An information processing apparatus comprising: (2) The information processing apparatus according to (1), wherein the setting information relates to display execution conditions for the display object. (3) The display execution condition is based on at least one of the remaining amount of the food or beverage, the contact between the food or beverage and a predetermined real object, or the relationship between the food or beverage and another food or beverage, The information processing apparatus according to 2).
- the information processing apparatus according to (2) or (3), wherein the display execution condition is based on whether the food or beverage is provided as a single item or as part of a course dish. (5) The display execution condition is based on at least one of a time, an elapsed time since the food or beverage was provided, or an elapsed time since the user who provided the food or beverage started eating ( The information processing apparatus according to any one of 2) to (4). (6) The information processing apparatus according to any one of (1) to (5), wherein the setting information relates to a content of the display object.
- the content includes information on additional orders, information on the elapsed time since the food or beverage was provided, information on the temperature of the food or beverage, and correspondence between the food or beverage and the user provided with the food or beverage.
- the setting information relates to a display format of the display object.
- the display style is at least one of a display position, a display posture, an animation, and a detail level of the display object.
- the setting information is variably set based on at least one of a user to whom the food or beverage is provided or a restaurant that provides the food or beverage, (1) to (9) The information processing apparatus described in 1.
- the control unit displays the display object that prompts an additional order at an earlier timing as the consumption pace of the food or beverage is faster, and at a later timing as the delay is slower, according to any one of (1) to (11).
- the control unit displays the display object that prompts an additional order of another food or beverage according to an order in which the user to whom the food or beverage is provided eats or drinks the food or beverage, (1) to (12 ).
- the information processing apparatus according to any one of (14) The information processing according to any one of (1) to (13), wherein the control unit displays the display object indicating a correspondence relationship between the food or beverage and a user to whom the food or beverage is provided. apparatus.
- the control unit displays the display object that receives an evaluation of the food or beverage at a timing when the eating or drinking of the food or beverage is completed. apparatus.
- An information processing method including: (20) Computer A control unit for controlling display of a display object relating to the food or beverage based on information relating to the food or beverage obtained as a result of sensing and setting information associated with the food or beverage; A storage medium that stores a program for functioning as a computer.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Business, Economics & Management (AREA)
- Physics & Mathematics (AREA)
- Tourism & Hospitality (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Strategic Management (AREA)
- Primary Health Care (AREA)
- General Business, Economics & Management (AREA)
- Marketing (AREA)
- Human Resources & Organizations (AREA)
- Economics (AREA)
- Computer Hardware Design (AREA)
- Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Le problème décrit par la présente invention est de fournir une technologie apte à améliorer la qualité de service associée à des aliments ou à des boissons fournies selon une interaction d'utilisateur. La solution selon l'invention concerne un dispositif de traitement d'informations qui comporte une unité de commande qui commande la présentation d'un objet d'affichage associé à des aliments ou à des boissons sur la base d'informations associées à l'aliment ou à la boisson obtenues en conséquence de la détection et à des informations prédéfinies qui correspondent à l'aliment ou à la boisson.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2017-025770 | 2017-02-15 | ||
| JP2017025770 | 2017-02-15 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2018150756A1 true WO2018150756A1 (fr) | 2018-08-23 |
Family
ID=63169800
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2017/047343 Ceased WO2018150756A1 (fr) | 2017-02-15 | 2017-12-28 | Dispositif de traitement d'informations, procédé de traitement d'informations et support d'informations |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2018150756A1 (fr) |
Cited By (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2020116290A1 (fr) * | 2018-12-06 | 2020-06-11 | 株式会社アーティフィス | Dispositif de projection de table |
| JP2020187404A (ja) * | 2019-05-10 | 2020-11-19 | 東京電力ホールディングス株式会社 | 飲食物提供管理システム、飲食物提供管理方法およびプログラム |
| JPWO2021095768A1 (fr) * | 2019-11-15 | 2021-05-20 | ||
| KR20220054230A (ko) * | 2020-10-23 | 2022-05-02 | 주식회사 누비랩 | 식사 모니터링 장치 및 방법 |
| JP2023131225A (ja) * | 2022-03-09 | 2023-09-22 | 株式会社村田製作所 | 料理提供サポートシステム |
| WO2024019099A1 (fr) * | 2022-07-19 | 2024-01-25 | ダイキン工業株式会社 | Dispositif d'estimation d'indice thermique et dispositif de commande d'environnement thermique |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2002117186A (ja) * | 2000-10-10 | 2002-04-19 | Soft Service:Kk | アンケート管理装置及びアンケート方法 |
| JP2012108282A (ja) * | 2010-11-17 | 2012-06-07 | Nikon Corp | 電子機器 |
| WO2015098188A1 (fr) * | 2013-12-27 | 2015-07-02 | ソニー株式会社 | Dispositif de commande d'affichage, procédé de commande d'affichage, et programme |
| JP2016194762A (ja) * | 2015-03-31 | 2016-11-17 | ソニー株式会社 | 情報処理システム、情報処理方法及びプログラム |
-
2017
- 2017-12-28 WO PCT/JP2017/047343 patent/WO2018150756A1/fr not_active Ceased
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2002117186A (ja) * | 2000-10-10 | 2002-04-19 | Soft Service:Kk | アンケート管理装置及びアンケート方法 |
| JP2012108282A (ja) * | 2010-11-17 | 2012-06-07 | Nikon Corp | 電子機器 |
| WO2015098188A1 (fr) * | 2013-12-27 | 2015-07-02 | ソニー株式会社 | Dispositif de commande d'affichage, procédé de commande d'affichage, et programme |
| JP2016194762A (ja) * | 2015-03-31 | 2016-11-17 | ソニー株式会社 | 情報処理システム、情報処理方法及びプログラム |
Cited By (16)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP7193790B2 (ja) | 2018-12-06 | 2022-12-21 | 株式会社アーティフィス | テーブルプロジェクション装置 |
| JPWO2020116290A1 (ja) * | 2018-12-06 | 2021-02-15 | 株式会社アーティフィス | テーブルプロジェクション装置 |
| WO2020116290A1 (fr) * | 2018-12-06 | 2020-06-11 | 株式会社アーティフィス | Dispositif de projection de table |
| CN113170071A (zh) * | 2018-12-06 | 2021-07-23 | 株式会社阿提菲斯 | 桌面投影装置 |
| JP2020187404A (ja) * | 2019-05-10 | 2020-11-19 | 東京電力ホールディングス株式会社 | 飲食物提供管理システム、飲食物提供管理方法およびプログラム |
| JP7346900B2 (ja) | 2019-05-10 | 2023-09-20 | 東京電力ホールディングス株式会社 | 飲食物提供管理システム、飲食物提供管理方法およびプログラム |
| JP7344307B2 (ja) | 2019-11-15 | 2023-09-13 | 株式会社Nttドコモ | 情報処理装置 |
| WO2021095768A1 (fr) * | 2019-11-15 | 2021-05-20 | 株式会社Nttドコモ | Dispositif de traitement d'informations |
| JPWO2021095768A1 (fr) * | 2019-11-15 | 2021-05-20 | ||
| US12211227B2 (en) | 2019-11-15 | 2025-01-28 | Ntt Docomo, Inc. | Information processing apparatus |
| KR20220054230A (ko) * | 2020-10-23 | 2022-05-02 | 주식회사 누비랩 | 식사 모니터링 장치 및 방법 |
| KR102708485B1 (ko) * | 2020-10-23 | 2024-09-23 | 주식회사 누비랩 | 식사 모니터링 장치 및 방법 |
| JP2023131225A (ja) * | 2022-03-09 | 2023-09-22 | 株式会社村田製作所 | 料理提供サポートシステム |
| WO2024019099A1 (fr) * | 2022-07-19 | 2024-01-25 | ダイキン工業株式会社 | Dispositif d'estimation d'indice thermique et dispositif de commande d'environnement thermique |
| JP2024013237A (ja) * | 2022-07-19 | 2024-01-31 | ダイキン工業株式会社 | 温熱指標推定装置、及び、温熱環境制御装置 |
| JP7473721B2 (ja) | 2022-07-19 | 2024-04-23 | ダイキン工業株式会社 | 温熱指標推定装置、及び、温熱環境制御装置 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2018150756A1 (fr) | Dispositif de traitement d'informations, procédé de traitement d'informations et support d'informations | |
| JP6586758B2 (ja) | 情報処理システム、情報処理方法及びプログラム | |
| US20220215258A1 (en) | Devices, Systems, and Methods that Observe and Classify Real-World Activity Relating to an Observed Object, and Track and Disseminate State Relating the Observed Object | |
| KR102777142B1 (ko) | 머신 러닝 분류들에 기초한 디바이스 위치 | |
| US10482551B2 (en) | Systems and methods of automatically estimating restaurant wait times using wearable devices | |
| US10355947B2 (en) | Information providing method | |
| US20120066144A1 (en) | Map guidance for the staff of a service-oriented business | |
| US20170357849A1 (en) | Information processing apparatus, information processing method, and program | |
| KR20160128017A (ko) | 전자 장치, 서버 및 그 제어 방법 | |
| US11307877B2 (en) | Information processing apparatus and information processing method | |
| JP6572629B2 (ja) | 情報処理装置、情報処理方法及びプログラム | |
| CN105286423A (zh) | 一种用于智能杯子识别身份的系统及方法 | |
| JP6277582B2 (ja) | 電子機器 | |
| JP6467601B1 (ja) | 食事履歴システム | |
| US20250005688A1 (en) | An electronic device, a system, and a method for controlling a victual ordering system | |
| JP6590005B2 (ja) | 電子機器及びプログラム | |
| CN104347017B (zh) | 一种广告播放的系统及其方法 | |
| CN111093025B (zh) | 一种图像处理方法及电子设备 | |
| JP2014123215A (ja) | 電子機器 | |
| JP2015220596A (ja) | 電子機器 | |
| WO2019171866A1 (fr) | Dispositif de traitement d'informations, procédé pour déterminer le moment de servir un plat, et programme | |
| US12474889B1 (en) | Apparatuses and methods for an interactive device | |
| Wang et al. | GastroConcerto: Towards Designing Auditory Dining System to Enrich Chefs’ Culinary Practices | |
| EP3519916B1 (fr) | Connecteur électrique planaire pour dispositif électronique | |
| WO2014097671A1 (fr) | Appareil électronique |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17896793 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 17896793 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: JP |