WO2025204235A1 - Information processing system, control method, and program - Google Patents
Information processing system, control method, and programInfo
- Publication number
- WO2025204235A1 WO2025204235A1 PCT/JP2025/004764 JP2025004764W WO2025204235A1 WO 2025204235 A1 WO2025204235 A1 WO 2025204235A1 JP 2025004764 W JP2025004764 W JP 2025004764W WO 2025204235 A1 WO2025204235 A1 WO 2025204235A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- cooking
- information
- processing
- user
- recipe
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
Definitions
- This disclosure relates to an information processing system, a control method, and a program.
- Patent Document 1 discloses technology that senses the biometric reactions of a chef when preparing a dish, and then includes the sensed biometric data in recipe data to recreate a dish that suits the eater's preferences.
- This disclosure therefore proposes an information processing system, control method, and program that can assist users in creating dishes.
- an information processing system includes a control unit that performs the following processes: generating cooking information using a cooking learning model based on instruction information input by a user; generating cooking processing information corresponding to the cooking steps included in the cooking information using a cooking processing learning model, integrating the cooking processing information to generate integrated cooking processing information; and presenting the generated integrated cooking processing information.
- the present disclosure also provides a control method including: a processor generating cooking information using a cooking learning model based on instruction information input by a user; generating cooking processing information corresponding to cooking steps included in the cooking information; integrating the cooking processing information to generate integrated cooking processing information; and presenting the generated integrated cooking processing information.
- FIG. 10 is a flowchart showing an example of the flow of a recipe plan generation process according to the present embodiment.
- FIG. 10 is a diagram showing an example of a display screen on which a recipe plan is displayed according to the present embodiment.
- FIG. 10 is a diagram illustrating a display screen that displays details of a recipe plan.
- FIG. 10 is a diagram illustrating a display screen that displays details of a recipe plan.
- 10 is a flowchart showing an example of the flow of a process for generating recipe plans with different levels of freshness according to this embodiment.
- 10 is a flowchart showing an example of the flow of an adjustment process according to the present embodiment.
- 10A and 10B are diagrams illustrating an example of adjustment according to a user's cooking environment according to the present embodiment.
- Fig. 1 is a diagram showing the overall configuration of a cooking information generation system 1 according to an embodiment of the present disclosure.
- the information processing device 10 is realized by a PC (personal computer), tablet terminal, smartphone, HMD (head-mounted display), etc.
- the information processing device 10 creates new dishes in response to instructions (hereinafter also referred to as suggestions) from a user such as a chef, and presents them to the user.
- the information processing device 10 can reflect the user's suggestions in the creation of the dish as appropriate during the dish creation process.
- the information processing device 10 can create recipe data that includes the cooking steps of the dish as the creation of the dish.
- the information processing device 10 uses AI to generate recipes (an example of cooking information), adjusts the recipe in response to user feedback, and considers the feasibility of each cooking step included in the recipe (i.e., generates corresponding cooking processing information), and integrates each piece of cooking processing information to generate integrated cooking processing information (a new recipe, also referred to as an integrated recipe), which is presented to the user.
- Recipeing processing information is, for example, information on processing related to the appearance of ingredients (laser processing, cutting, heating, presentation, etc.).
- the information processing device 10 may use the results of simulations performed using a camera 20, food ingredient sensor 22, laser processing machine 24, cooking robot 26, cutting machine 28, etc., or the results of actual cooking.
- the camera 20 captures an image of the food ingredient F and outputs the captured image to the information processing device 10, the laser processing machine 24, the cooking robot 26, or the cutting machine 28.
- the food ingredient F may be placed by the cooking robot 26.
- the food ingredient sensor 22 senses various components (e.g., moisture, sugar content, etc.) or conditions (e.g., temperature) of the food ingredient F, and outputs the sensing data to the information processing device 10, etc.
- the laser processing machine 24 can draw letters, patterns, marks, etc. on the food material F by irradiating it with a laser and charring it, or can carve letters, patterns, marks, etc. into the food material F by using a laser to carve the food material F.
- the cooking robot 26 has one or more robotic arms and can grasp the food material F and place it in a predetermined position, add seasonings, pour sauces, and perform various cooking operations using various cooking utensils.
- the cutting machine 28 can cut the food material F using a CNC (Computer Numerical Control) milling machine, laser, etc.
- the laser processing machine 24, cooking robot 26, and cutting machine 28 are digital fabrication devices that can precisely perform cutting, engraving, heating, laminating, and various cooking processes based on digital data.
- the laser processing machine 24, cooking robot 26, and cutting machine 28 are examples of processing devices that perform various cooking processes, and the configuration of the cooking information generation system 1 is not limited to this.
- Configuration example of information processing device 10>> 2 is a block diagram showing an example of the configuration of the information processing device 10 according to the present embodiment.
- the information processing device 10 includes a control unit 110, a communication unit 120, an operation input unit 130, a display unit 140, a storage unit 150, and an audio input/output unit 160.
- the communication unit 120 has a transmitter that transmits data to an external device and a receiver that receives data from the external device.
- the communication unit 120 is communicatively connected to the camera 20 and receives captured images of the food ingredient F from the camera 20.
- the communication unit 120 is also communicatively connected to the food ingredient sensor 22 and receives sensing data of the food ingredient F from the food ingredient sensor 22.
- the communication unit 120 is also communicatively connected to various digital fabrication devices (the laser processing machine 24, the cooking robot 26, and the cutting machine 28) and transmits control signals for controlling various cooking operations of the various digital fabrication devices.
- the communication unit 120 may communicate with an external device or the Internet using, for example, a wired/wireless LAN (Local Area Network), Wi-Fi (registered trademark), Bluetooth (registered trademark), a mobile communication network (LTE (Long Term Evolution)), 5G (fifth generation mobile communication system)), etc.
- a wired/wireless LAN Local Area Network
- Wi-Fi registered trademark
- Bluetooth registered trademark
- a mobile communication network LTE (Long Term Evolution)
- 5G farth generation mobile communication system
- the operation input unit 130 accepts operation input by the user and outputs the input information to the control unit 110.
- the operation input unit 130 may be realized by a mouse, a keyboard, a switch, a button, or the like.
- the display unit 140 displays various operation screens and a predicted image of a processed result, which will be described later.
- the display unit 140 may be a display panel such as a liquid crystal display (LCD) or an organic electroluminescence (EL) display.
- the operation input unit 130 and the display unit 140 may be provided integrally.
- the operation input unit 130 may be a touch sensor stacked on the display unit 140 (for example, a panel display).
- the control unit 110 functions as an arithmetic processing unit and a control unit, and controls the overall operation of the information processing device 10 in accordance with various programs.
- the control unit 110 is realized by electronic circuits such as a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), a microprocessor, etc.
- the control unit 110 may also include a ROM (Read Only Memory) that stores programs to be used, arithmetic parameters, etc., and a RAM (Random Access Memory) that temporarily stores parameters that change as appropriate.
- the agent unit 111 functions as an agent capable of interacting with the user, and acquires appropriate information from the interaction with the user.
- Information input by the user to the information processing device 10 may be input through user operation from the operation input unit 130, or may be input through interaction with an agent presented by the agent unit 111.
- the interaction between the user and the agent may be via text or voice.
- the agent unit 111 may be an AI agent that utilizes large language models (LLMs), etc. This allows the user to input natural language.
- LLMs large language models
- FIG 3 is a diagram for explaining the details of the function of the recipe plan generation unit 112 according to this embodiment.
- the recipe plan generation unit 112 includes an ingredient pairing unit 510 and a recipe plan generation unit 520.
- the agent unit 111 interacts with the user U, and the prompt 41 generated from the user's suggestions 40 is input to the draft generation unit 112.
- the user's suggestions 40 include information about the concept of the new dish they want to make and the ingredients to be used.
- the concept of the dish may include, for example, the impression, atmosphere, style, genre, theme, purpose, and key points of the dish.
- requests regarding the design of the dish and the ingredients to be used may also be input as concepts.
- the ingredient pairing unit 510 outputs combination ingredients that are suitable for use in combination with the ingredients the user wants to use (the ingredients used). For example, the ingredient pairing unit 510 inputs data on the ingredients used into an ingredient pairing learning model, and outputs ingredients that pair well with the ingredients used or that are compatible at the molecular level.
- the ingredient pairing learning model is a learning model generated by learning ingredient combinations using machine learning.
- the recipe draft generation unit 520 inputs the concept, ingredients used, and ingredient combinations into the cooking learning model as input data, and outputs one or more recipe drafts 42, which are recipe information proposals.
- the cooking learning model is a model that has learned about cooking in advance through machine learning.
- the cooking learning model can learn images (captured images) of various dishes, corresponding recipe data, etc.
- the recipe data to be learned includes the name of the dish, ingredients, cooking steps, cooking utensils used, cooking time, and tags (for example, attribute information such as recipe category, purpose of cooking, genre, suitable season, suitable occasion, target demographic, and other keywords).
- recipe categories that can be tagged include: "Suitable seasons” include, for example, spring, summer, autumn, winter, New Year's, cherry blossom viewing, moon viewing, and Christmas. "Suitable occasions” include, for example, entertaining, parties, snacks, late-night snacks, goes well with wine, and anniversaries. "Target demographics” include, for example, athletes, children, adults, and the elderly. "Other keywords” include, for example, gorgeous, colorful, lively, nutritious, low-salt, easy to digest, healthy, relaxing, mood changer, easy, time-saving, nostalgic, Italian, and Chinese.
- the dish draft output from the cooking learning model includes recipe data and corresponding dish images.
- the cooking learning model may include a recipe generation AI that generates recipe data and an image generation AI that generates dish images corresponding to the recipe data.
- the cooking learning models used by the recipe draft generation unit 520 include a user cooking learning model 521 that has learned the user's cooking, and an other user cooking learning model 522 that has learned the cooking of other users (e.g., other chefs). There may be multiple other user cooking learning models 522. For example, there may be an other user cooking learning model that has learned the cooking of a French chef, or an other user cooking learning model that has learned the cooking of a Chinese chef. The other user cooking learning model 522 may also be an other user cooking learning model that has learned the cooking of many chefs regardless of cooking genre.
- the recipe draft generation unit 520 may output a recipe draft using the user's cooking learning model 521, or may output a recipe draft using the other user's cooking learning model 522, or may output a recipe draft using the user's cooking learning model 521 and the other user's cooking learning model 522.
- the user's cooking learning model 521 and the other user's cooking learning model 522 may be stored in the memory unit 150.
- Figure 3 illustrates the cooking learning models used by the recipe draft generation unit 520 as the configuration of the recipe draft generation unit 520, but it is assumed that the recipe draft generation unit 520 calls up and uses each cooking learning model stored in the memory unit 150.
- the recipe draft generation unit 520 may link with an external database as appropriate to acquire external information (for example, information about the dish, information about ingredients, information related to the input concept, various designs, etc.). Furthermore, the recipe draft generation unit 520 may use design generation AI to acquire a design based on the concept, input the design data into the cooking learning model, and output a dish image decorated based on that design.
- external information for example, information about the dish, information about ingredients, information related to the input concept, various designs, etc.
- the recipe draft generation unit 520 may use design generation AI to acquire a design based on the concept, input the design data into the cooking learning model, and output a dish image decorated based on that design.
- the display control unit 113 controls the display of images on the display unit 140.
- the display control unit 113 controls the display of various input screens on the display unit 140, such as a screen that prompts dialogue with the AI agent, a screen that presents a recipe draft generated by the draft generation unit 112, a screen that presents new recipe information (integrated cooking and processing information) generated by the integration unit 116 (described below), and a screen that presents a new recipe development story generated by the story generation unit 117 (described below).
- FIG. 4 is a diagram for explaining the generation of cooking information according to this embodiment.
- the adjustment unit 114 adjusts the recipe drafts 42 generated by the draft generation unit 112 and outputs the adjusted recipe drafts to the element review unit 115.
- the adjustment unit 114 may be input with one or more recipe drafts automatically selected from the recipe drafts 42 generated by the draft generation unit 112, or with one or more recipe drafts selected by the user.
- the adjustment unit 114 may assign a score to each recipe draft based on a certain criterion (for example, whether it is close to the characteristics of the user's dish) and select the recipe draft with the highest score.
- the recipe drafts may be sorted based on scores according to various criteria, making it easier for the user to select.
- the adjustment unit 114 may receive new user suggestions (prompts based on these suggestions, also simply referred to as suggestions) obtained from the dialogue between the user and the agent from the agent unit 111.
- the adjustment unit 114 adjusts the recipe draft in accordance with the user suggestions input from the agent unit 111.
- the adjustment unit 114 may use a cooking learning model (user cooking learning model, other user cooking learning model), an ingredient pairing learning model, image generation AI, etc.
- the cooking learning model is stored in the memory unit 150, and the adjustment unit 114 can call up and use the cooking learning model from the memory unit 150 as needed. Furthermore, it is assumed that such a cooking learning model is the same as the cooking learning model used in the recipe draft generation unit 520 of the draft generation unit 112.
- adjustments to the recipe draft by the adjustment unit 114 include replacing ingredients used, obtaining new ingredient combinations, changing the cooking utensils used, and modifying the dish image (appearance of the dish).
- the element review unit 115 has the function of dividing the recipe data included in the recipe draft output from the adjustment unit 114 into elements and reviewing them according to the cooking steps of the recipe data, in order to increase the feasibility of the draft dish.
- a draft dish includes recipe data showing the cooking steps and a dish image showing the appearance of the dish, but there are cases where the appearance of the dish shown in a dish image generated by image generation AI (for example, presentation on a dish, decoration) is not realistic.
- the element review unit 115 requests adjustment from the adjustment unit 114.
- the adjustment unit 114 adjusts the recipe data (for example, changes to the ingredients used, the cooking utensils used, or the appearance of the dish), and the element review unit 115 reviews the element again.
- the element review unit 115 divides the recipe data 421A' of the dish draft adjusted by the adjustment unit 114 into elements 1 to 3, etc., and generates variations in the process of reviewing each element.
- the integration unit 116 combines each successful element (cooking processing information) to generate new cooking information (integrated cooking processing information). As described above, control optimization and variation generation are performed for each element (cooking process) of the decomposed recipe data, and one or more successful derived elements can be generated for each element.
- the integration unit 116 can appropriately combine the successful derived elements for each element to generate multiple patterns of new recipe data as integrated cooking processing information. In this case, the integration unit 116 can generate new recipe data that meets conditions such as overall cooking time.
- the integration unit 116 also generates dish images corresponding to each new recipe data as integrated cooking processing information.
- the integration unit 116 appropriately combines variations of each element to generate new dish information such as dish A'1 (integration 1) to dish A'3 (integration 3), etc.
- the one or more pieces of integrated cooking and processing information generated by the integration unit 116 are output to the display control unit 113 and displayed on the display unit 140, thereby being presented to the user.
- elements not used in the integration may also be presented as candidates.
- the user may interact with the agent about the presented integrated cooking and processing information and may further input suggestions. For example, a suggestion to change the elements used in the integration to other elements listed as candidates may be input. A change in ingredients or concept, etc. may also be input.
- the story generation unit 117 generates a development story for the creative dish based on information stored in the storage unit 150 as appropriate during the creation process.
- the information stored in the storage unit 150 includes the user's dialogue with the agent (user's suggestions), the initially selected dish draft, the dish drafts that were not selected, and examples of success and failure in considering elements.
- the development story may be generated using text and images.
- the development story may also be a still image with text and images arranged in it, or a video showing the changes in the dish image from the dish draft to the creative dish.
- the generated development story is output to the display control unit 113 and displayed on the display unit 140 for presentation to the user.
- the development story may be used, for example, for advertising, etc.
- control unit 110 has been specifically described above. However, the control unit 110 is not limited to the functional configuration described above and has a tool linkage function. Specifically, the control unit 110 can link with various conventional tools as appropriate when generating cooking information. For example, the control unit 110 may link with 3D CAD when generating a design in the draft generation unit 112 or the element review unit 115. The control unit 110 may also link with tools that control digital fabrication equipment such as the laser processing machine 24, cooking robot 26, and cutting machine 28, and control these devices. The control unit 110 may also link with various AI tools, such as image generation AI.
- the storage unit 150 is realized by a ROM that stores programs and calculation parameters used in the processing of the control unit 110, and a RAM that temporarily stores parameters that change as needed.
- the voice input/output unit 160 has a voice input unit that collects voice and inputs the voice data to the control unit 110, and a voice output unit that outputs voice.
- the voice input unit is realized by, for example, a microphone and collects user voice.
- the voice output unit is realized by, for example, a speaker and outputs agent voice.
- the configuration of the information processing device 10 has been specifically described above. Note that the configuration of the information processing device 10 according to the present disclosure is not limited to the example shown in FIG. 2.
- the information processing device 10 may be realized by multiple devices.
- at least some of the functions of the information processing device 10 may be realized by a server on the Internet.
- each functional component of the control unit 110 may be provided on a server.
- each component of the information processing device 10 may be realized by an information processing system consisting of a user terminal and a server.
- FIG. 5 is a flowchart showing an example of the overall flow of the cooking information generation system 1 according to this embodiment.
- the information processing device 10 performs an input process for user basic information (step S103).
- the user basic information may be input by user operation from the operation input unit 130, or may be input through interaction between the user and an agent presented by the agent unit 111.
- the information processing device 10 generates one or more recipe drafts using the draft generation unit 112 in response to the user's suggestions regarding the new creative dish, which are obtained from the dialogue between the agent and the user (step S106).
- the information processing device 10 causes the adjustment unit 114 to perform adjustment processing on the recipe draft (step S115).
- the information processing device 10 using the element review unit 115, divides the recipe data included in the recipe draft into elements corresponding to the cooking process, and performs a process to review the feasibility of each element (step S118). Specifically, one or more pieces of cooking and processing information are generated for each element.
- FIG. 6 is a flowchart showing an example of the flow of the basic information input process according to this embodiment.
- the cooking information generation system 1 an AI tool that assists users in creating new dishes
- the system executes the basic information input process described below. This system may be executed, for example, by an application installed on an information processing device 10.
- the user's cooking environment may also include various cooking appliances (e.g., a laser processing machine 24, a cutting machine 28, a heating device (e.g., a pot, a microwave, a frying pan, etc.)), a camera 20, an ingredient sensor 22, a robot (e.g., a cooking robot 26), etc. that can be controlled by this system.
- various cooking appliances e.g., a laser processing machine 24, a cutting machine 28, a heating device (e.g., a pot, a microwave, a frying pan, etc.)
- a camera 20 e.g., a camera 20
- an ingredient sensor 22 e.g., a cooking robot 26
- the display control unit 113 presents an input prompting screen on the display unit 140 that prompts the user to input information about the dish (basic information), and accepts input of the basic information by the user (step S203).
- the basic information may be input through dialogue with an agent presented by the agent unit 111.
- the agent unit 111 displays the agent on the display unit 140 via the display control unit 113, and performs control to start dialogue with the user via the voice input/output unit 160.
- control unit 110 stores the entered user's basic information in the storage unit 150 (step S206).
- control unit 110 begins the learning process of the user's cooking information.
- the control unit 110 prepares the data to be used for learning. Specifically, the control unit 110 estimates the contents of the dish (step S209) and the tableware used (step S212) based on the input dish image and recipe data.
- FIG. 7 is a diagram for explaining the estimation of dish contents based on the user's dish information according to this embodiment.
- the input user's dish information 44 includes a dish image 441 and recipe data 442.
- the control unit 110 analyzes the dish image 411 to perform image recognition (recognition of the objects shown in the image) and estimates the dish contents by referring to the recipe data. Specifically, the control unit 110 associates each dish item shown in the dish image 411 with the recipe data.
- image recognition recognition of the objects shown in the image
- association information 45 with the recipe data can be obtained for each image-recognized dish item, such as "Recipe 2: Fried shrimp,” “Recipe 3: Meat,” and “Recipe 4: Side dish.”
- the control unit 110 also estimates the tableware used (i.e., the tableware owned by the user) from the image-recognized dish portion, such as "round plate, medium size.”
- the control unit 110 uses the user's cooking information (recipe data, dish images) and the estimated dish contents and tableware used obtained from the dish information as input data (learning data) to learn about the user's cooking and generate a user cooking learning model 521 (step S215).
- the input data is not limited to the above example.
- recipe data may include information on ingredients and cooking utensils used, but if calorie information is not included, the control unit 110 may access an external ingredient database to obtain calorie information as external information and add it to the recipe data as input data.
- recipe data and dish images may have recipe categories (purpose of cooking, genre, target demographic, etc.) added as tags, and such recipe categories are also input for learning.
- the learning method is not particularly limited, but machine learning such as reinforcement learning is one example.
- the generated user cooking learning model 521 is stored in the storage unit 150.
- the recipe generation unit 112 can generate a recipe draft based on the user's cooking (for example, in line with the characteristics of the user's cooking) in response to input data (user's suggestions (purpose of cooking, genre, ingredients, etc.)).
- FIG. 8 is a flowchart showing an example of the flow of a recipe plan generation process according to this embodiment.
- the agent unit 111 of the information processing device 10 inputs the user's suggestions, acquired from a dialogue with the user, into the draft generation unit 112 (step S303).
- the user's suggestions are matters (requests) related to the dish the user wants to make, and in this embodiment, these are assumed to be the concept and ingredients to be used.
- the agent unit 111 may input the suggestions (concept, ingredients to be used) obtained by analyzing the user's dialogue (content converted into text by voice recognition or text entered by the user) using natural language processing directly into the draft generation unit 112 as a prompt, or may input a prompt generated by supplementing information as appropriate.
- the contents of the input data (prompt) input to the draft generation unit 112 can be, for example, "purpose of cooking (Christmas dish, party dish, etc.), ingredients (potatoes, cheese, seafood, etc.), genre (appetizer, stew, Chinese food, etc.), nutrition (low calorie, high calorie, etc.), target demographic (children, athletes, etc.), cooking equipment used (oven cooking, 3D printer, etc.), design (geometric design, gentle design, flashy design, etc.)".
- the agent unit 111 may supplement the details of the concept input by the user through dialogue, or may supplement the information by eliciting necessary information by asking questions such as "What kind of customer base are you looking for?" The agent unit 111 may also randomly fill in any missing information.
- the draft generation unit 112 uses the ingredient pairing learning model in the ingredient pairing unit 510 to output ingredients (combined ingredients) that are suitable for combination with the ingredients specified by the user (included in the above-mentioned recommendations) (step S306).
- the recipe plan generation unit 112 uses the cooking learning model to generate a recipe plan using the recipe learning model by the recipe plan generation unit 520 based on the inputted recommendations and ingredient combinations (step S309).
- the cooking learning model includes the user's cooking learning model and other users' cooking learning models.
- the recipe plan generation unit 520 can generate recipe plans with different levels of freshness using the cooking learning model as appropriate. The recipe plan generation process will be described later with reference to Figure 12.
- the recipe plan generation unit 112 may generate multiple input data by randomly combining the information contained in the recommendations and combined ingredients. This makes it possible to generate recipe plans that correspond to various combinations based on the user's recommendations, etc.
- the generated recipe plans are expected to include recipe data and dish images.
- a user learning model that has learned the recipe data and dish images (captured images) may generate dish images that correspond to the recipe data.
- the dish images may be generated by an image generation AI according to the generated recipe data.
- the recipe plan generation unit 112 may generate an appropriate prompt from the results obtained by text analysis of the recipe data, and input this into the image generation AI to obtain dish images.
- the generated recipe plan is displayed on the display unit 140 by the display control unit 113 and presented to the user (step S312).
- FIG. 9 is a diagram showing an example of a display screen on which a recipe plan is displayed according to this embodiment.
- a large number of recipe plans are displayed on display screen 600.
- thumbnails of the dish images included in the recipe plan are displayed.
- details of the recipe plan i.e., the contents of the recipe, are displayed.
- FIGS. 10 and 11 are diagrams illustrating the display screen that displays details of a recipe plan.
- the screen switches to a recipe plan details display screen 610a, as shown on the left of FIG. 10.
- the details display screen 610a displays recipe data 611a, a dish image 612a, and a suggested items display area 613a.
- the recipe data 611a includes ingredients (foodstuffs, seasonings), cooking steps, etc.
- the user can review the presented recipe and food images, and at this point make further suggestions and revise the presented dish draft based on the initial user suggestions.
- An example of how to input suggestions is explained below.
- the agent unit 111 will recognize this and the recognized suggestion, "Make it into a spring-like fruit,” will be displayed in the suggestion display area 613a.
- the agent unit 111 may also mark the relevant part of the recipe that corresponds to the part selected by the user (e.g., "Decorate the cake with melon").
- the agent unit 111 will recognize this and display the recognized suggestion, "Add more decorations to make it more colorful,” in the suggestion display area 613c.
- the agent unit 111 may also mark the relevant part of the recipe that corresponds to the part selected by the user (for example, "Place the cake on the plate”).
- the draft generation unit 112 can use a cooking learning model as appropriate to control the generation rate of recipe drafts with different levels of novelty, depending on the level of novelty of the recipe desired by the user.
- Figure 12 is a flowchart showing an example of the process flow for generating recipe plans with different levels of freshness according to this embodiment.
- the draft generation unit 112 checks the newness level of the dish desired by the user (step S323).
- the newness level of the dish desired by the user may be obtained by the agent unit 111 from a conversation with the user. It may also be input by the user during the initial basic information input process.
- the plan generation unit 112 determines the ratio of the number of recipe plans A to D to be generated depending on the newness level (step S326).
- Recipe plans A to D are plans with different recipe newness levels, with the relationship being newness level a of recipe plan A ⁇ newness level b of recipe plan B ⁇ newness level c of recipe plan C ⁇ newness level d of recipe plan D.
- the plan generation unit 112 controls so that the higher the newness level, the more recipe plans with higher newness levels are generated.
- the draft generation unit 112 sets the information (prompt) about the concept and ingredients (including combination ingredients) indicated by the user as input data for the cooking learning model (step S329).
- the recipe plan generation unit 112 generates a recipe plan A using the user recipe model (step S332).
- the recipe plan generation unit 112 also uses another user's cooking model to change the style (here, the appearance of the dish) of the recipe plan generated using the user's cooking model, generating a recipe plan B (step S335).
- "changing the style of the dish” primarily means changing the appearance of the dish, such as the presentation, arrangement, and decoration of the dish. More specifically, the dish image included in the recipe plan is changed. Even with the same recipe, the impression of the dish can change if the presentation, decoration, tableware used, etc. are different.
- the appearance of such dishes also reflects the chef's characteristics, and by changing it using another user's cooking model, it becomes possible to create a new dish (that is less similar to the user's dish).
- the recipe plan generation unit 112 also generates a recipe plan C by changing the style of the recipe plan generated using the other user's recipe model using the user's recipe model (step S338).
- the recipe plan generator 112 also generates a recipe plan D using other users' recipe models (step S341).
- the plan generator 112 may generate recipe plans A to D according to the ratio determined in step S326 above.
- FIG. 13 is a flowchart showing an example of the flow of adjustment processing according to this embodiment.
- the adjustment unit 114 uses the ingredient pairing learning model to output new ingredients that are suitable for pairing with the changed ingredients in response to the ingredient change (step S409). If the ingredient change is not a specific ingredient request (for example, "I want to change to a spring-like fruit,” “I want to use ingredients with a good texture,” “I want to increase the variety of vegetables,” etc.), the adjustment unit 114 may determine ingredients using an appropriate database or AI (such as a pre-trained learning model for determining ingredients) in response to an abstract request such as "spring-like fruit.”
- AI such as a pre-trained learning model for determining ingredients
- the adjustment unit 114 uses the ingredients used, the combined ingredients, and the user's new comments as input data and performs an adjustment process on the recipe data in the draft dish using the cooking learning model (step S412). Comments other than changes to ingredients include, for example, comments on the cooking process or cooking contents.
- the adjustment unit 114 also performs an adjustment process on the appearance of the dish using the cooking learning model in accordance with the user's new comments (for example, comments on the layout, design, etc. of the dish) (step S415). Specifically, the adjustment unit 114 performs a conversion process on the dish image.
- the adjustment unit 114 may also convert the dish image to correspond to the revised recipe data.
- the adjustment unit 114 adjusts the recipe data to accommodate the user's cooking environment (step S418).
- the adjustment unit 114 assigns cooking robots, laser processing machines, etc. to the cooking steps of a recipe based on the information about the user's cooking environment input as basic information.
- Figure 14 is a diagram showing an example of adjustment according to the user's cooking environment according to this embodiment. Here, we will explain using the recipe for "Mushroom Farci" generated based on the user's input that "I want to make a beautiful stuffed dish using vegetables.”
- the cooking steps of the recipe are "1. Remove the stems from the mushrooms, 2. Add a pattern to the surface of the mushrooms, 3. Stuff the mushrooms with a filling (cheese, nuts, etc.), 4. Arrange on a plate and garnish with leaves," and if the user's cooking environment is equipped with a cooking robot, laser processing machine, etc., the adjustment unit 114 assigns the cooking robot, etc. to the cooking steps.
- the recipe data 620 is modified to "1.
- the robot cuts off the stems from the mushrooms, 2.
- the robot engraves a pattern on the surface of the mushrooms with a laser, 3.
- the robot stuffs the mushrooms with a filling (cheese, nuts, etc.), 4.
- the robot arranges on a plate and garnishes with leaves.”
- FIG. 15 is a flowchart showing an example of the flow of the element review process according to this embodiment.
- the feasibility of cooking is reviewed for each element of the recipe draft generated by the draft generation unit 112 or the recipe draft adjusted by the adjustment unit 114.
- the element review unit 115 of the information processing device 10 considers each cooking step of the recipe data included in the recipe draft as an element (step S503).
- Fig. 16 is a flowchart showing the flow of the examination process of element 1.
- the element review unit 115 checks whether the method for using a robot to cut off mushroom stems has already been learned (step S523). If it has already been learned (step S523/Yes), processing proceeds to step S532.
- the element review unit 115 obtains the cooking time required (step S532). For example, the element review unit 115 obtains the cooking time required for the successful axis selection method.
- the element review unit 115 requests the adjustment unit 114 to adjust the recipe data (step S538).
- the adjustment unit 114 may make adjustments, such as changing the recipe to one where removing mushroom stems is a task that is difficult for the cooking robot 26 to perform and therefore a task that must be done by a human, changing the recipe to one that does not require removing the stems, or changing the ingredients to ones whose stems are easier to remove.
- the processing ends. In the integration process described below, the successful cooking and processing plans are used when integrating the elements.
- Fig. 17 is a flowchart showing the flow of the examination process of element 2.
- the element review unit 115 checks whether engraving a mushroom with a laser cutter (laser processing machine 24) has been learned (step S543). If learning has been learned (step S543/Yes), processing proceeds to step S567.
- the element review unit 115 If learning has not been performed (step S543/No), the element review unit 115 generates n pattern patterns (step S546). For example, the element review unit 115 generates pattern variations based on the food image output from the adjustment unit 114 using an algorithm such as style conversion.
- the element review unit 115 defines m sets of laser processing parameters (step S549).
- the element review unit 115 sets the power and speed ranges for laser control as "power: minimum 40, maximum 60" and “speed: minimum 50, maximum 90,” and defines m combinations of power and speed.
- the element review unit 115 may refer to processing examples of similar ingredients, or may define parameters starting with low power parameters for safety reasons.
- the element review unit 115 selects a set of pattern pattern and processing parameters (step S552).
- the element review unit 115 uses the robot (cooking robot 26) to set the mushroom (in a location where it can be processed using a laser cutter) (step S555), and controls the laser cutter to process (carve) it (step S558).
- the element review unit 115 photographs the processed food material using the camera 20 and acquires the image (step S561).
- steps S552 to S561 are performed for all combinations of pattern patterns and processing parameters (step S562). This results in a large number of processing results (images of processed ingredients), such as those shown in Table 2 below.
- the element review unit 115 learns the relationships between the pattern pattern, processing parameters, and processed image based on these, and creates a learning model for predicting the processing result from the pattern pattern and processing parameters (step S564). Even with the same pattern pattern, the processing result can be clear, crushed, or thin, depending on the processing parameters. This is because the processing reaction may not be linear depending on the ingredients of the ingredients, and phenomena such as the ingredients expanding or crumbling may occur.
- a learning model (an example of a cooking processing learning model) is used to predict the processing result from the pattern pattern and processing parameters, making it possible to consider many combinations of pattern patterns and processing parameters.
- the learning model can be generated by machine learning such as reinforcement learning, for example.
- the element review unit 115 evaluates each cooking and processing plan (a combination of pattern pattern and processing parameters: an example of cooking and processing information) based on the degree of match (whether the pattern is distorted by processing) between the pattern generated from the pattern in element 2 and the processing result (predicted by the learning model) (step S567).
- the element review unit 115 also obtains the cooking time required (for each combination pattern) (step S570).
- step S573/Yes the element review unit 115 requests the adjustment unit 114 to adjust the recipe data (step S576).
- the adjustment unit 114 changes the ingredients or the pattern, for example. Note that if there is a cooking and processing plan for the cooking step of element 2 that has a rating above a certain level (step S573/No), the process ends. In the integration process described below, cooking and processing plans that have a rating above a certain level are used when integrating elements.
- Fig. 18 is a flowchart showing the flow of the examination process of element 3.
- step S583/Yes the element review unit 115 sets restrictions on the filling material (step S586).
- Restrictions from the user may be extracted from a conversation between the agent and the user, such as "I want to make a dish for customers with allergies" or "I want to reduce the calories.” Restrictions on the filling material may include, for example, controlling the ingredients and quantity. If the user has not instructed to restrict the filling (step S583/No), the process proceeds to step S589.
- the element review unit 115 creates multiple patterns of ingredients to be mixed as fillings (step S589).
- the element review unit 115 imports the 3D model of the mixed material into a simulator, and uses the simulator to learn how the robot hand fills mushrooms with the filling material according to the mixing pattern (step S592).
- the generated learning model is an example of a cooking and processing learning model.
- the learning model can be generated by machine learning, such as reinforcement learning.
- the element review unit 115 obtains the cooking time required (step S595).
- the element review unit 115 requests the adjustment unit 114 to adjust the recipe data (step S601).
- the element review unit 115 can use the generated learning model to consider the feasibility (whether or not it will be successful) of each mixing pattern. Note that if there is a successful cooking/processing plan among the cooking steps for element 3 (step S598/No), the processing ends. In the integration process described below, the successful cooking/processing plan is used when integrating the elements.
- Fig. 19 is a flowchart showing the flow of the examination process of element 4.
- the element review unit 115 generates variations of food images in which food is served on tableware that the user normally uses (step S603). Information about the tableware used by the user has already been acquired during the basic information input process.
- the element review unit 115 generates a leaf arrangement pattern (step S609).
- the element review unit 115 generates an arrangement pattern by, for example, combining multiple types of leaves.
- the element review unit 115 uses a simulator to learn possible arrangements for the robot hand (cooking robot 26) (step S612).
- the generated learning model is an example of a cooking and processing learning model.
- the learning model can be generated by machine learning, such as reinforcement learning.
- the element review unit 115 deletes placement patterns that are difficult to realize (step S615).
- the element review unit 115 can use the generated learning model to consider the feasibility (whether or not each placement pattern will be successful) of each placement pattern.
- the element review unit 115 generates food images by combining variations of the above food image of food served on tableware with leaf layouts of each learned arrangement pattern (specifically, arrangement patterns that are likely to be realized by the robot hand) (step S618).
- step S624/Yes the element review unit 115 requests the adjustment unit 114 to adjust the recipe data (step S627). If there is a successful cooking and processing plan among the cooking process plans (arrangement pattern: an example of cooking and processing information) for element 4 (step S624/No), the process ends. In the integration process described below, the successful cooking and processing plan is used when integrating the elements.
- the element consideration process more specific variations of the cooking process (cooking processing information) are created based on each cooking process of the recipe data in the dish draft, and a consideration is made as to whether the cooking process will be successful. Furthermore, in learning the cooking processing learning model used in the consideration of each element described above, the information processing device 10 can further improve the accuracy of generating the learning model by including failure cases in the learning data. Note that the element consideration unit 115 may present multiple cooking processing information (or processing results) for each element as candidates to the user, allowing the user to select the cooking processing information to be included in the integration.
- Integration processing> 20 is a flowchart showing an example of the flow of the integration process according to this embodiment.
- the integration unit 116 combines the elements reviewed by the element review unit 115 to generate integrated cooking and processing information.
- the integration unit 116 changes the default conditions in accordance with the user instructions (step S706).
- user instructions may include conditions such as a cooking time of 15 minutes or less and an evaluation rank of 3 or higher for element 2.
- user instructions may include the user's selection of cooking and processing information to be included in the integration.
- step S703/No the integration unit 116 sets the default conditions (step S709).
- the integration unit 116 generates combination patterns of each element that satisfy the conditions (step S712).
- the integration unit 116 generates an integrated recipe and corresponding dish image as integrated cooking and processing information based on each combination pattern (step S715).
- the display control unit 113 sorts the dish images according to the condition (step S806). For example, this may be in order of cooking time.
- the display control unit 113 sorts the dish images, for example, in the order in which they were created (step S809).
- step S812 the display control unit 113 displays a list of food images (step S812).
- a specific example is as shown in Figure 21.
- step S815/Yes the display control unit 113 displays details of the selected dish (step S818).
- the element information display 652 displays not only the cooking and processing patterns that were adopted, but also the cooking and processing patterns that were not adopted. For example, if "Cooking Process 2" of element 2 is selected from the elements of the dish "Mushroom Farci” selected in the dish image 651, the "Pattern Pattern” and “Processing Parameter” blocks are displayed. If “Pattern Pattern” is selected, the detailed display 653 displays a large number of pattern patterns (examples of cooking and processing information) generated by element review in the element review unit 115, including those that were not adopted in the dish, as shown in FIG. 23.
- the user can check the pattern patterns that were derived and generated in the element review unit 115 but not adopted, the numerous processing parameters that were not adopted, and the processing results according to the processing parameters. While checking these, the user can make more specific suggestions such as, "The shape will be a little distorted, but I think this color (burnt color) would be good," allowing for an efficient review cycle.
- the user can select any cooking and processing information from the many pieces of cooking and processing information presented, and instruct the system to generate integrated cooking and processing information that includes the selected cooking and processing information.
- This embodiment assumes an iterative process in which corrections are made each time the user makes a new suggestion, resulting in a final proposal.
- the control unit 110 then summarizes the user's suggestions and presents them to the user so that the user can confirm them, and begins corrections (step S833). Specifically, as shown in FIG. 5, the adjustment process in the adjustment unit 114, the review process in the element review unit 115, and the integration process by the integration unit 116 are performed again as appropriate, and the corrected cooking and processing information is presented.
- the various learning models used in the cooking information generation system 1 according to this embodiment may be realized by LLMs (large-scale language models). That is, the above-described user cooking learning model 521, other users cooking learning model 522, ingredient pairing learning model, and cooking processing learning model may be configured to use LLMs as knowledge databases.
- LLMs large-scale language models
- the integration unit 116 can also set personalization conditions, which are conditions for changing the content of elements (cooking and processing information) for each individual receiving the food, in accordance with input from the user during the integration process. For example, the integration unit 116 sets personalization conditions that determine the type, amount, arrangement, or processing state of ingredients in specified elements, based on each individual's preferences or calorie intake. The integration unit 116 generates an integrated recipe that meets the personalization conditions set by the user. Furthermore, during cooking, the display control unit 113 may rearrange the configuration of corresponding elements in accordance with the personalization conditions specified by the user and present the integrated recipe.
- personalization conditions are conditions for changing the content of elements (cooking and processing information) for each individual receiving the food, in accordance with input from the user during the integration process. For example, the integration unit 116 sets personalization conditions that determine the type, amount, arrangement, or processing state of ingredients in specified elements, based on each individual's preferences or calorie intake. The integration unit 116 generates an integrated recipe that meets the personalization conditions set by the
- a food image is displayed on the screen as a method of presenting the generated cooking and processing information to the user, but this embodiment is not limited to this, and the information processing device 10 may actually cook a high-ranked dish using a cooking robot 26 or the like and present it to the user.
- the story generation unit 117 can also generate a development story.
- various data is accumulated in the memory unit 150, such as dialogue between the agent and the user, adjustments made, and element considerations.
- the story generation unit 117 picks out catchphrases and preferences that reflect the user's characteristics from the data accumulated in the memory unit 150, extracts unusual corrections made in response to the user's suggestions, re-learning, and episodes of failure and success (such as failed/successful cooking and processing methods), compiles a development story, and outputs it as text or images.
- the adjustment unit 114 adjusts the "Mushroom Farci" recipe to suit the user's cooking environment, if it determines that cooking step 3 (element 3): filling and cooking step 4 (element 4): plating are difficult for the cooking robot in the user's environment to handle, or if it determines that manual intervention is better, it may make the recipe one that incorporates manual intervention.
- Consideration of recipe elements when manual intervention is used is carried out, for example, as follows:
- the information processing device 10 may select personnel by acquiring the profiles (areas of expertise, etc.) and schedule information (free time) of assignable staff.
- the element review unit 115 In considering element 3, the element review unit 115 generates n filling mixture patterns and has the assigned staff member create (cook) them. At this time, the cooking time can be obtained using a timer or the like.
- the element review unit 115 may also have the staff member think up filling mixture patterns. When cooking manually, people tend to proceed without recording the cooking process, so the agent unit 111 interacts with the staff member and hears from them about mixing points, recommended mixing patterns (rankings), etc., and stores them.
- the element review unit 115 may also take images of the fillings cooked by the staff member and create a list in order of the staff member's ranking. The staff member may evaluate whether they can create the fillings according to each mixing pattern, and also evaluate the appearance and taste of the created fillings, and input this information into the information processing device 10.
- each staff member is also stored in the storage unit 150 and can be incorporated into the development story generated by the story generation unit 117, making it possible to visualize the staff member's contributions.
- one or more computer programs can be created to cause the hardware, such as the CPU, ROM, and RAM, built into the information processing device 10 to perform the functions of the information processing device 10.
- a computer-readable storage medium storing the one or more computer programs.
- the present technology can also be configured as follows. (1) A process of generating cooking information using a cooking learning model based on instruction information input by a user; Using a cooking process learning model, cooking process information corresponding to the cooking steps included in the dish information is generated; A process of integrating the cooking and processing information to generate integrated cooking and processing information; A process of presenting the generated integrated cooking and processing information; A control unit that performs Information processing system. (2) The information processing system according to (1), wherein the control unit generates a plurality of pieces of cooking processing information for the cooking process and determines feasible cooking processing information using the cooking processing learning model. (3) The information processing system according to (1) or (2), wherein the cooking and processing learning model is generated by reinforcement learning of simulation results based on multiple cooking and processing information generated for the cooking process.
- control unit uses an ingredient pairing learning model to generate combined ingredients suitable for combination with the ingredients based on the information of the ingredients included in the instruction information.
- control unit uses the concept, the ingredients, and the combined ingredients as input data and generates the cooking information using at least one of a first cooking learning model that has learned the user's cooking and a second cooking learning model that has learned the cooking of other users.
- the control unit A process of generating cooking information by changing the style of the cooking design in the cooking information generated using the first cooking learning model using the second cooking learning model; A process of generating cooking information by changing the style of the cooking design in the cooking information generated using the second cooking learning model using the first cooking learning model;
- the information processing system according to (9) above, (11) The information processing system described in (9) or (10), wherein the control unit determines the proportion of the number of pieces of cooking information generated using each cooking learning model according to the set level of newness of the cooking.
- (12) The information processing system according to any one of (1) to (11), wherein the control unit generates multiple pieces of cooking information, presents them to the user, and sends the cooking information selected by the user to subsequent processing.
- the control unit includes failure case data in learning data when learning the cooking learning model or the cooking processing learning model.
- the cooking and processing information is information about processing related to the appearance of ingredients.
- the control unit sets personalization conditions, which are conditions for changing the cooking and processing information according to the preferences or calorie intake of the individual receiving the food, based on user input, changes the cooking and processing information to satisfy the personalization conditions, and then generates the integrated cooking and processing information.
- personalization conditions which are conditions for changing the cooking and processing information according to the preferences or calorie intake of the individual receiving the food, based on user input, changes the cooking and processing information to satisfy the personalization conditions, and then generates the integrated cooking and processing information.
- the processor generating cooking information using a cooking learning model based on instruction information input by a user; generating cooking processing information corresponding to the cooking steps included in the cooking information; Integrating the cooking and processing information to generate integrated cooking and processing information; Presenting the generated integrated cooking processing information;
- a control method comprising: (19) Computer, A process of generating cooking information using a cooking learning model based on instruction information input by a user; generating cooking processing information corresponding to the cooking steps included in the cooking information; A process of integrating the cooking and processing information to generate integrated cooking and processing information; A process of presenting the generated integrated cooking and processing information; A program that functions as a control unit to perform the above.
Landscapes
- Business, Economics & Management (AREA)
- Tourism & Hospitality (AREA)
- Health & Medical Sciences (AREA)
- Economics (AREA)
- General Health & Medical Sciences (AREA)
- Human Resources & Organizations (AREA)
- Marketing (AREA)
- Primary Health Care (AREA)
- Strategic Management (AREA)
- Physics & Mathematics (AREA)
- General Business, Economics & Management (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
Description
本開示は、情報処理システム、制御方法、およびプログラムに関する。 This disclosure relates to an information processing system, a control method, and a program.
近年、調理ロボットを用いて様々な調理工程を自動化する技術が開示されている。例えば、下記特許文献1では、調理人が料理を作るときの生体反応をセンシングし、センシングされた生体データをレシピデータに含めて、食べる人の好みに応じた料理を再現する技術が開示されている。 In recent years, technologies have been disclosed that use cooking robots to automate various cooking processes. For example, Patent Document 1 below discloses technology that senses the biometric reactions of a chef when preparing a dish, and then includes the sensed biometric data in recipe data to recreate a dish that suits the eater's preferences.
しかしながら、調理人や料理研究家等のユーザが新たな料理を創作する際の支援は十分ではなかった。 However, there was a lack of support for users such as chefs and culinary experts when creating new dishes.
そこで、本開示では、ユーザの料理創作を支援することが可能な情報処理システム、制御方法、およびプログラムを提案する。 This disclosure therefore proposes an information processing system, control method, and program that can assist users in creating dishes.
本開示によれば、ユーザにより入力された指示情報に基づいて、料理学習モデルを用いて料理情報を生成する処理と、調理加工学習モデルを用いて、前記料理情報に含まれる調理工程に対応する調理加工情報を生成し、前記調理加工情報を統合して調理加工統合情報を生成する処理と、前記生成した調理加工統合情報を提示する処理と、を行う制御部を備える、情報処理システムが提供される。 According to the present disclosure, an information processing system is provided that includes a control unit that performs the following processes: generating cooking information using a cooking learning model based on instruction information input by a user; generating cooking processing information corresponding to the cooking steps included in the cooking information using a cooking processing learning model, integrating the cooking processing information to generate integrated cooking processing information; and presenting the generated integrated cooking processing information.
また、本開示によれば、プロセッサが、ユーザにより入力された指示情報に基づいて、料理学習モデルを用いて料理情報を生成することと、前記料理情報に含まれる調理工程に対応する調理加工情報を生成することと、前記調理加工情報を統合して調理加工統合情報を生成することと、前記生成した調理加工統合情報を提示することと、を含む、制御方法が提供される。 The present disclosure also provides a control method including: a processor generating cooking information using a cooking learning model based on instruction information input by a user; generating cooking processing information corresponding to cooking steps included in the cooking information; integrating the cooking processing information to generate integrated cooking processing information; and presenting the generated integrated cooking processing information.
また、本開示によれば、コンピュータを、ユーザにより入力された指示情報に基づいて、料理学習モデルを用いて料理情報を生成する処理と、前記料理情報に含まれる調理工程に対応する調理加工情報を生成し、前記調理加工情報を統合して調理加工統合情報を生成する処理と、前記生成した調理加工統合情報を提示する処理と、を行う制御部として機能させる、プログラムが提供される。 Furthermore, according to the present disclosure, a program is provided that causes a computer to function as a control unit that performs the following processes: generating cooking information using a cooking learning model based on instruction information input by a user; generating cooking processing information corresponding to the cooking steps included in the cooking information and integrating the cooking processing information to generate integrated cooking processing information; and presenting the generated integrated cooking processing information.
以下に添付図面を参照しながら、本開示の好適な実施の形態について詳細に説明する。なお、本明細書及び図面において、実質的に同一の機能構成を有する構成要素については、同一の符号を付することにより重複説明を省略する。 Preferred embodiments of the present disclosure will be described in detail below with reference to the accompanying drawings. Note that in this specification and drawings, components having substantially the same functional configuration will be assigned the same reference numerals, and redundant explanations will be omitted.
また、説明は以下の順序で行うものとする。
1.本開示の一実施形態による料理情報生成システムの概要
2.情報処理装置10の構成例
3.動作処理
3-1.基本情報の入力処理
3-2.料理素案の生成処理
3-3.調整処理
3-4.要素検討処理
3-5.統合処理
3-6.提示処理
4.その他
5.補足
The explanation will be given in the following order:
1. Overview of a Recipe Information Generation System According to an Embodiment of the Present Disclosure 2. Configuration Example of Information Processing Device 10 3. Operation Processing 3-1. Basic Information Input Processing 3-2. Recipe Draft Generation Processing 3-3. Adjustment Processing 3-4. Element Review Processing 3-5. Integration Processing 3-6. Presentation Processing 4. Others 5. Supplementary Information
<<1.本開示の一実施形態による料理情報生成システムの概要>>
図1を参照して、本開示の一実施形態による料理情報生成システムの概要について説明する。図1は、本開示の一実施形態による料理情報生成システム1の全体構成を示す図である。
<<1. Overview of a Recipe Information Generation System According to an Embodiment of the Present Disclosure>>
An overview of a cooking information generation system according to an embodiment of the present disclosure will be described with reference to Fig. 1. Fig. 1 is a diagram showing the overall configuration of a cooking information generation system 1 according to an embodiment of the present disclosure.
図1に示すように、本実施形態による料理情報生成システム1は、情報処理装置10、カメラ20、食材センサ22、レーザー加工機24、調理ロボット26、および切削加工機28を含む。 As shown in FIG. 1, the cooking information generation system 1 according to this embodiment includes an information processing device 10, a camera 20, an ingredient sensor 22, a laser processing machine 24, a cooking robot 26, and a cutting machine 28.
情報処理装置10は、ネットワーク30を介してカメラ20、食材センサ22、レーザー加工機24、調理ロボット26、および切削加工機28と通信接続し得る。なお、カメラ20、食材センサ22、レーザー加工機24、調理ロボット26、および切削加工機28は、調理工程に対応する調理加工情報の実現可能性を検討するために用いられる構成の一例であり、本実施形態はこれに限定されない。 The information processing device 10 can be connected to and communicate with the camera 20, food sensor 22, laser processing machine 24, cooking robot 26, and cutting machine 28 via network 30. Note that the camera 20, food sensor 22, laser processing machine 24, cooking robot 26, and cutting machine 28 are examples of configurations used to consider the feasibility of cooking and processing information corresponding to cooking processes, and this embodiment is not limited to this.
情報処理装置10は、PC(パーソナルコンピュータ)、タブレット端末、スマートフォン、HMD(Head Mounted Display)等により実現される。情報処理装置10は、調理人等のユーザからの指示(以下、指摘事項とも称する)に応じて、新たな料理の創作を行い、ユーザに提示する。情報処理装置10は、料理の創作過程において、適宜、ユーザからの指摘事項を創作料理に反映させ得る。本実施形態において、情報処理装置10は、料理の創作として、より具体的には、料理の調理工程を含むレシピデータを生成し得る。 The information processing device 10 is realized by a PC (personal computer), tablet terminal, smartphone, HMD (head-mounted display), etc. The information processing device 10 creates new dishes in response to instructions (hereinafter also referred to as suggestions) from a user such as a chef, and presents them to the user. The information processing device 10 can reflect the user's suggestions in the creation of the dish as appropriate during the dish creation process. In this embodiment, the information processing device 10 can create recipe data that includes the cooking steps of the dish as the creation of the dish.
ここで、既存のレシピを学習した学習モデルを用いてAI(Artificial Intelligence)により新たなレシピを生成することが従来も行われているが、料理は複数のレシピを単に切り貼りしただけでは成立しない。料理の創作には、調理過程で生じる食材の変化、安全性、デザイン性など様々な要素が関係する。また、画像生成AIを用いて単に新たな料理画像を生成しても、実際の食材を使った実現可能なレシピの提案まではできない。 Although it has been common practice to use AI (Artificial Intelligence) to generate new recipes using learning models that have learned from existing recipes, cooking is not simply a matter of cutting and pasting multiple recipes. Creating a dish involves a variety of factors, such as changes in ingredients that occur during the cooking process, safety, and design. Furthermore, simply generating new dish images using image generation AI does not go so far as to suggest feasible recipes using actual ingredients.
そこで、本開示では、ユーザの料理創作を支援する仕組みを提案する。具体的には、情報処理装置10において、AIを用いたレシピ(料理情報の一例)の生成と、ユーザからの指摘事項に応じたレシピの調整と、レシピに含まれる各調理工程の実現可能性の検討、すなわち対応する調理加工情報を生成し、各調理加工情報を統合して調理加工統合情報(新たなレシピであって、統合レシピとも称する)を生成し、ユーザに提示する。調理加工情報とは、例えば、食材の外観に関する加工の情報(レーザー加工、切削、加熱、盛り付け等)である。 This disclosure therefore proposes a system to assist users in creating dishes. Specifically, the information processing device 10 uses AI to generate recipes (an example of cooking information), adjusts the recipe in response to user feedback, and considers the feasibility of each cooking step included in the recipe (i.e., generates corresponding cooking processing information), and integrates each piece of cooking processing information to generate integrated cooking processing information (a new recipe, also referred to as an integrated recipe), which is presented to the user. Cooking processing information is, for example, information on processing related to the appearance of ingredients (laser processing, cutting, heating, presentation, etc.).
情報処理装置10は、調理加工情報を生成する際に、カメラ20、食材センサ22、レーザー加工機24、調理ロボット26、および切削加工機28等を用いてシミュレーションを行った結果や、実際に調理した結果を用い得る。 When generating cooking and processing information, the information processing device 10 may use the results of simulations performed using a camera 20, food ingredient sensor 22, laser processing machine 24, cooking robot 26, cutting machine 28, etc., or the results of actual cooking.
例えば、カメラ20は、食材Fを撮像し、撮像画像を情報処理装置10、レーザー加工機24、調理ロボット26、または切削加工機28に出力する。食材Fは、調理ロボット26により配置されてもよい。食材センサ22は、食材Fの各種成分(例えば、水分、糖分等)、または状態(例えば、温度)等をセンシングし、センシングデータを情報処理装置10等に出力する。 For example, the camera 20 captures an image of the food ingredient F and outputs the captured image to the information processing device 10, the laser processing machine 24, the cooking robot 26, or the cutting machine 28. The food ingredient F may be placed by the cooking robot 26. The food ingredient sensor 22 senses various components (e.g., moisture, sugar content, etc.) or conditions (e.g., temperature) of the food ingredient F, and outputs the sensing data to the information processing device 10, etc.
レーザー加工機24は、食材Fにレーザーを照射して焦がすことで文字、模様、マーク等を食材Fに描いたり、レーザーにより食材Fを削って文字、模様、マーク等を食材Fに彫刻したりすることができる。調理ロボット26は、1以上のロボットアームを有し、食材Fを掴んで所定の位置に配置したり、調味料を掛けたり、ソースを注いだり、各種調理器具を利用して様々な調理を行い得る。切削加工機28は、CNC(Computer Numerical Control)ミリングマシンやレーザー等で食材Fの切削を行い得る。 The laser processing machine 24 can draw letters, patterns, marks, etc. on the food material F by irradiating it with a laser and charring it, or can carve letters, patterns, marks, etc. into the food material F by using a laser to carve the food material F. The cooking robot 26 has one or more robotic arms and can grasp the food material F and place it in a predetermined position, add seasonings, pour sauces, and perform various cooking operations using various cooking utensils. The cutting machine 28 can cut the food material F using a CNC (Computer Numerical Control) milling machine, laser, etc.
レーザー加工機24、調理ロボット26、および切削加工機28は、デジタルデータに基づいて、切削、彫刻、加熱、積層、および各種調理工程等を緻密に行い得るデジタルファブリケーション機器である。レーザー加工機24、調理ロボット26、および切削加工機28は、各種調理加工を行う加工機器の一例であって、料理情報生成システム1の構成はこれに限定されない。 The laser processing machine 24, cooking robot 26, and cutting machine 28 are digital fabrication devices that can precisely perform cutting, engraving, heating, laminating, and various cooking processes based on digital data. The laser processing machine 24, cooking robot 26, and cutting machine 28 are examples of processing devices that perform various cooking processes, and the configuration of the cooking information generation system 1 is not limited to this.
以上、本開示の一実施形態による料理情報生成システム1の概要について説明した。続いて、本実施形態による料理情報の生成を行う情報処理装置10について図面を参照して具体的に説明する。 The above provides an overview of the cooking information generation system 1 according to one embodiment of the present disclosure. Next, the information processing device 10 that generates cooking information according to this embodiment will be described in detail with reference to the drawings.
<<2.情報処理装置10の構成例>>
図2は、本実施形態による情報処理装置10の構成の一例を示すブロック図である。図2に示すように、情報処理装置10は、制御部110、通信部120、操作入力部130、表示部140、記憶部150、および音声入出力部160を含む。
<<2. Configuration example of information processing device 10>>
2 is a block diagram showing an example of the configuration of the information processing device 10 according to the present embodiment. As shown in FIG. 2, the information processing device 10 includes a control unit 110, a communication unit 120, an operation input unit 130, a display unit 140, a storage unit 150, and an audio input/output unit 160.
(通信部120)
通信部120は、外部装置にデータを送信する送信部と、外部装置からデータを受信する受信部を有する。例えば通信部120は、カメラ20と通信接続し、カメラ20から食材Fの撮像画像を受信する。また、通信部120は、食材センサ22と通信接続し、食材センサ22から食材Fのセンシングデータを受信する。また、通信部120は、各種デジタルファブリケーション機器(レーザー加工機24、調理ロボット26、および切削加工機28)と通信接続し、各種デジタルファブリケーション機器における各種調理動作を制御するための制御信号を送信する。
(Communication unit 120)
The communication unit 120 has a transmitter that transmits data to an external device and a receiver that receives data from the external device. For example, the communication unit 120 is communicatively connected to the camera 20 and receives captured images of the food ingredient F from the camera 20. The communication unit 120 is also communicatively connected to the food ingredient sensor 22 and receives sensing data of the food ingredient F from the food ingredient sensor 22. The communication unit 120 is also communicatively connected to various digital fabrication devices (the laser processing machine 24, the cooking robot 26, and the cutting machine 28) and transmits control signals for controlling various cooking operations of the various digital fabrication devices.
通信部120は、例えば有線/無線LAN(Local Area Network)、Wi-Fi(登録商標)、Bluetooth(登録商標)、携帯通信網(LTE(Long Term Evolution)、5G(第5世代の移動体通信方式))等を用いて、外部装置またはインターネットと通信接続してもよい。 The communication unit 120 may communicate with an external device or the Internet using, for example, a wired/wireless LAN (Local Area Network), Wi-Fi (registered trademark), Bluetooth (registered trademark), a mobile communication network (LTE (Long Term Evolution)), 5G (fifth generation mobile communication system)), etc.
(操作入力部130および表示部140)
操作入力部130は、ユーザによる操作入力を受け付け、入力情報を制御部110に出力する。操作入力部130は、マウス、キーボード、スイッチ、またはボタン等により実現されてもよい。また、表示部140は、各種操作画面や、後述する加工結果予測画像を表示する。表示部140は、液晶ディスプレイ(LCD:Liquid Crystal Display)、有機EL(Electro Luminescence)ディスプレイなどの表示パネルであってもよい。操作入力部130および表示部140は、一体化して設けられてもよい。操作入力部130は、表示部140(例えばパネルディスプレイ)に積層されるタッチセンサであってもよい。
(Operation Input Unit 130 and Display Unit 140)
The operation input unit 130 accepts operation input by the user and outputs the input information to the control unit 110. The operation input unit 130 may be realized by a mouse, a keyboard, a switch, a button, or the like. The display unit 140 displays various operation screens and a predicted image of a processed result, which will be described later. The display unit 140 may be a display panel such as a liquid crystal display (LCD) or an organic electroluminescence (EL) display. The operation input unit 130 and the display unit 140 may be provided integrally. The operation input unit 130 may be a touch sensor stacked on the display unit 140 (for example, a panel display).
(制御部110)
制御部110は、演算処理装置および制御装置として機能し、各種プログラムに従って情報処理装置10内の動作全般を制御する。制御部110は、例えばCPU(Central Processing Unit)、GPU(Graphics Processing Unit)、マイクロプロセッサ等の電子回路によって実現される。また、制御部110は、使用するプログラムや演算パラメータ等を記憶するROM(Read Only Memory)、および適宜変化するパラメータ等を一時記憶するRAM(Random Access Memory)を含んでいてもよい。
(Control unit 110)
The control unit 110 functions as an arithmetic processing unit and a control unit, and controls the overall operation of the information processing device 10 in accordance with various programs. The control unit 110 is realized by electronic circuits such as a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), a microprocessor, etc. The control unit 110 may also include a ROM (Read Only Memory) that stores programs to be used, arithmetic parameters, etc., and a RAM (Random Access Memory) that temporarily stores parameters that change as appropriate.
また、制御部110は、エージェント部111、素案生成部112、表示制御部113、調整部114、要素検討部115、統合部116、およびストーリー生成部117としても機能し得る。 The control unit 110 can also function as an agent unit 111, a draft generation unit 112, a display control unit 113, an adjustment unit 114, an element review unit 115, an integration unit 116, and a story generation unit 117.
エージェント部111は、ユーザとの対話が可能なエージェントとして機能し、ユーザとの対話から適切な情報を取得する。情報処理装置10へのユーザによる情報入力は、操作入力部130からのユーザ操作による入力の他、エージェント部111により提示されるエージェントとの対話による入力であってもよい。ユーザとエージェントとの対話は、テキストで行われてもよいし、音声により行われてもよい。 The agent unit 111 functions as an agent capable of interacting with the user, and acquires appropriate information from the interaction with the user. Information input by the user to the information processing device 10 may be input through user operation from the operation input unit 130, or may be input through interaction with an agent presented by the agent unit 111. The interaction between the user and the agent may be via text or voice.
より具体的には、エージェント部111は、大規模言語モデル(LLM:Large language Models)等を活用したAIエージェントであってもよい。これにより、ユーザは、自然言語入力を行い得る。 More specifically, the agent unit 111 may be an AI agent that utilizes large language models (LLMs), etc. This allows the user to input natural language.
エージェント部111は、ユーザとの対話から、創作する料理に関するユーザの指摘(要望)を取得する。例えば、ユーザは、新しく作りたい料理のコンセプトや、使いたい食材などについてエージェントと対話する。エージェント部111は、ユーザとの対話に基づいて、ユーザの指摘に応じたプロンプトを生成し、出力する。プロンプトの生成には、AIが用いられてもよい。より具体的には、プロンプト生成について学習した学習モデルが用いられてもよい。以下、エージェント部111から出力されるユーザの指摘に対応するプロンプトについて、指摘事項とも称する。 The agent unit 111 acquires the user's suggestions (requests) regarding the dish to be created from a dialogue with the user. For example, the user may interact with the agent about the concept of a new dish they want to make, the ingredients they want to use, etc. The agent unit 111 generates and outputs prompts in response to the user's suggestions based on the dialogue with the user. AI may be used to generate the prompts. More specifically, a learning model that has learned about prompt generation may be used. Hereinafter, the prompts that correspond to the user's suggestions and are output from the agent unit 111 are also referred to as suggestions.
素案生成部112は、料理素案(料理情報の一例)を生成する機能を有する。素案生成部112は、機械学習により生成した学習モデルを用いて、エージェント部111から入力されたユーザの指摘事項(プロンプト)に応じて、1以上の料理素案を生成する。料理素案には、料理のレシピデータが少なくとも含まれる。また、料理素案には、料理画像が含まれていてもよい。料理素案の生成について、以下図3を参照して詳細に説明する。 The recipe plan generation unit 112 has the function of generating a recipe plan (an example of recipe information). Using a learning model generated by machine learning, the recipe plan generation unit 112 generates one or more recipe plans in response to user suggestions (prompts) input from the agent unit 111. A recipe plan includes at least recipe data for the dish. The recipe plan may also include images of the dish. The generation of a recipe plan will be explained in detail below with reference to Figure 3.
図3は、本実施形態による素案生成部112の機能の詳細について説明するための図である。図3に示すように、素案生成部112は、食材ペアリング部510および料理素案生成部520を含む。 Figure 3 is a diagram for explaining the details of the function of the recipe plan generation unit 112 according to this embodiment. As shown in Figure 3, the recipe plan generation unit 112 includes an ingredient pairing unit 510 and a recipe plan generation unit 520.
素案生成部112には、エージェント部111がユーザUと対話し、ユーザの指摘事項40から生成したプロンプト41が入力される。ユーザの指摘事項40には、新たに作りたい料理のコンセプトや使用食材の情報が含まれる。料理のコンセプトとは、例えば、料理の印象、雰囲気、スタイル、ジャンル、テーマ、目的、重点項目等が想定される。また、料理のデザイン性や使用食材に関する注文も、コンセプトとして入力され得る。 The agent unit 111 interacts with the user U, and the prompt 41 generated from the user's suggestions 40 is input to the draft generation unit 112. The user's suggestions 40 include information about the concept of the new dish they want to make and the ingredients to be used. The concept of the dish may include, for example, the impression, atmosphere, style, genre, theme, purpose, and key points of the dish. In addition, requests regarding the design of the dish and the ingredients to be used may also be input as concepts.
下記表1に、コンセプトの一例を示す。コンセプトは、料理の全体的なイメージでもよいし、デザイン性や食材に関するものであってもよい。 Table 1 below shows some examples of concepts. The concept can be the overall image of the dish, or it can be related to the design or ingredients.
食材ペアリング部510は、ユーザの使いたい食材(使用食材)と組み合わせて使用することが適切な組み合わせ食材を出力する。例えば、食材ペアリング部510は、使用食材のデータを食材ペアリング学習モデルに入力し、使用食材と良く組み合わされる食材や、分子レベルで適合性がよい食材を出力する。食材ペアリング学習モデルは、機械学習により食材の組み合わせを学習して生成された学習モデルである。 The ingredient pairing unit 510 outputs combination ingredients that are suitable for use in combination with the ingredients the user wants to use (the ingredients used). For example, the ingredient pairing unit 510 inputs data on the ingredients used into an ingredient pairing learning model, and outputs ingredients that pair well with the ingredients used or that are compatible at the molecular level. The ingredient pairing learning model is a learning model generated by learning ingredient combinations using machine learning.
料理素案生成部520は、コンセプトと、使用食材と、組み合わせ食材と、を入力データとして料理学習モデルに入力し、料理情報の一案である料理素案42を1以上出力する。料理学習モデルは、機械学習により料理について予め学習したモデルである。料理学習モデルは、各種料理の画像(撮像画像)、対応するレシピデータ等を学習し得る。学習するレシピデータには、料理名、材料、調理工程、使用調理器具、調理時間、およびタグ(例えば、レシピカテゴリーとして、料理の目的、ジャンル、適した季節、適した場面、ターゲット層、その他キーワード等の属性情報)等が含まれる。 The recipe draft generation unit 520 inputs the concept, ingredients used, and ingredient combinations into the cooking learning model as input data, and outputs one or more recipe drafts 42, which are recipe information proposals. The cooking learning model is a model that has learned about cooking in advance through machine learning. The cooking learning model can learn images (captured images) of various dishes, corresponding recipe data, etc. The recipe data to be learned includes the name of the dish, ingredients, cooking steps, cooking utensils used, cooking time, and tags (for example, attribute information such as recipe category, purpose of cooking, genre, suitable season, suitable occasion, target demographic, and other keywords).
タグ付けされるレシピカテゴリーの具体例を挙げる。「適した季節」としては、例えば、春夏秋冬、正月、花見、月見、クリスマス等が挙げられる。「適した場面」としては、例えば、おもてなし、パーティー、おつまみ、夜食、ワインに合う、記念日等が挙げられる。「ターゲット層」としては、例えば、アスリート向け、子供向け、大人向け、老人向け等が挙げられる。「その他キーワード」としては、例えば、華やか、彩り、わいわい、滋養、減塩、消化によい、ヘルシー、リラックス、気分転換、簡単、時短、懐かしい、イタリアン、中華等が挙げられる。料理学習モデルから出力される料理素案は、レシピデータと、対応する料理画像を含む。料理学習モデルは、レシピデータを生成するレシピ生成AIと、レシピデータに対応する料理画像を生成する画像生成AIを含み得る。 Specific examples of recipe categories that can be tagged include: "Suitable seasons" include, for example, spring, summer, autumn, winter, New Year's, cherry blossom viewing, moon viewing, and Christmas. "Suitable occasions" include, for example, entertaining, parties, snacks, late-night snacks, goes well with wine, and anniversaries. "Target demographics" include, for example, athletes, children, adults, and the elderly. "Other keywords" include, for example, gorgeous, colorful, lively, nutritious, low-salt, easy to digest, healthy, relaxing, mood changer, easy, time-saving, nostalgic, Italian, and Chinese. The dish draft output from the cooking learning model includes recipe data and corresponding dish images. The cooking learning model may include a recipe generation AI that generates recipe data and an image generation AI that generates dish images corresponding to the recipe data.
料理素案生成部520が使用する料理学習モデルには、ユーザの料理を学習したユーザ料理学習モデル521と、他のユーザ(例えば他の調理人)の料理を学習した他ユーザ料理学習モデル522が含まれる。他ユーザ料理学習モデル522は、複数あってもよい。例えば、フランス料理の調理人の料理を学習した他ユーザ料理学習モデルや、中華料理の調理人の料理を学習した他ユーザ料理学習モデル等があってもよい。また、他ユーザ料理学習モデル522は、料理のジャンルに関わらず多数の調理人の料理を学習した他ユーザ料理学習モデルであってもよい。 The cooking learning models used by the recipe draft generation unit 520 include a user cooking learning model 521 that has learned the user's cooking, and an other user cooking learning model 522 that has learned the cooking of other users (e.g., other chefs). There may be multiple other user cooking learning models 522. For example, there may be an other user cooking learning model that has learned the cooking of a French chef, or an other user cooking learning model that has learned the cooking of a Chinese chef. The other user cooking learning model 522 may also be an other user cooking learning model that has learned the cooking of many chefs regardless of cooking genre.
料理素案生成部520は、ユーザ料理学習モデル521を用いて料理素案を出力してもよいし、他ユーザ料理学習モデル522を用いて料理素案を出力したり、ユーザ料理学習モデル521および他ユーザ料理学習モデル522を用いて料理素案を出力したりしてもよい。なお、ユーザ料理学習モデル521および他ユーザ料理学習モデル522は、記憶部150に格納され得る。図3では、説明の都合上、料理素案生成部520が使用する料理学習モデルを料理素案生成部520の構成として図示しているが、料理素案生成部520が、記憶部150に記憶されている各料理学習モデルを呼び出して使用していることを想定する。 The recipe draft generation unit 520 may output a recipe draft using the user's cooking learning model 521, or may output a recipe draft using the other user's cooking learning model 522, or may output a recipe draft using the user's cooking learning model 521 and the other user's cooking learning model 522. The user's cooking learning model 521 and the other user's cooking learning model 522 may be stored in the memory unit 150. For ease of explanation, Figure 3 illustrates the cooking learning models used by the recipe draft generation unit 520 as the configuration of the recipe draft generation unit 520, but it is assumed that the recipe draft generation unit 520 calls up and uses each cooking learning model stored in the memory unit 150.
また、料理素案生成部520は、料理情報の生成において、適宜外部のデータベースと連携し、外部情報(例えば料理に関する情報、食材の情報、入力されたコンセプトに関連する情報、様々なデザイン等)を取得してもよい。また、料理素案生成部520は、デザイン生成AIを用いて、コンセプトに応じたデザインを取得し、料理学習モデルにデザインデータを入力し、そのデザインに基づいた装飾が施された料理画像を出力させてもよい。 Furthermore, when generating dish information, the recipe draft generation unit 520 may link with an external database as appropriate to acquire external information (for example, information about the dish, information about ingredients, information related to the input concept, various designs, etc.). Furthermore, the recipe draft generation unit 520 may use design generation AI to acquire a design based on the concept, input the design data into the cooking learning model, and output a dish image decorated based on that design.
表示制御部113は、表示部140への画像の表示制御を行う。表示制御部113は、AIエージェントとの対話を促す画面等の各種入力画面や、素案生成部112により生成された料理素案の提示画面、後述する統合部116により生成された新たな料理情報(調理加工統合情報)を提示する提示画面、後述するストーリー生成部117により生成された新たな料理の開発ストーリーを提示する提示画面等を表示部140に表示する制御を行う。 The display control unit 113 controls the display of images on the display unit 140. The display control unit 113 controls the display of various input screens on the display unit 140, such as a screen that prompts dialogue with the AI agent, a screen that presents a recipe draft generated by the draft generation unit 112, a screen that presents new recipe information (integrated cooking and processing information) generated by the integration unit 116 (described below), and a screen that presents a new recipe development story generated by the story generation unit 117 (described below).
以下に説明する調整部114、要素検討部115、および統合部116については、図4も参照しながら説明する。図4は、本実施形態による料理情報の生成について説明するための図である。 The adjustment unit 114, element review unit 115, and integration unit 116 will be explained below with reference to Figure 4. Figure 4 is a diagram for explaining the generation of cooking information according to this embodiment.
調整部114は、素案生成部112により生成された料理素案42について調整を行い、調整した料理素案を要素検討部115に出力する。調整部114には、素案生成部112により生成された料理素案42のうち自動選択された1以上の料理素案が入力されてもよいし、ユーザにより選択された1以上の料理素案が入力されてもよい。自動選択の方法としては、例えば、調整部114は、ある基準(例えばユーザの料理の特徴に近いか)で各料理素案にスコアをつけて、スコアが高い料理素案を選択するようにしてもよい。また、ユーザが選択する際にも、各種の基準に応じたスコアに基づいて料理素案がソートできるようにして、ユーザが選択し易いようにしてもよい。 The adjustment unit 114 adjusts the recipe drafts 42 generated by the draft generation unit 112 and outputs the adjusted recipe drafts to the element review unit 115. The adjustment unit 114 may be input with one or more recipe drafts automatically selected from the recipe drafts 42 generated by the draft generation unit 112, or with one or more recipe drafts selected by the user. As a method of automatic selection, for example, the adjustment unit 114 may assign a score to each recipe draft based on a certain criterion (for example, whether it is close to the characteristics of the user's dish) and select the recipe draft with the highest score. Furthermore, when the user makes a selection, the recipe drafts may be sorted based on scores according to various criteria, making it easier for the user to select.
図4に示す例では、ユーザにより選択された料理素案42Aが調整部114に入力される。料理素案42Aには、レシピデータ421Aと料理画像422Aが含まれる。 In the example shown in FIG. 4, a recipe draft 42A selected by the user is input to the adjustment unit 114. The recipe draft 42A includes recipe data 421A and a dish image 422A.
調整部114には、ユーザとエージェントとの対話から取得された、ユーザからの新たな指摘事項(に基づくプロンプト。単に指摘事項とも称する。)が、エージェント部111から入力されてもよい。調整部114は、エージェント部111から入力されたユーザの指摘事項に応じて、料理素案を調整する。この際、調整部114は、料理学習モデル(ユーザ料理学習モデル、他ユーザ料理学習モデル)、食材ペアリング学習モデル、画像生成AI等を用いてもよい。料理学習モデルは記憶部150に記憶され、調整部114は必要に応じて記憶部150から料理学習モデルを呼び出して使用し得る。また、かかる料理学習モデルは、素案生成部112の料理素案生成部520で使用される料理学習モデルと同じものを想定する。 The adjustment unit 114 may receive new user suggestions (prompts based on these suggestions, also simply referred to as suggestions) obtained from the dialogue between the user and the agent from the agent unit 111. The adjustment unit 114 adjusts the recipe draft in accordance with the user suggestions input from the agent unit 111. In doing so, the adjustment unit 114 may use a cooking learning model (user cooking learning model, other user cooking learning model), an ingredient pairing learning model, image generation AI, etc. The cooking learning model is stored in the memory unit 150, and the adjustment unit 114 can call up and use the cooking learning model from the memory unit 150 as needed. Furthermore, it is assumed that such a cooking learning model is the same as the cooking learning model used in the recipe draft generation unit 520 of the draft generation unit 112.
また、調整部114は、ユーザの調理環境情報に応じて料理素案を調整する環境調整部としても機能する。また、調整部114は、必要に応じて外部のデータベース等から外部情報を取得し、外部情報を料理素案の調整に用いてもよい。 The adjustment unit 114 also functions as an environment adjustment unit that adjusts the recipe draft in accordance with the user's cooking environment information. The adjustment unit 114 may also acquire external information from an external database, etc., as needed, and use the external information to adjust the recipe draft.
調整部114による料理素案の調整とは、具体的には、使用食材の差し替えや、新たな組み合わせ食材の取得、使用調理器具の変更、および料理画像(料理外観)の修正等が想定される。 Specific examples of adjustments to the recipe draft by the adjustment unit 114 include replacing ingredients used, obtaining new ingredient combinations, changing the cooking utensils used, and modifying the dish image (appearance of the dish).
例えば、ユーザが料理素案を選択した際、「これいいね。でもトッピングはフルーツの方がいいかな」とエージェントと会話した場合、エージェント部111により、ユーザとの会話に基づく指摘事項として「トッピングをフルーツに変更」といった情報が入力され、調整部114は、トッピングの食材をフルーツに変更するようレシピデータを調整する。 For example, if a user selects a recipe draft and converses with the agent saying, "This is good, but maybe fruit would be better as a topping," the agent unit 111 will input information such as "change the topping to fruit" as a suggestion based on the conversation with the user, and the adjustment unit 114 will adjust the recipe data to change the topping ingredient to fruit.
また、調整部114は、ユーザの調理環境に合わせてレシピデータ内の調理器具の変更を行い得る。例えば、「じゃがいもを鍋で10分蒸す」というレシピデータに対し、ユーザの調理環境で一般的に行われている方法「電子レンジ」を使用するレシピデータ、具体的には、「じゃがいもを電子レンジで600W3分蒸す」というレシピデータに修正する。調整部114は、調理器具を鍋から電子レンジに変更した際、電子レンジの適切な設定情報(レンジ出力、時間)を、外部のデータベース等から取得した外部情報(茹で時間と電子レンジでの加熱時間との対応表等)に基づいて生成してもよい。 The adjustment unit 114 may also change the cooking utensils in the recipe data to suit the user's cooking environment. For example, recipe data that says "steam potatoes in a pot for 10 minutes" may be modified to recipe data that uses a "microwave oven," a method commonly used in the user's cooking environment, specifically, recipe data that says "steam potatoes in a microwave oven at 600W for 3 minutes." When the cooking utensil is changed from a pot to a microwave oven, the adjustment unit 114 may generate appropriate setting information for the microwave oven (microwave output, time) based on external information (such as a correspondence table between boiling time and microwave heating time) obtained from an external database, etc.
このようにして、調整部114により、料理素案が、よりユーザの意図や環境に近付くよう調整される。 In this way, the adjustment unit 114 adjusts the recipe plan to better match the user's intentions and environment.
要素検討部115は、調整部114から出力された料理素案について、実現可能性を高めるため、料理素案に含まれるレシピデータの調理工程に応じてレシピデータを各要素に分けて検討する機能を有する。すなわち、料理素案の段階では、本当にこのレシピで現実的に料理が作れるのか、料理画像に示されるような料理外観に仕立てることは現実的に可能であるのか、について確証がない。料理素案には、調理工程を示すレシピデータと、料理の外観を示す料理画像が含まれるが、例えば画像生成AIにより生成された料理画像で示される料理の外観(例えば、器への盛り付け、装飾)が現実的ではない場合もある。 The element review unit 115 has the function of dividing the recipe data included in the recipe draft output from the adjustment unit 114 into elements and reviewing them according to the cooking steps of the recipe data, in order to increase the feasibility of the draft dish. In other words, at the draft dish stage, there is no certainty as to whether the dish can actually be made using this recipe, or whether it is realistically possible to create the appearance of the dish shown in the dish image. A draft dish includes recipe data showing the cooking steps and a dish image showing the appearance of the dish, but there are cases where the appearance of the dish shown in a dish image generated by image generation AI (for example, presentation on a dish, decoration) is not realistic.
そこで、要素検討部115は、要素毎に実現可能性の検討、具体的には、使用予定の調理ロボット等の制御最適化や、バリエーションの生成(要素の拡張)を行い、要素毎に、実現可能性のある、すなわちその調理工程が成功する調理加工情報を生成する。要素検討では、実際に調理ロボットで調理を行ってその結果を分析したり、シミュレーションにより調理が行われたりする。要素検討部115は、要素毎に1以上の調理加工情報を生成し得る。 The element review unit 115 therefore reviews the feasibility of each element, specifically optimizing the control of the cooking robot to be used and generating variations (expanding the elements), and generates cooking and processing information for each element that is feasible, i.e., that will result in the cooking process being successful. Element review involves actually cooking with a cooking robot and analyzing the results, or cooking is performed through simulation. The element review unit 115 can generate one or more pieces of cooking and processing information for each element.
成功結果が得られない要素(調理工程)がある場合、要素検討部115は、調整部114に対して調整要求を行う。この場合、調整部114においてレシピデータの調整(例えば、使用食材の変更、使用調理器具の変更、料理外観の変更等)が行われ、再度、要素検討部115で要素検討が行われる。 If there is an element (cooking step) that does not produce a successful result, the element review unit 115 requests adjustment from the adjustment unit 114. In this case, the adjustment unit 114 adjusts the recipe data (for example, changes to the ingredients used, the cooking utensils used, or the appearance of the dish), and the element review unit 115 reviews the element again.
図4に示すように、要素検討部115は、調整部114により調整された料理素案のレシピデータ421A’を要素1~3・・・に分けて、各要素の検討の過程でバリエーションを生成する。 As shown in Figure 4, the element review unit 115 divides the recipe data 421A' of the dish draft adjusted by the adjustment unit 114 into elements 1 to 3, etc., and generates variations in the process of reviewing each element.
統合部116は、成功した各要素(調理加工情報)を組み合わせて新たな料理情報(調理加工統合情報)を生成する。上述したように、分解されたレシピデータの各要素(調理工程)について、各々制御最適化やバリエーションの生成が行われ、要素毎に1以上の成功派生要素が生成され得る。統合部116は、要素毎の成功派生要素を適宜組み合わせて、複数パターンの新レシピデータを、調理加工統合情報として生成し得る。この際、統合部116は、全体の調理時間など、条件を満たす新レシピデータを生成し得る。また、統合部116は、調理加工統合情報として、各新レシピデータに対応する料理画像も生成する。 The integration unit 116 combines each successful element (cooking processing information) to generate new cooking information (integrated cooking processing information). As described above, control optimization and variation generation are performed for each element (cooking process) of the decomposed recipe data, and one or more successful derived elements can be generated for each element. The integration unit 116 can appropriately combine the successful derived elements for each element to generate multiple patterns of new recipe data as integrated cooking processing information. In this case, the integration unit 116 can generate new recipe data that meets conditions such as overall cooking time. The integration unit 116 also generates dish images corresponding to each new recipe data as integrated cooking processing information.
図4に示すように、統合部116は、各要素のバリエーションを適宜組み合わせて、料理A’1(統合1)~料理A’3(統合3)・・・といった新たな料理情報を生成する。 As shown in Figure 4, the integration unit 116 appropriately combines variations of each element to generate new dish information such as dish A'1 (integration 1) to dish A'3 (integration 3), etc.
統合部116により生成された1以上の調理加工統合情報は、表示制御部113に出力され表示部140に表示されることで、ユーザに提示される。調理加工統合情報が表示される際、統合に用いられた要素(調理加工情報)の他、統合に用いられなかった要素も候補として提示され得る。ユーザは、提示された調理加工統合情報についてエージェントと対話を行い、さらに指摘事項を入力してもよい。例えば、統合に用いる要素を候補として挙げられている他の要素に変更する指摘が入力されてもよい。また、食材の変更またはコンセプトの変更等が入力されてもよい。 The one or more pieces of integrated cooking and processing information generated by the integration unit 116 are output to the display control unit 113 and displayed on the display unit 140, thereby being presented to the user. When the integrated cooking and processing information is displayed, in addition to the elements (cooking and processing information) used in the integration, elements not used in the integration may also be presented as candidates. The user may interact with the agent about the presented integrated cooking and processing information and may further input suggestions. For example, a suggestion to change the elements used in the integration to other elements listed as candidates may be input. A change in ingredients or concept, etc. may also be input.
エージェント部111は、新たな指摘事項を、調整部114、要素検討部115、または統合部116に入力し、適宜調理加工統合情報の修正を行わせる。 The agent unit 111 inputs the new findings into the adjustment unit 114, element review unit 115, or integration unit 116, causing them to revise the integrated cooking and processing information as appropriate.
新たな料理の創作過程においてユーザの要望が当初の内容から変化する場合も想定される。本実施形態では、料理情報生成の開始時のみならず、料理素案を選択する際や、調理加工統合情報が提示された際など、創作過程において、適宜、新たにユーザが指摘事項を入力し得る。これにより、システムがユーザと協業してユーザの意図により近い創作料理を提案し、ユーザの料理創作を支援することが可能となる。 It is also possible that the user's requests may change from their original content during the process of creating a new dish. In this embodiment, the user can input new suggestions as needed during the creation process, not only at the start of dish information generation, but also when selecting a dish draft or when integrated cooking and processing information is presented. This allows the system to work together with the user to suggest creative dishes that are closer to the user's intentions, and support the user's dish creation.
ストーリー生成部117は、創作過程において適宜記憶部150に保存された情報に基づいて、創作料理の開発ストーリーを生成する。記憶部150に保存された情報には、ユーザのエージェントとの対話(ユーザの指摘事項)、最初に選択された料理素案、選択されなかった料理素案、要素検討における成功/失敗事例等が含まれる。開発ストーリーは、テキストおよび画像を用いて生成されてもよい。また、開発ストーリーは、テキストおよび画像が配置された静止画像であってもよいし、料理素案から創作料理に至るまでの料理画像の変化を示す動画であってもよい。生成された開発ストーリーは、表示制御部113に出力され、表示部140に表示されてユーザに提示される。開発ストーリーは、例えば広告等に活用されてもよい。 The story generation unit 117 generates a development story for the creative dish based on information stored in the storage unit 150 as appropriate during the creation process. The information stored in the storage unit 150 includes the user's dialogue with the agent (user's suggestions), the initially selected dish draft, the dish drafts that were not selected, and examples of success and failure in considering elements. The development story may be generated using text and images. The development story may also be a still image with text and images arranged in it, or a video showing the changes in the dish image from the dish draft to the creative dish. The generated development story is output to the display control unit 113 and displayed on the display unit 140 for presentation to the user. The development story may be used, for example, for advertising, etc.
以上、制御部110の機能構成について具体的に説明した。なお、制御部110は、上述した機能構成に限定されず、ツール連携機能を有する。具体的には、制御部110は、料理情報の生成において、適宜、従来の各種ツールと連携し得る。例えば、制御部110は、素案生成部112や要素検討部115においてデザインを生成する際に、3DCADと連携して生成してもよい。また、制御部110は、レーザー加工機24、調理ロボット26、および切削加工機28等のデジタルファブリケーション機器を制御するツールと連携して、これらの機器の制御を行ってもよい。また、制御部110は、画像生成AI等の各種AIツールと連携してもよい。 The functional configuration of the control unit 110 has been specifically described above. However, the control unit 110 is not limited to the functional configuration described above and has a tool linkage function. Specifically, the control unit 110 can link with various conventional tools as appropriate when generating cooking information. For example, the control unit 110 may link with 3D CAD when generating a design in the draft generation unit 112 or the element review unit 115. The control unit 110 may also link with tools that control digital fabrication equipment such as the laser processing machine 24, cooking robot 26, and cutting machine 28, and control these devices. The control unit 110 may also link with various AI tools, such as image generation AI.
(記憶部150)
記憶部150は、制御部110の処理に用いられるプログラムや演算パラメータ等を記憶するROM、および適宜変化するパラメータ等を一時記憶するRAMにより実現される。
(Storage unit 150)
The storage unit 150 is realized by a ROM that stores programs and calculation parameters used in the processing of the control unit 110, and a RAM that temporarily stores parameters that change as needed.
例えば、記憶部150には、素案生成部112や調整部114で用いられる料理学習モデルおよび食材ペアリング学習モデルが格納されていてもよい。また、記憶部150には、ユーザの基本情報(ユーザが登録したレシピデータ、調理環境情報など)が格納されていてもよい。また、記憶部150には、素案生成部112で生成された料理素案や、調整部114で調整された料理素案、要素検討部115における要素検討結果等、新たな料理の創作過程で出力された各種情報が格納されていてもよい。 For example, the storage unit 150 may store a cooking learning model and an ingredient pairing learning model used by the draft generation unit 112 and the adjustment unit 114. The storage unit 150 may also store basic information about the user (such as recipe data registered by the user and cooking environment information). The storage unit 150 may also store various information output during the process of creating a new dish, such as the dish draft generated by the draft generation unit 112, the dish draft adjusted by the adjustment unit 114, and the results of element review by the element review unit 115.
(音声入出力部160)
音声入出力部160は、音声を収音して音声データを制御部110に入力する音声入力部と、音声の出力を行う音声出力部を有する。音声入力部は、例えばマイクロホンにより実現され、ユーザ音声を収音する。また、音声出力部は、例えばスピーカーにより実現され、エージェント音声を出力する。
(Audio input/output unit 160)
The voice input/output unit 160 has a voice input unit that collects voice and inputs the voice data to the control unit 110, and a voice output unit that outputs voice. The voice input unit is realized by, for example, a microphone and collects user voice. The voice output unit is realized by, for example, a speaker and outputs agent voice.
以上、情報処理装置10の構成について具体的に説明した。なお、本開示による情報処理装置10の構成は図2に示す例に限定されない。例えば、情報処理装置10は、複数の装置により実現されてもよい。また、情報処理装置10の少なくとも一部の機能を、インターネット上のサーバで実現してもよい。例えば、制御部110の各機能構成を、サーバに設けてもよい。すなわち、情報処理装置10の各構成は、ユーザ端末とサーバから成る情報処理システムにより実現されてもよい。 The configuration of the information processing device 10 has been specifically described above. Note that the configuration of the information processing device 10 according to the present disclosure is not limited to the example shown in FIG. 2. For example, the information processing device 10 may be realized by multiple devices. Furthermore, at least some of the functions of the information processing device 10 may be realized by a server on the Internet. For example, each functional component of the control unit 110 may be provided on a server. In other words, each component of the information processing device 10 may be realized by an information processing system consisting of a user terminal and a server.
<<3.動作処理>>
続いて、本実施形態による料理情報生成システム1の動作処理について図5を参照して説明する。
<<3. Operation Processing>>
Next, the operation process of the cooking information generation system 1 according to this embodiment will be described with reference to FIG.
図5は、本実施形態による料理情報生成システム1の全体の流れの一例を示すフローチャートである。 Figure 5 is a flowchart showing an example of the overall flow of the cooking information generation system 1 according to this embodiment.
図5に示すように、まず、情報処理装置10は、ユーザ基本情報の入力処理を行う(ステップS103)。ユーザ基本情報の入力は、操作入力部130からユーザ操作により入力されてもよいし、エージェント部111により提示されるエージェントとユーザとの対話により入力されてもよい。 As shown in FIG. 5, first, the information processing device 10 performs an input process for user basic information (step S103). The user basic information may be input by user operation from the operation input unit 130, or may be input through interaction between the user and an agent presented by the agent unit 111.
次に、情報処理装置10は、エージェントとユーザとの対話から取得される、新たな創作料理に関するユーザの指摘事項に応じて、素案生成部112により1以上の料理素案を生成する(ステップS106)。 Next, the information processing device 10 generates one or more recipe drafts using the draft generation unit 112 in response to the user's suggestions regarding the new creative dish, which are obtained from the dialogue between the agent and the user (step S106).
次いで、情報処理装置10は、表示部140において、ユーザに1以上の料理素案を提示する(ステップS109)。 Next, the information processing device 10 presents one or more recipe ideas to the user on the display unit 140 (step S109).
次に、情報処理装置10は、ユーザによる料理素案の選択を受け付け、さらに、ユーザからの新たな指摘事項を取得する(ステップS112)。 Next, the information processing device 10 accepts the user's selection of a recipe draft and further acquires any new suggestions from the user (step S112).
次いで、情報処理装置10は、調整部114により、料理素案の調整処理を行う(ステップS115)。 Next, the information processing device 10 causes the adjustment unit 114 to perform adjustment processing on the recipe draft (step S115).
次に、情報処理装置10は、要素検討部115により、料理素案に含まれるレシピデータを調理工程に応じた要素に分けて、要素毎に実現可能性を検討する処理を行う(ステップS118)。具体的には、要素毎に、1以上の調理加工情報の生成が行われる。 Next, the information processing device 10, using the element review unit 115, divides the recipe data included in the recipe draft into elements corresponding to the cooking process, and performs a process to review the feasibility of each element (step S118). Specifically, one or more pieces of cooking and processing information are generated for each element.
次いで、情報処理装置10は、統合部116により、生成した各要素(各調理加工情報)を統合する統合処理を行い(ステップS121)、統合処理により生成された調理加工統合情報(新たな料理情報)を表示部140に表示してユーザに提示する処理を行う(ステップS124)。 Next, the information processing device 10 performs an integration process using the integration unit 116 to integrate the generated elements (each piece of cooking and processing information) (step S121), and then displays the integrated cooking and processing information (new cooking information) generated by the integration process on the display unit 140 to present it to the user (step S124).
次に、提示した調理加工統合情報についてユーザから新たな指摘事項の入力があった場合(ステップS127/Yes)、修正要と判断し、上記ステップS115(調整処理)、ステップS118(要素検討処理)、またはステップS121(統合処理)に適宜戻り、修正が行われる。いずれの段階から修正するかは、新たな指摘事項の内容に応じて制御部110により適宜判断される。 Next, if the user inputs new comments about the presented integrated cooking and processing information (step S127/Yes), it is determined that corrections are required, and the process returns to step S115 (adjustment processing), step S118 (element review processing), or step S121 (integration processing) as appropriate, and corrections are made. The control unit 110 determines at which stage to start corrections depending on the content of the new comments.
そして、情報処理装置10は、ユーザからの指摘事項がない場合(ステップS127/No)、提示した調理加工統合情報を登録する処理を行う(ステップS130)。情報処理装置10は、提示した複数の調理加工統合情報からユーザにより選択された調理加工統合情報を登録するようにしてもよい。 If there are no user-specified items (step S127/No), the information processing device 10 performs a process of registering the presented integrated cooking and processing information (step S130). The information processing device 10 may also register the integrated cooking and processing information selected by the user from the multiple pieces of presented integrated cooking and processing information.
なお、図5に示す動作処理は一例であって、必ずしも図5に示す全てのステップが図5に示す順で行われなくともよい。例えば、情報処理装置10は、ステップS109およびS112をスキップし、制御部110により料理素案を任意に選択するようにしてもよい。また、情報処理装置10は、ステップS115に示す調整処理の後に、調整した料理素案をユーザに提示し、新たな指摘事項の入力があった場合は再度調整処理を行うようにしてもよい。また、情報処理装置10は、S118に示す要素検討処理の後に、検討結果をユーザに提示し、新たな指摘事項の入力があった場合は再度要素検討処理を行うようにしてもよい。また、情報処理装置10は、ステップS121の前に、複数の調理加工情報(または加工結果)を候補として提示してユーザの選択を受け付け、ステップS121では、ユーザの選択に応じて選択された調理加工情報を含むよう統合を行ってもよい。 Note that the operational processing shown in FIG. 5 is an example, and all steps shown in FIG. 5 do not necessarily have to be performed in the order shown in FIG. 5. For example, the information processing device 10 may skip steps S109 and S112 and have the control unit 110 arbitrarily select a recipe draft. Furthermore, after the adjustment processing shown in step S115, the information processing device 10 may present the adjusted recipe draft to the user, and perform the adjustment processing again if new suggestions are input. Furthermore, after the element review processing shown in S118, the information processing device 10 may present the review results to the user, and perform the element review processing again if new suggestions are input. Furthermore, before step S121, the information processing device 10 may present multiple cooking processing information (or processing results) as candidates and accept the user's selection, and in step S121, perform integration to include the cooking processing information selected in accordance with the user's selection.
以下、図5に示す各処理について詳細に説明する。 The following describes each process shown in Figure 5 in detail.
<3-1.基本情報の入力処理>
図6は、本実施形態による基本情報入力処理の流れの一例を示すフローチャートである。本実施形態による料理情報生成システム1では、本システム(ユーザの新しい料理の創作を支援するAIツール)を起動した際に、まず、ユーザの基本情報が入力済みか否かを確認し、必要な基本情報が入力されていない場合は、以下に説明する基本情報の入力処理を実行する。なお、本システムは、例えば情報処理装置10にインストールされているアプリケーションにより実行され得る。また、ユーザの調理環境には、本システムにより制御可能な各種調理機器(例えば、レーザー加工機24、切削加工機28、加熱装置(鍋、電子レンジ、フライパン等))、カメラ20、食材センサ22、ロボット(例えば、調理ロボット26)等が備わっている。
<3-1. Input process of basic information>
FIG. 6 is a flowchart showing an example of the flow of the basic information input process according to this embodiment. When the cooking information generation system 1 according to this embodiment (an AI tool that assists users in creating new dishes) is launched, it first checks whether the user's basic information has already been entered. If the necessary basic information has not been entered, the system executes the basic information input process described below. This system may be executed, for example, by an application installed on an information processing device 10. The user's cooking environment may also include various cooking appliances (e.g., a laser processing machine 24, a cutting machine 28, a heating device (e.g., a pot, a microwave, a frying pan, etc.)), a camera 20, an ingredient sensor 22, a robot (e.g., a cooking robot 26), etc. that can be controlled by this system.
図6に示すように、情報処理装置10において、表示制御部113は、ユーザの料理に関する情報(基本情報)の入力を促す入力誘導画面を表示部140に提示し、ユーザによる基本情報の入力を受け付ける(ステップS203)。基本情報の入力は、エージェント部111により提示されるエージェントとの対話により行われてもよい。例えばエージェント部111は、表示制御部113を介して表示部140にエージェントを表示させ、音声入出力部160を介してユーザとの対話を開始する制御を行う。 As shown in FIG. 6, in the information processing device 10, the display control unit 113 presents an input prompting screen on the display unit 140 that prompts the user to input information about the dish (basic information), and accepts input of the basic information by the user (step S203). The basic information may be input through dialogue with an agent presented by the agent unit 111. For example, the agent unit 111 displays the agent on the display unit 140 via the display control unit 113, and performs control to start dialogue with the user via the voice input/output unit 160.
必要なユーザの基本情報とは、例えば、ユーザの得意料理、専門の料理(フランス料理等)、過去の料理情報(レシピデータ、料理画像(撮像画像))、調理環境の情報(主に使用する器具、特別な調理器具等)である。過去の料理情報に含まれるレシピや料理画像には、タグとしてレシピカテゴリー(クリスマス向けといった料理の目的や、アスリート向けといったターゲット層の情報等)が付加されていてもよい。過去の料理情報は、例えばユーザが利用しているレシピ管理システムから取得されてもよいし、ユーザによって指定されたデータベースから取得されてもよい。 The necessary basic user information includes, for example, the user's favorite dishes, specialty dishes (French cuisine, etc.), past cooking information (recipe data, cooking images (captured images)), and cooking environment information (main utensils used, special cooking utensils, etc.). Recipes and cooking images included in past cooking information may be tagged with recipe categories (such as the purpose of the cooking, such as for Christmas, or information about the target demographic, such as for athletes). Past cooking information may be obtained, for example, from a recipe management system used by the user, or from a database specified by the user.
次に、制御部110は、入力されたユーザの基本情報を記憶部150に記憶する(ステップS206)。 Next, the control unit 110 stores the entered user's basic information in the storage unit 150 (step S206).
続いて、制御部110は、ユーザの料理情報の学習処理を開始する。まず、制御部110は、学習に用いるデータの準備を行う。具体的には、制御部110は、入力された料理画像およびレシピデータに基づいて、料理内容の推定と(ステップS209)、使用されている食器の推定を行う(ステップS212)。 Next, the control unit 110 begins the learning process of the user's cooking information. First, the control unit 110 prepares the data to be used for learning. Specifically, the control unit 110 estimates the contents of the dish (step S209) and the tableware used (step S212) based on the input dish image and recipe data.
図7は、本実施形態によるユーザの料理情報に基づく料理内容の推定等について説明するための図である。図7に示すように、入力されたユーザの料理情報44には、料理画像441とレシピデータ442が含まれる。制御部110は、料理画像411を解析して画像認識(映っている物体の認識)を行い、レシピデータを参照して、料理内容を推定する。具体的には、制御部110は、料理画像411に映る各料理の品とレシピデータとの対応付けを行う。図7に示す例では、画像認識した各料理の品に対して、「レシピ2:エビフライ」、「レシピ3:肉」、「レシピ4:付け合わせ」というように、レシピデータとの対応付け情報45が取得され得る。また制御部110は、画像認識された食器部分から、「丸皿、中サイズ」等、使用されている食器(すなわちユーザが所持している食器)を推定する。 FIG. 7 is a diagram for explaining the estimation of dish contents based on the user's dish information according to this embodiment. As shown in FIG. 7, the input user's dish information 44 includes a dish image 441 and recipe data 442. The control unit 110 analyzes the dish image 411 to perform image recognition (recognition of the objects shown in the image) and estimates the dish contents by referring to the recipe data. Specifically, the control unit 110 associates each dish item shown in the dish image 411 with the recipe data. In the example shown in FIG. 7, association information 45 with the recipe data can be obtained for each image-recognized dish item, such as "Recipe 2: Fried shrimp," "Recipe 3: Meat," and "Recipe 4: Side dish." The control unit 110 also estimates the tableware used (i.e., the tableware owned by the user) from the image-recognized dish portion, such as "round plate, medium size."
次に、制御部110は、ユーザの料理情報(レシピデータ、料理画像)と、料理情報から得られた上記料理内容や使用食器の推定結果等を入力データ(学習用データ)として、ユーザの料理を学習し、ユーザ料理学習モデル521を生成する(ステップS215)。なお、入力データは上述した例に限定されない。例えばレシピデータには、使用食材、使用調理器具等の情報も含まれるが、カロリーの情報が含まれていない場合に、制御部110は、外部の食材成分データベースにアクセスして、外部情報としてカロリーの情報を取得し、レシピデータに追加して入力データとしてもよい。また、レシピデータや料理画像には、上述したように、タグとしてレシピカテゴリー(料理の目的、ジャンル、ターゲット層等)が付加されているものもあるため、このようなレシピカテゴリーも入力され、学習が行われる。学習の手法は特に限定しないが、例えば強化学習等の機械学習が想定される。生成されたユーザ料理学習モデル521は、記憶部150に記憶される。 Next, the control unit 110 uses the user's cooking information (recipe data, dish images) and the estimated dish contents and tableware used obtained from the dish information as input data (learning data) to learn about the user's cooking and generate a user cooking learning model 521 (step S215). Note that the input data is not limited to the above example. For example, recipe data may include information on ingredients and cooking utensils used, but if calorie information is not included, the control unit 110 may access an external ingredient database to obtain calorie information as external information and add it to the recipe data as input data. Also, as described above, recipe data and dish images may have recipe categories (purpose of cooking, genre, target demographic, etc.) added as tags, and such recipe categories are also input for learning. The learning method is not particularly limited, but machine learning such as reinforcement learning is one example. The generated user cooking learning model 521 is stored in the storage unit 150.
このように、ユーザの料理を学習した学習モデルを用意することで、素案生成部112において、入力データ(ユーザからの指摘事項(料理の目的、ジャンル、食材等))に応じて、ユーザの料理に基づく(例えばユーザの料理の特徴に沿った)料理素案を生成することが可能となる。 In this way, by preparing a learning model that has learned from the user's cooking, the recipe generation unit 112 can generate a recipe draft based on the user's cooking (for example, in line with the characteristics of the user's cooking) in response to input data (user's suggestions (purpose of cooking, genre, ingredients, etc.)).
ここではユーザ料理学習モデル521の生成について説明したが、本実施形態では、多数のユーザ(調理人等)の料理を学習して生成された他ユーザ料理学習モデル522も予め用意され得る。他ユーザ料理学習モデル522で用いる学習用データは特に限定しないが、例えば多数の調理人が利用するレシピ管理システム(アプリケーション)から取得されたり、調理人のレシピ集が格納されているデータベースから取得されたりしてもよい。 Although the generation of the user cooking learning model 521 has been described here, in this embodiment, an other user cooking learning model 522 generated by learning the cooking of many users (chefs, etc.) can also be prepared in advance. The learning data used in the other user cooking learning model 522 is not particularly limited, but may be obtained, for example, from a recipe management system (application) used by many chefs, or from a database storing chef recipe collections.
<3-2.料理素案の生成処理>
図8は、本実施形態による料理素案の生成処理の流れの一例を示すフローチャートである。
<3-2. Recipe draft generation process>
FIG. 8 is a flowchart showing an example of the flow of a recipe plan generation process according to this embodiment.
図8に示すように、まず、情報処理装置10のエージェント部111は、ユーザとの対話から取得した、ユーザの指摘事項を素案生成部112に入力する(ステップS303)。ユーザの指摘事項は、ユーザが作りたい料理に関する事項(要望)であり、本実施形態では、コンセプトおよび使用食材を想定する。エージェント部111は、自然言語処理によりユーザの対話(音声認識によりテキスト化した内容や、ユーザに入力されたテキスト)を解析して得た指摘事項(コンセプト、使用食材)を、そのままプロンプトとして素案生成部112に入力してもよいし、適宜情報を補足等して生成したプロンプトを入力してもよい。 As shown in FIG. 8, first, the agent unit 111 of the information processing device 10 inputs the user's suggestions, acquired from a dialogue with the user, into the draft generation unit 112 (step S303). The user's suggestions are matters (requests) related to the dish the user wants to make, and in this embodiment, these are assumed to be the concept and ingredients to be used. The agent unit 111 may input the suggestions (concept, ingredients to be used) obtained by analyzing the user's dialogue (content converted into text by voice recognition or text entered by the user) using natural language processing directly into the draft generation unit 112 as a prompt, or may input a prompt generated by supplementing information as appropriate.
素案生成部112に入力される入力データ(プロンプト)の内容としては、例えば、「料理の目的(クリスマス料理、パーティー料理等)、食材(ポテト、チーズ、魚介等)、ジャンル(前菜、煮込み料理、中華料理等)、栄養(ローカロリー、ハイカロリー等)、ターゲット層(子供向け、アスリート向け等)、使用調理器具(オーブン料理、3Dプリンター等)、デザイン(幾何学的なデザイン、優しいデザイン、派手なデザイン等)」等が想定される。エージェント部111は、ユーザから入力されたコンセプト等についての詳細を対話で聞き出して補足したり、「客層はどのような感じですか?」と質問により必要な情報を聞き出して補足したりしてもよい。また、エージェント部111は、足りない情報をランダムに補ってもよい。 The contents of the input data (prompt) input to the draft generation unit 112 can be, for example, "purpose of cooking (Christmas dish, party dish, etc.), ingredients (potatoes, cheese, seafood, etc.), genre (appetizer, stew, Chinese food, etc.), nutrition (low calorie, high calorie, etc.), target demographic (children, athletes, etc.), cooking equipment used (oven cooking, 3D printer, etc.), design (geometric design, gentle design, flashy design, etc.)". The agent unit 111 may supplement the details of the concept input by the user through dialogue, or may supplement the information by eliciting necessary information by asking questions such as "What kind of customer base are you looking for?" The agent unit 111 may also randomly fill in any missing information.
次に、素案生成部112は、食材ペアリング部510により、食材ペアリング学習モデルを用いて、ユーザから指定された使用食材(上記指摘事項に含まれる)との組み合わせに適した食材(組み合わせ食材)を出力する(ステップS306)。 Next, the draft generation unit 112 uses the ingredient pairing learning model in the ingredient pairing unit 510 to output ingredients (combined ingredients) that are suitable for combination with the ingredients specified by the user (included in the above-mentioned recommendations) (step S306).
次いで、素案生成部112は、入力された指摘事項と組み合わせ食材とに基づいて、料理素案生成部520により、料理学習モデルを用いて、料理素案を生成する(ステップS309)。料理学習モデルには、上述したように、ユーザ料理学習モデルと他ユーザ料理学習モデルが含まれる。料理素案生成部520は、適宜料理学習モデルを用いて新しさレベルの異なる各料理素案を生成することが可能である。かかる料理素案の生成処理については、図12を参照して後述する。 Next, the recipe plan generation unit 112 uses the cooking learning model to generate a recipe plan using the recipe learning model by the recipe plan generation unit 520 based on the inputted recommendations and ingredient combinations (step S309). As described above, the cooking learning model includes the user's cooking learning model and other users' cooking learning models. The recipe plan generation unit 520 can generate recipe plans with different levels of freshness using the cooking learning model as appropriate. The recipe plan generation process will be described later with reference to Figure 12.
なお、指摘事項には、多数の項目や一項目に複数の内容が含まれたり、また、組み合わせ食材も複数ある場合がある。素案生成部112は、指摘事項と組み合わせ食材に含まれる各情報をランダムに組み合わせて多数の入力データを生成してもよい。これにより、ユーザの指摘事項等に基づく様々な組み合わせに応じた料理素案を生成することが可能となる。生成される料理素案は、レシピデータと料理画像を含むものを想定する。レシピデータと料理画像(撮像画像)を学習したユーザ学習モデルにおいて、レシピデータと対応する料理画像の生成が行われ得る。また、料理画像は、生成されたレシピデータに応じて、画像生成AIにより生成されてもよい。例えば、素案生成部112は、レシピデータをテキスト解析して得られた結果から適切なプロンプトを生成し、画像生成AIに入力して料理画像を取得してもよい。 Note that recommendations may include multiple items, multiple contents per item, or multiple combined ingredients. The recipe plan generation unit 112 may generate multiple input data by randomly combining the information contained in the recommendations and combined ingredients. This makes it possible to generate recipe plans that correspond to various combinations based on the user's recommendations, etc. The generated recipe plans are expected to include recipe data and dish images. A user learning model that has learned the recipe data and dish images (captured images) may generate dish images that correspond to the recipe data. Furthermore, the dish images may be generated by an image generation AI according to the generated recipe data. For example, the recipe plan generation unit 112 may generate an appropriate prompt from the results obtained by text analysis of the recipe data, and input this into the image generation AI to obtain dish images.
生成された料理素案は、表示制御部113により、表示部140に表示され、ユーザに提示される(ステップS312)。 The generated recipe plan is displayed on the display unit 140 by the display control unit 113 and presented to the user (step S312).
図9は、本実施形態による料理素案が表示される表示画面の一例を示す図である。図9に示すように、表示画面600において、多数の料理素案が表示される。ここでは、料理素案に含まれる料理画像がサムネイル表示される。ユーザが任意の料理画像をタップすると、料理素案の詳細、すなわちレシピの内容が表示される。 FIG. 9 is a diagram showing an example of a display screen on which a recipe plan is displayed according to this embodiment. As shown in FIG. 9, a large number of recipe plans are displayed on display screen 600. Here, thumbnails of the dish images included in the recipe plan are displayed. When the user taps on any of the dish images, details of the recipe plan, i.e., the contents of the recipe, are displayed.
図10および図11は、料理素案の詳細を表示する表示画面について説明するための図である。図9に示したような料理素案の一覧画面からユーザが任意の料理画像を選択すると、図10左に示すように、料理素案の詳細表示画面610aに切り替わる。詳細表示画面610aには、レシピデータ611a、料理画像612a、および指摘事項表示エリア613aが表示される。レシピデータ611aには、材料(食材、調味料)および調理工程等が含まれる。 FIGS. 10 and 11 are diagrams illustrating the display screen that displays details of a recipe plan. When a user selects any dish image from the recipe plan list screen as shown in FIG. 9, the screen switches to a recipe plan details display screen 610a, as shown on the left of FIG. 10. The details display screen 610a displays recipe data 611a, a dish image 612a, and a suggested items display area 613a. The recipe data 611a includes ingredients (foodstuffs, seasonings), cooking steps, etc.
ユーザは、提示されたレシピや料理画像を確認し、最初のユーザの指摘事項に応じて提示された料理素案に対して、この時点でもさらに指摘して料理素案を修正することが可能である。指摘事項の入力方法の一例について以下説明する。 The user can review the presented recipe and food images, and at this point make further suggestions and revise the presented dish draft based on the initial user suggestions. An example of how to input suggestions is explained below.
例えば図10右に示すように、料理画像612bの果物をタッチ操作により選択し(例えば丸で囲み)、「このメロンを春らしい果物にしたい」と言うと、エージェント部111により認識され、指摘事項表示エリア613aに、認識した指摘事項として「春らしい果物にする」と表示される。また、エージェント部111は、ユーザに選択された部分に対応するレシピの該当箇所(例えば、「ケーキの上にメロンを飾る」など)をマーキングしてもよい。 For example, as shown on the right side of Figure 10, if a fruit in cooking image 612b is selected by touch operation (e.g., by circling it) and the user says, "I want to make this melon into a spring-like fruit," the agent unit 111 will recognize this and the recognized suggestion, "Make it into a spring-like fruit," will be displayed in the suggestion display area 613a. The agent unit 111 may also mark the relevant part of the recipe that corresponds to the part selected by the user (e.g., "Decorate the cake with melon").
次いで、図11左に示すように、ユーザが料理画像612cの皿をタッチ操作により選択し、「この辺り、デコレーションを増やして華やかにしたい」と言うと、エージェント部111により認識され、指摘事項表示エリア613cに、認識した指摘事項として「デコレーションを増やして華やかにする」と表示される。また、エージェント部111は、ユーザに選択された部分に対応するレシピの該当箇所(例えば、「ケーキを皿に載せる」など)をマーキングしてもよい。 Next, as shown on the left side of Figure 11, if the user selects the plate in the cooking image 612c by touch operation and says, "I'd like to add more decorations to this area to make it more colorful," the agent unit 111 will recognize this and display the recognized suggestion, "Add more decorations to make it more colorful," in the suggestion display area 613c. The agent unit 111 may also mark the relevant part of the recipe that corresponds to the part selected by the user (for example, "Place the cake on the plate").
そして、情報処理装置10では、調整部114において、新たに入力された指摘事項に応じて料理素案の調整が行われる。調整の詳細については図13を参照して後述する。図11右には、ユーザが入力した指摘事項に応じて調整された料理素案を表示する表示画面を示す。表示画面610dには、調整により変更となったレシピデータ611dと、調整に応じて新たに生成された料理画像612dと、調整内容について説明する説明エリア613dが表示されている。レシピデータ611dでは、変更箇所がマーキングされてもよい。 Then, in the information processing device 10, the adjustment unit 114 adjusts the recipe draft in accordance with the newly input suggestions. Details of the adjustments will be described later with reference to FIG. 13. The right side of FIG. 11 shows a display screen displaying the recipe draft adjusted in accordance with the suggestions input by the user. Display screen 610d displays recipe data 611d that has been changed due to the adjustments, a new dish image 612d generated in accordance with the adjustments, and an explanation area 613d that explains the adjustments. The changed parts may be marked in the recipe data 611d.
(新しさレベルの異なる料理素案の生成)
続いて、素案生成部112における料理学習モデルを用いた料理素案の生成についてより具体的に説明する。上述したように、素案生成部112は、ユーザ料理学習モデルと、他ユーザ料理学習モデルと、を適宜用いて料理素案を生成することが可能である。ユーザ料理学習モデルを用いた場合、ユーザの料理の特徴に沿った(似た)料理素案が生成されることが想定され、他ユーザ料理学習モデルを用いた場合、ユーザの料理の特徴とは異なる(似ていない)、すなわち少なくともユーザにとってより新しさのある料理素案が生成されることが想定される。
(Generating recipe ideas with different levels of novelty)
Next, we will explain in more detail how the recipe plan generator 112 generates a recipe plan using the cooking learning model. As described above, the recipe plan generator 112 can generate a recipe plan by appropriately using the user's cooking learning model and the other user's cooking learning model. When the user's cooking learning model is used, it is expected that a recipe plan that is in line with (similar to) the characteristics of the user's cooking will be generated, and when the other user's cooking learning model is used, it is expected that a recipe plan that is different from (not similar to) the characteristics of the user's cooking will be generated, that is, at least more novel to the user.
素案生成部112は、ユーザが求める料理の新しさレベルの取得に応じて、適宜料理学習モデルを用いて、新しさレベルの異なる料理素案の生成率を制御し得る。 The draft generation unit 112 can use a cooking learning model as appropriate to control the generation rate of recipe drafts with different levels of novelty, depending on the level of novelty of the recipe desired by the user.
図12は、本実施形態による新しさレベルの異なる料理素案生成処理の流れの一例を示すフローチャートである。 Figure 12 is a flowchart showing an example of the process flow for generating recipe plans with different levels of freshness according to this embodiment.
図12に示すように、まず、素案生成部112は、ユーザが求める料理の新しさレベルを確認する(ステップS323)。ユーザが求める料理の新しさレベルは、エージェント部111によりユーザとの会話から取得されてもよい。また、最初の基本情報の入力処理においてユーザにより入力されてもよい。 As shown in FIG. 12, first, the draft generation unit 112 checks the newness level of the dish desired by the user (step S323). The newness level of the dish desired by the user may be obtained by the agent unit 111 from a conversation with the user. It may also be input by the user during the initial basic information input process.
次に、素案生成部112は、新しさレベルに応じて、生成する料理素案A~Dの数の割合を決める(ステップS326)。料理素案A~Dは、料理の新しさレベルが異なる素案であり、料理素案Aの新しさレベルa < 料理素案Bの新しさレベルb < 料理素案Cの新しさレベルc < 料理素案Dの新しさレベルd、の関係となる。素案生成部112は、新しさレベルの高さが高いほど、新しさレベルがより高い料理素案を多く生成するよう制御する。 Next, the plan generation unit 112 determines the ratio of the number of recipe plans A to D to be generated depending on the newness level (step S326). Recipe plans A to D are plans with different recipe newness levels, with the relationship being newness level a of recipe plan A < newness level b of recipe plan B < newness level c of recipe plan C < newness level d of recipe plan D. The plan generation unit 112 controls so that the higher the newness level, the more recipe plans with higher newness levels are generated.
次いで、素案生成部112は、ユーザにより指摘されたコンセプトおよび食材(組み合わせ食材を含む)の情報(プロンプト)を、料理学習モデルの入力データにセットする(ステップS329)。 Next, the draft generation unit 112 sets the information (prompt) about the concept and ingredients (including combination ingredients) indicated by the user as input data for the cooking learning model (step S329).
次に、素案生成部112は、ユーザ料理モデルを用いて、料理素案Aを生成する(ステップS332)。 Next, the recipe plan generation unit 112 generates a recipe plan A using the user recipe model (step S332).
また、素案生成部112は、ユーザ料理モデルを用いて生成した料理素案のスタイル(ここでは、料理の外観、見た目)を、他ユーザ料理モデルを用いて変更して料理素案Bを生成する(ステップS335)。本明細書において、「料理のスタイルを変える」とは、主に料理の外観、見た目といった、料理の盛り付け、配置、装飾等を変えることを意味する。より具体的には、料理素案に含まれる料理画像が変更される。同じレシピであっても、盛り付け、装飾、使用する食器等が異なると料理の印象が異なり得る。このような料理の外観にも、調理人の特徴が表れるものであり、他ユーザ料理モデルを用いて変更することで、(ユーザの料理との類似度が低い)新しい料理を創作することが可能となる。 The recipe plan generation unit 112 also uses another user's cooking model to change the style (here, the appearance of the dish) of the recipe plan generated using the user's cooking model, generating a recipe plan B (step S335). In this specification, "changing the style of the dish" primarily means changing the appearance of the dish, such as the presentation, arrangement, and decoration of the dish. More specifically, the dish image included in the recipe plan is changed. Even with the same recipe, the impression of the dish can change if the presentation, decoration, tableware used, etc. are different. The appearance of such dishes also reflects the chef's characteristics, and by changing it using another user's cooking model, it becomes possible to create a new dish (that is less similar to the user's dish).
また、素案生成部112は、他ユーザ料理モデルを用いて生成した料理素案のスタイルを、ユーザ料理モデルを用いて変更して料理素案Cを生成する(ステップS338)。 The recipe plan generation unit 112 also generates a recipe plan C by changing the style of the recipe plan generated using the other user's recipe model using the user's recipe model (step S338).
また、素案生成部112は、他ユーザ料理モデルを用いて、料理素案Dを生成する(ステップS341)。 The recipe plan generator 112 also generates a recipe plan D using other users' recipe models (step S341).
以上説明した料理素案A~Dの生成では、それぞれ多数の料理素案A~Dが生成される。素案生成部112は、上記ステップS326で決定した割合に応じて料理素案A~Dを生成するようにしてもよい。 In the generation of the recipe plans A to D described above, a large number of recipe plans A to D are generated. The plan generator 112 may generate recipe plans A to D according to the ratio determined in step S326 above.
そして、素案生成部112は、料理素案A~Dについて、それぞれ決定した割合に応じて(割合に応じた数の料理素案を)出力する(ステップS344)。各料理素案は、表示制御部113により表示部140に表示される。 Then, the plan generation unit 112 outputs the recipe plans A to D (the number of recipe plans corresponding to the ratio) according to the determined ratio (step S344). Each recipe plan is displayed on the display unit 140 by the display control unit 113.
以上、新しさレベルに応じた料理素案の生成について説明した。なお、「ユーザが求める料理の新しさレベル」は必ずしも必要ではなく、制御部110が任意に新しさレベルを設定してもよい。 The above describes how to generate recipe drafts according to newness levels. Note that the "newness level of the dish desired by the user" is not necessarily required, and the control unit 110 may set the newness level at will.
<3-3.調整処理>
続いて、本実施形態による調整処理について具体的に説明する。調整処理では、料理素案に対する調整が行われる。本実施形態では、提示された料理素案を見てユーザが新しい指摘事項を入力することが可能である。ユーザの考えは創作過程において変化する可能性があり、本実施形態では、ユーザが最初に入力した指摘事項に応じて料理学習モデルを用いて生成された料理素案に対してユーザが新しい指摘事項を入力できるようにすることで、ユーザとAIとの協業を実現し得る。情報処理装置10は、新しい指摘事項に応じて、再度、料理学習モデルにより料理素案を生成(修正)する。なお、最初の指摘事項に応じて多数の料理素案を生成して提示し、そこからユーザに少なくとも1つに選択させることで、よりユーザの意図に合った候補に絞っていくことが可能となる。また、後段の処理における計算量を削減することができる。
<3-3. Adjustment processing>
Next, the adjustment process according to this embodiment will be described in detail. In the adjustment process, adjustments are made to the recipe draft. In this embodiment, the user can input new suggestions after viewing the presented recipe draft. Since a user's ideas may change during the creative process, this embodiment allows the user to input new suggestions to a recipe draft generated using a cooking learning model based on the suggestions initially input by the user, thereby realizing collaboration between the user and AI. The information processing device 10 regenerates (modifies) a recipe draft using the cooking learning model based on the new suggestions. Note that by generating and presenting multiple recipe drafts based on the initial suggestions and allowing the user to select at least one from these, it is possible to narrow down the candidates to those that more closely match the user's intentions. This also reduces the amount of calculation required in subsequent processes.
図13は、本実施形態による調整処理の流れの一例を示すフローチャートである。 FIG. 13 is a flowchart showing an example of the flow of adjustment processing according to this embodiment.
図13に示すように、料理素案に対するユーザによる新たな指摘事項があり(ステップS403/Yes)、それが食材変更であった場合は(ステップS406/Yes)、調整部114は、食材変更に応じて、食材ペアリング学習モデルを用いて、変更された食材との組み合わせに適した食材を新たに出力する(ステップS409)。食材の変更が具体的な食材の指摘ではなかった場合(例えば、「春らしい果物に変えたい」、「歯ごたえのある食材も使いたい」、「野菜の種類を増やしたい」等)、調整部114は、「春らしい果物」というような抽象的な指摘に応じて、適宜データベースやAI(食材を決定するための予め学習済みの学習モデル等)を用いて食材を決定してもよい。 As shown in FIG. 13, if the user provides new instructions for the recipe draft (step S403/Yes) and these are changes to ingredients (step S406/Yes), the adjustment unit 114 uses the ingredient pairing learning model to output new ingredients that are suitable for pairing with the changed ingredients in response to the ingredient change (step S409). If the ingredient change is not a specific ingredient request (for example, "I want to change to a spring-like fruit," "I want to use ingredients with a good texture," "I want to increase the variety of vegetables," etc.), the adjustment unit 114 may determine ingredients using an appropriate database or AI (such as a pre-trained learning model for determining ingredients) in response to an abstract request such as "spring-like fruit."
次いで、調整部114は、使用食材と、組み合わせ食材と、ユーザの新たな指摘事項と、を入力データとして、料理学習モデルを用いて、料理素案におけるレシピデータの調整処理を行う(ステップS412)。食材変更以外の指摘事項としては、例えば、調理工程、または調理内容への指摘等が挙げられる。また、調整部114は、ユーザの新たな指摘事項(例えば料理のレイアウト、デザイン等に関する指摘事項)に応じて、料理学習モデルを用いて、料理外観の調整処理を行う(ステップS415)。具体的には、調整部114は、料理画像の変換処理を行う。また、調整部114は、修正されたレシピデータに対応するよう、料理画像を変換してもよい。 Next, the adjustment unit 114 uses the ingredients used, the combined ingredients, and the user's new comments as input data and performs an adjustment process on the recipe data in the draft dish using the cooking learning model (step S412). Comments other than changes to ingredients include, for example, comments on the cooking process or cooking contents. The adjustment unit 114 also performs an adjustment process on the appearance of the dish using the cooking learning model in accordance with the user's new comments (for example, comments on the layout, design, etc. of the dish) (step S415). Specifically, the adjustment unit 114 performs a conversion process on the dish image. The adjustment unit 114 may also convert the dish image to correspond to the revised recipe data.
最後に、調整部114は、ユーザの調理環境に対応するようレシピデータを調整する(ステップS418)。例えば、調整部114は、基本情報として入力されたユーザの調理環境の情報に基づいて、レシピの調理工程に対して、調理ロボットやレーザー加工機等を割り当てる。図14は、本実施形態によるユーザの調理環境に応じた調整の一例を示す図である。ここでは、ユーザによる「野菜を使った美しい詰め物料理を作りたい」という入力によって生成された「きのこファルシ」のレシピを用いて説明する。 Finally, the adjustment unit 114 adjusts the recipe data to accommodate the user's cooking environment (step S418). For example, the adjustment unit 114 assigns cooking robots, laser processing machines, etc. to the cooking steps of a recipe based on the information about the user's cooking environment input as basic information. Figure 14 is a diagram showing an example of adjustment according to the user's cooking environment according to this embodiment. Here, we will explain using the recipe for "Mushroom Farci" generated based on the user's input that "I want to make a beautiful stuffed dish using vegetables."
図14上段のレシピデータ620に示すように、レシピの調理工程が、「1.マッシュルームの軸をとる、2.マッシュルームの表面に模様をつける、3.フィリング(チーズ、ナッツ等)をマッシュルームに詰める、4.皿に盛り付け、葉を飾る」であり、ユーザの調理環境として、調理ロボットやレーザー加工機等が備えられている場合、調整部114は、調理工程に調理ロボット等を割り当てる。具体的には、図14下段のレシピデータ621に示すように、レシピデータ620を「1.ロボットがマッシュルームの軸を切り取る、2.マッシュルームの表面にレーザーで模様を彫刻する、3.ロボットがフィリング(チーズ、ナッツ等)をマッシュルームに詰める、4.ロボットが皿に盛り付け、葉を飾る」と修正する。 As shown in recipe data 620 in the upper row of Figure 14, the cooking steps of the recipe are "1. Remove the stems from the mushrooms, 2. Add a pattern to the surface of the mushrooms, 3. Stuff the mushrooms with a filling (cheese, nuts, etc.), 4. Arrange on a plate and garnish with leaves," and if the user's cooking environment is equipped with a cooking robot, laser processing machine, etc., the adjustment unit 114 assigns the cooking robot, etc. to the cooking steps. Specifically, as shown in recipe data 621 in the lower row of Figure 14, the recipe data 620 is modified to "1. The robot cuts off the stems from the mushrooms, 2. The robot engraves a pattern on the surface of the mushrooms with a laser, 3. The robot stuffs the mushrooms with a filling (cheese, nuts, etc.), 4. The robot arranges on a plate and garnishes with leaves."
<3-4.要素検討処理>
次に、要素検討処理について説明する。図15は、本実施形態による要素検討処理の流れの一例を示すフローチャートである。ここでは、素案生成部112により生成された料理素案、または調整部114により調整された料理素案に対して、要素毎に調理の実現可能性の検討が行われる。
<3-4. Element review processing>
Next, the element review process will be described. Fig. 15 is a flowchart showing an example of the flow of the element review process according to this embodiment. Here, the feasibility of cooking is reviewed for each element of the recipe draft generated by the draft generation unit 112 or the recipe draft adjusted by the adjustment unit 114.
図15に示すように、まず、情報処理装置10の要素検討部115は、料理素案に含まれるレシピデータの各調理工程を要素とする(ステップS503)。 As shown in FIG. 15, first, the element review unit 115 of the information processing device 10 considers each cooking step of the recipe data included in the recipe draft as an element (step S503).
次に、要素検討部115は、要素毎に調理工程の実現性の検討処理を行う(ステップS506)。要素の検討においては、内容を派生(具体化)させた上で(例えば「人参を花の形に切る」の場合、花の形のバリエーションを生成)、調理シミュレーションを行ったり、調理ロボット等を用いて実際に調理を行ったりする。このような要素検討の具体例について、図16~図19を参照して後述する。 Next, the element review unit 115 performs a process to review the feasibility of the cooking process for each element (step S506). When reviewing elements, the content is derived (specified) (for example, in the case of "cutting carrots into the shape of a flower," variations of the flower shape are generated), and a cooking simulation is performed, or actual cooking is performed using a cooking robot, etc. Specific examples of such element review will be described later with reference to Figures 16 to 19.
次いで、全て不成功の要素がある場合、要素検討部115は、調整部114に、レシピデータの調整要求を行い(ステップS512)、再度、本実施形態による要素検討処理を行う。 Next, if there are any elements that are all unsuccessful, the element review unit 115 requests the adjustment unit 114 to adjust the recipe data (step S512) and performs the element review process according to this embodiment again.
続いて、要素検討の具体例を説明する。例えば、図14に示すような「きのこファルシ」のレシピの場合、レシピデータ621は、調理工程に応じて、「1.ロボットがマッシュルームの軸を切り取る」、「2.マッシュルームの表面にレーザーで模様を彫刻する」、「3.ロボットがフィリング(チーズ、ナッツ等)をマッシュルームに詰める」、「4.ロボットが皿に盛り付け、葉を飾る」の4つの要素(要素1~4)に分割される。かかる要素の検討について、以下順次説明する。 Next, a specific example of element consideration will be explained. For example, in the case of a "Mushroom Farci" recipe as shown in Figure 14, recipe data 621 is divided into four elements (elements 1 to 4) according to the cooking process: "1. The robot cuts off the stems of the mushrooms," "2. The robot engraves a pattern on the surface of the mushrooms with a laser," "3. The robot stuffs the mushrooms with a filling (cheese, nuts, etc.)," and "4. The robot arranges the food on a plate and garnishes with leaves." The consideration of these elements will be explained in order below.
(要素1の検討)
要素1の調理工程「ロボットがマッシュルームの軸を切り取る」の検討について、図16を参照して説明する。図16は、要素1の検討処理の流れを示すフローチャートである。
(Consideration of element 1)
The examination of the cooking process of element 1, "The robot cuts off the stems of the mushrooms," will be described with reference to Fig. 16. Fig. 16 is a flowchart showing the flow of the examination process of element 1.
図16に示すように、まず、要素検討部115は、マッシュルームの軸をロボットで切り取る方法が学習済みである否かを確認する(ステップS523)。学習済みの場合(ステップS523/Yes)、処理はステップS532に進む。 As shown in FIG. 16, first, the element review unit 115 checks whether the method for using a robot to cut off mushroom stems has already been learned (step S523). If it has already been learned (step S523/Yes), processing proceeds to step S532.
未学習であった場合(ステップS523/No)、要素検討部115は、マッシュルームの3Dモデルをシミュレータに入力し(ステップS526)、ユーザ環境におけるロボットハンド(調理ロボット26)によるマッシュルームの軸の取り方を学習する(ステップS529)。マッシュルームの3Dモデルは、外部から取得されてもよいし、実際にカメラ20でマッシュルームを3Dキャプチャしてもよい。例えば、要素検討部115は、ユーザ環境におけるロボットハンドに対応するシミュレータを用いて、マッシュルームの軸の取り方を強化学習等の機械学習により学習することで、より効率的にマッシュルームの軸の取り方を模索することができる(シミュレーションの最適化)。ここで生成された学習モデル(調理加工学習モデルの一例)は、マッシュルームの軸の取り方を再度シミュレーションする際に用いられ得る。 If the model has not yet been learned (step S523/No), the element review unit 115 inputs the 3D model of the mushroom into the simulator (step S526) and learns how to remove the stem from the mushroom using the robotic hand (cooking robot 26) in the user's environment (step S529). The 3D model of the mushroom may be obtained from an external source, or the mushroom may be captured in 3D using the camera 20. For example, the element review unit 115 can use a simulator corresponding to the robotic hand in the user's environment to learn how to remove the stem from the mushroom using machine learning such as reinforcement learning, thereby exploring how to remove the stem from the mushroom more efficiently (simulation optimization). The learning model generated here (an example of a cooking and processing learning model) can be used when resimulating how to remove the stem from the mushroom.
次に、要素検討部115は、調理所要時間を取得する(ステップS532)。例えば、要素検討部115は、成功した軸の取り方における調理所要時間を取得する。 Next, the element review unit 115 obtains the cooking time required (step S532). For example, the element review unit 115 obtains the cooking time required for the successful axis selection method.
次いで、シミュレーションの結果、ユーザの調理環境で「マッシュルームの軸を取る」という要素1の調理工程について調理加工案(調理加工情報)を様々検討したにも関わらず全て不成功であった場合(ステップS535/Yes)、要素検討部115は、調整部114にレシピデータの調整を要求する(ステップS538)。調整部114では、例えば、マッシュルームの軸を取る作業は調理ロボット26では難しいため人間にやらせる作業とするレシピに変更する、軸を取る必要のないレシピに変更する、軸が取りやすい食材に変更する等の調整が行われ得る。なお、要素1の調理工程についての調理加工案のうち、成功した調理加工案がある場合(ステップS535/No)、処理は終了する。後述する統合処理においては、要素の統合において、成功した調理加工案が用いられる。 Next, if the simulation results show that all cooking and processing plans (cooking and processing information) for the cooking step of element 1, "removing mushroom stems," are unsuccessful despite being considered in the user's cooking environment (step S535/Yes), the element review unit 115 requests the adjustment unit 114 to adjust the recipe data (step S538). The adjustment unit 114 may make adjustments, such as changing the recipe to one where removing mushroom stems is a task that is difficult for the cooking robot 26 to perform and therefore a task that must be done by a human, changing the recipe to one that does not require removing the stems, or changing the ingredients to ones whose stems are easier to remove. Note that if any of the cooking and processing plans for the cooking step of element 1 are successful (step S535/No), the processing ends. In the integration process described below, the successful cooking and processing plans are used when integrating the elements.
(要素2の検討)
要素2の調理工程「マッシュルームの表面にレーザーで模様を彫刻する」の検討について、図17を参照して説明する。図17は、要素2の検討処理の流れを示すフローチャートである。
(Consideration of element 2)
The examination of the cooking process of element 2, "carving a pattern on the surface of the mushroom with a laser," will be described with reference to Fig. 17. Fig. 17 is a flowchart showing the flow of the examination process of element 2.
図17に示すように、まず、要素検討部115は、マッシュルームへのレーザーカッター(レーザー加工機24)による彫刻が学習済みであるか否かを確認する(ステップS543)。学習済みの場合(ステップS543/Yes)、処理はステップS567に進む。 As shown in FIG. 17, first, the element review unit 115 checks whether engraving a mushroom with a laser cutter (laser processing machine 24) has been learned (step S543). If learning has been learned (step S543/Yes), processing proceeds to step S567.
未学習の場合(ステップS543/No)、要素検討部115は、模様パターンをn個生成する(ステップS546)。例えば、要素検討部115は、調整部114から出力された料理画像をベースとして、模様のバリエーションを、スタイル変換等のアルゴリズムを用いて生成する。 If learning has not been performed (step S543/No), the element review unit 115 generates n pattern patterns (step S546). For example, the element review unit 115 generates pattern variations based on the food image output from the adjustment unit 114 using an algorithm such as style conversion.
次に、要素検討部115は、レーザーの加工パラメータをm組定義する(ステップS549)。例えば、要素検討部115は、レーザー制御のパワーとスピードの範囲について、「パワー:最小値40、最大値60」、「スピード:最小値50、最大値90」のように設定し、パワーとスピードの組み合わせをm個定義する。要素検討部115は、類似食材の加工例を参照したり、安全性を考慮して低いパワーのパラメータから定義するようにしてもよい。 Next, the element review unit 115 defines m sets of laser processing parameters (step S549). For example, the element review unit 115 sets the power and speed ranges for laser control as "power: minimum 40, maximum 60" and "speed: minimum 50, maximum 90," and defines m combinations of power and speed. The element review unit 115 may refer to processing examples of similar ingredients, or may define parameters starting with low power parameters for safety reasons.
次いで、要素検討部115は、模様パターンと加工パラメータの組を選択する(ステップS552)。 Next, the element review unit 115 selects a set of pattern pattern and processing parameters (step S552).
次に、要素検討部115は、ロボット(調理ロボット26)によりマッシュルームを(レーザーカッターによる加工が可能な場所に)セットし(ステップS555)、レーザーカッターを制御して加工(彫刻)する(ステップS558)。 Next, the element review unit 115 uses the robot (cooking robot 26) to set the mushroom (in a location where it can be processed using a laser cutter) (step S555), and controls the laser cutter to process (carve) it (step S558).
次いで、要素検討部115は、加工後の食材をカメラ20により撮影し、画像を取得する(ステップS561)。 Next, the element review unit 115 photographs the processed food material using the camera 20 and acquires the image (step S561).
以上のステップS552~S561に示す処理を、模様パターンと加工パラメータの全組み合わせパターンで行う(ステップS562)。これにより、例えば下記表2に示すような多数の加工結果(加工後の食材の撮像画像)が得られる。 The above processing shown in steps S552 to S561 is performed for all combinations of pattern patterns and processing parameters (step S562). This results in a large number of processing results (images of processed ingredients), such as those shown in Table 2 below.
全パターン行った場合(ステップS403/Yes)、要素検討部115は、模様パターン、加工パラメータ、および加工後の画像に基づいて、これらの関係性を学習し、模様パターンおよび加工パラメータから加工結果を予測するための学習モデルを作成する(ステップS564)。同じ模様パターンであっても、加工パラメータによって、加工結果が鮮明であったり、つぶれてしまったり、薄かったりしており、異なる。食材の成分によって加工反応がリニアでなかったり、食材が膨らんだり、崩れたり等の現象が起こるためである。本実施形態では、学習モデル(調理加工学習モデルの一例)を用いて、模様パターンおよび加工パラメータから加工結果を予測できるようにすることで、模様パターンおよび加工パラメータの多数の組み合わせについて検討することが可能となる。学習モデルは、例えば強化学習等の機械学習により生成され得る。 If all patterns have been examined (step S403/Yes), the element review unit 115 learns the relationships between the pattern pattern, processing parameters, and processed image based on these, and creates a learning model for predicting the processing result from the pattern pattern and processing parameters (step S564). Even with the same pattern pattern, the processing result can be clear, crushed, or thin, depending on the processing parameters. This is because the processing reaction may not be linear depending on the ingredients of the ingredients, and phenomena such as the ingredients expanding or crumbling may occur. In this embodiment, a learning model (an example of a cooking processing learning model) is used to predict the processing result from the pattern pattern and processing parameters, making it possible to consider many combinations of pattern patterns and processing parameters. The learning model can be generated by machine learning such as reinforcement learning, for example.
次いで、要素検討部115は、要素2における模様から生成される模様パターンと(学習モデルで予測した)加工結果の一致度(模様が加工によって崩れていないか)で、各調理加工案(模様パターンと加工パラメータの組み合わせ:調理加工情報の一例)について評価付けを行う(ステップS567)。 Next, the element review unit 115 evaluates each cooking and processing plan (a combination of pattern pattern and processing parameters: an example of cooking and processing information) based on the degree of match (whether the pattern is distorted by processing) between the pattern generated from the pattern in element 2 and the processing result (predicted by the learning model) (step S567).
また、要素検討部115は、(各組み合わせパターンにおける)調理所要時間を取得する(ステップS570)。 The element review unit 115 also obtains the cooking time required (for each combination pattern) (step S570).
そして、上記ステップS567で行われた評価が、全て低評価だった場合(ステップS573/Yes)、要素検討部115は、調整部114にレシピデータの調整を要求する(ステップS576)。調整部114では、例えば、食材の変更や模様の変更が行われる。なお、要素2の調理工程についての調理加工案のうち、評価が一定以上の調理加工案がある場合(ステップS573/No)、処理は終了する。後述する統合処理においては、要素の統合において、評価が一定以上の調理加工案が用いられる。 If all of the evaluations performed in step S567 above are low (step S573/Yes), the element review unit 115 requests the adjustment unit 114 to adjust the recipe data (step S576). The adjustment unit 114 changes the ingredients or the pattern, for example. Note that if there is a cooking and processing plan for the cooking step of element 2 that has a rating above a certain level (step S573/No), the process ends. In the integration process described below, cooking and processing plans that have a rating above a certain level are used when integrating elements.
(要素3の検討)
要素3の調理工程「ロボットがフィリング(チーズ、ナッツ等)をマッシュルームに詰める」の検討について、図18を参照して説明する。図18は、要素3の検討処理の流れを示すフローチャートである。
(Consideration of element 3)
The examination of the cooking process of element 3, "Robot stuffs mushrooms with fillings (cheese, nuts, etc.)," will be described with reference to Fig. 18. Fig. 18 is a flowchart showing the flow of the examination process of element 3.
図18に示すように、まず、要素検討部115は、マッシュルームに詰めるフィリングについてユーザからの制限指示がある場合(ステップS583/Yes)、フィリング材に制限を設定する(ステップS586)。ユーザからの制限指示は、例えば、「アレルギーを持つ客向けの料理を作りたい」、「カロリーを抑えたい」など、エージェントとユーザの会話から抽出されたものであってもよい。フィリング材の制限としては、例えば、食材や量の制御が想定される。ユーザからの制限指示がない場合(ステップS583/No)、処理はステップS589に進む。 As shown in FIG. 18, first, if the user has instructed to restrict the filling to be stuffed into the mushrooms (step S583/Yes), the element review unit 115 sets restrictions on the filling material (step S586). Restrictions from the user may be extracted from a conversation between the agent and the user, such as "I want to make a dish for customers with allergies" or "I want to reduce the calories." Restrictions on the filling material may include, for example, controlling the ingredients and quantity. If the user has not instructed to restrict the filling (step S583/No), the process proceeds to step S589.
次に、要素検討部115は、フィリング材として混合する食材のパターンを複数作成する(ステップS589)。 Next, the element review unit 115 creates multiple patterns of ingredients to be mixed as fillings (step S589).
次いで、要素検討部115は、シミュレータに混合材料の3Dモデルをインポートし、かかるシミュレータを用いて、混合パターンに応じたロボットハンドによるフィリング材のマッシュルームへの充填を学習する(ステップS592)。生成される学習モデルは、調理加工学習モデルの一例である。学習モデルは、例えば強化学習等の機械学習により生成され得る。 Next, the element review unit 115 imports the 3D model of the mixed material into a simulator, and uses the simulator to learn how the robot hand fills mushrooms with the filling material according to the mixing pattern (step S592). The generated learning model is an example of a cooking and processing learning model. The learning model can be generated by machine learning, such as reinforcement learning.
次に、要素検討部115は、調理所要時間を取得する(ステップS595)。 Next, the element review unit 115 obtains the cooking time required (step S595).
そして、全て不成功の場合(ステップS598/Yes)、要素検討部115は、調整部114にレシピデータの調整を要求する(ステップS601)。要素検討部115は、上記生成した学習モデルを用いて、各混合パターンの実現可能性(成功するか否か)を検討し得る。なお、要素3の調理工程についての調理加工案のうち、成功した調理加工案がある場合(ステップS598/No)、処理は終了する。後述する統合処理においては、要素の統合において、成功した調理加工案が用いられる。 If all of them are unsuccessful (step S598/Yes), the element review unit 115 requests the adjustment unit 114 to adjust the recipe data (step S601). The element review unit 115 can use the generated learning model to consider the feasibility (whether or not it will be successful) of each mixing pattern. Note that if there is a successful cooking/processing plan among the cooking steps for element 3 (step S598/No), the processing ends. In the integration process described below, the successful cooking/processing plan is used when integrating the elements.
(要素4の検討)
要素4の調理工程「ロボットが皿に盛り付け、葉を飾る」の検討について、図19を参照して説明する。図19は、要素4の検討処理の流れを示すフローチャートである。
(Consideration of element 4)
The examination of the cooking process of element 4, "The robot arranges the food on a plate and garnishes it with leaves," will be described with reference to Fig. 19. Fig. 19 is a flowchart showing the flow of the examination process of element 4.
図19に示すように、まず、要素検討部115は、ユーザが普段使用する食器に料理を盛り付けた料理画像のバリエーションを生成する(ステップS603)。ユーザが使用する食器に関する情報は、基本情報の入力処理において取得済みである。 As shown in FIG. 19, first, the element review unit 115 generates variations of food images in which food is served on tableware that the user normally uses (step S603). Information about the tableware used by the user has already been acquired during the basic information input process.
次に、要素検討部115は、皿に葉や花を飾る場合、飾るものの3Dモデルのバリエーションを生成する(ステップS606)。例えば葉の場合、多種類の葉を組み合わせて飾ることが多いため、多種類の葉の3Dモデルを生成する。なお、3Dモデルのバリエーションの生成は、ツール連携する3DCAD等で行われてもよい。ここでは、葉を飾る場合について以下説明する。 Next, when decorating a plate with leaves or flowers, the element review unit 115 generates variations of 3D models of the decorations (step S606). For example, in the case of leaves, many different types of leaves are often combined for decoration, so 3D models of many different types of leaves are generated. Note that the generation of variations of 3D models may be performed using a tool such as 3D CAD that is linked to the tool. Here, the case of decorating with leaves will be explained below.
次いで、要素検討部115は、葉の配置パターンを生成する(ステップS609)。要素検討部115は、例えば多種類の葉を組み合わせて配置パターンを生成する。 Next, the element review unit 115 generates a leaf arrangement pattern (step S609). The element review unit 115 generates an arrangement pattern by, for example, combining multiple types of leaves.
次に、要素検討部115は、シミュレータを用いて、ロボットハンド(調理ロボット26)により可能な配置を学習する(ステップS612)。生成される学習モデルは、調理加工学習モデルの一例である。学習モデルは、例えば強化学習等の機械学習により生成され得る。 Next, the element review unit 115 uses a simulator to learn possible arrangements for the robot hand (cooking robot 26) (step S612). The generated learning model is an example of a cooking and processing learning model. The learning model can be generated by machine learning, such as reinforcement learning.
次いで、要素検討部115は、実現が難しい配置パターンを削除する(ステップS615)。要素検討部115は、上記生成した学習モデルを用いて、各配置パターンの実現可能性(成功するか否か)を検討し得る。 Next, the element review unit 115 deletes placement patterns that are difficult to realize (step S615). The element review unit 115 can use the generated learning model to consider the feasibility (whether or not each placement pattern will be successful) of each placement pattern.
続いて、要素検討部115は、食器に料理を盛り付けた上記料理画像のバリエーションに、学習した各配置パターン(具体的には、ロボットハンドで実現できる可能性の高い配置パターン)の葉のレイアウトを、それぞれ組み合わせた料理画像を生成する(ステップS618)。 Next, the element review unit 115 generates food images by combining variations of the above food image of food served on tableware with leaf layouts of each learned arrangement pattern (specifically, arrangement patterns that are likely to be realized by the robot hand) (step S618).
また、要素検討部115は、(各配置パターンにおける)調理所要時間を取得する(ステップS621)。 The element review unit 115 also obtains the cooking time required (for each arrangement pattern) (step S621).
そして、全て不成功の場合(すなわち、実現可能な配置パターンがない場合)(ステップS624/Yes)、要素検討部115は、調整部114にレシピデータの調整を要求する(ステップS627)。なお、要素4の調理工程についての調理加工案(配置パターン:調理加工情報の一例)のうち、成功した調理加工案がある場合(ステップS624/No)、処理は終了する。後述する統合処理においては、要素の統合において、成功した調理加工案が用いられる。 If all of the cooking and processing plans are unsuccessful (i.e., there is no feasible arrangement pattern) (step S624/Yes), the element review unit 115 requests the adjustment unit 114 to adjust the recipe data (step S627). If there is a successful cooking and processing plan among the cooking process plans (arrangement pattern: an example of cooking and processing information) for element 4 (step S624/No), the process ends. In the integration process described below, the successful cooking and processing plan is used when integrating the elements.
以上、各要素の検討の具体例について説明した。このように、要素検討処理においては、料理素案におけるレシピデータの各調理工程に基づいて、より具体的な調理工程(調理加工情報)のバリエーションを作成し、調理工程が成功するか否かの検討が行われる。また、情報処理装置10は、上述した各要素の検討において用いられる調理加工学習モデルの学習において、失敗事例も学習データに含めることで、学習モデルの生成精度をより高めることを可能とする。なお、要素検討部115は、各要素の複数の料理加工情報(または加工結果)を候補としてユーザに提示し、統合に含める料理加工情報を選択できるようにしてもよい。 The above describes specific examples of the consideration of each element. In this way, in the element consideration process, more specific variations of the cooking process (cooking processing information) are created based on each cooking process of the recipe data in the dish draft, and a consideration is made as to whether the cooking process will be successful. Furthermore, in learning the cooking processing learning model used in the consideration of each element described above, the information processing device 10 can further improve the accuracy of generating the learning model by including failure cases in the learning data. Note that the element consideration unit 115 may present multiple cooking processing information (or processing results) for each element as candidates to the user, allowing the user to select the cooking processing information to be included in the integration.
<3-5.統合処理>
図20は、本実施形態による統合処理の流れの一例を示すフローチャートである。統合部116では、要素検討部115で検討された各要素を組み合わせて調理加工統合情報を生成する。
<3-5. Integration processing>
20 is a flowchart showing an example of the flow of the integration process according to this embodiment. The integration unit 116 combines the elements reviewed by the element review unit 115 to generate integrated cooking and processing information.
図20に示すように、まず、統合条件に関するユーザからの指示がある場合(ステップS703/Yes)、統合部116は、ユーザ指示に応じてデフォルトの条件を変更する(ステップS706)。例えば、ユーザからの指示として、調理時間が15分以内、要素2の評価ランクが3以上、といった条件が挙げられる。また、ユーザからの指示として、ユーザによる統合に含めたい調理加工情報の選択が挙げられる。 As shown in FIG. 20, first, if there are instructions from the user regarding the integration conditions (step S703/Yes), the integration unit 116 changes the default conditions in accordance with the user instructions (step S706). For example, user instructions may include conditions such as a cooking time of 15 minutes or less and an evaluation rank of 3 or higher for element 2. Also, user instructions may include the user's selection of cooking and processing information to be included in the integration.
一方、統合条件に関するユーザからの指示がない場合(ステップS703/No)、統合部116は、デフォルトの条件を設定する(ステップS709)。 On the other hand, if there are no instructions from the user regarding the integration conditions (step S703/No), the integration unit 116 sets the default conditions (step S709).
次いで、統合部116は、条件を満たす各要素の組み合わせパターンを生成する(ステップS712)。 Next, the integration unit 116 generates combination patterns of each element that satisfy the conditions (step S712).
次に、統合部116は、各組み合わせパターンに基づいて、調理加工統合情報として、統合レシピおよび対応する料理画像を生成する(ステップS715)。 Next, the integration unit 116 generates an integrated recipe and corresponding dish image as integrated cooking and processing information based on each combination pattern (step S715).
そして、最後に、統合部116は、統合レシピの手順効率、および安全性を調整する(ステップS718)。例えば「きのこファルシ」のレシピの場合、フィリングの材料によって混ぜたフィリングを冷蔵庫でしばらくなじませる必要がある場合、統合部116は、その工程を先に行うよう調整することで、全体の調理時間を短くすることが可能である。また、統合部116は、食品安全性を考慮し、調理ロボットが時間のかかる加工や配置を行っている間は、温度管理が必要な食材を一旦冷蔵庫に入れるステップを追加する。手順効率や安全性の知識は、料理学習で得てもよいし、外部のデータベース等と連携して得てもよい。 Finally, the integration unit 116 adjusts the procedural efficiency and safety of the integrated recipe (step S718). For example, in the case of a recipe for "mushroom farci," if the mixed filling ingredients need to be left to soak in the refrigerator for a while, the integration unit 116 can shorten the overall cooking time by adjusting the process so that this step is carried out first. Furthermore, in consideration of food safety, the integration unit 116 adds a step in which ingredients that require temperature control are temporarily placed in the refrigerator while the cooking robot performs time-consuming processing and arrangement. Knowledge of procedural efficiency and safety may be acquired through cooking learning, or by linking with an external database, etc.
<3-6.提示処理>
続いて、本実施形態による調理加工統合情報の提示処理について説明する。統合部116により統合された統合レシピおよび対応する料理画像は、調理加工統合情報の一例である。
<3-6. Presentation process>
Next, a process for presenting integrated cooking and processing information according to this embodiment will be described. The integrated recipe and the corresponding dish image integrated by the integration unit 116 are an example of integrated cooking and processing information.
表示制御部113は、調理加工統合情報の提示の一例として、各料理画像をサムネイル表示してもよい。図21は、本実施形態による調理加工統合情報の提示の一例を示す図である。図21に示すように、提示画面640において、料理画像がサムネイル表示される。 The display control unit 113 may display each dish image as a thumbnail as an example of the presentation of the integrated cooking and processing information. Figure 21 is a diagram showing an example of the presentation of the integrated cooking and processing information according to this embodiment. As shown in Figure 21, the dish images are displayed as thumbnails on the presentation screen 640.
調理加工統合情報の提示の詳細について、以下説明する。図22は、本実施形態による調理加工統合情報の提示処理の流れの一例を示すフローチャートである。 Details of the presentation of integrated cooking and processing information are explained below. Figure 22 is a flowchart showing an example of the flow of the process for presenting integrated cooking and processing information according to this embodiment.
図22に示すように、まず、ソース条件のユーザ指示がある場合(ステップS803/Yes)、表示制御部113は、条件に応じて料理画像をソートする(ステップS806)。例えば、調理時間順などが挙げられる。 As shown in FIG. 22, first, if a user instruction for a sauce condition is given (step S803/Yes), the display control unit 113 sorts the dish images according to the condition (step S806). For example, this may be in order of cooking time.
一方、ソース条件のユーザ指示がない場合(ステップS803/No)、表示制御部113は、例えば作成順に料理画像をソートする(ステップS809)。 On the other hand, if no sauce conditions are specified by the user (step S803/No), the display control unit 113 sorts the dish images, for example, in the order in which they were created (step S809).
次に、表示制御部113は、料理画像の一覧を表示する(ステップS812)。具体例は、図21に示した通りである。 Next, the display control unit 113 displays a list of food images (step S812). A specific example is as shown in Figure 21.
次いで、ユーザにより料理画像が選択された場合(ステップS815/Yes)、表示制御部113は、選択料理の詳細を表示する(ステップS818)。 Next, if a dish image is selected by the user (step S815/Yes), the display control unit 113 displays details of the selected dish (step S818).
次に、選択料理の詳細表示画面において、料理画像に含まれる料理の構成(食材)がポインティングされた場合(ステップS821/Yes)、表示制御部113は、その構成の詳細を表示する(ステップS824)。 Next, if a component (ingredient) of a dish included in the dish image is pointed to on the details display screen for the selected dish (step S821/Yes), the display control unit 113 displays details of that component (step S824).
図23は、本実施形態による選択料理の詳細表示画面の一例を示す図である。図23に示すように、詳細表示画面650には、料理画像651、選択された料理の要素情報表示652、および調理加工パターンの詳細表示653が表示されている。 FIG. 23 is a diagram showing an example of a details display screen for a selected dish according to this embodiment. As shown in FIG. 23, the details display screen 650 displays a dish image 651, element information display 652 for the selected dish, and a detailed display 653 of the cooking and processing pattern.
要素情報表示652では、採用された調理加工パターンの他、採用されなかった調理加工パターンも表示される。例えば、料理画像651において選択した料理「きのこファルシ」の各要素のうち、要素2の「調理工程2」が選択された場合、「模様パターン」と「加工パラメータ」のブロックが表示される。「模様パターン」が選択された場合、詳細表示653には、図23に示すように、料理には採用されなかったものも含めて、要素検討部115における要素検討で生成された多数の模様パターン(調理加工情報の一例)が表示される。 The element information display 652 displays not only the cooking and processing patterns that were adopted, but also the cooking and processing patterns that were not adopted. For example, if "Cooking Process 2" of element 2 is selected from the elements of the dish "Mushroom Farci" selected in the dish image 651, the "Pattern Pattern" and "Processing Parameter" blocks are displayed. If "Pattern Pattern" is selected, the detailed display 653 displays a large number of pattern patterns (examples of cooking and processing information) generated by element review in the element review unit 115, including those that were not adopted in the dish, as shown in FIG. 23.
また、「加工パラメータ」が選択された場合、詳細表示653には、料理には採用されなかったものも含めて、要素検討部115における要素検討で生成された多数の加工パラメータ(調理加工情報の一例)が表示される。この際、各模様パターンの加工パラメータ毎の加工後の画像(学習モデルにより予測された加工結果)が併せて表示されてもよい。ユーザは、どの模様パターンがどのような加工パラメータにより加工されると加工結果がどのようになるかを確認することができる。 Furthermore, when "Processing parameters" is selected, the detailed display 653 displays a large number of processing parameters (an example of cooking processing information) generated during element review in the element review unit 115, including those not used in the dish. At this time, an image after processing for each processing parameter of each pattern pattern (the processing result predicted by the learning model) may also be displayed. The user can confirm what the processing result will be when which pattern pattern is processed with which processing parameter.
このように、ユーザは、要素検討部115において派生して生成されたが採用されなかった模様パターン、採用されなかった多数の加工パラメータ、および加工パラメータに応じた加工結果等を確認することができる。これらを確認しながら、ユーザは、「少し形は崩れるけど、こっちぐらいの色味(焦げ色)がいいな」といった、より具体的な指摘を行うことができ、効率よく検討サイクルを回すことができる。すなわち、ユーザは、多数提示される調理加工情報から任意の調理加工情報を選択し、選択した調理加工情報を含む調理加工統合情報を生成するよう指示することができる。本実施形態では、ユーザから新たな指摘事項がある度に修正を繰り返して最終案に帰着する反復的処理を想定している。 In this way, the user can check the pattern patterns that were derived and generated in the element review unit 115 but not adopted, the numerous processing parameters that were not adopted, and the processing results according to the processing parameters. While checking these, the user can make more specific suggestions such as, "The shape will be a little distorted, but I think this color (burnt color) would be good," allowing for an efficient review cycle. In other words, the user can select any cooking and processing information from the many pieces of cooking and processing information presented, and instruct the system to generate integrated cooking and processing information that includes the selected cooking and processing information. This embodiment assumes an iterative process in which corrections are made each time the user makes a new suggestion, resulting in a final proposal.
具体的には、図22に示すように、ユーザによる指摘入力がある場合(ステップS827/Yes)、エージェント部111は、エージェントとユーザとの対話によりユーザの新たな指摘事項を取得する(ステップS830)。 Specifically, as shown in FIG. 22, if the user inputs a comment (step S827/Yes), the agent unit 111 acquires the user's new comment through a dialogue between the agent and the user (step S830).
次いで、制御部110は、ユーザの指摘事項をまとめてユーザに提示することでユーザが確認できるようにすると共に、修正を開始する(ステップS833)。具体的には、図5に示すように、調整部114における調整処理、要素検討部115における検討処理、統合部116による統合処理が、適宜再度行われ、修正した調理加工情報が提示される。 The control unit 110 then summarizes the user's suggestions and presents them to the user so that the user can confirm them, and begins corrections (step S833). Specifically, as shown in FIG. 5, the adjustment process in the adjustment unit 114, the review process in the element review unit 115, and the integration process by the integration unit 116 are performed again as appropriate, and the corrected cooking and processing information is presented.
ユーザの新たな指摘事項と修正の具体例は、例えば下記のようなものが挙げられる。 Examples of new user feedback and corrections include the following:
<<4.その他>>
本実施形態による料理情報生成システム1において用いられる各種学習モデルは、LLM(大規模言語モデル)により実現されてもよい。すなわち、上述したユーザ料理学習モデル521、他ユーザ料理学習モデル522、食材ペアリング学習モデル、および調理加工学習モデルは、知識データベースとしてLLMを利用する構成であってもよい。
<<4. Other>>
The various learning models used in the cooking information generation system 1 according to this embodiment may be realized by LLMs (large-scale language models). That is, the above-described user cooking learning model 521, other users cooking learning model 522, ingredient pairing learning model, and cooking processing learning model may be configured to use LLMs as knowledge databases.
また、本実施による各種学習モデルの学習において、情報処理装置10は、失敗事例を学習に用いてもよい。これにより、学習モデルの生成精度を高めることが可能となる。 Furthermore, when training various learning models in this implementation, the information processing device 10 may use failure cases for training. This makes it possible to improve the accuracy of generating learning models.
また、統合部116は、統合処理におけるユーザからの入力に応じて、料理の提供を受ける個人ごとに要素(調理加工情報)の内容を変更する条件であるパーソナライズ条件を設定することも可能である。例えば、統合部116は、個人ごとの嗜好または摂取カロリー等に対して、指定された要素における食材の種類、量、配置、または加工状態等を決定するパーソナライズ条件を設定する。統合部116は、ユーザからのパーソナライズ条件の設定に応じて、パーソナライズ条件を満たす統合レシピを生成する。また、調理時において、表示制御部113は、ユーザから指定されたパーソナライズ条件に応じて、対応する要素の構成を組み替えて統合レシピを提示してもよい。 The integration unit 116 can also set personalization conditions, which are conditions for changing the content of elements (cooking and processing information) for each individual receiving the food, in accordance with input from the user during the integration process. For example, the integration unit 116 sets personalization conditions that determine the type, amount, arrangement, or processing state of ingredients in specified elements, based on each individual's preferences or calorie intake. The integration unit 116 generates an integrated recipe that meets the personalization conditions set by the user. Furthermore, during cooking, the display control unit 113 may rearrange the configuration of corresponding elements in accordance with the personalization conditions specified by the user and present the integrated recipe.
上述した実施形態では、図21や図23に示すように、生成した調理加工情報のユーザへの提示方法として、料理画像を画面に表示しているが、本実施形態はこれに限定されず、情報処理装置10は、調理ロボット26等を用いて上位ランクの料理を実際に調理してユーザに提示してもよい。 In the above-described embodiment, as shown in Figures 21 and 23, a food image is displayed on the screen as a method of presenting the generated cooking and processing information to the user, but this embodiment is not limited to this, and the information processing device 10 may actually cook a high-ranked dish using a cooking robot 26 or the like and present it to the user.
また、本実施形態では、ストーリー生成部117による開発ストーリーの生成も可能である。料理創作の過程において、エージェントとユーザの対話や、調整内容、要素検討内容等、記憶部150には様々なデータが蓄積される。ストーリー生成部117は、料理が完成した場合(新たな指摘事項がなく、最終決定された場合)、記憶部150に蓄積されたデータから、ユーザの特徴を示す口癖やこだわりをピックアップしたり、ユーザの指摘によって行われた珍しい修正、再学習、失敗や成功のエピソード(失敗/成功した調理加工処理方法など)を抽出して開発ストーリーをまとめ、テキストや画像により出力する。 In this embodiment, the story generation unit 117 can also generate a development story. During the process of creating a dish, various data is accumulated in the memory unit 150, such as dialogue between the agent and the user, adjustments made, and element considerations. When the dish is completed (when there are no new issues to be pointed out and it has been finalized), the story generation unit 117 picks out catchphrases and preferences that reflect the user's characteristics from the data accumulated in the memory unit 150, extracts unusual corrections made in response to the user's suggestions, re-learning, and episodes of failure and success (such as failed/successful cooking and processing methods), compiles a development story, and outputs it as text or images.
また、本実施形態では、最初の料理素案に対して、ユーザの指摘事項は考慮するが、調整、要素検討、および統合は、機械的に自動で行い、ユーザに提示している。ここで、料理は多食材を使った複雑な調理となる場合も多く、より美味しい料理を作るためには調理人等の感覚も重要となる。そこで、より人とAIツールが協働しながら新しい料理を生み出す方法として、適宜、人手により収集したデータを用いるようにしてもよい。 In addition, in this embodiment, the initial recipe draft is prepared taking into consideration any user comments, but adjustments, element review, and integration are performed automatically and mechanically, and the result is presented to the user. Here, dishes are often complex and require multiple ingredients, so the chef's sense is also important in creating delicious dishes. Therefore, as a method for creating new dishes through better collaboration between people and AI tools, manually collected data may be used as appropriate.
例えば、「きのこのファルシ」レシピについて、調整部114において、ユーザの調理環境に応じて調整する際、調理工程3(要素3):フィリングと、調理工程4(要素4):盛り付けについて、ユーザ環境の調理ロボットでは対応が難しいと判断した場合や、人手を用いた方がよいと判断した場合、人手を取り入れたレシピにするようにしてもいい。人手を取り入れた場合のレシピの要素検討は、例えば下記のように行われる。 For example, when the adjustment unit 114 adjusts the "Mushroom Farci" recipe to suit the user's cooking environment, if it determines that cooking step 3 (element 3): filling and cooking step 4 (element 4): plating are difficult for the cooking robot in the user's environment to handle, or if it determines that manual intervention is better, it may make the recipe one that incorporates manual intervention. Consideration of recipe elements when manual intervention is used is carried out, for example, as follows:
人手を用いる調理工程(要素)の検討では、実際に人員を配置して検討が行われる。情報処理装置10は、アサイン可能なスタッフのプロフィール(得意分野等)、およびスケジュール情報(空き時間)を取得して人選してもよい。 When considering cooking processes (elements) that require manual labor, actual personnel are assigned and the study is carried out. The information processing device 10 may select personnel by acquiring the profiles (areas of expertise, etc.) and schedule information (free time) of assignable staff.
要素検討部115は、要素3の検討において、フィリングの混合パターンをn通り生成し、アサインされたスタッフに作成(調理)させる。この際、調理時間はタイマー等で取得できるようにする。また、要素検討部115は、スタッフにフィリングの混合パターンを考えさせてもよい。人が調理する場合は、調理過程の記録を行わずに進めがちであるため、エージェント部111は、スタッフと対話を行い、混合のポイント、お薦めの混合パターン(ランキング)等をスタッフからヒアリングして記憶する。また、要素検討部115は、スタッフが調理したフィリングを撮像し、スタッフが付けたランク順のリストを作成してもよい。スタッフは、各混合パターン通りにフィリングが作成できるか、また、作成したフィリングの見栄えや味等も評価し、情報処理装置10に入力してもよい。 In considering element 3, the element review unit 115 generates n filling mixture patterns and has the assigned staff member create (cook) them. At this time, the cooking time can be obtained using a timer or the like. The element review unit 115 may also have the staff member think up filling mixture patterns. When cooking manually, people tend to proceed without recording the cooking process, so the agent unit 111 interacts with the staff member and hears from them about mixing points, recommended mixing patterns (rankings), etc., and stores them. The element review unit 115 may also take images of the fillings cooked by the staff member and create a list in order of the staff member's ranking. The staff member may evaluate whether they can create the fillings according to each mixing pattern, and also evaluate the appearance and taste of the created fillings, and input this information into the information processing device 10.
要素検討部115は、要素4の検討において、盛り付けパターンをn通り作成し、アサインされたスタッフに作成(調理)させる。また、要素検討部115は、スタッフに盛り付けパターンを考えさせてもよい。例えば、盛り付ける皿をスタッフに選ばせるようにしてもよい。調理時間の取得と、スタッフへのヒアリングおよび調理過程の記憶は、上記要素3の検討の場合と同様である。スタッフは、各盛り付けパターン通りに盛り付けができるか等を評価し、情報処理装置10に入力してもよい。 In considering element 4, the element review unit 115 creates n plating patterns and has the assigned staff create (cook) them. The element review unit 115 may also have the staff think about the plating patterns. For example, the staff may be asked to select the plate on which the food will be served. The acquisition of cooking time, interviews with staff, and storage of the cooking process are the same as in the consideration of element 3 above. The staff may evaluate whether they can plate the food according to each plating pattern, and input this into the information processing device 10.
このように人手により検討された要素も、統合部116において用いられる。 Elements manually considered in this way are also used in the integration unit 116.
また、各スタッフの調理過程の様子も記憶部150に記憶され、ストーリー生成部117により生成される開発ストーリーに組み込み、スタッフの貢献を可視化することもできる。 In addition, the cooking process of each staff member is also stored in the storage unit 150 and can be incorporated into the development story generated by the story generation unit 117, making it possible to visualize the staff member's contributions.
<<5.補足>>
以上、添付図面を参照しながら本開示の好適な実施形態について詳細に説明したが、本技術はかかる例に限定されない。本開示の技術分野における通常の知識を有する者であれば、請求の範囲に記載された技術的思想の範疇内において、各種の変更例または修正例に想到し得ることは明らかであり、これらについても、当然に本開示の技術的範囲に属するものと了解される。
<<5. Supplementary Information>>
Although the preferred embodiments of the present disclosure have been described in detail above with reference to the accompanying drawings, the present technology is not limited to such examples. It is clear that a person skilled in the art of the present disclosure can conceive of various modified or altered examples within the scope of the technical ideas described in the claims, and it is understood that these also naturally fall within the technical scope of the present disclosure.
上述した実施形態は、例えばシェフや料理研究家が新しい料理を創作する際に用いられ得る。 The above-described embodiments can be used, for example, by chefs and culinary experts when creating new dishes.
また、上述した情報処理装置10に内蔵されるCPU、ROM、およびRAM等のハードウェアに、情報処理装置10の機能を発揮させるための1以上のコンピュータプログラムも作成可能である。また、当該1以上のコンピュータプログラムを記憶させたコンピュータ読み取り可能な記憶媒体も提供される。 In addition, one or more computer programs can be created to cause the hardware, such as the CPU, ROM, and RAM, built into the information processing device 10 to perform the functions of the information processing device 10. Also provided is a computer-readable storage medium storing the one or more computer programs.
また、本明細書に記載された効果は、あくまで説明的または例示的なものであって限定的ではない。つまり、本開示に係る技術は、上記の効果とともに、または上記の効果に代えて、本明細書の記載から当業者には明らかな他の効果を奏しうる。 Furthermore, the effects described in this specification are merely descriptive or exemplary and are not limiting. In other words, the technology disclosed herein may achieve other effects in addition to or in place of the above-mentioned effects that would be apparent to those skilled in the art from the description herein.
なお、本技術は以下のような構成も取ることができる。
(1)
ユーザにより入力された指示情報に基づいて、料理学習モデルを用いて料理情報を生成する処理と、
調理加工学習モデルを用いて、前記料理情報に含まれる調理工程に対応する調理加工情報を生成し、
前記調理加工情報を統合して調理加工統合情報を生成する処理と、
生成した前記調理加工統合情報を提示する処理と、
を行う制御部を備える、
情報処理システム。
(2)
前記制御部は、前記調理工程について複数の調理加工情報を生成し、前記調理加工学習モデルを用いて、実現可能な調理加工情報を判断する、前記(1)に記載の情報処理システム。
(3)
前記調理加工学習モデルは、前記調理工程について生成された複数の調理加工情報に基づくシミュレーション結果の強化学習により生成される、前記(1)または(2)に記載の情報処理システム。
(4)
前記調理加工学習モデルは、前記調理工程について生成された複数の調理加工情報で示される調理加工を調理ロボットを用いて実施した調理結果の強化学習により生成される、前記(1)または(2)に記載の情報処理システム。
(5)
前記制御部は、複数の調理加工情報を候補として提示し、ユーザにより選択された調理加工情報を含むよう前記調理加工統合情報を生成する、前記(1)~(4)のいずれか1項に記載の情報処理システム。
(6)
前記指示情報は、料理のコンセプトおよび前記料理に使用したい食材の情報を含む、前記(1)~(5)のいずれか1項に記載の情報処理システム。
(7)
前記料理のコンセプトは、前記料理の目的、印象、雰囲気、スタイル、ジャンル、テーマ、目的、重点項目、ターゲット層、デザイン、またはレイアウトの情報を含む、前記(6)に記載の情報処理システム。
(8)
前記制御部は、前記指示情報に含まれる前記食材の情報に基づいて、食材ペアリング学習モデルを用いて、前記食材との組み合わせに適した組み合わせ食材を生成する、前記(6)に記載の情報処理システム。
(9)
前記制御部は、前記コンセプト、前記食材、および前記組み合わせ食材を入力データとし、ユーザの料理を学習した第1の料理学習モデルおよび他ユーザの料理を学習した第2の料理学習モデルの少なくともいずれかを用いて、前記料理情報を生成する、前記(8)に記載の情報処理システム。
(10)
前記制御部は、
前記第1の料理学習モデルを用いて生成した料理情報における料理デザインのスタイルを、前記第2の料理学習モデルを用いて変更した料理情報を生成する処理と、
前記第2の料理学習モデルを用いて生成した料理情報における料理デザインのスタイルを、前記第1の料理学習モデルを用いて変更した料理情報を生成する処理と、
を行い得る、前記(9)に記載の情報処理システム。
(11)
前記制御部は、設定された料理の新しさレベルに応じて、各料理学習モデルを用いた料理情報の生成数の割合を決定する、前記(9)または(10)に記載の情報処理システム。
(12)
前記制御部は、前記料理情報を複数生成してユーザに提示し、ユーザに選択された料理情報を後段の処理に送る、前記(1)~(11)のいずれか1項に記載の情報処理システム。
(13)
前記制御部は、前記料理情報について、ユーザからの指摘事項に応じて調整を行った上で、後段の処理に送る、前記(1)~(12)のいずれか1項に記載の情報処理システム。
(14)
前記制御部は、前記料理情報について、ユーザの調理環境に応じて調整を行った上で、後段の処理に送る、前記(1)~(13)のいずれか1項に記載の情報処理システム。
(15)
前記制御部は、前記料理学習モデル、または前記調理加工学習モデルの学習において、失敗事例データを学習データに含める、前記(1)~(14)のいずれか1項に記載の情報処理システム。
(16)
前記調理加工情報は、食材の外観に関する加工の情報である、前記(1)~(15)のいずれか1項に記載の情報処理システム。
(17)
前記制御部は、ユーザ入力に基づいて、料理の提供を受ける個人の嗜好または摂取カロリーに応じた前記調理加工情報の変更条件であるパーソナライズ条件を設定し、前記パーソナライズ条件を満たすよう前記調理加工情報を変更した上で、前記調理加工統合情報の生成を行う、前記(1)~(16)のいずれか1項に記載の情報処理システム。
(18)
プロセッサが、
ユーザにより入力された指示情報に基づいて、料理学習モデルを用いて料理情報を生成することと、
前記料理情報に含まれる調理工程に対応する調理加工情報を生成することと、
前記調理加工情報を統合して調理加工統合情報を生成することと、
生成した前記調理加工統合情報を提示することと、
を含む、制御方法。
(19)
コンピュータを、
ユーザにより入力された指示情報に基づいて、料理学習モデルを用いて料理情報を生成する処理と、
前記料理情報に含まれる調理工程に対応する調理加工情報を生成し、
前記調理加工情報を統合して調理加工統合情報を生成する処理と、
生成した前記調理加工統合情報を提示する処理と、
を行う制御部として機能させる、プログラム。
The present technology can also be configured as follows.
(1)
A process of generating cooking information using a cooking learning model based on instruction information input by a user;
Using a cooking process learning model, cooking process information corresponding to the cooking steps included in the dish information is generated;
A process of integrating the cooking and processing information to generate integrated cooking and processing information;
A process of presenting the generated integrated cooking and processing information;
A control unit that performs
Information processing system.
(2)
The information processing system according to (1), wherein the control unit generates a plurality of pieces of cooking processing information for the cooking process and determines feasible cooking processing information using the cooking processing learning model.
(3)
The information processing system according to (1) or (2), wherein the cooking and processing learning model is generated by reinforcement learning of simulation results based on multiple cooking and processing information generated for the cooking process.
(4)
The information processing system described in (1) or (2), wherein the cooking processing learning model is generated by reinforcement learning of cooking results in which cooking processing indicated by multiple cooking processing information generated for the cooking process is performed using a cooking robot.
(5)
The information processing system described in any one of (1) to (4), wherein the control unit presents multiple cooking processing information as candidates and generates the integrated cooking processing information to include cooking processing information selected by the user.
(6)
The information processing system according to any one of (1) to (5), wherein the instruction information includes a concept for the dish and information on ingredients desired to be used in the dish.
(7)
The information processing system according to (6), wherein the concept of the dish includes information on the purpose, impression, atmosphere, style, genre, theme, objective, key points, target demographic, design, or layout of the dish.
(8)
The information processing system described in (6), wherein the control unit uses an ingredient pairing learning model to generate combined ingredients suitable for combination with the ingredients based on the information of the ingredients included in the instruction information.
(9)
The information processing system described in (8) above, wherein the control unit uses the concept, the ingredients, and the combined ingredients as input data and generates the cooking information using at least one of a first cooking learning model that has learned the user's cooking and a second cooking learning model that has learned the cooking of other users.
(10)
The control unit
A process of generating cooking information by changing the style of the cooking design in the cooking information generated using the first cooking learning model using the second cooking learning model;
A process of generating cooking information by changing the style of the cooking design in the cooking information generated using the second cooking learning model using the first cooking learning model;
The information processing system according to (9) above,
(11)
The information processing system described in (9) or (10), wherein the control unit determines the proportion of the number of pieces of cooking information generated using each cooking learning model according to the set level of newness of the cooking.
(12)
The information processing system according to any one of (1) to (11), wherein the control unit generates multiple pieces of cooking information, presents them to the user, and sends the cooking information selected by the user to subsequent processing.
(13)
The information processing system according to any one of (1) to (12), wherein the control unit adjusts the cooking information in accordance with user comments and then sends the information to subsequent processing.
(14)
The information processing system according to any one of (1) to (13), wherein the control unit adjusts the cooking information according to the user's cooking environment and then sends it to subsequent processing.
(15)
The information processing system according to any one of (1) to (14), wherein the control unit includes failure case data in learning data when learning the cooking learning model or the cooking processing learning model.
(16)
The information processing system according to any one of (1) to (15), wherein the cooking and processing information is information about processing related to the appearance of ingredients.
(17)
The control unit sets personalization conditions, which are conditions for changing the cooking and processing information according to the preferences or calorie intake of the individual receiving the food, based on user input, changes the cooking and processing information to satisfy the personalization conditions, and then generates the integrated cooking and processing information.
(18)
The processor:
generating cooking information using a cooking learning model based on instruction information input by a user;
generating cooking processing information corresponding to the cooking steps included in the cooking information;
Integrating the cooking and processing information to generate integrated cooking and processing information;
Presenting the generated integrated cooking processing information;
A control method comprising:
(19)
Computer,
A process of generating cooking information using a cooking learning model based on instruction information input by a user;
generating cooking processing information corresponding to the cooking steps included in the cooking information;
A process of integrating the cooking and processing information to generate integrated cooking and processing information;
A process of presenting the generated integrated cooking and processing information;
A program that functions as a control unit to perform the above.
10 情報処理装置
110 制御部
111 エージェント部
112 素案生成部
113 表示制御部
114 調整部
115 要素検討部
116 統合部
117 ストーリー生成部
120 通信部
130 操作入力部
140 表示部
150 記憶部
160 音声入出力部
REFERENCE SIGNS LIST 10 Information processing device 110 Control unit 111 Agent unit 112 Draft generation unit 113 Display control unit 114 Adjustment unit 115 Element review unit 116 Integration unit 117 Story generation unit 120 Communication unit 130 Operation input unit 140 Display unit 150 Storage unit 160 Voice input/output unit
Claims (19)
調理加工学習モデルを用いて、前記料理情報に含まれる調理工程に対応する調理加工情報を生成し、
前記調理加工情報を統合して調理加工統合情報を生成する処理と、
生成した前記調理加工統合情報を提示する処理と、
を行う制御部を備える、
情報処理システム。 A process of generating cooking information using a cooking learning model based on instruction information input by a user;
Using a cooking process learning model, cooking process information corresponding to the cooking steps included in the dish information is generated;
A process of integrating the cooking and processing information to generate integrated cooking and processing information;
A process of presenting the generated integrated cooking and processing information;
A control unit that performs
Information processing system.
前記第1の料理学習モデルを用いて生成した料理情報における料理デザインのスタイルを、前記第2の料理学習モデルを用いて変更した料理情報を生成する処理と、
前記第2の料理学習モデルを用いて生成した料理情報における料理デザインのスタイルを、前記第1の料理学習モデルを用いて変更した料理情報を生成する処理と、
を行い得る、請求項9に記載の情報処理システム。 The control unit
A process of generating cooking information by changing the style of the cooking design in the cooking information generated using the first cooking learning model using the second cooking learning model;
A process of generating cooking information by changing the style of the cooking design in the cooking information generated using the second cooking learning model using the first cooking learning model;
The information processing system according to claim 9, wherein the information processing system is capable of performing the following.
ユーザにより入力された指示情報に基づいて、料理学習モデルを用いて料理情報を生成することと、
前記料理情報に含まれる調理工程に対応する調理加工情報を生成することと、
前記調理加工情報を統合して調理加工統合情報を生成することと、
生成した前記調理加工統合情報を提示することと、
を含む、制御方法。 The processor:
generating cooking information using a cooking learning model based on instruction information input by a user;
generating cooking processing information corresponding to the cooking steps included in the cooking information;
Integrating the cooking and processing information to generate integrated cooking and processing information;
Presenting the generated integrated cooking processing information;
A control method comprising:
ユーザにより入力された指示情報に基づいて、料理学習モデルを用いて料理情報を生成する処理と、
前記料理情報に含まれる調理工程に対応する調理加工情報を生成し、
前記調理加工情報を統合して調理加工統合情報を生成する処理と、
生成した前記調理加工統合情報を提示する処理と、
を行う制御部として機能させる、プログラム。 Computer,
A process of generating cooking information using a cooking learning model based on instruction information input by a user;
generating cooking processing information corresponding to the cooking steps included in the cooking information;
A process of integrating the cooking and processing information to generate integrated cooking and processing information;
A process of presenting the generated integrated cooking and processing information;
A program that functions as a control unit to perform the above.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2024-055451 | 2024-03-29 | ||
| JP2024055451 | 2024-03-29 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025204235A1 true WO2025204235A1 (en) | 2025-10-02 |
Family
ID=97220040
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2025/004764 Pending WO2025204235A1 (en) | 2024-03-29 | 2025-02-13 | Information processing system, control method, and program |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025204235A1 (en) |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2020075302A (en) * | 2018-11-05 | 2020-05-21 | ソニー株式会社 | Cooking robot, cooking robot control device and control method |
| WO2021024830A1 (en) * | 2019-08-08 | 2021-02-11 | ソニー株式会社 | Information processing device, information processing method, cooking robot, cooking method, and cooking instrument |
| WO2021024829A1 (en) * | 2019-08-08 | 2021-02-11 | ソニー株式会社 | Information processing device, information processing method, cooking robot, cooking method, and cookware |
-
2025
- 2025-02-13 WO PCT/JP2025/004764 patent/WO2025204235A1/en active Pending
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2020075302A (en) * | 2018-11-05 | 2020-05-21 | ソニー株式会社 | Cooking robot, cooking robot control device and control method |
| WO2021024830A1 (en) * | 2019-08-08 | 2021-02-11 | ソニー株式会社 | Information processing device, information processing method, cooking robot, cooking method, and cooking instrument |
| WO2021024829A1 (en) * | 2019-08-08 | 2021-02-11 | ソニー株式会社 | Information processing device, information processing method, cooking robot, cooking method, and cookware |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US5832446A (en) | Interactive database method and system for food and beverage preparation | |
| US4954954A (en) | Apparatus for generating a balanced calorically limited menu | |
| KR102131161B1 (en) | Microwave voice control method and microwave | |
| CN109886840A (en) | A kind of judgement exchange method and device based on selected vegetable | |
| CN106485052A (en) | A kind of intelligent cooking system and method | |
| Lan et al. | Promoting food tourism with Kansei cuisine design | |
| CN106372258A (en) | Menu prompting method and device and terminal equipment with device | |
| Gooptu et al. | Skill, work and gendered identity in contemporary India: the business of delivering home-cooked food for domestic consumption | |
| JP2005276171A (en) | Cooking assist system, information terminal, and server | |
| JP4364795B2 (en) | Recipe provision system and recipe provision method | |
| Değerli et al. | Cooking with ChatGPT and Bard: A study on competencies of AI tools on recipe correction, adaption, time management and presentation | |
| Otcu et al. | State of the art of sustainability in 3D food printing | |
| JP7752352B2 (en) | Cooking recipe display system and cooking recipe display method | |
| CN107705219A (en) | A kind of method, terminal and server for being used to generate personalized menu | |
| Schifferstein | Designing food experiences: a multisensory approach | |
| Zoran | Digital gastronomy 2.0: A 15-year transformative journey in culinary-tech evolution and interaction | |
| WO2025204235A1 (en) | Information processing system, control method, and program | |
| WO2021235291A1 (en) | Cooking learning assistance system and cooking learning assistance method | |
| CN117157663A (en) | Cooking recipe display system, cooking recipe display device, cooking recipe display method, and program | |
| KR20220039707A (en) | Information processing device, information processing method, cooking robot, cooking method, and cooking utensil | |
| Sutton | The mindful kitchen, the embodied cook: Tools, technology and knowledge transmission on a Greek island | |
| JP2004013331A (en) | System, method, and program for menu proposition | |
| JP7003739B2 (en) | Menu provision equipment, menu provision method and menu provision program | |
| JP2005250857A (en) | Cooking work process chart creating system, cooking work process chart creating program and cooking work process chart creating method | |
| CN113744840A (en) | System and method for realizing personalized instant nutrition evaluation and guidance based on centralized meal supply environment |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 25777904 Country of ref document: EP Kind code of ref document: A1 |