[go: up one dir, main page]

WO2019106998A1 - Dispositif de traitement d'informations, dispositif client et programme - Google Patents

Dispositif de traitement d'informations, dispositif client et programme Download PDF

Info

Publication number
WO2019106998A1
WO2019106998A1 PCT/JP2018/038803 JP2018038803W WO2019106998A1 WO 2019106998 A1 WO2019106998 A1 WO 2019106998A1 JP 2018038803 W JP2018038803 W JP 2018038803W WO 2019106998 A1 WO2019106998 A1 WO 2019106998A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
skin
user
tactile
field
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2018/038803
Other languages
English (en)
Japanese (ja)
Inventor
直輝 齋藤
尚美 北村
孝平 松森
めぐみ 関野
雄一郎 森
泰規 風間
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shiseido Co Ltd
Original Assignee
Shiseido Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=66664882&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=WO2019106998(A1) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Application filed by Shiseido Co Ltd filed Critical Shiseido Co Ltd
Priority to JP2019557064A priority Critical patent/JP7208153B2/ja
Publication of WO2019106998A1 publication Critical patent/WO2019106998A1/fr
Anticipated expiration legal-status Critical
Priority to JP2023000344A priority patent/JP7720873B2/ja
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/22Social work or social welfare, e.g. community support activities or counselling services

Definitions

  • the present invention relates to an information processing apparatus, a client apparatus, and a program.
  • cosmetics to be used for skin care are preferably selected according to the condition of the skin.
  • a salesperson with expertise on skin identifies the physical characteristics of the customer's skin (eg, skin irregularities and hardness) by direct contact with the customer's skin.
  • the salesperson proposes to the customer advice (for example, a method of skin care and cosmetics to be used for skin care) according to the identified skin condition.
  • the customer purchases cosmetics according to the condition of his or her skin according to the salesperson's suggestion.
  • Patent Document 1 discloses a technique of determining a skin condition according to a palpation result in order to identify a skin condition based on the touch of the skin.
  • Patent No. 5357363 gazette
  • Patent Document 1 the skin palpation result of Patent Document 1 depends on the degree of expertise or experience of the palpated examiner. Therefore, the determination results vary from one palpator to another. Therefore, when the technique of Patent Document 1 is used at a cosmetics retailer, the advice provided by the salesperson may not be suitable for the skin of the subject.
  • An object of the present invention is to prevent variations in the determination result of the condition of the skin based on the touch of the skin.
  • One aspect of the present invention is A means for acquiring tactile information indicating physical characteristics of the user's skin tactile sense; Means for presenting at least one of the second information on the skin condition of the user on the basis of the first information on the skin condition of the user and the tactile sensation log information including a plurality of tactile information based on the tactile information Prepare, It is an information processing apparatus.
  • FIG. 21 is a schematic view showing a configuration of a touch sensor of a modification 6;
  • FIG. 1 is a block diagram showing the configuration of the information processing system of this embodiment.
  • the information processing system 1 includes a client device 10, a server 30, a tactile sensor 40, a log device 50, and a prediction information providing server 70.
  • the client device 10, the server 30, and the prediction information providing server 70 are connected via a network (for example, the Internet or an intranet) NW.
  • the client device 10 is an example of an information processing device that transmits a request to the server 30.
  • the client device 10 is, for example, a smartphone, a tablet terminal, or a personal computer.
  • the server 30 is an example of an information processing apparatus that provides the client apparatus 10 with a response to the request sent from the client apparatus 10.
  • the server 30 is, for example, a web server.
  • the tactile sensor 40 is configured to measure information related to physical characteristics related to the touch of human skin.
  • the log device 50 acquires at least one of environmental information on the environment spent by the user, behavior information on the behavior of the user, and mind-body information on the mind and body of the user.
  • the log device 50 is, for example, a wearable device that can be worn by the user.
  • the log device 50 is connected to the client device 10 by wire or wirelessly.
  • the prediction information providing server 70 is an example of an information processing apparatus that provides prediction information indicating a future prediction.
  • the prediction information providing server 70 provides, for example, the following information.
  • -Tactile prediction information indicating prediction of future user's skin tactile sensation
  • Behavior prediction information indicating prediction of future user behavior (for example, schedule)
  • Prediction of future user's mind and body for example, prediction date of sex cycle
  • Mind-body prediction information indicating
  • the client device 10 includes a storage device 11, a processor 12, an input / output interface 13, a communication interface 14, and a GPS module 15.
  • the storage device 11 is configured to store programs and data.
  • the storage device 11 is, for example, a combination of a read only memory (ROM), a random access memory (RAM), and a storage (for example, a flash memory or a hard disk).
  • ROM read only memory
  • RAM random access memory
  • storage for example, a flash memory or a hard disk
  • the programs include, for example, the following programs.
  • Program of OS Operating System
  • Program of application for example, web browser
  • the data includes, for example, the following data.
  • ⁇ Database referenced in information processing ⁇ Data obtained by executing information processing (that is, execution result of information processing)
  • the processor 12 is configured to realize the function of the client device 10 by activating a program stored in the storage device 11.
  • the processor 12 is an example of a computer.
  • the input / output interface 13 is configured to receive a user's instruction from an input device connected to the client device 10, obtain information from the tactile sensor 40, and output information to an output device connected to the client device 10. Be done.
  • the input device is, for example, a keyboard, a pointing device, a touch panel, or a combination thereof.
  • the output device is, for example, a display.
  • Communication interface 14 is configured to control communication between client device 10 and server 30.
  • the GPS module 15 is configured to acquire position information of the client device 10 by communicating with a GPS (Global Positioning System) satellite.
  • GPS Global Positioning System
  • the server 30 includes a storage device 31, a processor 32, and a communication interface 34.
  • the storage device 31 is configured to store programs and data.
  • the storage device 31 is, for example, a combination of a ROM, a RAM, and a storage (for example, a flash memory or a hard disk).
  • the programs include, for example, the following programs. ⁇ Program of OS ⁇ Program of application executing information processing
  • the data includes, for example, the following data. ⁇ Database referenced in information processing ⁇ Execution result of information processing
  • the processor 32 is configured to realize the function of the server 30 by activating a program stored in the storage device 31.
  • the processor 32 is an example of a computer.
  • the input / output interface 33 is configured to receive a user's instruction from an input device connected to the server 30, and to output information to an output device connected to the server 30.
  • the input device is, for example, a keyboard, a pointing device, a touch panel, or a combination thereof.
  • the output device is, for example, a display.
  • Communication interface 34 is configured to control communication between server 30 and client device 10.
  • FIG. 2 is a schematic view showing the configuration of the tactile sensor of FIG.
  • the first example of the tactile sensor 40 includes a sensor unit 40 a, a base portion 40 b, and a controller 40 c.
  • the sensor unit 40a can be attached to the finger FIN of the user.
  • the sensor unit 40a includes an acceleration sensor 40aa and a pair of strain gauges 40ab.
  • the acceleration sensor 40aa is configured, for example, to measure at least one of the following characteristics. ⁇ Vibration characteristics of skin (as an example, vibration frequency) ⁇ Irregularity characteristics of skin (as an example, a value estimated from vibration frequency)
  • the user applies a force in a predetermined direction (for example, a downward (Y-direction) pressing force or a lateral (X direction) shear force) to the base portion 40b with a finger FIN
  • a force in a predetermined direction for example, a downward (Y-direction) pressing force or a lateral (X direction) shear force
  • the skin characteristics to be measured include, for example, at least one of the following. ⁇ Contact characteristics of skin ⁇ Deformation characteristics of skin ⁇ Friction characteristics of skin
  • the controller 40c is configured to calculate the friction characteristic of the skin by applying the deformation characteristic of the skin measured by the strain gauge 40ab to a predetermined estimation formula.
  • FIG. 3 is an explanatory view of the outline of the present embodiment.
  • the tactile sensor 40 acquires tactile information indicating physical characteristics related to the touch of the skin of the user U.
  • the tactile sensation information is information indicating physical characteristics related to the tactile sensation of the user U's skin.
  • the client device 10 acquires tactile sensation information from the tactile sensation sensor 40 and transmits the tactile sensation information to the server 30.
  • the server 30 estimates the skin condition based on the tactile sensation information acquired from the client device 10.
  • the skin condition is a condition of the skin estimated from physical characteristics relating to the touch of the skin of the user U.
  • the client device 10 acquires the estimation result of the skin condition by the server 30, and two types of information (an example of real time information (an example of “first information”) and one time information (“the second time”) based on the estimation result of the skin condition. At least one of "an example of information" is presented to the user U.
  • the estimation result of the skin condition can be obtained based on the tactile sensation information acquired by the tactile sensation sensor 40.
  • the tactile sensation sensor 40 it is possible to prevent the variation in the determination result of the skin state based on the touch of the skin.
  • FIG. 4 is a diagram showing the data structure of the user information database of the present embodiment.
  • the user information database of FIG. 4 stores information on the user (hereinafter referred to as “user information”).
  • the user information database includes a "user ID” field, a "user name” field, a "user attribute” field, and a "estimated expression” field. Each field is associated with each other.
  • the “user ID” field stores a user ID for identifying the user.
  • the “user name” field stores information (for example, text) indicating the user name.
  • the “user attribute” field information on the attribute of the user (hereinafter referred to as “user attribute information”) is stored.
  • the user attribute information is information arbitrarily determined by the user.
  • the "user attribute” field includes a “sex” field and an “age” field.
  • the “sex” field stores information indicating the gender of the user.
  • the “age” field stores information indicating the user's age.
  • the “estimated expression” field stores an estimated expression for estimating the skin condition of the user.
  • the estimation formula includes a coefficient for each factor that affects the skin of the user (for example, tactile information acquired by the tactile sensor 40).
  • the estimation formula is prepared for each index of skin condition of the user (hereinafter referred to as “skin index”). That is, the coefficients included in the estimation formula are different for each skin index.
  • the skin index is a quantitative value representing the skin condition of the user.
  • the skin index is, for example, at least one of the following. -Horny layer moisture-Texture-Skin color-Smoothness of the skin-Skin transparency-Skin lightening-Presence of rough skin-Degree of skin inflammation-Degree of skin fineness-Skin firmness- Softness of the skin
  • FIG. 5 is a diagram showing the data structure of the tactile sensation log information database of the present embodiment.
  • tactile sensation log information is stored.
  • the tactile sensation log information is information on the history of tactile sensation information acquired from the tactile sensation sensor 40.
  • the tactile sensation log information database includes a "tactile sensation log ID" field, a "date and time” field, and a "tactile sensation information” field. Each field is associated with each other.
  • the tactile sensation log information database is associated with the user ID.
  • a feeling log ID for identifying the feeling information constituting the feeling log information is stored.
  • the "feeling information” field includes a "vibration” field, a “concave” field, a “contact” field, a “deformation” field, and a “friction” field.
  • the “vibration” field stores vibration characteristic information (for example, a vibration frequency measured by the acceleration sensor 40 aa) regarding the vibration characteristic of the skin.
  • the “concave / convex” field stores concave / convex characteristic information (for example, a value estimated from a vibration frequency measured by the acceleration sensor 40 aa) regarding the concave / convex characteristic of the skin.
  • concave / convex characteristic information for example, a value estimated from a vibration frequency measured by the acceleration sensor 40 aa
  • the "contact” field stores contact characteristic information (for example, a value measured by the strain gauge 40ab) regarding the contact characteristic of the skin.
  • the “deformation” field stores deformation characteristic information (for example, the amount of deformation measured by the strain gauge 40 ab) regarding the deformation characteristic of the skin.
  • the “friction” field stores friction characteristic information (for example, the friction calculated by the base 40b) regarding the friction characteristic of the skin.
  • FIG. 6 is a diagram showing the data structure of the action log information database of the present embodiment.
  • the action log information database of FIG. 6 stores information indicating a history of action information (hereinafter, referred to as “action log information”).
  • the action log information is information obtained from the log device 50, information determined according to a user instruction (for example, a user's answer to a questionnaire or a user's voluntary input), or a combination thereof.
  • the action log information database includes "action log ID” field, "date and time” field, "action” field, "start time” field, “end time” field, "calories change” field, and "position information”. And the "environment information” field. Each field is associated with each other.
  • the action log information database is associated with the user ID.
  • action log ID an action log ID for identifying action information constituting action log information is stored.
  • the "action” field stores information on the user's action.
  • the action of the user includes at least one of the following. ⁇ Meal (eg, the contents of the meal) ⁇ Exercise (eg, exercise event) ⁇ Sleep (eg, number of turns during sleep) ⁇ Care behavior (for example, whether or not the morning cleansing was carried out, whether or not the skin care was conducted, the content of the care, information on the product used for the care)
  • the “start time” field stores information indicating the start time of the action.
  • the "calorie change” field stores information indicating the consumed calories or the consumed calories according to the behavior.
  • the “position information” field stores the position information acquired by the GPS module 15.
  • the "environment information” field stores environment information on the environment at the position indicated by the position information.
  • the environmental information includes at least one of the following. ⁇ Temperature, humidity, UV exposure
  • FIG. 7 is a diagram showing the data structure of the mind-body log information database of the present embodiment.
  • the mind-body log information database of FIG. 7 information indicating the history of mind-body information of the user (hereinafter referred to as “mind-body log information”) is stored.
  • the mind-body log information is information determined according to information acquired from the log device 50, a user instruction (for example, a user's answer to a questionnaire), or a combination thereof.
  • the mind-body log information database includes a "mind-body log ID" field, a "date and time” field, a "pulse value” field, a "sex cycle” field, and a “stress” field. Each field is associated with each other.
  • the mind-body log information database is associated with the user ID.
  • a mind-body log ID that identifies mind-body information that constitutes mind-body log information is stored in the "mind-body log ID" field.
  • the “date and time” field stores information indicating the date and time when the mind-body information was acquired.
  • the "pulse value” field stores the user's pulse value.
  • the pulse value is, for example, information acquired from the log device 50.
  • the “sex cycle” field stores information (one example of hormone balance information) indicating the sex cycle.
  • the "stress" field stores stress information indicating an index of stress.
  • the stress information indicates, for example, the strength of stress, a factor of stress, the type of stress, or a combination thereof.
  • the stress information is determined by the pulse value, the sexual cycle, or a combination thereof.
  • FIG. 8 is a diagram showing the data structure of the skin evaluation log information database of the present embodiment.
  • the skin evaluation log information database of FIG. 8 stores information (hereinafter, referred to as “skin evaluation log information”) indicating a history of qualitative evaluation (hereinafter, referred to as “skin evaluation”) related to a skin condition.
  • the skin evaluation log information database includes a "skin evaluation log ID” field, a "date and time” field, and a "skin score” field. Each field is associated with each other.
  • the skin evaluation log information database is associated with the user ID.
  • a skin evaluation log ID for identifying the skin evaluation constituting the skin evaluation log information is stored.
  • the “date and time” field stores information indicating the date and time when the skin evaluation was generated.
  • the “skin score” field stores touch information and a skin score obtained from the estimation formula.
  • the “skin score” field includes a "first skin score” field and a "second skin score” field.
  • a first skin score (an example of a first skin index) is stored in the "first skin score” field.
  • the first skin score indicates the current skin condition (for example, when the tactile sensation information is acquired) estimated from the tactile sensation log information.
  • the "second skin score” field stores a second skin score (an example of a second skin index).
  • the second skin score indicates a skin condition in the future (for example, one week after the day touch information is acquired) estimated from touch log information and touch prediction information.
  • FIG. 9 is a diagram showing the data structure of the content matching table of the present embodiment.
  • the content matching table of FIG. 9 has a data structure indicating the relationship between the skin evaluation and the content of two types of information (real time information and one time information) to be presented to the user.
  • the content matching table includes a “reference score” field, a “real time content” field, and a “one time content” field.
  • the “reference score” field stores a reference score for identifying real-time content or one-time content.
  • the "real-time content” field stores a content ID for identifying real-time content constituting real-time information.
  • Real-time content is, for example, text, image, voice, URL (Uniform Resource Locator), or a combination thereof.
  • Real-time content includes, for example: -A message prompting you to take care action immediately-Advice on how to take care immediately-Advice on a product suitable for care (eg, URL of website for purchasing the product) ⁇ First skin score showing past skin condition
  • the "one-time content” field stores a content ID for identifying one-time content that constitutes one-time information.
  • One-time content is, for example, text, image, voice, URL, or a combination thereof.
  • One-time content includes, for example, the following. ⁇ Advising to take care action before going to bed ⁇ Advising on how to take care before going to bed ⁇ Advising on products suitable for care (eg URL of website for purchasing the concerned goods) ⁇ Second skin score showing future skin condition
  • FIG. 10 is a diagram showing the data structure of the task information database of the present embodiment.
  • the task information database of FIG. 10 stores information (hereinafter referred to as “task information”) regarding the care action to be performed by the user.
  • the task information database includes a "task ID” field, a "reference content” field, a "registration date” field, and a "end date” field. Each field is associated with each other.
  • the task information database is associated with the user ID.
  • the “task ID” field stores a task ID for identifying task information.
  • the “reference content” field stores a content ID identifying reference content (real-time content or one-time content) which is a reference for care action.
  • FIG. 11 is a sequence diagram of a process of presenting content according to the present embodiment.
  • 12 to 14 show examples of screens displayed in the information processing of FIG.
  • the client device 10 executes acquisition of tactile sensation information (S100). Specifically, the processor 12 acquires tactile sensation information from the tactile sensation sensor 40 at regular time intervals. The processor 12 associates the tactile sensation information with the information indicating the execution date and time of step S100 (that is, the date and time when the tactile sensation information is acquired), and stores the information in the storage device 11 in association with each other. The processor 12 displays the screen P100 (FIG. 12) on the display.
  • the screen P100 includes a display object A100 and a button object B100.
  • the display object A100 displays tactile information (for example, vibration frequency, unevenness amount, contact force, deformation amount, and estimated value of friction) acquired from the tactile sensor 40.
  • the button object B100 is an object that receives a user instruction to start a content request (S101).
  • step S100 the client device 10 executes the content request (S101). Specifically, when the user operates the button object B 100, the processor 12 transmits content request data to the server 30.
  • the content request data includes the following information. ⁇ User ID Touch information obtained in step S100 (that is, touch information included in the display object A100) Information indicating the execution date and time of step S100
  • step S101 the server 30 executes database update (S300) based on the content request data.
  • the processor 32 adds a new record to the tactile sensation log information database (FIG. 5) associated with the user ID included in the content request data.
  • the following information is stored in each field of the new record.
  • the server 30 executes skin condition estimation (S301). Specifically, the processor 32 refers to the user information database (FIG. 4) to specify the estimation formula associated with the user ID included in the content request data.
  • the processor 32 refers to the user information database (FIG. 4) to specify the estimation formula associated with the user ID included in the content request data.
  • the processor 32 includes the value of the “date and time” field in a certain period (for example, one week) retroactively from the execution date of step S100. Record (hereinafter referred to as “touch reference record”) to be identified.
  • the processor 32 calculates an average value of values of each subfield (“vibration” field, “concave” field, “contact” field, “deformation” field, and “friction” field) of the “touch information information” field of the touch reference record. Calculate The processor 32 calculates the first skin score by applying the calculated average value to the identified estimation formula.
  • the first skin score is calculated for each of the following indicators. ⁇ Skin score relating to the moisture content of the horny layer ⁇ Skin score relating to texture ⁇ Skin score relating to skin color ⁇ Skin score relating to skin condition ⁇ Skin score relating to skin roughness ⁇ Skin score relating to skin inflammation level ⁇ Skin score relating to the degree of fine lines of skin ⁇ Softness of the skin
  • the processor 32 acquires tactile sensation prediction information from the prediction information providing server 70.
  • the processor 32 calculates the second skin score by applying the average value of the values of each sub-field of the “touch information” field of the touch reference record and the touch prediction information to the identified estimation formula.
  • the second skin score is calculated for each index similar to the first skin score.
  • the processor 32 adds a new record to the skin evaluation log information database (FIG. 8) associated with the user ID included in the content request data.
  • the following information is stored in each field of the new record.
  • Skin evaluation log ID
  • New skin evaluation log ID “Date and time” field: Information indicating the execution date and time of step S100 "First skin score” field ... First skin score "Second skin score” field ... Second skin score
  • the server 30 executes content selection (S302). Specifically, the processor 32 specifies the content ID corresponding to the first skin score calculated in step S301 with reference to the “reference score” field of the content matching table (FIG. 9). For example, in the case of the first skin score “30”, the content ID “REAL 001” associated with the reference score “20 to 39” is specified. The processor 32 specifies the content ID corresponding to the second skin score calculated in step S301 with reference to the “reference score” field. For example, in the case of the second skin score “60”, the content ID “ONE 003” associated with the reference score “60 to 89” is specified.
  • the server 30 executes a content response (S303). Specifically, the processor 32 transmits content response data to the client device 10.
  • Content response data includes the following information. -First skin score and second skin score calculated in step S301-Tactile information referenced to calculate the first skin score in step S301-Referenced to calculate the second skin score in step S301 Tactile log information ⁇ content ID identified in step S302, and content identified by the content ID (real-time content or one-time content)
  • the client device 10 executes presentation of content (S102) based on the content response data. Specifically, the processor 12 displays the screen P102 (FIG. 13) on the display.
  • the screen P102 includes display objects A102a to A102b and button objects B102a to B102d.
  • the display object A 102 a displays a graph corresponding to at least a part of the tactile sensation log information included in the content response data (for example, a graph showing temporal changes in vibration characteristics, concavo-convex characteristics, contact characteristics, and deformation characteristics). .
  • the display object A 102 b displays a first skin score and a second skin score included in the content response data.
  • the button object B102a is an object for receiving a user instruction to display the real time content screen P104a (FIG. 14A).
  • the button object B102b is an object that receives a user instruction to display the one-time content screen P104b (FIG. 14B).
  • the button object B 102 c is an object that receives a user instruction of task presentation (S 111 in FIG. 15).
  • the button object B 102 d is an object for receiving a user instruction for accessing a shopping site for purchasing a care item useful when performing care in accordance with real-time content or one-time content.
  • the processor 12 displays a screen P104a (FIG. 14A) on the display.
  • the screen P104a includes a display object A104a and a button object B104a.
  • the display object A 104 a displays real-time content identified by the content ID included in the content response data.
  • the button object B 104 a is an object that receives a user instruction to notify the server 30 that the care indicated by the real-time content displayed on the display object A 104 a is ended.
  • the processor 12 displays a screen P104b (FIG. 14B) on the display.
  • the screen P104b includes a display object A104b and a button object B104b.
  • the display object A 104 b displays one-time content identified by the content ID included in the content response data.
  • the button object B 104 b is an object that receives a user instruction to notify the server 30 that the care indicated by the one-time content displayed on the display object A 104 b is ended.
  • step S102 the client device 10 executes a task update request (S103).
  • Task update request data includes the following information. ⁇ User ID -Content ID of real-time content displayed on display object A 104a-Information indicating execution date and time of step S103
  • Task update request data includes the following information.
  • ⁇ User ID ⁇ Content ID of one-time content displayed on display object A 104 b
  • the server 30 executes database update (S304). Specifically, the processor 32 adds a new record to the task information database (FIG. 10) associated with the user ID included in the task update request data. The following information is stored in each field of the new record. ⁇ "Task ID” field ... New task ID ⁇ "Content ID” field ... Content ID included in task update request data "Registration date” field: Information indicating the execution date and time of step S103 included in the task update request data "End date and time” field: Code "NOT" As a result, the task specified by the user on the screen P104a or P104b (FIG. 14) is recorded in the server 30.
  • the task information database FIG. 10
  • FIG. 15 is a sequence diagram of task update processing according to the present embodiment.
  • FIG. 16 is a view showing an example of a screen displayed in the information processing of FIG.
  • the client device 10 executes a task presentation request (S110). Specifically, when the user designates button object B 102 c (FIG. 13), processor 12 transmits task presentation request data to server 30.
  • the task presentation request data includes a user ID.
  • the server 30 executes a task presentation response (S310).
  • the processor 32 refers to the task information database (FIG. 10) associated with the user ID included in the task presentation request data, and records that the code "NOT" is stored in the "end date and time” field.
  • the task ID and the content ID are specified (that is, a record in which task information related to unfinished tasks is stored).
  • the processor 32 transmits task presentation response data to the client device 10.
  • the task presentation response data includes the following information. ⁇ Task ID of unfinished task ⁇ Content ID associated with task ID of unfinished task
  • step S310 the client device 10 executes task presentation (S111). Specifically, the processor 12 displays the screen P110 (FIG. 16) on the display based on the task presentation response data.
  • the screen P110 includes display objects A110a to A110b and button objects B110a to B110b.
  • the content identified by the content ID included in the task presentation response data is displayed.
  • the button objects B110a to B110b are objects for receiving user instructions for notifying the server 30 of the end of the task corresponding to the content displayed on the display objects A110a to A110b.
  • the task IDs associated with the content IDs for identifying the content displayed on the display objects A110a to A110b are assigned to the button objects B110a to B110b, respectively.
  • the client device 10 executes a task update request (S112). Specifically, when the user designates the button object B 110 a (FIG. 16), the processor 12 sends task update request data to the server 30.
  • Task update request data includes the following information. ⁇ User ID ⁇ Task ID assigned to button object B 110 a designated by the user Information indicating the execution date and time of step S112
  • step S112 the server 30 executes database update (S311).
  • the processor 32 specifies a task information database (FIG. 10) associated with the user ID included in the task update request data.
  • the processor 32 specifies a record including the task ID included in the task update request data among the specified task information databases.
  • the processor 32 stores information indicating the execution date and time of step S112 included in the task update request data in the “end date and time” field of the identified record.
  • the processor 32 specifies the action log information database (FIG. 6) associated with the user ID included in the task update request data.
  • the processor 32 adds a new record to the identified action log information database. The following information is stored in each field of the new record. ⁇ "Action log ID” field ... New action log ID "Date” field: Information indicating the execution date and time of step S112 included in the task update request data "Action” field: Task ID included in the task update request data
  • FIG. 17 is a flowchart of the process of correcting the estimation formula of this embodiment.
  • the process of FIG. 17 is executed after the user ID of the user is registered in a dedicated measuring device (for example, a device disposed in a cosmetics retailer).
  • a dedicated measuring device for example, a device disposed in a cosmetics retailer.
  • the server 30 executes acquisition of the measurement result (S320). Specifically, when the user measures the reference value REF related to the skin condition using a dedicated measuring device (for example, a device disposed at a cosmetics retailer), the processor 32 determines from the measuring device the user ID, Acquire the reference value REF.
  • a dedicated measuring device for example, a device disposed at a cosmetics retailer
  • D
  • Equation 1 (Expression 1)
  • step S321 If the divergence value D calculated in step S321 is less than the predetermined threshold Th (S322-NO), the process of FIG. 17 ends.
  • the server 30 executes correction of the estimation formula (S323). Specifically, the processor 32 changes the coefficient of the estimation equation (FIG. 4) associated with the user ID acquired in step S320 so that the divergence value D becomes smaller than the predetermined threshold value Th. Thereby, the estimation equation is adjusted for each user.
  • the process of FIG. 17 is performed for each skin index.
  • the first modification is an example in which the process of presenting content (FIG. 11) is executed using a combination of tactile log information (FIG. 5) and action log information (FIG. 6).
  • the estimation formula of the first modification includes the coefficient of the user's action and the parameter determined according to the action information.
  • step S301 the processor 32 determines that the value of the “date and time” field in the record of the action log information database (FIG. 6) associated with the user ID included in the content request data is the execution date of step S100. Records (hereinafter, referred to as “action reference records”) included in a certain period (for example, one week) are identified retroactively.
  • the processor 32 is calculated from the tactile reference record and at least one of the "action” field, the "start time” field, the "end time” field, the "calorie change amount” field, and the "location information” field of the action reference record.
  • the first skin score is calculated by applying the average value to the estimation formula. The first skin score indicates the current skin condition estimated from tactile sensation log information and action log information.
  • the processor 32 acquires behavior prediction information from the prediction information providing server 70.
  • the processor 32 calculates the tactile reference record from at least one of the "action” field, the "start time” field, the “end time” field, the "calorie change” field, and the "location information” field of the action reference record.
  • the second skin score is calculated by applying the calculated average value and the action prediction information to the estimation formula.
  • the second skin score indicates a future skin condition estimated from tactile sensation log information, action log information, and action prediction information.
  • Real-time content and one-time content further include advice on the user's behavior.
  • the skin condition is estimated based on the tactile sensation log information and the action log information.
  • the second modification is an example in which the process of presenting content (FIG. 11) is executed using a combination of tactile sensation log information (FIG. 5) and mind-body log information (FIG. 7).
  • the estimation formula of the modification 2 includes coefficients of the user's mind and body and parameters determined according to the mind and body log information.
  • step S301 the processor 32 determines that the value of the “date and time” field in the records of the mind and body log information database (FIG. 7) associated with the user ID included in the content request data
  • a record hereinafter referred to as “a mind-body reference record” included in a certain period (for example, one week) is identified retroactively.
  • the processor 32 applies the estimation value to the estimation formula by applying at least one of the "Pulse value” field, the "sex cycle” field, and the "stress” field of the mind-body reference record and the average value calculated from the tactile sense reference record. 1 Calculate the skin score.
  • the first skin score indicates the current skin condition estimated from tactile sensation log information and mind-body log information.
  • the processor 32 acquires mind-body prediction information from the prediction information providing server 70.
  • the processor 32 estimates at least one of the “Pulse value” field, the “sex cycle” field, and the “stress” field of the mind-body reference record, the average value calculated from the tactile sense reference record, and the mind-body prediction information.
  • the second skin score is calculated by applying to the equation.
  • the second skin score indicates a future skin condition estimated from tactile sensation log information, mind-body log information, and mind-body prediction information.
  • Real-time content and one-time content further include the following information.
  • ⁇ User advice on hormone balance ⁇ User stress advice
  • the skin condition is estimated based on the combination of the tactile sensation log information and the mind-body log information.
  • Modification 3 is an example of estimating the future skin condition in consideration of the user's care behavior.
  • the estimation formula of the modification 3 includes the coefficient of the care behavior of the user and the parameter determined according to the task information.
  • step S301 after calculating the average value, the processor 32 refers to the task information database (FIG. 10) associated with the user ID included in the content request data, and within a fixed period (eg, step A record (hereinafter referred to as “reference record”) one week before the execution date of S301 is identified.
  • the processor 32 calculates the second skin score by applying the information on the specified reference record (that is, the information on the care behavior in a fixed period) and the calculated average value to the estimation formula.
  • the future skin condition is estimated in consideration of the history of the user's care behavior. This makes it possible to present future skin conditions suited to each user.
  • Modification 4 is an example in which the skin score is output to an external device other than the client device 10.
  • the server 30 is connected to a cosmetic production device that produces a cosmetic based on the skin score.
  • a plurality of cartridges are arranged in the cosmetic production device.
  • Each cartridge contains a raw material to be a component of a cosmetic, or a mixture of a plurality of raw materials (hereinafter referred to as "cosmetics").
  • the raw material is, for example, a liquid, a powder, a solid, or a combination thereof.
  • step S301 the processor 32 transmits the first skin score and the second skin score to the cosmetic production device.
  • generation apparatus determines the usage-amount of the raw material or cosmetics accommodated in each cartridge based on at least one of the 1st skin score and 2nd skin score which were transmitted from the server 30.
  • FIG. The cosmetic production device extracts the determined amount of raw material or cosmetic from each cartridge and provides it to the user.
  • the fourth modification it is possible to provide the user with a cosmetic produced using the raw material or the cosmetic according to the physical property related to the touch of the user's skin.
  • the variation 5 is a variation of the advice included in at least one of real-time content and one-time content.
  • At least one of the real-time content and the one-time content of Modification 5 further includes an advice based on the condition of the skin.
  • the server 30 further estimates the condition of the skin in step S301 (FIG. 11).
  • the server 30 further calculates the first skin score and the second skin score related to the index of the skin condition.
  • the condition of the skin is, for example, skin balance.
  • Skin balance is an index related to the health of the skin.
  • the skin balance is derived from measurements on the condition of the skin (for example, the stratum corneum, the epidermis, the dermis, and the condition of blood flow).
  • step S302 the server 30 further selects content according to the condition of the skin.
  • the content according to the condition of the skin includes advice based on the condition of the skin.
  • step S102 the client device 10 further presents the user U with an advice based on the condition of the skin.
  • Modification 6 is a modification of the touch sensor 40.
  • FIG. 18 is a schematic view showing the configuration of the touch sensor of the modification 6.
  • a first example of the touch sensor 40 of the modification 6 is the adhesion sensor 41 of FIG. 18A.
  • the adhesive feeling sensor 41 includes a base member 41a and a plurality of contacts 41b.
  • Each contact 41 b can extend and contract with respect to the base member 41 a.
  • the plurality of contacts 41b are arranged in a matrix on the base member 41a.
  • An adhesive substance may be applied to the surfaces of the plurality of contacts 41b.
  • the plurality of contacts 41 b are configured to detect the physical quantity of the skin SK in contact.
  • the adhesion sensor 41 detects at least one of the following parameters based on the physical quantity detected by the contacts 41b: To generate an electrical signal.
  • Adhesiveness of skin eg, a value estimated from the spatiotemporal change of force generated when skin SK is released from contact 41b
  • Distribution of contact force for example, distribution of contact force generated when pressing the skin SK with the contact 41 b
  • Contact area for example, a value estimated from the number of contacts 41b contacted when the skin SK is pressed by the contacts 41b
  • a second example of the touch sensor 40 of the modification 6 is the softness sensor 42 of FIG. 18B.
  • the flexible sensor 42 includes a piezo vibrator 42 a and a load cell 42 b.
  • the piezo vibrator 42a acquires the following biological information when pressed against the skin SK of the user. ⁇ Viscoelasticity of the skin (as an example, the amount of depression on the skin SK measured by the piezoelectric vibrator 42a, a value estimated from the depression speed, and a reaction force) An indicator of the flexibility of the surface of the skin SK (as one example, the amount of frequency change of the surface of the skin SK measured by the piezo vibrator 42a)
  • the load cell 42 b is configured to measure the contact characteristic when the piezo vibrator 42 a is pressed against the skin SK.
  • the measured contact property is an indicator of the overall flexibility of the skin SK.
  • the contact force includes, for example, at least one of the following. ⁇ Maximum value of contact force ⁇ Hysteresis ⁇ Stress relaxation ⁇ Waveform characteristics
  • the piezo vibrator 42a can be omitted.
  • the load cell 42b is configured to measure the contact force when pressed directly against the skin SK.
  • the third example of the touch sensor 40 of the modification 6 is a temperature sensor (not shown).
  • the temperature sensor is configured to measure the temperature of the skin.
  • Modified Example 7 A seventh modification will be described. Modification 7 is an example of a method of generating tactile sensation prediction information by the prediction information providing server 70.
  • the prediction information providing server 70 of FIG. 1 stores a formula for calculating a predicted value of tactile sensation.
  • This formula is a function of at least one of the following information: -Tactile log information-Behavior log information-Mind and body log information
  • the prediction information providing server 70 refers to at least one of the following databases associated with the user ID. -Touch sensation log information database ( Figure 5) ⁇ Action log information database ( Figure 6) ⁇ Mind and body log information database ( Figure 7)
  • the prediction information providing server 70 calculates the predicted value of the tactile sensation by applying the information of the referred database to the calculation formula.
  • the first aspect of the present embodiment is A unit (for example, a processor 32 that executes the process of step S300) for acquiring tactile information indicating physical characteristics related to the tactile sensation of the user's skin; A means for presenting at least one of second information related to the skin condition of the user based on the first information related to the skin condition of the user based on the tactile information and tactile log information including a plurality of tactile information (for example, step A processor 32 that executes the process of S303; It is an information processing apparatus.
  • a unit for example, a processor 32 that executes the process of step S300
  • a means for presenting at least one of second information related to the skin condition of the user based on the first information related to the skin condition of the user based on the tactile information and tactile log information including a plurality of tactile information for example, step A processor 32 that executes the process of S303; It is an information processing apparatus.
  • the condition of the skin is determined based on the physical property related to the touch of the skin.
  • the second aspect of the present embodiment is At least one of the first information and the second information includes advice on a method of skin care of the user, advice on a product suitable for skin care of the user, advice on hormone balance of the user, and advice on stress of the user. It is an information processor including at least one.
  • At least one of a method of skin care, a product suitable for skin care, hormonal balance, and advice regarding at least one of stress is presented. This allows the user to carry out skin care according to more detailed advice.
  • the third aspect of the present embodiment is A means (for example, a processor 32 for executing the process of step S301) for estimating a skin index of the skin condition of the user based on the tactile sensation log information; A unit for presenting the estimated skin index (for example, the processor 32 that executes the process of step S303); It is an information processing apparatus.
  • a skin indicator of a skin condition based on physical characteristics related to touch of the skin is presented. This allows the user to know the correct skin condition.
  • the fourth aspect of the present embodiment is A unit (for example, a processor 32 that executes the process of step S300) for acquiring tactile information indicating physical characteristics related to the tactile sensation of the user's skin; A means (for example, a processor 32 for executing the process of step S301) for estimating a skin index of a skin condition based on touch log information including a plurality of touch informations; A unit for presenting the estimated skin index (for example, the processor 32 that executes the process of step S303); It is an information processing apparatus.
  • a skin index based on physical characteristics related to touch of the skin is presented. This allows the user to know the correct skin condition.
  • the fifth aspect of the present embodiment is A means for acquiring action log information indicating a history of action information related to the user's action;
  • the means for estimating is an information processing apparatus that estimates a skin index based on tactile sensation log information and action log information.
  • a skin characteristic based on a combination of physical characteristics related to touch of skin and a behavior log is presented. This allows the user to know the physical properties related to the touch of his / her skin and the skin condition based on both of his / her behavior.
  • the sixth aspect of the present embodiment is A means for acquiring mind and body log information on the mind and body of the user;
  • the means for estimating is an information processing apparatus for estimating a skin index based on tactile sensation log information and mind-body log information.
  • a skin characteristic based on a combination of physical characteristics relating to touch of the skin and a mind-body log is presented. This allows the user to know the physical properties related to the touch of his / her skin and the skin condition based on both his / her mind and body.
  • the seventh aspect of the present embodiment is A means for acquiring tactile sensation prediction information indicating physical characteristics regarding future skin tactile sensation;
  • the means for estimating is an information processing apparatus that estimates a skin index based on touch log information and touch prediction information.
  • a skin condition based on a log and prediction of physical characteristics related to touch of the skin is presented. Thereby, the user can know the condition of the skin based on the past and future conditions of the physical property related to the touch of his / her skin.
  • the eighth aspect of the present embodiment is The acquiring means acquires the tactile sensation information from the tactile sensation sensor 40 configured to measure the tactile sensation information. It is an information processing apparatus.
  • the ninth aspect of the present embodiment is The tactile sensation information includes at least one of skin vibration characteristics, unevenness characteristics, contact characteristics, deformation characteristics, temperature, and skin friction characteristics. It is an information processing apparatus.
  • a client device 10 connectable to an information processing apparatus (for example, server 30) Equipped with a display
  • a unit for example, the processor 12 that executes the process of step S102 for displaying on the display the first information and the second information presented by the information processing apparatus (for example, the server 30); It is a client device.
  • the storage device 11 may be connected to the client device 10 via the network NW.
  • the storage device 31 may be connected to the server 30 via the network NW.
  • Each step of the above information processing can be executed by either the client device 10 or the server 30.
  • the tactile sensation log information and the action log information may be acquired by a module incorporated in the client device 10.
  • the tactile sensation log information, the action log information, and the mind-body log information may be acquired from an external server different from the server 30.
  • step S301 estimation of one of the first skin score and the second skin score may be omitted.
  • information processing system 10 client device 11: storage device 12: processor 13: input / output interface 14: communication interface 15: GPS module 30: server 31: storage device 32: processor 33: input / output interface 34: communication interface 40: Touch sensor 41: Adhesive sensor 42: Soft sensor 50: Log device 70: Predictive information providing server

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Public Health (AREA)
  • Business, Economics & Management (AREA)
  • Medical Informatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)
  • Pathology (AREA)
  • Tourism & Hospitality (AREA)
  • Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • Theoretical Computer Science (AREA)
  • Animal Behavior & Ethology (AREA)
  • Marketing (AREA)
  • Economics (AREA)
  • Biophysics (AREA)
  • General Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Strategic Management (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Human Resources & Organizations (AREA)
  • Child & Adolescent Psychology (AREA)
  • Veterinary Medicine (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Epidemiology (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Medical Treatment And Welfare Office Work (AREA)
  • User Interface Of Digital Computer (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

L'invention concerne un dispositif de traitement d'informations, pourvu : d'un moyen d'acquisition d'informations de sensation tactile indiquant des propriétés physiques relatives à la sensation tactile de la peau d'un utilisateur ; et un moyen pour présenter des premières informations concernant l'état de la peau de l'utilisateur sur la base des informations de sensation tactile et/ou des secondes informations relatives à l'état de la peau de l'utilisateur sur la base d'informations de journal de sensation tactile qui comprennent de multiples instances des informations de sensation tactile.
PCT/JP2018/038803 2017-11-30 2018-10-18 Dispositif de traitement d'informations, dispositif client et programme Ceased WO2019106998A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2019557064A JP7208153B2 (ja) 2017-11-30 2018-10-18 情報処理装置、クライアント装置、情報処理方法、及び、プログラム
JP2023000344A JP7720873B2 (ja) 2017-11-30 2023-01-05 情報処理装置、クライアント装置、プログラム、及び、システム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017231081 2017-11-30
JP2017-231081 2017-11-30

Publications (1)

Publication Number Publication Date
WO2019106998A1 true WO2019106998A1 (fr) 2019-06-06

Family

ID=66664882

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/038803 Ceased WO2019106998A1 (fr) 2017-11-30 2018-10-18 Dispositif de traitement d'informations, dispositif client et programme

Country Status (3)

Country Link
JP (2) JP7208153B2 (fr)
TW (1) TW201931276A (fr)
WO (1) WO2019106998A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2022030300A1 (fr) * 2020-08-03 2022-02-10
JPWO2022118890A1 (fr) * 2020-12-02 2022-06-09

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019106998A1 (fr) 2017-11-30 2019-06-06 株式会社 資生堂 Dispositif de traitement d'informations, dispositif client et programme
CN113689250A (zh) * 2020-05-19 2021-11-23 隆鼎国际私人有限公司 化妆品云端平台及化妆品调制系统

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001198091A (ja) * 2000-01-17 2001-07-24 Kao Corp 診断装置
JP2002224049A (ja) * 2001-02-06 2002-08-13 Tadashi Goino 携帯端末装置、アドバイスシステム、肌診断評価方法、肌診断評価用プログラム、化粧アドバイス提供方法及び化粧アドバイス提供用プログラム
JP2015204033A (ja) * 2014-04-15 2015-11-16 株式会社東芝 健康情報サービスシステム

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002041959A (ja) * 2000-07-28 2002-02-08 Fuji Xerox Co Ltd 肌のケア方法のアドバイスシステム、および肌のケア方法の入手システム
JP2005052212A (ja) * 2003-08-05 2005-03-03 Axiom Co Ltd 肌センサ
JP4609171B2 (ja) * 2005-04-18 2011-01-12 パナソニック電工株式会社 肌水分量測定装置
JP2008242963A (ja) * 2007-03-28 2008-10-09 Fujifilm Corp 健康分析表示方法及び健康分析表示装置
JP5534799B2 (ja) * 2009-12-22 2014-07-02 ショットモリテックス株式会社 肌特性測定装置およびプログラム
JP2012130580A (ja) * 2010-12-22 2012-07-12 Shiseido Co Ltd 皮膚触感評価方法及び皮膚触感評価システム
JP2013013628A (ja) * 2011-07-05 2013-01-24 Ands Corporation 皮膚状態の測定装置及び測定方法
JP2013117941A (ja) * 2011-10-31 2013-06-13 Sony Corp 体質判定装置、体質判定方法、健康支援装置、健康支援方法、プログラム、端末装置および健康支援システム
TWI556116B (zh) * 2012-02-15 2016-11-01 Hitachi Maxell Skin condition analysis and analysis information management system, skin condition analysis and analysis information management method, and data management server
WO2015029452A1 (fr) * 2013-08-30 2015-03-05 株式会社ニュートリション・アクト Analyseur, procédé d'analyse, programme et trousse de collecte d'échantillon de peau
WO2019106998A1 (fr) 2017-11-30 2019-06-06 株式会社 資生堂 Dispositif de traitement d'informations, dispositif client et programme

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001198091A (ja) * 2000-01-17 2001-07-24 Kao Corp 診断装置
JP2002224049A (ja) * 2001-02-06 2002-08-13 Tadashi Goino 携帯端末装置、アドバイスシステム、肌診断評価方法、肌診断評価用プログラム、化粧アドバイス提供方法及び化粧アドバイス提供用プログラム
JP2015204033A (ja) * 2014-04-15 2015-11-16 株式会社東芝 健康情報サービスシステム

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2022030300A1 (fr) * 2020-08-03 2022-02-10
JP7517429B2 (ja) 2020-08-03 2024-07-17 株式会社村田製作所 肌状態推定方法、肌状態推定装置及び肌状態推定システム
JPWO2022118890A1 (fr) * 2020-12-02 2022-06-09
WO2022118890A1 (fr) * 2020-12-02 2022-06-09 株式会社資生堂 Procédé de génération de données concernant la perception tactile de la peau humaine, dispositif de génération de données concernant la perception tactile de la peau humaine, procédé d'évaluation de la perception tactile de la peau humaine, dispositif d'évaluation de la perception tactile de la peau humaine, dispositif de présentation de la perception tactile de la peau humaine, et procédé de présentation de la perception tactile de la peau humaine
JP7679929B2 (ja) 2020-12-02 2025-05-20 株式会社 資生堂 人肌触感用データ生成方法、人肌触感用データ生成装置、人肌触感評価方法、人肌触感評価装置、人肌触感呈示装置、及び人肌触感呈示方法

Also Published As

Publication number Publication date
TW201931276A (zh) 2019-08-01
JP2023052208A (ja) 2023-04-11
JPWO2019106998A1 (ja) 2020-12-10
JP7720873B2 (ja) 2025-08-08
JP7208153B2 (ja) 2023-01-18

Similar Documents

Publication Publication Date Title
JP7720873B2 (ja) 情報処理装置、クライアント装置、プログラム、及び、システム
Marino Impacts of using passive back assist and shoulder assist exoskeletons in a wholesale and retail trade sector environment
US20140224552A1 (en) Body weight management device
Boeselt et al. Validity and usability of physical activity monitoring in patients with chronic obstructive pulmonary disease (COPD)
JP5736823B2 (ja) 体重管理装置
Rodić et al. Adoption intention of an IoT based healthcare technologies in rehabilitation process
JPWO2018116703A1 (ja) 表示制御装置、表示制御方法及びコンピュータプログラム
JP7686024B2 (ja) 情報処理装置、クライアント装置、及び、プログラム
KR20180087876A (ko) 관리서버, 이를 포함하는 피부관리 시스템 및 방법
JP7442503B2 (ja) 顧客に合わせてカスタマイズされた化粧品を提供する方法
KR20120006767A (ko) 매트리스 모델 선정 시스템
Ahmad et al. Predicting the load constant of the revised NIOSH lifting equation based on demographics
JP2018190176A (ja) 画像表示装置、肌状態サポートシステム、画像表示プログラム及び画像表示方法
JP2024029883A (ja) 健康管理装置、健康管理方法、及びプログラム
JP7220108B2 (ja) 情報処理装置、および情報処理装置の制御方法
JP7437549B2 (ja) 画像表示装置
JP2019028691A (ja) 肌状態のケアに関する情報出力システム、情報出力プログラム及び情報出力方法
Bandara et al. Are scrutiny and mistrust related? An eye-tracking study
US20200060609A1 (en) Method and device for ascertaining a skin condition
JP2015029609A (ja) 嗜好性評価方法、嗜好性評価装置およびプログラム
Fekete Advanced Statistical Methods in MCID/MID Research
WO2025041729A1 (fr) Méthode d'estimation de l'état de la peau, dispositif d'estimation de l'état de la peau, système d'estimation de l'état de la peau et programme
JP2018108182A (ja) 肌状態サポートシステム
WO2023017654A1 (fr) Dispositif de prédiction, procédé de prédiction et programme de prédiction
WO2023237749A1 (fr) Appareil et méthode d'évaluation de la douleur

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18884287

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019557064

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18884287

Country of ref document: EP

Kind code of ref document: A1