WO2005016137A1 - Systeme, dispositif et procede d'affichage d'images - Google Patents
Systeme, dispositif et procede d'affichage d'images Download PDFInfo
- Publication number
- WO2005016137A1 WO2005016137A1 PCT/JP2004/011707 JP2004011707W WO2005016137A1 WO 2005016137 A1 WO2005016137 A1 WO 2005016137A1 JP 2004011707 W JP2004011707 W JP 2004011707W WO 2005016137 A1 WO2005016137 A1 WO 2005016137A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- image display
- biological information
- subject
- information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/165—Evaluating the state of mind, e.g. depression, anxiety
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6887—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient; User input means
- A61B5/742—Details of notification to user or communication with user or patient; User input means using visual displays
- A61B5/744—Displaying an avatar, e.g. an animated cartoon character
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/20—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2560/00—Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
- A61B2560/02—Operational features
- A61B2560/0242—Operational features adapted to measure environmental factors, e.g. temperature, pollution
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0002—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
- A61B5/0004—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by the type of physiological signal transmitted
- A61B5/0013—Medical image data
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10—TECHNICAL SUBJECTS COVERED BY FORMER USPC
- Y10S—TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10S128/00—Surgery
- Y10S128/92—Computer assisted medical diagnostics
Definitions
- Image display system image display device, and image display method
- the present invention relates to an image display system and an image display method for displaying a state of a subject on a display device in a visible form, and a form in which biological information of the subject at a remote place can be visually recognized.
- the present invention relates to an image display device for displaying.
- the heart rate of a human increases when he is nervous, and the heart rate stabilizes when he is calm.
- the heart rate is a contraction rhythm of the heart and is a parameter indicating a human condition.
- Environmental information quantitatively shows the surrounding environment surrounding humans, such as changes in temperature and wind strength.
- Environmental information like biological information, is used as information to know the state of human beings.
- an electrocardiogram of a subject entering a bath tub was measured as disclosed in Japanese Patent Application Laid-Open No.
- an apparatus that generates an image in response.
- an electrocardiogram is measured as biological information
- the temperature of a bathtub is measured as environmental information.
- an image that changes with the measurement operation is presented to the subject in the biological information, and the subject measures the electrocardiogram while enjoying the image.
- the device disclosed in the above document is for accurately measuring an electrocardiogram, and the type of biological information to be measured and the purpose of use are limited.
- this device is for the subject to check his / her own ECG, it can be measured by a person other than the subject. It does not notify the condition of the person.
- the biological information and the environmental information are indices indicating the emotion and physical condition of the user, the surrounding environment, and the like.
- biological information is numerical information
- experience and time are needed to understand the meaning of the numerical value.
- Observing fresh information is also painful for those who can see it.
- a video camera is installed on a part of the body of a person to be measured or in a corner of a room to capture an image of the person to be measured. If this method is used, the condition of the subject can be seen at a glance, which violates the privacy of the subject.
- An object of the present invention is to provide a novel image display system and an image display for displaying a new state of a subject on a display device in a visible form, which can solve the problems of the conventional technology as described above. It is an object of the present invention to provide a method and an image display device for displaying biological information of a subject at a remote place in a visible form.
- the image display system arranges the biological information measuring device and the image display device at two different points, and the biological information measuring device transmits the measured biological information to the image display device via the network.
- the image display device generates and displays an image expressing the state of the subject based on the biological information.
- the image display device receives the biological information transmitted by the biological information measuring device, and generates and displays an image expressing the state of the subject based on the biological information.
- An image display method transmits biological information of a subject remotely, and generates an image expressing the state of the subject based on the biological information.
- the present invention it is possible to generate an image based on biological information and environmental information of a subject, and display the state of the subject on an image display device located at a different point from the subject.
- the image generated by the present invention is not a real image of the state of the person to be measured in a remote place. Because it is an image generated based on environmental information, unlike precise medical applications, it not only allows the subject to be described in a vague manner, but also allows the user to enjoy the generated image, thereby improving entertainment. improves.
- the state of the subject can be displayed at a point different from that of the subject, and the state of the subject can be easily grasped from a remote location.
- the image display device does not violate the privacy of the subject because it represents the vague state of the subject.
- FIG. 1 is a diagram showing a system configuration of an image display system according to the present invention.
- FIG. 2 is a block diagram showing a mobile terminal constituting the image display system according to the present invention.
- FIG. 3 is a block diagram showing an image display device constituting the image display system according to the present invention.
- FIG. 4 is a diagram showing a process of generating different images from the same biological information and environmental information.
- FIG. 5 is a diagram showing a system configuration of another example of the image display system according to the present invention.
- FIG. 6 is a diagram schematically showing an example in which a positional relationship is reflected on an image.
- FIG. 7 is a diagram schematically showing an example in which a data synchronization relationship is reflected on an image.
- FIG. 8 is a diagram schematically showing an example in which the relationship between the emotions of the subject is reflected in an image.
- FIG. 9 is a diagram showing a configuration of multiplexed data.
- FIG. 10 is a diagram showing an attachment position of a stimulus presentation device.
- FIG. 11 is a diagram showing a state in which a viewer touches a touch panel.
- FIG. 12 is a diagram showing an example of a correspondence table.
- Figure 13 schematically shows the process from when the viewer touches the screen to when the area is specified.
- FIG. 14 is a diagram schematically showing presentation of a tactile sensation from an image display device to a stimulus presentation device.
- FIG. 15 is a diagram showing a system configuration of still another example of the image display system according to the present invention.
- FIG. 16 is a diagram showing a system configuration of still another example of the image display system according to the present invention.
- An image display system includes a biological information measuring device.
- the biological information measuring device measures the biological information of a person (hereinafter, referred to as a subject) and environmental information around the subject, and transmits the measured biological information and environmental information to an image display device located in a remote place. Output.
- the image display device generates and displays an image expressing the state of the subject based on the biological information and the environment information.
- the state of a human being in a remote place can be transmitted by transmitting biological information via a network.
- an image display system 1 includes, as shown in FIG. 1, an electronic device 10 that is connectable to a network and has a function of measuring biometric information, And an image display device 20 that receives and displays the biological information of the user.
- the electronic device 10 and the image display device 20 are connected to each other via a network 100.
- the Internet is used as the network 100.
- the electronic device 10 is preferably a device that is always carried and carried, such as a portable telephone, a portable personal information terminal (Personal Digital Assistant), or the like.
- the device is provided with a biological sensor at a position where biological information of a user of the device can be efficiently acquired.
- the biometric sensors include those built into the mobile phone 10 and those separated from the mobile phone 10.
- the biosensor separated from the mobile phone 10 includes a device contact sensor l la, which is provided at the contact portion between the human body and a chair, bed, or other electronic device, and various parts of the human body. 1 lb. of human body contact type sensors that can be attached directly to a human body to detect biometric data.
- the biometric sensor includes a video camera 11c for capturing an image of the subject and a microphone for collecting the voice of the subject.
- the mobile terminal 10 shown in FIG. 2 includes a biological sensor 11 for measuring biological information, an environmental information sensor 12 for measuring environmental information, a ROM (Read Only Memory) 13 for recording programs and setting information, and a temporary storage.
- These blocks are connected via a bus 18.
- the biological sensor 11 measures biological information.
- Biological information is information that quantitatively indicates the movement of organs that make up the human body, such as blood pressure, pulse, and brain waves.
- the biological sensor 11 is provided on the surface of the portable terminal 10 or inside the portable terminal 10.
- a thermometer, a pulse meter, a perspiration meter, and the like are provided on a grip portion of the mobile terminal 10.
- An accelerometer, a vibration meter, and the like are provided inside the mobile terminal 10.
- a respirometer is provided in a microphone portion of the mobile terminal 10.
- the biological sensor 11 is also provided at a position different from the mobile terminal 10, such as a part of a user's body, furniture, and a part of a room. By attaching biometric sensors at various positions, various types of biological information can be measured.
- the biological sensor l ib attached to a part of the user's body includes a blood flow meter, electroencephalograph, eye movement sensor, electrocardiograph, vibration gyro, acceleration sensor, skin temperature sensor, body motion acceleration sensor, skin conductivity sensor, There are pulse meters, sphygmomanometers, respiratory sensors, pupil diameter sensors, tilt sensors, blood oxygen saturation sensors, and others.
- a blood flow meter emits infrared light to the human body, and measures the blood flow in the brain and the concentration of oxygen in the blood by reflecting the infrared light.
- An electroencephalograph measures electroencephalograms such as spikes and beta waves based on the current flowing in the brain.
- the eye movement sensor is attached to the head and measures the vibration frequency component of the eye based on the head voltage.
- An electrocardiograph measures the heart rate based on the current transmitted by the myocardium.
- the vibrating gyroscope measures chest movement and respiratory rate based on angular velocity.
- the skin temperature sensor measures body temperature.
- the skin conductivity sensor has a skin electrical resistance The amount of perspiration is measured based on.
- the respiration sensor is wrapped around the user's abdomen and chest to detect pressure fluctuations corresponding to the respiration.
- the tilt sensor measures the body position based on the tilt of each part of the body.
- the biological sensor 11a provided on furniture or the floor includes thermography, body dynamometer, respirometer, pulse meter, and the like.
- the biological sensor 11 installed on a sofa or a bed extracts a pulse, a breath, and a body motion based on a pressure change pattern by a human body transmitted through an elastic body of the sofa or the bed.
- a temperature distribution of a human body is measured by an infrared sensor.
- the biometric sensor 11c that collects the image and sound of the subject includes a video camera and a microphone. Image power captured by a video camera The ability to measure human movements, changes in facial expressions, and eye movements. Microphones collect human voice.
- These biological sensors 11 transmit biological information measured via infrared rays or wirelessly to the mobile terminal 10.
- the environmental information sensor 12 is a sensor that measures environmental information around a measurer.
- the environmental information sensor 12 includes a light meter, a gas sensor, a thermometer, a barometer, an altimeter, a GPS (Global Positioning System), and the like.
- the light meter measures the brightness around the subject
- the gas sensor measures the odor.
- GPS measures the latitude and longitude of the position where the subject is located based on radio waves from satellites.
- the mobile terminal 10 can also acquire environmental information via the network 100.
- Environmental information obtained via the network 100 includes weather, moon age, snowfall, rainfall, air pollution, wind speed, and the like.
- the communication interface 16 transmits the biological information measured by the biological sensor 11 and the environmental information measured by the environmental information sensor 12 to the image display device 20.
- the transmission processing of the communication interface 16 is controlled by the CPU 15.
- the CPU 15 executes a program for transmitting the biological information and the environmental information in the background, and outputs an instruction signal to the communication interface 16 by using a timer value as a trigger.
- the image display device 20 generates an image expressing the state of the subject based on the biological information and the environmental information received from the mobile terminal 10.
- the image display device 20 may be a device having a display screen and an information processing unit, such as a television, a mobile phone, or a personal computer, or may be a dedicated device.
- FIG. 3 shows the internal configuration of the image display device 20.
- the image display device 20 is provided with a key input by a user.
- Input unit 21 for receiving force
- audio output unit 22 for outputting audio
- display unit 23 for displaying images
- ROM 24 for recording programs and setting information
- RAM 25 as a work area for CPU 29, and information recorded on a recording medium.
- a device driver 26 for reading, a communication interface 27 for performing data communication according to a predetermined communication protocol, and an image storage unit 28 for storing an image are provided. These blocks are connected via a bus 200.
- the CPU 29 estimates the rough power ⁇ emotion and movement of the subject based on the biological information, and generates an image expressing the state of the subject.
- the generated image is an abstraction and symbolization of the subject's state. That is, the image generated by the image display device 20 expresses rough emotions and actions of the subject, but does not realistically describe the subject.
- the present invention is characterized in that a vague image of a person to be measured is generated and displayed casually.
- the image storage unit 28 stores a program for generating an image from biological information and environmental information, and a plurality of fish images expressing the state of the subject.
- Examples of the image of the fish include an image of the fish feeding, an image of the fish swimming vigorously, and an image of the sleeping fish.
- the image storage unit 28 stores a plurality of background images. Background images include clean water, muddy water, fast flowing water, and underwater at night.
- the CPU 29 also estimates the state of the person to be measured based on biological information and environmental information, and selects an image of a fish expressing this state. A method for estimating the state will be described.
- the state of the subject includes emotions such as anger and grief, sensations such as pleasure, and movements such as eating, moving, and sleeping.
- Human motion can be inferred, for example, based on images taken by a video camera.
- the video camera is installed at the head of the person to be measured or at one corner of the room.
- the CPU 15 can estimate an object existing around the subject and an operation of the subject based on an image captured by the video camera.
- the movement of the subject can be estimated from the position of the subject. For example, they are more likely to be working when they are in the office, ill when they are in the hospital, exercising when they are in the gym, and eating when they are in restaurants. Since it is difficult to specify the operation based on only the position information of the subject, the CPU 15 estimates the operation of the subject using information that is combined with the biological information and environmental information of the subject.
- the motion of the subject can be estimated from the voice around the subject. In this case, it is possible to guess what is present in the surroundings from the sound quality and pitch of the collected sound, and to guess the movement of the subject by text mining the human voice.
- the CPU 29 estimates the emotion, sensation, and movement of the subject. Then, the CPU 29 generates an image according to the state of the subject.
- the image generation method a table for associating the state of the subject and the image is stored in the ROM 24, and according to this table, the method of selecting the image stored in the image storage unit 28 and the state of emotion, feeling, movement, and the like are described. There is a method of generating an object that outputs an image in response to a state input. Regardless of the method, the generated image recalls the state of the subject.
- the CPU 29 controls fine movements such as the movement of fish, the movement of water, the number and size of bubbles in water, and the degree of turbidity of water.
- the control of the movement may include an element that is not correlated with the biological information or the environmental information.
- the CPU 29 may automatically generate an image related to biological information or environmental information. The automatic generation of an image is performed when the subject refuses to measure the biological information, when the biological information cannot be measured, or when the biological information and the environmental information cannot be received due to a transmission path failure.
- the CPU 29 displays the image on the display unit 23.
- the display unit 23 displays an image representing the state of the person to be measured, but this is an image in which the state of the person to be measured is not a realistic image but is vaguely expressed. Since this image is abstract, even if it is always displayed, it does not penetrate deeply into the privacy of the subject and does not bother long-term observation. If this image is expressed as a living thing or landscape, it can be installed on a living room wall or furniture like a photo stand or painting. Further, when an image is displayed on the mobile terminal 10, the state of the subject can be always observed.
- the image displayed on the display unit 23 is not limited to fish, but may be other animals such as dogs and cats, combigraphics, etc., and simultaneously the sound sampled by the sound output unit 22 and the synthesized sound. You may make it output.
- the display image can be selected.
- the image storage unit 28 stores various images and sounds.
- the image display device 20 can change the display content according to the selection of the display image by the user.
- FIG. 4 shows an example of a screen 23a displaying a fish and a screen 23b displaying a cat. These images have different display contents, but are generated based on the same biological information and environmental information.
- the displayed image may be a realistic image like a photo or a deformed image like an animation. In any case, any image may be used as long as it is easy to recall the state of the subject.
- the display image may be recorded in the image storage unit 28 in advance, or may be obtained from the outside via the network 100 or a recording medium. Display images not stored in the image storage unit 28 If the image is acquired from outside, the variety of images that can be displayed is diversified, and the business of selling images can be developed.
- the image display system 1 notifies the user located at a remote place of the state of the subject.
- the image indicating the state of the subject is a vague image, and it is possible to notify a rough change of the subject without giving any discomfort to the viewer or the viewer.
- This image display system 1 can be used for monitoring elderly people living in remote places and children who have gone far away. Further, in the image display system 1, a change in a human state can be represented by an image instead of an actual image.
- This image display system expresses the situation of the person to be measured in an image and notifies a user who is at a remote place, but this image becomes an ornamental image and is used for entertainment.
- the biological information and environmental information of the subject are accumulated in the server 30 from the plurality of mobile terminals 10.
- the server 30 multiplexes the biological information and the environmental information transmitted from the mobile terminal 10 and transfers the multiplexed information to the image display device 20.
- the server 30 is not required when information on the subject is accumulated on one image display device 20.
- the image display device 20 reflects the relationship between the subjects on the image.
- the relationship of the subject includes a positional relationship, a data synchronization relationship, a relationship of the subject's environment, and a relationship of the subject's emotions.
- Position information is measured by GPS. Fig.
- the image display device 20 reflects the positional relationship on the change of the image according to a predetermined rule. These rules are stored in the form of a program in the ROM 24 or the RAM 25 of the image display device 20.
- the CPU 29 reads the program and changes the image.
- the subject is associated with an object, for example, a jellyfish here. Then, the object of the subject at a close position is displayed closer.
- the size of the object is determined from the distance between the subject and the image display device 20.
- the object of the subject existing at a position close to the image display device 20 is displayed large and the object of the subject existing at a position far from the image display device 20 is displayed small.
- the jellyfish corresponding to the subjects A and B are arranged at close positions, and the jellyfish corresponding to the subject C are arranged at a remote position.
- the jellyfish corresponding to the subjects A and B are displayed large, and the jellyfish corresponding to the subject C are displayed small. I have.
- the image display device 20 applies the position information of the subject to a simple rule to reflect the relationship between the subjects and the relationship between the subject and the image display device 20 on the image.
- the spontaneous movement of the subject is an independent event, but a relationship is created by applying the rules.
- the image display device 20 reflects the data synchronization relationship in image generation.
- the data referred to is data generated or changed in a predetermined cycle such as breathing, heartbeat, walking rhythm, and motion.
- the image display device 20 When such data is synchronized by chance, the image display device 20 generates an image indicating that the data is synchronized.
- the left figure in Fig. 7 shows that the heart rates of the subjects D and E are synchronized, and the right figure in Fig. 7 shows that the jellyfish corresponding to the subjects D and E are cooperating and dancing. Is shown.
- the rule of inputting the heart rate and generating an image of dancing is used for a week.
- the environmental relationship is, for example, a difference in environmental information.
- Environmental information such as brightness, temperature, atmospheric pressure, altitude, and weather can be measured by the environmental information sensor 12.
- the image display device 20 displays an image of a large swelling wave when the jellyfish's behavior to be displayed increases as the temperature difference around the subject increases, or when the weather at the location where the subject is located is bad. Or display turbid water.
- the image display device 20 estimates the emotion and mood of the subject from the biological information and environmental information of the subject, and reflects the relationship between the emotions of the subject on the image. The method of inferring emotions is as described above.
- FIG. 8 is a schematic diagram in which the feeling of pleasure is reflected in the image.
- the image display device 20 performs gnolap classification based on whether the subject is in a pleasant or unpleasant mood.
- the jellyfish corresponding to the subject belonging to the pleasure gnolap can be moved or brought closer together.
- the jellyfish corresponding to the subject belonging to the group of discomfort are made hostile, separated from each other, or made aggressive.
- the image display device 20 when the image display device 20 receives data of a plurality of subjects, An image is generated from the relationship of the subject.
- the data input here is biological information and environmental information of the subject.
- the image display device 20 stores a program for obtaining the relationship between the subjects from the input data and reflecting the relationship on the image.
- the relationship between the input data and the method of reflecting on the image are not particularly limited.
- the present invention proposes a process of obtaining a relationship from data and reflecting the relationship in an image. Next, a method of transmitting data of a plurality of subjects will be described.
- the server 30 multiplexes the data transmitted from the mobile terminal 10.
- the multiplexed information is composed of a plurality of packets 40 as shown in FIG.
- Each packet 40 includes a communication header 41 and a data part 42.
- the communication header 41 stores communication control information such as addresses of a transmission destination and a transmission source.
- the data section 42 includes a data header 43 and a data storage section 44.
- the data header 43 includes an ID 45 of a subject, a time stamp 46 for time synchronization, an information category code 47 indicating a category of information, and an information type code 48 indicating a type of information.
- the information category code 47 is a code indicating whether the data stored in the data storage unit 44 is biological information or environmental information. This code can be extended if new information categories are measured in the future.
- the information type code 48 is a code indicating what the information stored in the data storage unit 44 is.
- the data storage unit 44 stores the actual value of the item indicated by the information type code 48. For example, if the information type code 48 is "pulse”, a numerical value such as "72" is entered, and if the information type code 48 is "weather”, a numerical value or a character string representing "sunny” is entered. This field has a variable length, and a numerical value indicating the length of the data storage unit itself is placed at the head of the data storage unit 44.
- the image display device 20 multiplexes and separates data based on the ID of the subject. Then, the information is arranged in chronological order based on the time stamp.
- the image display device 20 generates an image indicating the state of each subject and causes the generated image to be displayed on the display unit.
- the method of generating images is such that three subjects carry mobile terminals 10a, 10b, and 10c, respectively, and the three mobile terminals 10a, 10b, and 10c transmit biological information and information to the server 30.
- Environmental information is being sent.
- the server 30 multiplexes the received biological information and environmental information, and Send to 0.
- the image display device 20 generates images indicating the states of three persons to be measured, and displays the generated images on one display unit 23.
- the mobile terminal 10 and the image display device 20 may directly exchange data without arranging the power server 30 in which the server 30 is arranged as a relay device for data transmission.
- the configuration of the packet and the code of the data identifier generated by the server 30 are not limited to those described above.
- each subject is represented by a pseudo-creature, but in practice, one object or a living thing may be represented based on the biological information and environmental information of each subject. For example, it is possible to represent the flight of one airplane by assigning the movements of the engine, tail wing, and wing of one airplane to different subjects. Alternatively, it may be possible to express the movement state such as the color, size, and how to bounce the ball.
- the viewer of the image and the person to be measured interact with each other.
- the viewer stimulates the skin sensation (particularly, tactile sensation) of the subject.
- the biological information of the subject changes due to the stimulus, and the change also changes the image showing the subject.
- a feedback loop is formed between the subject and the viewer.
- a stimulus presentation device 90 for stimulating the skin is attached to the subject.
- FIG. 10 shows the mounting position of the stimulus presentation device 90.
- the stimulus presentation device 90 is attached to one or more points on the body of the subject.
- the stimulus presentation device 90 converts a tactile signal input from a remote place into a physical stimulus.
- the tactile signal is input from the mobile terminal 10.
- the portable terminal 10 and the stimulus presentation device 90 are connected by wire or wirelessly.
- the stimulus presentation device 90 provides a stimulus to the subject by using at least one of vibration by an actuator motor and electrical stimulation as used in a low-frequency treatment device to present a tactile stimulus. It should be noted that the stimulus can be presented by the vibration function of the portable terminal 10 without specially providing the stimulus presentation device 90.
- the display panel 23 of the image display device 20 is provided with a touch panel 91.
- the touch panel 91 senses an input based on a change in resistance or capacitance, and may use a piezoelectric element such as PVDF (polyvinylidene fluoride).
- PVDF polyvinylidene fluoride
- the subject is associated with an object (here, jellyfish).
- the display unit 23 displays the same number of jellyfish as the subject. Each jellyfish is drawn on one layer. By overlaying layers, multiple jellyfish are displayed on one display unit. As shown in FIG. 11, when the user touches the touch panel 91, the image display device 20 obtains the coordinates P of the touched portion.
- the image display device 20 compares the coordinates of the object and the point P, and determines whether the point P is included in the object. This determination is made sequentially for the innermost layer. Comparing up to the foremost layer, the image display device 20 determines that the object that was finally determined to be the object including the point P is the touched object.
- the image display device 20 When the touched object is rotating, the image display device 20 normalizes the object. When determining that the object is the object, the image display device 20 determines the force of the user touching any part of the object. In order to make this determination, the object is divided into areas.
- the correspondence table 92 shown in FIG. 12 records the coordinates of the area and the name of the corresponding part that gives the tactile stimulus.
- Fig. 13 schematically shows the process from when the viewer touches the screen to when the area is specified. Area a corresponds to the jellyfish's head, area b corresponds to the jellyfish's chest, area c corresponds to the jellyfish's abdomen, area d corresponds to the jellyfish's feet, and area d corresponds to the jellyfish's feet. Corresponding to the department. In this example, the point P touched by the viewer is included in the area 7d, that is, the upper part of the foot.
- the correspondence table 92 be managed together with the image data displayed on the image display device 20 equipped with the touch panel 91.
- the correspondence table is also downloaded at the same time.
- the tactile stimulus is given a unique code for each part, and the code of the corresponding part remains the same whether the displayed image is a jellyfish or a cat.
- the head is defined as 0 X 01
- the chest is defined as 0 X 02
- the code of the corresponding part is 2 to the power of n, and it is possible to support multiple tactile stimulation devices at the same time.
- there are only five types of corresponding parts but if the codes do not overlap, then you can add more. However, since this number is unique, it must not duplicate a code that has already been used.
- FIG. 14 schematically shows transmission of a tactile sensation from the image display device 20 to the stimulus presentation device 90.
- the stimulus type is a code that represents the type of stimulus such as the vibration pattern of the actuator or electrical stimulation.
- the vibration frequency of the actuator can be changed, the vibration rhythm and the pattern of electrical stimulation can be changed, and a non-single stimulation pattern can be given.
- the image display system 3 may be configured to distribute biological information and environmental information of a specific subject to an unspecified number of image display devices 20.
- the image generation system 3 transmits the biological information measuring device 50 for measuring the biological information and the environmental information of the subject and the biological information and the environmental information to the many image display devices 20. It comprises a server 60 and an image display device 20 for generating an image based on biological information and environmental information.
- the biological information measuring device 50 has substantially the same configuration as the portable terminal 10 described above.
- the image display system 3 distributes personal private information to a large number of people, so it is not always carried like a mobile terminal, but rather has a high degree of publicity and is installed in a space. S is desirable.
- personal biometric information may be distributed. Used for distribution of information and environmental information. By distributing the biometric and environmental information of my musicians and athletes, you can grasp the general flow of the game, even if it is impossible to watch actual sports at work, It is possible to observe changes in the emotions of athletes.
- an image display device 20 that generates an image based on previously measured biological information and environmental information, rather than reproducing the measured biological information and environmental information in real time, will be described.
- the already measured biological and environmental information are It is recorded on the biological information storage server 70 on the work 100 or on a recording medium 80 such as a CD-ROM or a semiconductor memory module.
- the image display device 20 generates an image based on these biological information and environmental information. This allows the user to enjoy the same video over and over again.
- only biological information and environmental information can be measured in advance and displayed when there is enough time. For example, the living body information of the deceased person can be recorded before birth, and the state before birth can be reproduced as appropriate with a symbolic image.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Public Health (AREA)
- Medical Informatics (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Pathology (AREA)
- Veterinary Medicine (AREA)
- Molecular Biology (AREA)
- Biophysics (AREA)
- Physics & Mathematics (AREA)
- Animal Behavior & Ethology (AREA)
- Heart & Thoracic Surgery (AREA)
- Surgery (AREA)
- Psychiatry (AREA)
- Developmental Disabilities (AREA)
- Educational Technology (AREA)
- Hospice & Palliative Care (AREA)
- Psychology (AREA)
- Child & Adolescent Psychology (AREA)
- Social Psychology (AREA)
- Databases & Information Systems (AREA)
- Data Mining & Analysis (AREA)
- Epidemiology (AREA)
- Primary Health Care (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
- Controls And Circuits For Display Device (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
Priority Applications (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| AU2004264761A AU2004264761A1 (en) | 2003-08-19 | 2004-08-13 | Image display system, image display device, image display method |
| EP04771674A EP1656880A4 (fr) | 2003-08-19 | 2004-08-13 | Systeme, dispositif et procede d'affichage d'images |
| HK07102749.3A HK1095506B (en) | 2003-08-19 | 2004-08-13 | Image display system, image display device, image display method |
| US10/567,930 US7731656B2 (en) | 2003-08-19 | 2004-08-13 | Image display system, image display device, image display method |
Applications Claiming Priority (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2003295482 | 2003-08-19 | ||
| JP2003-295482 | 2003-08-19 | ||
| JP2004136918A JP3931889B2 (ja) | 2003-08-19 | 2004-04-30 | 画像表示システム、画像表示装置、画像表示方法 |
| JP2004-136918 | 2004-04-30 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2005016137A1 true WO2005016137A1 (fr) | 2005-02-24 |
Family
ID=34197166
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2004/011707 Ceased WO2005016137A1 (fr) | 2003-08-19 | 2004-08-13 | Systeme, dispositif et procede d'affichage d'images |
Country Status (7)
| Country | Link |
|---|---|
| US (1) | US7731656B2 (fr) |
| EP (3) | EP2298156A1 (fr) |
| JP (1) | JP3931889B2 (fr) |
| KR (1) | KR101072561B1 (fr) |
| AU (1) | AU2004264761A1 (fr) |
| SG (1) | SG142298A1 (fr) |
| WO (1) | WO2005016137A1 (fr) |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP1912558A4 (fr) * | 2005-07-26 | 2010-01-20 | Vivometrics Inc | Interfaces informatiques comportant des avatars guides physiologiquement |
| JP2018018492A (ja) * | 2016-07-15 | 2018-02-01 | パナソニックIpマネジメント株式会社 | コンテンツ提示のための情報処理装置、情報処理装置の制御方法、及び制御プログラム |
| JP2019198531A (ja) * | 2018-05-17 | 2019-11-21 | Cyberdyne株式会社 | 生体情報計測装置及び生体情報計測方法 |
Families Citing this family (73)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2006314078A (ja) * | 2005-04-06 | 2006-11-16 | Sony Corp | 撮像装置、音声記録装置および音声記録方法 |
| US8083589B1 (en) * | 2005-04-15 | 2011-12-27 | Reference, LLC | Capture and utilization of real-world data for use in gaming systems such as video games |
| US20060277466A1 (en) * | 2005-05-13 | 2006-12-07 | Anderson Thomas G | Bimodal user interaction with a simulated object |
| JP4640953B2 (ja) * | 2005-05-30 | 2011-03-02 | 日本電信電話株式会社 | 心理状態分析装置 |
| JP3762966B1 (ja) * | 2005-07-20 | 2006-04-05 | クオリティ株式会社 | 体調管理用携帯端末装置および体調管理プログラム |
| WO2007030275A2 (fr) | 2005-09-02 | 2007-03-15 | Emsense Corporation | Dispositif et procede pour detecter une activite electrique dans des tissus |
| JP5388580B2 (ja) * | 2005-11-29 | 2014-01-15 | ベンチャー ゲイン リミテッド ライアビリティー カンパニー | ヒトの健康に関する残差ベースの管理 |
| KR100759806B1 (ko) * | 2005-12-08 | 2007-09-20 | 한국전자통신연구원 | 열 스트레스 관리 시스템 및 이를 이용한 열 스트레스 관리 방법 |
| JP4615474B2 (ja) * | 2006-04-07 | 2011-01-19 | 株式会社エヌ・ティ・ティ・ドコモ | 通信端末、ユーザデータ移動システム及びユーザデータ移動方法 |
| JP5092357B2 (ja) * | 2006-11-07 | 2012-12-05 | ソニー株式会社 | 撮像表示装置、撮像表示方法 |
| JP5023663B2 (ja) | 2006-11-07 | 2012-09-12 | ソニー株式会社 | 撮像装置、撮像方法 |
| KR100948050B1 (ko) * | 2006-11-23 | 2010-03-19 | 주식회사 메디슨 | 휴대용 초음파 시스템 |
| JP4961984B2 (ja) * | 2006-12-07 | 2012-06-27 | ソニー株式会社 | 画像表示システム、表示装置、表示方法 |
| US8652040B2 (en) | 2006-12-19 | 2014-02-18 | Valencell, Inc. | Telemetric apparatus for health and environmental monitoring |
| US8157730B2 (en) * | 2006-12-19 | 2012-04-17 | Valencell, Inc. | Physiological and environmental monitoring systems and methods |
| US8230457B2 (en) | 2007-03-07 | 2012-07-24 | The Nielsen Company (Us), Llc. | Method and system for using coherence of biological responses as a measure of performance of a media |
| US9215996B2 (en) | 2007-03-02 | 2015-12-22 | The Nielsen Company (Us), Llc | Apparatus and method for objectively determining human response to media |
| US8473044B2 (en) | 2007-03-07 | 2013-06-25 | The Nielsen Company (Us), Llc | Method and system for measuring and ranking a positive or negative response to audiovisual or interactive media, products or activities using physiological signals |
| US8782681B2 (en) | 2007-03-08 | 2014-07-15 | The Nielsen Company (Us), Llc | Method and system for rating media and events in media based on physiological data |
| US8764652B2 (en) * | 2007-03-08 | 2014-07-01 | The Nielson Company (US), LLC. | Method and system for measuring and ranking an “engagement” response to audiovisual or interactive media, products, or activities using physiological signals |
| US8926432B2 (en) * | 2007-03-12 | 2015-01-06 | Performance Designed Products Llc | Feedback controller |
| US20090233710A1 (en) * | 2007-03-12 | 2009-09-17 | Roberts Thomas J | Feedback gaming peripheral |
| JP4367663B2 (ja) * | 2007-04-10 | 2009-11-18 | ソニー株式会社 | 画像処理装置、画像処理方法、プログラム |
| JP4506795B2 (ja) | 2007-08-06 | 2010-07-21 | ソニー株式会社 | 生体運動情報表示処理装置、生体運動情報処理システム |
| US8221290B2 (en) | 2007-08-17 | 2012-07-17 | Adidas International Marketing B.V. | Sports electronic training system with electronic gaming features, and applications thereof |
| US8702430B2 (en) | 2007-08-17 | 2014-04-22 | Adidas International Marketing B.V. | Sports electronic training system, and applications thereof |
| US8251903B2 (en) | 2007-10-25 | 2012-08-28 | Valencell, Inc. | Noninvasive physiological analysis using excitation-sensor modules and related devices and methods |
| US8072432B2 (en) * | 2008-01-15 | 2011-12-06 | Sony Ericsson Mobile Communications Ab | Image sense tags for digital images |
| US20100152620A1 (en) * | 2008-12-12 | 2010-06-17 | Immersion Corporation | Method and Apparatus for Providing A Haptic Monitoring System Using Multiple Sensors |
| US9727139B2 (en) | 2008-12-12 | 2017-08-08 | Immersion Corporation | Method and apparatus for providing a haptic monitoring system using multiple sensors |
| US8788002B2 (en) | 2009-02-25 | 2014-07-22 | Valencell, Inc. | Light-guiding devices and monitoring devices incorporating same |
| US8700111B2 (en) | 2009-02-25 | 2014-04-15 | Valencell, Inc. | Light-guiding devices and monitoring devices incorporating same |
| US9750462B2 (en) | 2009-02-25 | 2017-09-05 | Valencell, Inc. | Monitoring apparatus and methods for measuring physiological and/or environmental conditions |
| JP5493455B2 (ja) * | 2009-05-01 | 2014-05-14 | ソニー株式会社 | 画像処理装置、画像処理方法、プログラム |
| JP5493456B2 (ja) * | 2009-05-01 | 2014-05-14 | ソニー株式会社 | 画像処理装置、画像処理方法、プログラム |
| WO2010136913A1 (fr) | 2009-05-28 | 2010-12-02 | Koninklijke Philips Electronics, N.V. | Appareil et procédés d'agencement d'éléments multimédia dans un espace physique sur la base de profils personnels |
| US20110045736A1 (en) * | 2009-08-20 | 2011-02-24 | Charles Randy Wooten | Effect Generating Device in Response to User Actions |
| US20110140915A1 (en) * | 2009-10-14 | 2011-06-16 | Energy Focus, Inc. | Distributed Personalized Energy and Carbon Accounting and Feedback System |
| WO2011076243A1 (fr) * | 2009-12-21 | 2011-06-30 | Fundacion Fatronik | Système et méthode de supervision du bien-être affectif |
| HUP1000229A2 (en) * | 2010-04-27 | 2012-11-28 | Christian Berger | Equipment for registering the excitement level as well as determining and recording the excitement condition of human beings |
| JP5714411B2 (ja) * | 2010-05-17 | 2015-05-07 | 株式会社光吉研究所 | 行動分析方法および行動分析装置 |
| GB201009379D0 (en) * | 2010-06-04 | 2010-07-21 | Univ Edinburgh | Method, apparatus, computer program and system for measuring oscillatory motion |
| KR101890717B1 (ko) * | 2010-07-20 | 2018-08-23 | 삼성전자주식회사 | 생체 정보를 활용한 가상 세계 조작 장치 및 방법 |
| JP5996542B2 (ja) | 2010-10-07 | 2016-09-21 | フォルシア・オートモーティブ・シーティング・リミテッド・ライアビリティ・カンパニーFaurecia Automotive Seating, Llc | 座席構造および環境構成を向上させるために乗員の身体についての詳細を取得、解析、および使用するシステム、方法、および構成要素 |
| US8888701B2 (en) | 2011-01-27 | 2014-11-18 | Valencell, Inc. | Apparatus and methods for monitoring physiological data during environmental interference |
| JP5747645B2 (ja) * | 2011-05-09 | 2015-07-15 | 株式会社ニコン | 電子装置、信号振り分け方法およびプログラム |
| US9427191B2 (en) | 2011-07-25 | 2016-08-30 | Valencell, Inc. | Apparatus and methods for estimating time-state physiological parameters |
| EP4461222A3 (fr) | 2011-08-02 | 2025-01-22 | Yukka Magic LLC | Systèmes et procédés de réglage de filtre variable par rétroaction de mesure de fréquence cardiaque |
| CN103917993B (zh) * | 2011-11-09 | 2018-05-15 | 皇家飞利浦有限公司 | 使用生物传感器来经由数据网络服务分享情绪 |
| US9292858B2 (en) | 2012-02-27 | 2016-03-22 | The Nielsen Company (Us), Llc | Data collection system for aggregating biologically based measures in asynchronous geographically distributed public environments |
| US9451303B2 (en) | 2012-02-27 | 2016-09-20 | The Nielsen Company (Us), Llc | Method and system for gathering and computing an audience's neurologically-based reactions in a distributed framework involving remote storage and computing |
| US8989835B2 (en) | 2012-08-17 | 2015-03-24 | The Nielsen Company (Us), Llc | Systems and methods to gather and analyze electroencephalographic data |
| JP6041613B2 (ja) * | 2012-10-11 | 2016-12-14 | 株式会社Nttドコモ | コンテクスト情報記憶装置、コンテクスト情報記憶方法、及びコンテクスト情報記憶プログラム |
| WO2014116924A1 (fr) | 2013-01-28 | 2014-07-31 | Valencell, Inc. | Dispositifs de surveillance physiologique disposant d'éléments de détection découplés des mouvements du corps |
| CN103152385B (zh) * | 2013-01-29 | 2019-01-04 | 王玉娇 | 关联应用的触发、实现和执行方法及相关设备 |
| US9320450B2 (en) | 2013-03-14 | 2016-04-26 | The Nielsen Company (Us), Llc | Methods and apparatus to gather and analyze electroencephalographic data |
| KR101499858B1 (ko) * | 2013-12-20 | 2015-03-09 | 연세대학교 산학협력단 | 심리안정을 유도하는 표시장치 |
| US9622702B2 (en) | 2014-04-03 | 2017-04-18 | The Nielsen Company (Us), Llc | Methods and apparatus to gather and analyze electroencephalographic data |
| US20160029898A1 (en) | 2014-07-30 | 2016-02-04 | Valencell, Inc. | Physiological Monitoring Devices and Methods Using Optical Sensors |
| DE102014215211A1 (de) * | 2014-08-01 | 2016-02-04 | Art + Com Ag | Automatisches Erzeugen von visuellen Stimuli |
| WO2016022295A1 (fr) | 2014-08-06 | 2016-02-11 | Valencell, Inc. | Modules à capteurs physiologiques optiques avec réduction du bruit de signal |
| CN104320536B (zh) * | 2014-09-26 | 2016-06-15 | 来安县新元机电设备设计有限公司 | 一种隐私保护的方法及系统 |
| US9794653B2 (en) | 2014-09-27 | 2017-10-17 | Valencell, Inc. | Methods and apparatus for improving signal quality in wearable biometric monitoring devices |
| US10186014B2 (en) | 2015-01-06 | 2019-01-22 | Samsung Electronics Co., Ltd. | Information display method and electronic device for supporting the same |
| US10945618B2 (en) | 2015-10-23 | 2021-03-16 | Valencell, Inc. | Physiological monitoring devices and methods for noise reduction in physiological signals based on subject activity type |
| US10610158B2 (en) | 2015-10-23 | 2020-04-07 | Valencell, Inc. | Physiological monitoring devices and methods that identify subject activity type |
| CN105468144B (zh) * | 2015-11-17 | 2019-02-12 | 小米科技有限责任公司 | 智能设备控制方法及装置 |
| JP2017181593A (ja) * | 2016-03-28 | 2017-10-05 | 旭化成エレクトロニクス株式会社 | 表示装置 |
| JP6781979B2 (ja) * | 2016-05-20 | 2020-11-11 | 美貴子 隈元 | 心理状態評価用プログラム及び心理状態評価装置 |
| US10966662B2 (en) | 2016-07-08 | 2021-04-06 | Valencell, Inc. | Motion-dependent averaging for physiological metric estimating systems and methods |
| JP6409028B2 (ja) * | 2016-07-13 | 2018-10-17 | デザミス株式会社 | 牛の活動状態管理システム |
| CN107320114B (zh) * | 2017-06-29 | 2020-12-25 | 京东方科技集团股份有限公司 | 基于脑电波检测的拍摄处理方法、系统及其设备 |
| JP2019053460A (ja) * | 2017-09-14 | 2019-04-04 | 大日本印刷株式会社 | 色変更装置及びプログラム |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH10234030A (ja) * | 1997-02-19 | 1998-09-02 | Anima Denshi Kk | 監視モニタシステム |
| WO2001052718A2 (fr) | 2000-01-19 | 2001-07-26 | Healthetech, Inc. | Dispositif de surveillance du regime et de l'activite |
| JP2002034936A (ja) * | 2000-07-24 | 2002-02-05 | Sharp Corp | 通信装置および通信方法 |
| JP2002314715A (ja) * | 2001-04-18 | 2002-10-25 | Noboru Akasaka | 緊急対応方法及び緊急対応システム |
| JP2004049309A (ja) * | 2002-07-16 | 2004-02-19 | National Trust:Kk | 被看護・被介護者の監視システム |
Family Cites Families (17)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5956484A (en) * | 1995-12-13 | 1999-09-21 | Immersion Corporation | Method and apparatus for providing force feedback over a computer network |
| US5907291A (en) | 1997-06-05 | 1999-05-25 | Vsm Technology Inc. | Multi-patient monitoring apparatus and method |
| JPH114820A (ja) | 1997-06-18 | 1999-01-12 | Ee D K:Kk | 健康管理装置 |
| US6755783B2 (en) * | 1999-04-16 | 2004-06-29 | Cardiocom | Apparatus and method for two-way communication in a device for monitoring and communicating wellness parameters of ambulatory patients |
| EP1265524A2 (fr) * | 1999-10-08 | 2002-12-18 | Healthetech, Inc. | Systeme integre de gestion des echanges caloriques |
| US7171331B2 (en) * | 2001-12-17 | 2007-01-30 | Phatrat Technology, Llc | Shoes employing monitoring devices, and associated methods |
| JP4395687B2 (ja) * | 2000-12-20 | 2010-01-13 | ソニー株式会社 | 情報処理装置 |
| TW510789B (en) * | 2001-03-01 | 2002-11-21 | Sanyo Electric Co | Massage machine and physiological quantity measuring device used in the same |
| JP2002282227A (ja) | 2001-03-23 | 2002-10-02 | Osaka Gas Co Ltd | 生体情報計測装置 |
| DE10129662A1 (de) * | 2001-06-20 | 2003-01-09 | Philips Corp Intellectual Pty | Kommunikationssystem mit Systemkomponenten zur Feststellung der Urheberschaft eines Kommunikationsbeitrages |
| CN1420466A (zh) | 2001-11-20 | 2003-05-28 | 罗一峰 | 可视音乐脑电生物反馈法 |
| JP2003295482A (ja) | 2002-04-01 | 2003-10-15 | Seiko Epson Corp | 像担持体およびその製造方法 |
| US6902513B1 (en) * | 2002-04-02 | 2005-06-07 | Mcclure Daniel R. | Interactive fitness equipment |
| EP1521615A4 (fr) * | 2002-06-11 | 2010-11-03 | Jeffrey A Matos | Systeme de reanimation cardiaque |
| US6817979B2 (en) * | 2002-06-28 | 2004-11-16 | Nokia Corporation | System and method for interacting with a user's virtual physiological model via a mobile terminal |
| JP2004136918A (ja) | 2002-10-17 | 2004-05-13 | Toppan Printing Co Ltd | ストロー突刺し性を有する蓋材 |
| US20040179037A1 (en) * | 2003-03-03 | 2004-09-16 | Blattner Patrick D. | Using avatars to communicate context out-of-band |
-
2004
- 2004-04-30 JP JP2004136918A patent/JP3931889B2/ja not_active Expired - Fee Related
- 2004-08-13 US US10/567,930 patent/US7731656B2/en not_active Expired - Fee Related
- 2004-08-13 EP EP10013873A patent/EP2298156A1/fr not_active Withdrawn
- 2004-08-13 SG SG200802647-8A patent/SG142298A1/en unknown
- 2004-08-13 WO PCT/JP2004/011707 patent/WO2005016137A1/fr not_active Ceased
- 2004-08-13 AU AU2004264761A patent/AU2004264761A1/en not_active Abandoned
- 2004-08-13 KR KR1020067002623A patent/KR101072561B1/ko not_active Expired - Fee Related
- 2004-08-13 EP EP04771674A patent/EP1656880A4/fr not_active Withdrawn
- 2004-08-13 EP EP10013874.2A patent/EP2298157B1/fr not_active Expired - Lifetime
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH10234030A (ja) * | 1997-02-19 | 1998-09-02 | Anima Denshi Kk | 監視モニタシステム |
| WO2001052718A2 (fr) | 2000-01-19 | 2001-07-26 | Healthetech, Inc. | Dispositif de surveillance du regime et de l'activite |
| JP2002034936A (ja) * | 2000-07-24 | 2002-02-05 | Sharp Corp | 通信装置および通信方法 |
| JP2002314715A (ja) * | 2001-04-18 | 2002-10-25 | Noboru Akasaka | 緊急対応方法及び緊急対応システム |
| JP2004049309A (ja) * | 2002-07-16 | 2004-02-19 | National Trust:Kk | 被看護・被介護者の監視システム |
Non-Patent Citations (1)
| Title |
|---|
| See also references of EP1656880A4 |
Cited By (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP1912558A4 (fr) * | 2005-07-26 | 2010-01-20 | Vivometrics Inc | Interfaces informatiques comportant des avatars guides physiologiquement |
| EP2484276A3 (fr) * | 2005-07-26 | 2012-08-29 | adidas AG | Interfaces d'ordinateur comprenant des avatars guidés physiologiquement |
| EP2484278A3 (fr) * | 2005-07-26 | 2012-08-29 | adidas AG | Interfaces d'ordinateur comprenant des avatars guidés physiologiquement |
| EP2484277A3 (fr) * | 2005-07-26 | 2012-08-29 | adidas AG | Interfaces d'ordinateur comprenant des avatars guidés physiologiquement |
| US8790255B2 (en) | 2005-07-26 | 2014-07-29 | Adidas Ag | Computer interfaces including physiologically guided avatars |
| JP2018018492A (ja) * | 2016-07-15 | 2018-02-01 | パナソニックIpマネジメント株式会社 | コンテンツ提示のための情報処理装置、情報処理装置の制御方法、及び制御プログラム |
| JP2019198531A (ja) * | 2018-05-17 | 2019-11-21 | Cyberdyne株式会社 | 生体情報計測装置及び生体情報計測方法 |
| JP7132568B2 (ja) | 2018-05-17 | 2022-09-07 | Cyberdyne株式会社 | 生体情報計測装置及び生体情報計測方法 |
Also Published As
| Publication number | Publication date |
|---|---|
| US20060217598A1 (en) | 2006-09-28 |
| KR101072561B1 (ko) | 2011-10-11 |
| JP2005095570A (ja) | 2005-04-14 |
| US7731656B2 (en) | 2010-06-08 |
| AU2004264761A1 (en) | 2005-02-24 |
| SG142298A1 (en) | 2008-05-28 |
| EP1656880A4 (fr) | 2009-05-06 |
| EP1656880A1 (fr) | 2006-05-17 |
| KR20060058105A (ko) | 2006-05-29 |
| EP2298157B1 (fr) | 2014-04-09 |
| JP3931889B2 (ja) | 2007-06-20 |
| HK1095506A1 (zh) | 2007-05-11 |
| EP2298156A1 (fr) | 2011-03-23 |
| EP2298157A1 (fr) | 2011-03-23 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP3931889B2 (ja) | 画像表示システム、画像表示装置、画像表示方法 | |
| CN101198277B (zh) | 用于生理学和心理生理学监控的系统 | |
| US10835168B2 (en) | Systems and methods for estimating and predicting emotional states and affects and providing real time feedback | |
| US12285256B2 (en) | Systems and methods for estimating and predicting emotional states and affects and providing real time feedback | |
| JP6268193B2 (ja) | 脈波測定装置、携帯機器、医療機器システム、及び生体情報コミュニケーションシステム | |
| US11699524B2 (en) | System for continuous detection and monitoring of symptoms of Parkinson's disease | |
| CN109620185A (zh) | 基于多模态信息的自闭症辅助诊断系统、设备及介质 | |
| KR20160029375A (ko) | 사용자의 생체신호를 모니터링 및 분석하는 장치 및 방법 | |
| JP2011120917A (ja) | 人間の生理学的情報及びコンテキスト情報の検知装置 | |
| JP2009015449A (ja) | 生体情報共有システム、生体情報表現装置、生体情報表現方法 | |
| JP2009502335A (ja) | 生理的にガイドされるアバターを含むコンピュータ・インタフェース | |
| WO2015190042A1 (fr) | Dispositif d'évaluation d'activité, dispositif de traitement d'évaluation, et programme | |
| US11741851B2 (en) | Cognitive aid device and method for assisting | |
| EP3549630A1 (fr) | Dispositif de commande de sortie, procédé de commande de sortie, et programme | |
| McDaniel et al. | Therapeutic haptics for mental health and wellbeing | |
| CN100482149C (zh) | 图像显示系统、图像显示装置及图像显示方法 | |
| JP2004537343A (ja) | 個人情報配信システム | |
| Reis et al. | An imaginary friend that connects with the user's emotions | |
| HK1095506B (en) | Image display system, image display device, image display method | |
| JP5105779B2 (ja) | 情報選択システム | |
| JP4961172B2 (ja) | 情報選択システム | |
| Chickella et al. | 1.2. PERVASIVE AND PERSUASIVE TECHNOLOGIES IN MOTIVATING PEOPLE TO PHYSICAL ACTIVITIES OUTDOOR IN FITNESS PARKS | |
| JP2025115113A (ja) | 情報処理装置、評価方法、および制御プログラム | |
| JP2023080965A (ja) | 感情推定システム |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| WWE | Wipo information: entry into national phase |
Ref document number: 200480023584.9 Country of ref document: CN |
|
| AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
| AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
| WWE | Wipo information: entry into national phase |
Ref document number: 2004264761 Country of ref document: AU |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 1020067002623 Country of ref document: KR |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2006217598 Country of ref document: US Ref document number: 10567930 Country of ref document: US |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2004771674 Country of ref document: EP |
|
| ENP | Entry into the national phase |
Ref document number: 2004264761 Country of ref document: AU Date of ref document: 20040813 Kind code of ref document: A |
|
| WWP | Wipo information: published in national office |
Ref document number: 2004264761 Country of ref document: AU |
|
| WWP | Wipo information: published in national office |
Ref document number: 2004771674 Country of ref document: EP |
|
| WWP | Wipo information: published in national office |
Ref document number: 1020067002623 Country of ref document: KR |
|
| WWP | Wipo information: published in national office |
Ref document number: 10567930 Country of ref document: US |