WO2017043610A1 - 情報処理装置、方法およびコンピュータプログラム - Google Patents
情報処理装置、方法およびコンピュータプログラム Download PDFInfo
- Publication number
- WO2017043610A1 WO2017043610A1 PCT/JP2016/076531 JP2016076531W WO2017043610A1 WO 2017043610 A1 WO2017043610 A1 WO 2017043610A1 JP 2016076531 W JP2016076531 W JP 2016076531W WO 2017043610 A1 WO2017043610 A1 WO 2017043610A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- vibration
- corrected
- vibration data
- user
- processing apparatus
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- A—HUMAN NECESSITIES
- A41—WEARING APPAREL
- A41D—OUTERWEAR; PROTECTIVE GARMENTS; ACCESSORIES
- A41D1/00—Garments
- A41D1/002—Garments adapted to accommodate electronic equipment
-
- A—HUMAN NECESSITIES
- A41—WEARING APPAREL
- A41D—OUTERWEAR; PROTECTIVE GARMENTS; ACCESSORIES
- A41D1/00—Garments
- A41D1/02—Jackets
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/211—Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/212—Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/213—Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/215—Input arrangements for video game devices characterised by their sensors, purposes or types comprising means for detecting acoustic signals, e.g. using a microphone
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/218—Input arrangements for video game devices characterised by their sensors, purposes or types using pressure sensors, e.g. generating a signal proportional to the pressure applied by the player
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/23—Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/23—Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console
- A63F13/235—Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console using a wireless connection, e.g. infrared or piconet
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/24—Constructional details thereof, e.g. game controllers with detachable joystick handles
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/25—Output arrangements for video game devices
- A63F13/28—Output arrangements for video game devices responding to control signals received from the game device for affecting ambient conditions, e.g. for vibrating players' seats, activating scent dispensers or affecting temperature or light
- A63F13/285—Generating tactile feedback signals via the game input device, e.g. force feedback
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/30—Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
- A63F13/31—Communication aspects specific to video games, e.g. between several handheld game devices at close range
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
- A63F13/426—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
- A63F13/428—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/54—Controlling the output signals based on the game progress involving acoustic signals, e.g. for simulating revolutions per minute [RPM] dependent engine sounds in a driving game or reverberation against a virtual wall
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/55—Controlling game characters or game objects based on the game progress
- A63F13/57—Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
- A63F13/577—Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using determination of contact between game characters or objects, e.g. to avoid collision between virtual racing cars
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/90—Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
- A63F13/92—Video game devices specially adapted to be hand-held while playing
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/014—Hand-worn input/output arrangements, e.g. data gloves
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M19/00—Current supply arrangements for telephone systems
- H04M19/02—Current supply arrangements for telephone systems providing ringing current or supervisory tones, e.g. dialling tone or busy tone
- H04M19/04—Current supply arrangements for telephone systems providing ringing current or supervisory tones, e.g. dialling tone or busy tone the ringing-current being generated at the substations
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2250/00—Miscellaneous game characteristics
- A63F2250/16—Use of textiles
- A63F2250/166—Garments
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/30—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
- A63F2300/308—Details of the user interface
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/014—Force feedback applied to GUI
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
Definitions
- the present disclosure relates to an information processing apparatus, a method, and a computer program.
- notification from an application may be performed by sound or vibration.
- Patent Document 1 discloses an information processing apparatus in which notification from an application as described above is performed by sound or vibration.
- the client device disclosed in Patent Literature 1 is a terminal including an imaging unit, and is disclosed to notify a user by vibration.
- the present disclosure proposes an information processing apparatus, method, and computer program that correct vibration data in accordance with the contact state between the information processing apparatus and the user.
- a corrected vibration data generation unit that generates corrected vibration data in which the intensity of vibration data with respect to the vibration device is corrected based on information from a detection unit that detects a contact state of the vibration device including the vibrator;
- an information processing apparatus comprising: a vibration signal generation unit that generates a vibration signal from the corrected vibration data.
- the processor based on information from a detection unit that detects a contact state of the vibration device including the vibrator, the processor generates corrected vibration data in which the intensity of the vibration data with respect to the vibration device is corrected. Generating a vibration signal by the processor based on the corrected vibration data.
- the processor Based on information from a detection unit that detects a contact state of the vibration device including the vibrator, the processor generates corrected vibration data in which the intensity of the vibration data with respect to the vibration device is corrected, A computer program for generating a vibration signal based on the corrected vibration data is provided.
- FIG. 1 is a diagram illustrating an example of a wristband type wearable terminal.
- FIG. 2 is a diagram illustrating an example of the relationship between the human sensory sensitivity related to vibration and the human pressing pressure of the vibration device.
- FIG. 3 is a diagram illustrating an example of an appearance of a wristband type wearable terminal according to an embodiment of the present disclosure.
- FIG. 4 is a cross-sectional view illustrating a state in which the wristband type wearable terminal according to the embodiment of the present disclosure is mounted.
- FIG. 5 is a block diagram illustrating a configuration of a wristband type wearable terminal according to an embodiment of the present disclosure.
- FIG. 6 is a flowchart illustrating an example of processing performed in the wristband type wearable terminal according to the embodiment of the present disclosure.
- FIG. 1 is a diagram illustrating an example of a wristband type wearable terminal.
- FIG. 2 is a diagram illustrating an example of the relationship between the human sensory sensitivity related to vibration and the human pressing pressure of the vibration device.
- FIG. 7 is a diagram illustrating an example of an appearance of a jacket-type wearable terminal according to an embodiment of the present disclosure.
- FIG. 8 is a diagram illustrating a positional relationship between the vibration device and the pressure sensor of the jacket-type wearable terminal according to the embodiment of the present disclosure.
- FIG. 9 is a block diagram illustrating an exemplary configuration of a jacket-type wearable terminal according to an embodiment of the present disclosure.
- FIG. 10 is a diagram illustrating examples of virtual objects and listeners generated in the game machine.
- FIG. 11 is a flowchart illustrating an example of processing performed between the jacket-type wearable terminal and the game machine in the embodiment of the present disclosure.
- FIG. 12 is a diagram showing the sensation intensity of vibration in each part of a person.
- FIG. 13 is a diagram illustrating an example of a wearing position of the wearable terminal according to the embodiment of the present disclosure.
- FIG. 14 is a block diagram illustrating an example of another configuration of the wearable terminal according to the embodiment of the present disclosure.
- FIG. 15 is a table showing a relationship between correction information and parts stored in the storage unit of the wearable terminal in the embodiment of the present disclosure.
- FIG. 16 is a flowchart illustrating another example of processing performed in the wearable terminal in the embodiment of the present disclosure.
- FIG. 17 is a diagram illustrating an example of a vibration stand including the vibration device according to the embodiment of the present disclosure.
- 18 is a diagram illustrating an example of a structure of a diaphragm provided in the vibration stand illustrated in FIG. FIG.
- FIG. 19 is a diagram illustrating an example of a vibrating device according to an embodiment of the present disclosure.
- FIG. 20 is a cross-sectional view illustrating a cross-section of the vibration device according to the embodiment of the present disclosure.
- FIG. 21 is a diagram illustrating a mounting method of the vibration device according to the embodiment of the present disclosure.
- FIG. 22 is a diagram illustrating a relationship between the vibration intensity and the resonance frequency of the vibration device according to the embodiment of the present disclosure.
- FIG. 23 is a diagram illustrating another example of an event in which vibration feedback occurs in the embodiment of the present disclosure.
- FIG. 24 is a diagram illustrating a method of expressing the shape and texture of a virtual object by vibration.
- FIG. 20 is a cross-sectional view illustrating a cross-section of the vibration device according to the embodiment of the present disclosure.
- FIG. 21 is a diagram illustrating a mounting method of the vibration device according to the embodiment of the present disclosure.
- FIG. 22 is a diagram illustrating a
- FIG. 25 is a diagram illustrating a method of expressing the shape and texture of a virtual object by vibration.
- FIG. 26 is a diagram illustrating a method of expressing the shape and texture of a virtual object by vibration.
- FIG. 27 is a diagram illustrating a method of expressing the shape and texture of a virtual object by vibration.
- FIG. 28 is a diagram illustrating a method of expressing the shape and texture of a virtual object by vibration.
- FIG. 1 is an example of a wearable terminal 10 compared with the information processing apparatus of the present disclosure.
- the wearable terminal 10 shown in FIG. 1 is a wristband type, and the wearable terminal 10 is worn by winding a band around a user's arm or the like.
- the wearable terminal 10 as shown in FIG. 1 may have a pedometer function, and for example, notifies the user by vibration that a predetermined number of steps has been reached. Therefore, the wearable terminal 10 includes a vibration device 12 for applying vibration to the user.
- the user may be notified by vibration from the wearable terminal 10.
- the contact state between the user and the wearable terminal 10 changes according to the usage scene.
- the wearable terminal 10 is strongly pressed against the user when the band is tightly wound, and weakly pressed against the user when the band is loosely wound.
- the vibration device 12 of the wearable terminal 10 is strongly pressed against the user or weakly pressed against the user.
- the user's vibration sensation sensitivity varies depending on the pressing pressure with which the vibration device 12 is pressed against the user.
- FIG. 2 is a diagram showing an example of the relationship between the human sensory sensitivity related to vibration and the pressing pressure of the vibration device.
- the pressing pressure of the vibration device 12 increases, the human experience intensity of vibration increases. Therefore, even if vibrations of the same magnitude occur, the user feels strong vibrations if the wearable terminal 10 is strongly worn by the user by tightening the band. On the other hand, if the wearable terminal 10 is loosely worn by a person by tightening the band loosely, the user feels vibration weak.
- the relationship between the pressing pressure of the vibration device 12 against a person and the sensation intensity of the vibration may not be a proportional relationship as shown in FIG.
- the vibration of the vibration device 12 is regulated by the contact surface between the vibration device 12 and a person. Therefore, the vibration generated from the vibration device 12 is weakened, and as a result, unlike the example shown in FIG.
- the vibration data is corrected in order to make the sensation intensity of vibration constant, considering the difference in pressing pressure of the vibration device 12 against the user's body.
- FIG. 3 is a diagram illustrating an example of an appearance of the wristband type wearable terminal 100 according to the present embodiment.
- the wristband type wearable terminal 100 is worn by winding a band around a user's arm or the like.
- the wristband type wearable terminal 100 may have a pedometer function.
- the wristband type wearable terminal 100 notifies the user that a predetermined number of steps has been reached by vibration. Is provided.
- the wristband type wearable terminal 100 includes a pressure sensor 104 for detecting a contact state between the vibration device 102 and the user's arm. The pressure sensor 104 detects how hard the wearable terminal 100 is pressed against the user by detecting the pressure, that is, how strongly the vibration device 102 is pressed against the user.
- the pressure sensor 104 detecting the contact state between the wearable terminal 100 and the user is synonymous with detecting the contact state between the vibration device 102 and the user.
- the pressure sensor 104 is an example of a detection unit that detects the contact state of the vibration device 102.
- the vibration device 102 may be a device including a vibrator such as an eccentric motor in which a weight having an uneven shape is attached to the rotation shaft of the motor.
- the vibration device 102 may be a device including a vibrator such as a voice coil motor, a piezoelectric actuator, or an electromagnetic linear actuator.
- the pressure sensor 104 may be a sensor including a pressure-sensitive element that converts pressure into an electric signal, such as a piezo element, and includes a capacitor whose capacitance changes according to the pressure. It may be a sensor that converts it into an electrical signal.
- the pressure sensor 104 may be a film type pressure sensor 104.
- FIG. 4 is a cross-sectional view when the above-described wearable terminal 100 is worn on the user's arm 900.
- Wearable terminal 100 is worn by a band being wound around user's arm 900.
- the pressure sensor 104 may be configured to be disposed between the vibration device 102 and the user's arm 900 when the wearable terminal 100 is worn on the user's arm 900 as shown in FIG. preferable. This is because the pressing pressure of the vibration device 102 against the user is preferably measured at a position close to the contact position between the vibration device 102 and the user's arm 900.
- the portion of the user's arm 900 having a small distance from the vibration device 102 senses the vibration of the vibration device 102 most strongly, so that the pressure sensor 104 is arranged between the vibration device 102 and the user's arm 900.
- the pressure sensor 104 is arranged between the vibration device 102 and the user's arm 900.
- it is comprised.
- the appearance of the wearable terminal 100 of the present embodiment and the positional relationship between the pressure sensor 104 and the vibration device 102 when the wearable terminal 100 is worn by the user have been described. Below, the internal structure of the wearable terminal 100 of this embodiment is demonstrated.
- FIG. 5 is a block diagram showing an internal configuration of the wearable terminal 100 of the present embodiment.
- Wearable terminal 100 further includes a processing unit 106, a corrected vibration data generation unit 108, and a vibration signal generation unit 110.
- the processing unit 106 performs processing for an application included in the wearable terminal 100. Examples of applications processed by the processing unit 106 include applications having a pedometer function. Further, the processing unit 106 generates vibration data for the vibration device 102 based on an instruction from the application. For example, the processing unit 106 may generate vibration data when the number of steps measured reaches a predetermined number of steps.
- the corrected vibration data generation unit 108 generates corrected vibration data in which the intensity of the vibration data for the vibration device 102 generated by the processing unit 106 is corrected based on information from the pressure sensor 104.
- the corrected vibration data generation unit 108 generates corrected vibration data in which the intensity of the vibration data is increased when the pressure detected by the pressure sensor 104 is small, and the intensity of the vibration data is increased when the pressure detected by the pressure sensor 104 is large.
- Generate weakened correction vibration data For example, the corrected vibration data generation unit 108 may generate the corrected vibration data by multiplying the vibration data by using the reciprocal of the sensitivity value of the vibration shown in FIG. 2 as a coefficient. By generating the corrected vibration data in this way, it is possible to perform control so that the sensory intensity at which the user feels vibration is the same.
- the vibration signal generation unit 110 generates a vibration signal for driving the vibration device 102 based on the corrected vibration data generated by the corrected vibration data generation unit 108.
- the vibration signal generation unit 110 generates a vibration signal that is an analog signal by performing D / A conversion or the like on the corrected vibration data.
- FIG. 6 is a flowchart showing processing performed in the wristband type wearable terminal 100.
- the processing unit 106 generates vibration data for driving the vibration device 102 based on an instruction from the application. For example, in an application having a pedometer function, the processing unit 106 may generate vibration data to notify that the number of steps has reached a predetermined number of steps.
- the pressure sensor 104 detects the contact state between the wearable terminal 100 and the user by detecting the pressure. Then, the pressure sensor 104 sends information regarding the detected pressure to the corrected vibration data generation unit 108.
- step S ⁇ b> 104 the corrected vibration data generation unit 108 generates corrected vibration data based on the vibration data received from the processing unit 106 and information on the pressure received from the pressure sensor 104.
- the corrected vibration data generation unit 108 generates corrected vibration data in which the intensity of the vibration data is increased when the pressure detected by the pressure sensor 104 is small, and the intensity of the vibration data when the pressure detected by the pressure sensor 104 is large.
- the corrected vibration data with weakened is generated.
- the vibration signal generation unit 110 receives the corrected vibration data from the corrected vibration data generation unit 108 and generates a vibration signal by performing processing such as D / A conversion.
- the vibration device 102 vibrates based on the vibration signal generated by the vibration signal generation unit 110.
- the vibration device 102 vibrates based on the corrected vibration data corrected based on the pressure-related information detected by the pressure sensor 104, so that the user has the same sensory strength regardless of the contact state of the wearable terminal 100. Such a vibration is generated.
- FIG. 7 is a diagram illustrating an example of the appearance of a jacket-type wearable terminal 100.
- the jacket-type wearable terminal 100 may be used, for example, to feed back vibration generated in game software or the like to the user.
- the jacket-type wearable terminal 100 may include a plurality of vibration devices 102a to 102f. Further, the plurality of vibrating devices 102a to 102f may be arranged symmetrically as shown in FIG. Further, as shown in FIG. 7, the plurality of vibration devices 102a to 102f are arranged such that the vibration devices 102a and 102d are disposed on the chest, the vibration devices 102b and 102e are disposed on the upper abdomen, and the vibration devices 102c and 102f are disposed on the lower abdomen. May be arranged.
- FIG. 8 is a view showing a wearing state from the side when the jacket-type wearable terminal 100 shown in FIG. 7 is worn by the user.
- the jacket-type wearable terminal 100 also has a pressure sensor 104 for detecting the contact state between the vibration device 102 and the user, similarly to the wristband-type wearable terminal 100.
- the pressure sensor 104 detects how much the vibration device 102 is pressed against the user by detecting the pressure.
- the pressure sensor 104 is preferably configured to be disposed between the vibration device 102 and the user when the wearable terminal 100 is attached to the user as shown in FIG.
- FIG. 9 is a block diagram showing an example of the configuration of the jacket-type wearable terminal 100.
- the jacket-type wearable terminal 100 may be used to feed back vibration generated in game software or the like to the user. Therefore, the jacket-type wearable terminal 100 according to the present embodiment is connected to the game machine 200 and receives corrected vibration data from the game machine 200.
- the jacket-type wearable terminal 100 shown in FIG. 9 receives vibration data from the game machine 200, unlike the configuration of the wristband-type wearable terminal 100 shown in FIG. You don't have to. Therefore, as shown in FIG. 9, the jacket-type wearable terminal 100 includes a communication unit 112 for receiving vibration data from the game machine 200.
- the wearable terminal 100 transmits information on the pressure detected by the pressure sensor 104 to the game machine 200 via the communication unit 112.
- the communication unit 112 may be a short-range wireless communication interface such as Bluetooth (registered trademark).
- the communication unit 112 is not limited to the above-described interface, and may be a short-range wireless communication interface such as ZigBee (registered trademark).
- the game machine 200 includes a communication unit 202, a processing unit 204, and a corrected vibration data generation unit 206.
- the communication unit 202 is used to exchange information with the wearable terminal 100.
- the processing unit 204 executes processing for game software.
- the processing unit 204 may process information related to a virtual space based on game software as shown in FIG.
- a virtual object 300 is arranged in a virtual space based on game software, and listeners 302a to 302f for detecting contact with the virtual object 300 or arrival of sound are arranged in the virtual object 300.
- the processing unit 204 when another virtual object 300 contacts the listeners 302a to 302f in the virtual space, the processing unit 204 generates vibration data based on the information at the time of contact.
- Each of the listeners 302a to 302f corresponds to each of the vibration devices 102a to 102f of the jacket-type wearable terminal 100. Therefore, for example, when another virtual object 300 comes into contact with the listener 302a, the processing unit 204 generates vibration data for vibrating the vibration device 102a.
- the corrected vibration data generation unit 206 of the game machine 200 corrects and corrects vibration data generated by the processing unit 204 based on information detected by the pressure sensor 104 of the wearable terminal 100. Generate vibration data.
- FIG. 11 is a flowchart showing processing performed in the jacket type wearable terminal 100 and the game machine 200.
- the pressure sensor 104 detects the contact state between the wearable terminal 100 and the user by detecting the pressure. Then, the pressure sensor 104 sends information regarding the detected pressure to the processing unit 106. Next, in S ⁇ b> 202, the processing unit 106 sends information related to the pressure received from the pressure sensor 104 via the communication unit 112 to the game machine 200.
- the processing unit 204 of the game machine 200 generates vibration data based on an instruction from the game software.
- the instruction from the game software may be generated based on, for example, the contact of another virtual object 300 with the listener 302a.
- the corrected vibration data generation unit 206 of the game machine 200 generates corrected vibration data based on the vibration data received from the processing unit 204 and the information received from the pressure sensor 104.
- the corrected vibration data generation unit 206 of the game machine 200 detects the contact state between the vibration device 102a and the user from the pressure sensor 104. Based on the information, corrected vibration data is generated.
- the processing unit 204 of the game machine 200 transmits the corrected vibration data generated by the corrected vibration data generating unit 206 to the wearable terminal 100 via the communication unit 202.
- the vibration signal generation unit 110 generates a vibration signal by performing processing such as D / A conversion on the corrected vibration data received from the game machine 200.
- the vibration device 102 vibrates based on the vibration signal which the vibration signal generation part 110 produced
- the vibration data may be corrected by an information processing apparatus other than the wearable terminal 100 such as the game machine 200. Further, by arranging a plurality of pressure sensors 104 corresponding to the plurality of vibration devices 102, corrected vibration data is generated according to the contact state between each vibration device 102 and the user.
- the pressure sensor 104 is used to detect the pressing pressure of the vibration device 102 against the user.
- the detection unit that detects the pressing pressure of the vibration device 102 on the user is not limited to the pressure sensor 104.
- the detection unit for detecting the pressing pressure of the vibration device 102 on the user may be an acceleration sensor or a gyro sensor.
- the acceleration sensor or gyro sensor detects a secondary vibration that occurs when the wearable terminal 100 is not pressed against the user, and the processing unit 106 is based on the secondary vibration detected by the acceleration sensor or the gyro sensor.
- the pressing pressure against the user of the vibration device 102 may be estimated.
- the secondary vibration detected by the acceleration sensor or the gyro sensor is vibration generated when the wearable terminal 100 is wobbled because the wearable terminal 100 is not strongly attached to the user. For example, when the wearable terminal is worn on the wrist, the vibration is detected when the user shakes his / her hand and the wearable terminal 100 wobbles.
- the corrected vibration data generation unit 108 may generate corrected vibration data based on the pressing pressure estimated by the processing unit 106 based on information detected by the acceleration sensor or the gyro sensor. For example, the corrected vibration data generation unit 108 generates corrected vibration data in which the intensity of the vibration data is increased when the magnitude of the acceleration or angular velocity or angular acceleration of the secondary vibration detected by the acceleration sensor or gyro sensor is large. May be. The corrected vibration data generation unit 108 generates corrected vibration data in which the intensity of the vibration data is reduced when the magnitude of the acceleration or angular velocity or angular acceleration of the secondary vibration detected by the acceleration sensor or the gyro sensor is small. May be.
- the wearable terminal 100 has been mainly described above, the above-described processing may be applied to an information processing apparatus held by a user such as a smartphone or a game controller. At this time, the information processing apparatus held by the user may detect the user's holding pressure by the pressure sensor 104.
- the pressing pressure of the vibration device 102 against the user may be detected using the touch panel.
- the smartphone may detect the pressing pressure of the vibration device 102 on the user using a touch panel.
- Embodiment in which vibration data is corrected according to the mounting position >> ⁇ 2-1.
- Embodiment in which intensity of vibration data is corrected according to mounting position> The embodiment in which the vibration data is corrected according to the pressing pressure of the vibration device 102 on the user has been described above. In the following, an embodiment in which vibration data is corrected according to the mounting position of the vibration device 102 will be described.
- FIG. 12 is a diagram simply showing the relationship between each part of the body and the sensitivity to feel vibration.
- the intensity of the vibration data is corrected based on the position where the wearable terminal 100 is worn so that the user can feel the same vibration for the reason described above.
- FIG. 13 is a diagram showing positions where the wristband type wearable terminal 100 shown in FIG. 3 may be attached.
- the wristband type wearable terminal 100 may be worn on the wrist, or may be worn on the upper arm or the ankle.
- the wearable terminal 100 according to the present embodiment corrects the vibration data according to the mounting position.
- the reference device 800 shown in FIG. 13 is used to determine the mounting position of the wearable terminal 100 as described later.
- FIG. 14 is a block diagram showing a configuration of the wearable terminal 100 of the present embodiment. As shown in FIG. 14, the wearable terminal 100 according to the present embodiment further includes a position detection sensor 114 and a storage unit 116.
- the position detection sensor 114 detects the position of the wearable terminal 100.
- the position detection sensor 114 may be a motion sensor such as an acceleration sensor or a gyro sensor. And the position detection sensor 114 may estimate the mounting position of the wearable terminal 100 from the tendency of the change of the information which a motion sensor detects.
- the information detected by the motion sensor may be acceleration detected by the acceleration sensor or angular velocity or angular acceleration detected by the gyro sensor.
- the position detection sensor 114 may be a magnetic sensor, an ultrasonic sensor, or a sensor using radio waves.
- the position detection sensor 114 may estimate the wearing position of the wearable terminal 100 based on the distance or direction from the reference device 800 as shown in FIG. At this time, in order to detect the distance or direction from the reference device 800, magnetism, ultrasonic waves, or radio waves as described above may be used.
- the storage unit 116 stores the relationship between the wearing position of the wearable terminal 100 and the correction information as shown in FIG. For example, when the wearable terminal 100 is worn on a wrist with high sensitivity, the correction information is stored so that the vibration intensity becomes relatively weak. Further, when the wearable terminal 100 is worn on the upper arm having a medium sensitivity, the correction information is stored so that the vibration intensity is relatively stronger than when the wearable terminal 100 is worn on the wrist.
- the vibration data correction method is not limited to the above-described example, and the corrected vibration data generation unit 108 multiplies the vibration data by using the reciprocal of the vibration sensation sensitivity value shown in FIG. It may be generated.
- FIG. 16 is a flowchart showing an example of processing performed in the wearable terminal 100 of the present embodiment.
- the processing unit 106 generates vibration data for driving the vibration device 102.
- the position detection sensor 114 detects the wearing position of the wearable terminal 100.
- the corrected vibration data generation unit 108 generates corrected vibration data based on the vibration data received from the processing unit 106 and the wearing position of the wearable terminal 100 detected by the position detection sensor 114.
- the corrected vibration data generation unit 108 reads correction information from the storage unit 116 based on the wearing position of the wearable terminal 100 detected by the position detection sensor 114 as described above, and generates corrected vibration data using the correction information.
- the vibration signal generation unit 110 receives the corrected vibration data from the corrected vibration data generation unit 108 and generates a vibration signal by performing processing such as D / A conversion.
- the vibration device 102 vibrates based on the vibration signal generated by the vibration signal generation unit 110.
- the vibration data is corrected based on the wearing position of the wearable terminal 100 detected by the position detection sensor 114. Then, when the vibration device 102 vibrates with the corrected vibration data, a vibration that causes the user to have the same sensation intensity regardless of the contact position of the wearable terminal 100 is generated.
- the vibration data is corrected also in the jacket-type wearable terminal 100 having the plurality of vibration devices 102 as shown in FIG.
- the vibration data is corrected according to the position of the vibration device 102 provided in the jacket-type wearable terminal 100.
- the position of the vibration devices 102 in the wearable terminal 100 is determined without the process of detecting the mounting positions of the vibration devices 102.
- the vibration data may be corrected.
- the vibration data for the vibration devices 102c and 102f located near the lower abdomen where the vibration sensitivity is relatively low is the vibration located near the chest where the vibration sensitivity is relatively high.
- the user can feel more uniform vibration depending on the mounting site of the vibration device 102. As a result, the user can experience vibrations that feel more realistic in a game, for example.
- the sensitivity with which the user feels vibration changes depending on the wearing position of the wearable terminal 100. Further, the sensitivity to human frequency varies depending on the body part, and the sensitivity to vibrations in the X-axis, Y-axis, and Z-axis directions also varies depending on the frequency change or the body part.
- the vibration data is set so that the vibration device 102 vibrates at a frequency with high sensitivity felt by the palm. Is preferably corrected.
- the vibration data is preferably corrected so that the vibration device 102 vibrates at a frequency with high sensitivity felt by the buttocks.
- the vibration frequency is relatively compared to a normal state such as a hand-held state. It is desirable to vibrate by lowering.
- the smartphone being held in the hand and being accommodated in the pocket near the buttocks may be detected by the same configuration as the position detection sensor 114 described above. That is, the position detection sensor 114 may estimate the position of the smartphone from the tendency of change in information detected by the motion sensor.
- the position detection sensor 114 may be a magnetic sensor, an ultrasonic sensor, or a sensor using radio waves, and may estimate the position of the smartphone based on the distance or direction from the reference device 800.
- the fact that the smartphone is held in the hand and that the smartphone is accommodated in the pocket may be estimated by a proximity sensor mounted on the smartphone. Specifically, when it is determined by the proximity sensor that an object exists near the screen of the smartphone, the smartphone may be estimated to be in a pocket or a bag. Moreover, when it is determined that there is no object near the screen of the smartphone, it may be estimated that the smartphone is in a state of being held by the hand.
- the estimation as described above may be performed by a combination of values of the proximity sensor and the motion sensor.
- the sensitivity to the vibration intensity is greatly different between the hand and the buttocks. Therefore, in addition to the correction of the vibration data with respect to the vibration frequency described above, the vibration data with respect to the vibration intensity may be corrected. preferable.
- the vibration data is corrected so that the frequency at which the vibration device 102 vibrates changes based on the position detected by the position detection sensor 114.
- the user can experience vibrations suitable for sensitivity characteristics with respect to the vibration intensity or frequency of each part.
- FIG. 17 is a diagram illustrating an example of a system including a display device 500 that displays an image, a head-mounted wearable terminal 600 that presents vibration to a user 902, and a vibration stand 700.
- an image for presenting the user 902 with a feeling of flying in the air may be displayed on the display device 500.
- the user 902 can lie on the front of the body by the user 902 lying on the vibration stand 700, and the vibration is presented to the user 902 by the head-mounted wearable device 600.
- the display device 500 may be a head mounted display instead of the screen as shown in FIG.
- the head-mounted wearable terminal 600 and the vibration stand 700 are examples of an information processing apparatus that corrects vibration data as described above.
- the head-mounted wearable device 600 may present vibration to the user 902 and the vibration may be presented to the user 902 behind the head or the neck.
- the user 902 has an illusion of the center of gravity, and the acceleration / deceleration felt by the user 902 is enhanced.
- the vibration which expresses a collision feeling is presented in front of the head of the user 902, so that the feeling of flying in the sky is further enhanced.
- the head-mounted wearable device 600 may further include a pressure sensor 104, and the above-described vibration data correction may be performed based on the pressure detected by the pressure sensor 104.
- the vibration stand 700 of this embodiment includes a vibration plate 702 and a pedestal 704.
- a plurality of vibration plates 702 may be provided in the vibration stand 700, and the vibration plate 702 presents vibrations to the user 902 when the vibration device 102 vibrates. As shown in FIG. 17, when the user 902 lies on the vibration stand 700, vibration is presented on the legs of the user 902 without applying the weight of the user 902, thereby enhancing the floating feeling felt by the user 902.
- FIG. 18 is a diagram illustrating an example of a configuration between the diaphragm 702 and the pedestal 704.
- Each vibration plate 702 includes a vibration device 102, and the vibration plate 702 also vibrates when the vibration device 102 vibrates.
- Each diaphragm 702 is supported by an elastic member 706 so that it can vibrate with respect to the base 704.
- the vibration stand 700 basically defines the correspondence between the body part of the user 902 and the position of the diaphragm 702, as in the jacket-type wearable terminal 100 described above. Therefore, it is preferable that the vibration data is corrected so that the vibration intensity and frequency of each diaphragm 702 change in accordance with the difference in sensitivity of each person who feels the vibration described above.
- the positional relationship between the body part of the user 902 and the diaphragm 702 may change according to the height of the user 902. Therefore, the positional relationship between the body part of the user 902 and the diaphragm 702 may be automatically corrected by using the height information of the user 902 or the like.
- Each diaphragm 702 of the vibration stand 700 further includes a pressure sensor 104 for detecting a contact pressure with the body of the user 902, and correction of vibration data as described above based on the pressure detected by the pressure sensor 104. May be performed.
- a distance sensor or the like that detects the deformation amount of the elastic member 706 may be provided instead of the pressure sensor 104, and vibration data may be corrected based on the distance detected by the distance sensor. At this time, if the distance is small, it is estimated that the pressing pressure against the vibration device 102 is large. For example, correction that weakens vibration data may be performed.
- FIG. 19 is a diagram illustrating an appearance of the vibration device 102
- FIG. 20 is a diagram illustrating a cross section of the vibration device 102.
- the vibration device 102 includes a case 400 having an elliptical cross section. As described above, since the cross section of the case 400 is an ellipse, even when the angle at which the vibration device 102 is in contact with the body of the user 902 is changed, the case 400 comes into contact with the body of the user 902 with the same pressure.
- the cross section of the case 400 has a major axis L1 and a minor axis L2.
- the vibrator 118 is disposed on one of the surfaces 400a and 400b whose tangent line is perpendicular to the short axis L2.
- the surface 400 a on which the vibrator 118 is not arranged vibrates more greatly due to the vibration of the vibrator 118 than the surface 400 b on which the vibrator 118 is arranged.
- the smaller the area of the more vibrated surface 400a the smaller the sound produced, so that the more vibrated surface 400a of the case 400 is suppressed from generating sound.
- the opening 402 is provided.
- FIG. 21 is a diagram illustrating a state when the above-described vibration device 102 is attached to the user 902.
- the vibration device 102 is disposed such that the surface 400 a on which the vibrator 118 is not disposed is a contact surface with the user 902.
- the surface 400a of the case 400 where the vibrator 118 is not disposed vibrates more greatly due to the vibration of the vibrator 118, so that the vibration is efficiently transmitted to the user 902.
- FIG. 22 is a diagram illustrating an example of the resonance frequency (frequency characteristics) of the vibration device 102 described above.
- FIG. 22 shows that the vibration intensity increases near 200 Hz, 400 Hz, and 600 Hz.
- the resonance frequency of the case 400 is adjusted in accordance with the vibration frequency sensitivity characteristic of each part, whereby vibration is transmitted to the user 902 with high energy efficiency.
- the vibration device 102 provided in the wearable terminal 100 attached to the palm is configured to have a resonance frequency of 200 Hz.
- the vibration intensity may be maximized sensuously.
- the resonance frequency of the case 400 in accordance with the sensitivity of the portion where the sensitivity to human vibration is low (for example, the buttocks), the frequency characteristics of vibration felt by the user 902 can be made closer to flat.
- the vibration intensity may be maximized or flattened by performing an electrical / software frequency correction process on the input signal instead of the structure of the case 400.
- the frequency sensitivity characteristic varies depending on the human part, it is desirable that the characteristic of the case 400 and the frequency correction of the input vibration be changed according to the position where the vibration device 102 is disposed.
- a pressure sensor 104 for detecting the pressing pressure of the vibration device 102 against the user 902 may be provided on the surface of the case 400 of the vibration device 102.
- the game machine vibrates on the game machine or the controller. Vibrate the device.
- the game machine can control feedback by vibration even when the operation virtual object does not collide with another virtual object.
- FIG. 23 is a diagram illustrating an example in which feedback by vibration is performed based on the shock wave 82 generated by the shock wave generation source 80 located at a position away from the operation virtual object 40a.
- the shock wave generation source 80 may be, for example, an explosion, and the propagation of the shock wave 82 may be simulated in a virtual space by a physical engine.
- the operation virtual object 40a is arranged at the designated position of the controller 280, and the shock wave 82 generated by the explosion generated in the virtual space reaches the operation virtual object 40a, and feedback by vibration is performed.
- feedback by vibration may be performed according to the property of the medium between the operation virtual object 40a and the shock wave generation source 80.
- the strength of vibration may be different between feedback by vibration when the medium is air and feedback by vibration when the medium is water.
- the vibration when the medium is water may be weaker than the vibration when the medium is air. This is because the propagation characteristics of the simulated shock wave 82 differ depending on the medium.
- the vibration data may be simply generated according to the distance between the shock wave generation source 80 and the operation virtual object 40a without depending on the propagation of the shock wave 82 simulated in the virtual space. Thereby, feedback by vibration is also performed by a physical engine having a simpler configuration.
- FIGS. 24 to 27 are diagrams illustrating a state where the operation virtual object 40a passes over the semicircular virtual object 40d.
- the semicircular virtual object 40d has a surface with little friction (having a smooth feel).
- vibration feedback with a short vibration time is performed.
- FIG. 26 while the operation virtual object 40a is moving on the surface of the semicircular virtual object 40d, feedback by vibration is not performed.
- FIG. 27 when the operation virtual object 40a descends to the other end of the semicircular virtual object 40d, vibration feedback with a short vibration time is performed again.
- the change in the shape of the surface is caused by the user being presented with a vibration with a short vibration time at the timing when the shape of the surface in contact with the operation virtual object 40a changes (the state in FIGS. 25 and 27).
- the user can feel this.
- the operation virtual object 40a is moving on the surface with less friction (the state in FIG. 26)
- the user can feel a smooth touch by not presenting vibration.
- the operation virtual object 40a moves along the surface of the semicircular virtual object 40d, the user can also feel the swelled shape of the semicircular virtual object 40d visually.
- FIG. 28 is a diagram illustrating a state in which a part 40e of the semicircular virtual object has a surface with high friction.
- the vibration may be changed according to the speed at which the user operates the operation virtual object 40a. For example, in the example of FIG. 28, the time interval at which the controller 280 vibrates may be shortened as the speed at which the user operates the operation virtual object 40a increases.
- the vibration provided to the user changes, so the user can feel that the texture has changed from the change in vibration.
- the top, bottom, left, and right edges of the screen are set as side surfaces of a virtual box, and the ball virtually set in the box is displayed on the screen according to the inclination of the game machine detected by the acceleration sensor.
- Run the application that rolls up, down, left and right (with a structure that looks like a box with a ball seen from the opening side, and simulates the behavior of a ball that rolls inside the box. The ball behavior is simulated including the bounce when hitting the ball.)
- the ball rolls or collides with the wall when the ball rolls or collides with the wall, it generates audio and tactile feedback with varying sizes according to the relative speed between the box and the ball (with output from one actuator, The low frequency is perceived by the user as tactile information, and the middle and high frequencies are perceived as sound.
- a predetermined waveform pattern is preset according to the type of the ball. In other words, the user obtains feedback on each of visual, auditory, and tactile information from the terminal. Also, by reflecting the tilt of the terminal in the behavior, the user can recognize his / her somatic sense (information on how his / her body is moving) as information.
- the user can intuitively perceive which wall the ball hit by using only tactile information without visual or auditory information. Then, the user feels as if the impact vibration is generated from the collision point of the ball and is transmitted to the hand. This is because the actuators are arranged on the left and right, so that the output of the left actuator increases when the ball hits the left, and the output of the right actuator increases when the ball hits the right. Thereby, similarly to the stereo effect of sound, the user perceives whether the wall on which the ball hits is left or right from the magnitude of the tactile information.
- the audio and tactile signals output from the actuator have the same magnitude in the upper and lower directions (if the speed at the time of collision is the same).
- the user intuitively perceives which of the upper and lower walls the ball hits. Then, the user feels as if shock vibrations are generated from the upper and lower collision points and transmitted to the hand.
- a virtual ball is set in this application, but the material and size of the ball can be virtually changed.
- the difference in the texture of the ball is expressed by the sound and tactile signal patterns generated during rolling / collision in addition to the difference in images. These signal patterns are obtained by sampling actual ball collision sounds and collision vibrations, or synthesizing data, or processing existing sound effect sounds. Also, the box side visuals may vary depending on the ball material for a more natural proximity.
- metal expresses the character of metal by reverberating the sound and tactile information for a relatively long time after the collision.
- rubberiness can be expressed by suppressing the generation of sound with respect to tactile information.
- the presentation of the pseudo “feeling of weight” by the presentation of such visual, audio, and tactile information can be similarly generated even in the system by HapticJacket described separately.
- the visual information is not matched with the movement of the user's controller in the same manner as in the above example.
- Can present a sense Specifically, when the virtual ball gets over the stepped shape of the concavo-convex surface, the movement of the ball in the left-right direction is delayed by a certain amount (the ball is not displaced left and right even if the actual controller is displaced left and right) ), It is possible to present a “feeling of catching”. After a certain amount of difference between the two displacements, the errors are prevented from accumulating by matching the displacements again. At this time, generating tactile and audio feedback before and after the catching is also effective for presenting a pseudo “striking feeling”.
- the difference between the rough surface and the smooth surface shown in FIG. 28 can be expressed by tactile sensation and visual / auditory.
- the “frictional feeling” with respect to the tracing can be presented in a pseudo manner by reproducing a waveform in which pink noise is deformed according to the speed of the tracing.
- the tactile sensation presentation the user feels like tracing a rough surface such as a file surface.
- the cross-modal effect which is an illusion phenomenon that occurs in the user's brain, is used to provide a representation of reality in the virtual space, especially by making presentations related to tactile Fordback. It is possible to improve it.
- the vibration data is corrected according to the pressure detected by one pressure sensor 104.
- a plurality of pressure sensors 104 may be provided in the information processing apparatus, and vibration data may be corrected based on pressure distribution values obtained from the plurality of pressure sensors 104.
- the vibration data has been corrected according to the pressing pressure of the wearable terminal 100 or the mounting position.
- the sensitivity of human vibration varies depending on the contact area between the information processing apparatus and the user 902.
- the information processing apparatus held by the user 902 such as a smartphone or a game controller
- the vibration data may be corrected depending on how the user 902 holds the information processing apparatus.
- the information processing apparatus held by the user 902 may include a detection unit that detects a contact area between the user 902 and the information processing apparatus. Further, the detection unit may detect which finger of the user 902 is in contact with the information processing apparatus.
- the pressure value of the part of the user 902 holding the information processing apparatus and / or information on the part of the user 902 (for example, the index finger and the middle finger are strongly held, Alternatively, the vibration intensity may be corrected by holding the ring finger and little finger strongly.
- the wearing position of the wearable terminal 100 is detected using magnetism, ultrasonic waves, or radio waves.
- the wearing position of the wearable terminal 100 may be explicitly input by the user 902.
- the jacket-type wearable terminal 100 receives the corrected vibration data from the game machine 200 and generates a vibration signal. However, the jacket-type wearable terminal 100 stores the positions of the plurality of vibration devices 102a to 102f, and corrects the vibration data received from the game machine 200 corresponding to the positions of the plurality of vibration devices 102a to 102f. Good. For example, the jacket-type wearable terminal 100 may receive the vibration data from the game machine 200 and generate corrected vibration data that is corrected so as to increase the vibration with respect to the vibration devices 102c and 102f disposed on the lower abdomen. . That is, the corrected vibration data may be generated based on vibration data received from another information processing apparatus.
- processing unit 106 and the corrected vibration data generation unit 108 described above may be realized using a general-purpose processor.
- a computer program for operating the processor as described above may be provided.
- a storage medium storing such a program may be provided.
- the information processing apparatus of the present disclosure corrects vibration data according to the state of the information processing apparatus and the user 902.
- the information processing apparatus according to the present disclosure corrects vibration data according to the pressing pressure with which the vibration device 102 is pressed against the user 902.
- the vibration device 102 vibrates at the same physical vibration intensity, the sensory intensity of vibration of the user 902 is constant.
- the information processing apparatus of the present disclosure corrects vibration data according to a mounting position where the information processing apparatus is mounted on the user 902 or a contact position where the information processing apparatus is in contact with the user 902. As a result, the vibration sensation intensity of the user 902 is constant.
- a corrected vibration data generation unit that generates corrected vibration data in which the intensity of vibration data for the vibration device is corrected based on information from a detection unit that detects a contact state of the vibration device including a vibrator;
- An information processing apparatus comprising: a vibration signal generation unit that generates a vibration signal from the corrected vibration data.
- the detector is a pressure sensor;
- the detection unit is an acceleration sensor or a gyro sensor
- the information processing apparatus according to (1), wherein the corrected vibration data generation unit generates the corrected vibration data based on information on acceleration, angular velocity, or angular acceleration detected by the acceleration sensor or the gyro sensor.
- the acceleration sensor or the gyro sensor detects a secondary vibration generated when the information processing apparatus is not pressed against a user
- the corrected vibration data generation unit is configured to increase the intensity of the vibration data when the acceleration or the angular velocity or the angular acceleration of the secondary vibration detected by the acceleration sensor or the gyro sensor is large.
- the information processing apparatus according to (4), wherein the correction vibration data is generated.
- the detection unit detects a mounting position for the user of the vibration device, The information processing apparatus according to any one of (1) to (8), wherein the correction vibration data generation unit generates the correction vibration data based on information on the mounting position detected by the detection unit. . (10) The information processing apparatus according to (9), wherein the detection unit detects the mounting position based on a distance or direction from a reference device serving as a reference. (11) The information processing according to (9) or (10), wherein the correction vibration data generation unit corrects the vibration data so as to correct a frequency at which the vibrator is vibrated according to the detected mounting position. apparatus.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Textile Engineering (AREA)
- Heart & Thoracic Surgery (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Health & Medical Sciences (AREA)
- Cardiology (AREA)
- Biophysics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- Acoustics & Sound (AREA)
- Signal Processing (AREA)
- Computer Hardware Design (AREA)
- Optics & Photonics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
0.背景
1.押しつけ圧に応じて振動データが補正される実施形態
1-1.リストバンド型のウェアラブル端末
1-2.ジャケット型のウェアラブル端末
2.装着位置に応じて振動データが補正される実施形態
2-1.装着位置に応じて振動データの強度が補正される実施形態
2-2.装着位置に応じて振動データの周波数が補正される実施形態
3.振動デバイスの構造
4.操作仮想オブジェクトに対する接触以外によって発生するフィードバック
5.仮想オブジェクトの形状および材質に基づく振動によるフィードバック
6.フィードバックの応用例
7.補足
8.むすび
最初に本開示の背景が説明される。図1は、本開示の情報処理装置と対比されるウェアラブル端末10の一例である。図1に示されるウェアラブル端末10はリストバンド型であり、ウェアラブル端末10は、ユーザの腕などにバンドが巻かれることによって装着される。
<1-1.リストバンド型のウェアラブル端末>
以上では、本開示の背景が説明された。以下では本実施形態の情報処理装置の一例であるウェアラブル端末100について説明される。図3は、本実施形態のリストバンド型のウェアラブル端末100の外観の一例を示す図である。
以上では、リストバンド型のウェアラブル端末100の構成について説明された。以下では、ジャケット型のウェアラブル端末100について説明される。図7は、ジャケット型のウェアラブル端末100の外観の一例を示す図である。ジャケット型のウェアラブル端末100は、例えばゲームソフトなどにおいて生成される振動をユーザにフィードバックするために用いられてもよい。
<2-1.装着位置に応じて振動データの強度が補正される実施形態>
以上では、振動デバイス102のユーザに対する押しつけ圧に応じて振動データが補正される実施形態について説明された。以下では、振動デバイス102の装着位置に応じて振動データが補正される実施形態について説明される。
以上では、装着位置に応じて振動データの強度が補正される実施形態について説明された。以下では、装着位置に応じて振動データの周波数が補正される実施形態について説明される。
以上では、装着位置に応じて振動データが補正される実施形態について説明された。以下では、本開示の振動デバイス102の具体的な構成について説明される。図19は振動デバイス102の外観を示す図であり、図20は振動デバイス102の断面を示す図である。図19および図20に示されるように、振動デバイス102は、断面が楕円形状のケース400を有する。このようにケース400の断面が楕円であることによって、振動デバイス102がユーザ902の体に接する角度が変化しても、同様の圧力でユーザ902の体に接するようになる。
ところで、近年、ゲームのための仮想空間におけるイベントを、振動によりユーザにフィードバックすることが行われている。ゲーム機本体がユーザに把持される場合にはゲーム機本体が振動し、ゲーム機本体と分離したコントローラがユーザに把持される場合にはコントローラが振動し得る。
以上では、衝撃波82に基づいて振動によるフィードバックが行われる例について説明された。以下では、仮想オブジェクト40の形状および材質に基づく振動によるフィードバックについてより詳細に説明される。
以下では、例えば特開2015-231098に示されるような、ユーザによって把持されるポータブル型の端末であって、ユーザの左右の手によって把持される各グリップ部分に振動デバイス(アクチュエータ)を有する端末における振動フィードバックの具体例を説明する。
以上、添付図面を参照しながら本開示の好適な実施形態について詳細に説明したが、本開示の技術的範囲はかかる例に限定されない。本開示の技術分野における通常の知識を有する者であれば、特許請求の範囲に記載された技術的思想の範疇内において、各種の変更例または修正例に想到し得ることは明らかであり、これらについても、当然に本開示の技術的範囲に属する。
以上説明したように、本開示の情報処理装置は、情報処理装置とユーザ902との状態に応じて振動データを補正する。例えば本開示の情報処理装置は、振動デバイス102がユーザ902に押しつけられている押しつけ圧に応じて振動データを補正する。これによって、物理的に同じ振動強度で振動デバイス102が振動していても、ユーザ902の振動の体感強度は一定となる。
(1)
振動子を備える振動デバイスの接触状態を検知する検知部からの情報に基づいて、前記振動デバイスに対する振動データの強度を補正した補正振動データを生成する補正振動データ生成部と、
前記補正振動データから振動信号を生成する振動信号生成部と、を備える、情報処理装置。
(2)
前記検知部は圧力センサであり、
前記補正振動データ生成部は、前記圧力センサが検知する圧力に関する情報に基づいて前記補正振動データを生成する、前記(1)に記載の情報処理装置。
(3)
前記補正振動データ生成部は、前記圧力センサが検知する圧力が弱い場合に前記振動データの強度を強くするように前記補正振動データを生成する、前記(2)に記載の情報処理装置。
(4)
前記検知部は加速度センサまたはジャイロセンサであり、
前記補正振動データ生成部は、前記加速度センサまたは前記ジャイロセンサが検知する加速度または角速度または角加速度に関する情報に基づいて前記補正振動データを生成する、前記(1)に記載の情報処理装置。
(5)
前記加速度センサまたは前記ジャイロセンサは、前記情報処理装置がユーザに圧接されていないことによって生じる副次的な振動を検知し、
前記補正振動データ生成部は、前記加速度センサまたは前記ジャイロセンサによって検知される前記副次的な振動の加速度または角速度または角加速度の大きさが大きい場合に前記振動データの強度を強くするように前記補正振動データを生成する、前記(4)に記載の情報処理装置。
(6)
前記圧力センサは、前記情報処理装置がユーザに接触する接触面と前記振動デバイスとの間に配置される、前記(2)または前記(3)に記載の情報処理装置。
(7)
前記振動デバイスは断面が楕円形状であるケースを備え、前記振動子は接線が前記ケースの短軸と垂直である面の一方に配置される、前記(1)から前記(6)のいずれか1項に記載の情報処理装置。
(8)
前記振動デバイスは、前記振動子が配置されていない前記ケースの面がユーザとの接触面となるように配置される、前記(7)に記載の情報処理装置。
(9)
さらに前記検知部は、前記振動デバイスのユーザに対する装着位置を検知し、
前記補正振動データ生成部は、前記検知部が検知する前記装着位置に関する情報に基づいて前記補正振動データを生成する、前記(1)から前記(8)のいずれか1項に記載の情報処理装置。
(10)
前記検知部は基準となる基準デバイスからの距離または方向に基づいて前記装着位置を検知する、前記(9)に記載の情報処理装置。
(11)
前記補正振動データ生成部は、検知された前記装着位置に応じて前記振動子を振動させる周波数を補正するように前記振動データを補正する、前記(9)または前記(10)に記載の情報処理装置。
(12)
振動子を備える振動デバイスの接触状態を検知する検知部からの情報に基づいて、前記振動デバイスに対する振動データの強度を補正した補正振動データをプロセッサにより生成することと、
前記補正振動データに基づいて振動信号を前記プロセッサにより生成することと、を含む、方法。
(13)
プロセッサに、振動子を備える振動デバイスの接触状態を検知する検知部からの情報に基づいて、前記振動デバイスに対する振動データの強度を補正した補正振動データを生成させ、
前記補正振動データに基づいて振動信号を生成させる、コンピュータプログラム。
102 振動デバイス
104 圧力センサ
106 処理部
108 補正振動データ生成部
110 振動信号生成部
112 通信部
114 位置検知センサ
116 記憶部
118 振動子
200 ゲーム機
202 通信部
204 処理部
206 補正振動データ生成部
300 仮想オブジェクト
302 リスナー
400 ケース
402 開口部
500 表示装置
600 頭部装着型ウェアラブルデバイス
700 振動スタンド
702 振動板
704 台座
706 弾性部材
800 基準デバイス
Claims (13)
- 振動子を備える振動デバイスの接触状態を検知する検知部からの情報に基づいて、前記振動デバイスに対する振動データの強度を補正した補正振動データを生成する補正振動データ生成部と、
前記補正振動データから振動信号を生成する振動信号生成部と、を備える、情報処理装置。 - 前記検知部は圧力センサであり、
前記補正振動データ生成部は、前記圧力センサが検知する圧力に関する情報に基づいて前記補正振動データを生成する、請求項1に記載の情報処理装置。 - 前記補正振動データ生成部は、前記圧力センサが検知する圧力が弱い場合に前記振動データの強度を強くするように前記補正振動データを生成する、請求項2に記載の情報処理装置。
- 前記検知部は加速度センサまたはジャイロセンサであり、
前記補正振動データ生成部は、前記加速度センサまたは前記ジャイロセンサが検知する加速度または角速度または角加速度に関する情報に基づいて前記補正振動データを生成する、請求項1に記載の情報処理装置。 - 前記加速度センサまたは前記ジャイロセンサは、前記情報処理装置がユーザに圧接されていないことによって生じる副次的な振動を検知し、
前記補正振動データ生成部は、前記加速度センサまたは前記ジャイロセンサによって検知される前記副次的な振動の加速度または角速度または角加速度の大きさが大きい場合に前記振動データの強度を強くするように前記補正振動データを生成する、請求項4に記載の情報処理装置。 - 前記圧力センサは、前記情報処理装置がユーザに接触する接触面と前記振動デバイスとの間に配置される、請求項2に記載の情報処理装置。
- 前記振動デバイスは断面が楕円形状であるケースを備え、前記振動子は接線が前記ケースの短軸と垂直である面の一方に配置される、請求項1に記載の情報処理装置。
- 前記振動デバイスは、前記振動子が配置されていない前記ケースの面がユーザとの接触面となるように配置される、請求項7に記載の情報処理装置。
- さらに前記検知部は、前記振動デバイスのユーザに対する装着位置を検知し、
前記補正振動データ生成部は、前記検知部が検知する前記装着位置に関する情報に基づいて前記補正振動データを生成する、請求項1に記載の情報処理装置。 - 前記検知部は基準となる基準デバイスからの距離または方向に基づいて前記装着位置を検知する、請求項9に記載の情報処理装置。
- 前記補正振動データ生成部は、検知された前記装着位置に応じて前記振動子を振動させる周波数を補正するように前記振動データを補正する、請求項9に記載の情報処理装置。
- 振動子を備える振動デバイスの接触状態を検知する検知部からの情報に基づいて、前記振動デバイスに対する振動データの強度を補正した補正振動データをプロセッサにより生成することと、
前記補正振動データに基づいて振動信号を前記プロセッサにより生成することと、を含む、方法。 - プロセッサに、振動子を備える振動デバイスの接触状態を検知する検知部からの情報に基づいて、前記振動デバイスに対する振動データの強度を補正した補正振動データを生成させ、
前記補正振動データに基づいて振動信号を生成させる、コンピュータプログラム。
Priority Applications (7)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US15/743,874 US10353470B2 (en) | 2015-09-08 | 2016-09-08 | Information processing device, method, and computer |
| CN201680050699.XA CN107949818B (zh) | 2015-09-08 | 2016-09-08 | 信息处理设备、方法和计算机程序 |
| JP2017538528A JP6822408B2 (ja) | 2015-09-08 | 2016-09-08 | 情報処理装置、方法およびコンピュータプログラム |
| EP20166761.5A EP3690613A1 (en) | 2015-09-08 | 2016-09-08 | Information processing device, method, and computer |
| EP16844471.9A EP3349097B1 (en) | 2015-09-08 | 2016-09-08 | Information processing apparatus and method, and computer program |
| US16/419,463 US10942573B2 (en) | 2015-09-08 | 2019-05-22 | Information processing device, method, and computer |
| US17/193,387 US11314333B2 (en) | 2015-09-08 | 2021-03-05 | Information processing device, method, and computer |
Applications Claiming Priority (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201562215572P | 2015-09-08 | 2015-09-08 | |
| US62/215,572 | 2015-09-08 | ||
| JPPCT/JP2016/075581 | 2016-08-31 | ||
| PCT/JP2016/075581 WO2017043400A1 (ja) | 2015-09-08 | 2016-08-31 | 情報処理装置、方法およびコンピュータプログラム |
Related Parent Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2016/075581 Continuation-In-Part WO2017043400A1 (ja) | 2015-09-08 | 2016-08-31 | 情報処理装置、方法およびコンピュータプログラム |
Related Child Applications (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US15/743,874 A-371-Of-International US10353470B2 (en) | 2015-09-08 | 2016-09-08 | Information processing device, method, and computer |
| US16/419,463 Continuation US10942573B2 (en) | 2015-09-08 | 2019-05-22 | Information processing device, method, and computer |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2017043610A1 true WO2017043610A1 (ja) | 2017-03-16 |
Family
ID=58239593
Family Applications (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2016/075581 Ceased WO2017043400A1 (ja) | 2015-09-08 | 2016-08-31 | 情報処理装置、方法およびコンピュータプログラム |
| PCT/JP2016/076531 Ceased WO2017043610A1 (ja) | 2015-09-08 | 2016-09-08 | 情報処理装置、方法およびコンピュータプログラム |
Family Applications Before (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2016/075581 Ceased WO2017043400A1 (ja) | 2015-09-08 | 2016-08-31 | 情報処理装置、方法およびコンピュータプログラム |
Country Status (6)
| Country | Link |
|---|---|
| US (5) | US10331214B2 (ja) |
| EP (3) | EP3349096B1 (ja) |
| JP (2) | JP6834962B2 (ja) |
| KR (1) | KR102639118B1 (ja) |
| CN (2) | CN107924236B (ja) |
| WO (2) | WO2017043400A1 (ja) |
Cited By (23)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2018110432A1 (ja) * | 2016-12-15 | 2018-06-21 | 株式会社ソニー・インタラクティブエンタテインメント | 情報処理システム、コントローラデバイス、コントローラデバイスの制御方法、及びプログラム |
| WO2018193650A1 (ja) * | 2017-04-18 | 2018-10-25 | 株式会社ソニー・インタラクティブエンタテインメント | 振動制御装置 |
| WO2019124068A1 (ja) * | 2017-12-19 | 2019-06-27 | ソニー株式会社 | 情報処理装置および方法、並びにプログラム |
| JPWO2018193513A1 (ja) * | 2017-04-18 | 2019-08-08 | 株式会社ソニー・インタラクティブエンタテインメント | 振動制御装置 |
| WO2019244716A1 (ja) * | 2018-06-19 | 2019-12-26 | ソニー株式会社 | 情報処理装置、情報処理方法、およびプログラム |
| WO2020026443A1 (ja) * | 2018-08-03 | 2020-02-06 | 株式会社ソニー・インタラクティブエンタテインメント | 触覚表現用振動制御システム、触覚表現用振動発生装置、触覚表現用振動制御装置、および触覚表現用振動制御方法 |
| WO2020153116A1 (ja) * | 2019-01-21 | 2020-07-30 | ソニー株式会社 | 情報処理装置、情報処理方法、及びプログラム |
| JP2020526065A (ja) * | 2017-06-13 | 2020-08-27 | ビーハプティクス インコーポレイテッド | ヘッドマウントディスプレイ |
| JP2021041325A (ja) * | 2019-09-10 | 2021-03-18 | 株式会社東海理化電機製作所 | 制御装置、制御方法、及びプログラム |
| JP2021041326A (ja) * | 2019-09-10 | 2021-03-18 | 株式会社東海理化電機製作所 | 制御装置、制御方法、及びプログラム |
| US10963054B2 (en) | 2016-12-15 | 2021-03-30 | Sony Interactive Entertainment Inc. | Information processing system, vibration control method and program |
| US10963055B2 (en) | 2016-12-15 | 2021-03-30 | Sony Interactive Entertainment Inc. | Vibration device and control system for presenting corrected vibration data |
| US11013990B2 (en) | 2017-04-19 | 2021-05-25 | Sony Interactive Entertainment Inc. | Vibration control apparatus |
| WO2021145025A1 (ja) * | 2020-01-16 | 2021-07-22 | ソニーグループ株式会社 | 情報処理装置、情報処理端末及びプログラム |
| US11195293B2 (en) | 2017-07-20 | 2021-12-07 | Sony Interactive Entertainment Inc. | Information processing device and positional information obtaining method |
| US11198059B2 (en) | 2017-08-29 | 2021-12-14 | Sony Interactive Entertainment Inc. | Vibration control apparatus, vibration control method, and program |
| WO2022123667A1 (ja) * | 2020-12-09 | 2022-06-16 | 日本電信電話株式会社 | 触覚システム |
| US11458389B2 (en) | 2017-04-26 | 2022-10-04 | Sony Interactive Entertainment Inc. | Vibration control apparatus |
| WO2022269799A1 (ja) * | 2021-06-23 | 2022-12-29 | 日本電信電話株式会社 | 擬似触覚提示装置、擬似触覚提示方法、およびプログラム |
| JP2023007801A (ja) * | 2021-07-02 | 2023-01-19 | 日本電信電話株式会社 | 振動信号生成装置、振動提示装置、それらの方法、およびプログラム |
| US11738261B2 (en) | 2017-08-24 | 2023-08-29 | Sony Interactive Entertainment Inc. | Vibration control apparatus |
| US11779836B2 (en) | 2017-08-24 | 2023-10-10 | Sony Interactive Entertainment Inc. | Vibration control apparatus |
| KR20240045828A (ko) * | 2022-09-30 | 2024-04-08 | 광주과학기술원 | 포스피드백 장치 및 광대역 공진 진동자를 이용한 힘 및 복합 진동 렌더링 시스템과 이를 이용한 힘 및 복합 진동 제공 방법 |
Families Citing this family (30)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP6834962B2 (ja) * | 2015-09-08 | 2021-02-24 | ソニー株式会社 | 情報処理装置、方法およびコンピュータプログラム |
| CN106371573B (zh) * | 2015-12-04 | 2020-06-02 | 北京智谷睿拓技术服务有限公司 | 触觉反馈的方法、装置和虚拟现实交互系统 |
| TWI679079B (zh) * | 2016-12-22 | 2019-12-11 | 日商西鐵城時計股份有限公司 | 工具機及其控制裝置 |
| US10592199B2 (en) * | 2017-01-24 | 2020-03-17 | International Business Machines Corporation | Perspective-based dynamic audio volume adjustment |
| JP6936976B2 (ja) * | 2017-05-24 | 2021-09-22 | 株式会社村田製作所 | 刺激伝達装置 |
| JP6837921B2 (ja) | 2017-06-02 | 2021-03-03 | 任天堂株式会社 | ゲームプログラム、情報処理装置、情報処理システム、および、情報処理方法 |
| JP6613267B2 (ja) * | 2017-06-02 | 2019-11-27 | 任天堂株式会社 | 情報処理システム、情報処理プログラム、情報処理装置、および、情報処理方法 |
| JP6653293B2 (ja) | 2017-06-05 | 2020-02-26 | 任天堂株式会社 | 情報処理システム、情報処理プログラム、情報処理装置、および、情報処理方法 |
| CN109690450B (zh) | 2017-11-17 | 2020-09-29 | 腾讯科技(深圳)有限公司 | Vr场景下的角色模拟方法和终端设备 |
| JP2021073749A (ja) * | 2018-03-07 | 2021-05-13 | ソニーグループ株式会社 | 情報処理装置、情報処理方法、およびプログラム |
| US10569167B1 (en) * | 2018-08-22 | 2020-02-25 | Abdul Hotaki | System and methods for providing wireless feedback to a video game |
| JP2022503796A (ja) * | 2018-10-01 | 2022-01-12 | レイア、インコーポレイテッド | ホログラフィックリアリティシステム、マルチビューディスプレイ、および方法 |
| US11813518B2 (en) | 2018-10-19 | 2023-11-14 | Sony Interactive Entertainment Inc. | Information processing system, controller apparatus, information processing apparatus, and program |
| US11429187B2 (en) * | 2018-10-19 | 2022-08-30 | Sony Interactive Entertainment Inc. | Controller apparatus, control method thereof, and program |
| JP2020080122A (ja) * | 2018-11-14 | 2020-05-28 | ソニー株式会社 | 情報処理装置、情報処理方法、および記憶媒体 |
| KR20200070607A (ko) * | 2018-12-10 | 2020-06-18 | (주)리얼감 | 강도를 이용한 포스 피드백 방법 및 시스템, 기계로 읽을 수 있는 저장 매체 |
| KR20200073951A (ko) * | 2018-12-13 | 2020-06-24 | (주)리얼감 | 포스 피드백 방법 및 시스템, 기계로 읽을 수 있는 저장 매체 |
| WO2020132784A1 (en) * | 2018-12-24 | 2020-07-02 | Intel Corporation | Methods and apparatus to detect collision of virtual camera with objects in three-dimensional volumetric model |
| KR102233004B1 (ko) * | 2018-12-28 | 2021-03-26 | 중앙대학교 산학협력단 | 가상현실 노래방 인터랙션 방법 및 장치 |
| KR102218091B1 (ko) * | 2019-01-22 | 2021-02-19 | (주)스코넥엔터테인먼트 | 가상 현실 제어 시스템 |
| KR102280916B1 (ko) * | 2019-05-24 | 2021-07-26 | 한양대학교 산학협력단 | 임의 위치에서 발생하는 진동 피드백 구현 장치 및 방법 |
| US11397508B1 (en) * | 2019-06-11 | 2022-07-26 | Hyper Reality Partners, Llc | Virtual experience pillars |
| JP2021067877A (ja) * | 2019-10-25 | 2021-04-30 | 株式会社ソニー・インタラクティブエンタテインメント | 情報処理装置、ヘッドマウントディスプレイ、および画像表示方法 |
| JP7170680B2 (ja) * | 2020-02-13 | 2022-11-14 | 任天堂株式会社 | ゲームシステム、ゲームプログラム、情報処理装置、および、ゲーム処理方法 |
| WO2023108131A1 (en) * | 2021-12-10 | 2023-06-15 | Shaw Industries Group, Inc. | Visceral surface covering simulator and method of use |
| US20250110558A1 (en) * | 2022-02-08 | 2025-04-03 | Sony Group Corporation | Information processing apparatus, information processing method, and program |
| JP2025068140A (ja) * | 2022-03-07 | 2025-04-25 | アルプスアルパイン株式会社 | 触覚提示装置、シートシステム、及び触覚提示方法 |
| KR102594789B1 (ko) * | 2022-06-08 | 2023-10-27 | 한국전자기술연구원 | 핸드트래킹 기반 슈도햅틱 피드백을 활용한 실감형 인터랙션 방법 |
| JP7632424B2 (ja) * | 2022-09-14 | 2025-02-19 | カシオ計算機株式会社 | 電子機器、電子機器の制御方法及びプログラム |
| WO2024090303A1 (ja) * | 2022-10-24 | 2024-05-02 | ソニーグループ株式会社 | 情報処理装置及び情報処理方法 |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH1185400A (ja) * | 1997-09-11 | 1999-03-30 | Sony Corp | 表示装置 |
| JP2012187148A (ja) * | 2011-03-08 | 2012-10-04 | Sony Computer Entertainment Inc | 情報処理装置および情報処理方法 |
| JP2013054645A (ja) * | 2011-09-06 | 2013-03-21 | Denso Wave Inc | 光学的情報読取装置 |
| JP2013150201A (ja) * | 2012-01-20 | 2013-08-01 | Nec Access Technica Ltd | 端末装置、報知制御方法、プログラム |
| JP2014179088A (ja) * | 2013-03-14 | 2014-09-25 | Immersion Corp | 接触子に基づいたハプティックフィードバック生成 |
Family Cites Families (51)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6084587A (en) * | 1996-08-02 | 2000-07-04 | Sensable Technologies, Inc. | Method and apparatus for generating and interfacing with a haptic virtual reality environment |
| US7084869B2 (en) * | 2000-03-31 | 2006-08-01 | Massachusetts Institute Of Technology | Methods and apparatus for detecting and correcting penetration between objects |
| US20070203435A1 (en) * | 2004-03-26 | 2007-08-30 | Peter Novak | System And Method For Gait Synchronized Vibratory Stimulation Of The Feet |
| JP4484570B2 (ja) * | 2004-04-14 | 2010-06-16 | 富士ゼロックス株式会社 | 音響情報処理装置、音響情報提供方法 |
| JP4926799B2 (ja) * | 2006-10-23 | 2012-05-09 | キヤノン株式会社 | 情報処理装置、情報処理方法 |
| KR101310969B1 (ko) | 2006-12-01 | 2013-09-23 | 삼성전자주식회사 | 디바이스의 환경을 분석하는 방법 및 이를 이용한 디바이스 |
| CN101563884B (zh) * | 2006-12-19 | 2013-03-27 | 日本电气株式会社 | 共同数据生成方法和用于该方法的设备 |
| CN101496954B (zh) * | 2008-01-28 | 2012-11-21 | 联想(北京)有限公司 | 一种游戏控制器及其游戏处理方法 |
| CN101394439B (zh) * | 2008-11-10 | 2011-11-02 | 华为终端有限公司 | 自动控制终端振动强度的方法、终端 |
| CN102473035B (zh) * | 2009-07-22 | 2015-01-21 | 意美森公司 | 具有横跨平台的触觉反馈的交互式触摸屏游戏象征 |
| JP5197521B2 (ja) * | 2009-07-29 | 2013-05-15 | 京セラ株式会社 | 入力装置 |
| TWI473490B (zh) | 2009-09-03 | 2015-02-11 | Htc Corp | 調整事件提示程度的方法與其行動電子裝置及電腦程式產品 |
| JP5278259B2 (ja) | 2009-09-07 | 2013-09-04 | ソニー株式会社 | 入力装置、入力方法及びプログラム |
| US8487759B2 (en) * | 2009-09-30 | 2013-07-16 | Apple Inc. | Self adapting haptic device |
| JP5668076B2 (ja) * | 2009-11-17 | 2015-02-12 | イマージョン コーポレーションImmersion Corporation | 電子デバイスにおける触覚帯域を増加するためのシステム及び方法 |
| US8540571B2 (en) * | 2010-03-31 | 2013-09-24 | Immersion Corporation | System and method for providing haptic stimulus based on position |
| JP5195859B2 (ja) * | 2010-09-27 | 2013-05-15 | トヨタ自動車株式会社 | 睡眠装置 |
| US8352643B2 (en) * | 2010-09-30 | 2013-01-08 | Immersion Corporation | Haptically enhanced interactivity with interactive content |
| US20120190460A1 (en) * | 2011-01-21 | 2012-07-26 | Cecil Sessions | Vibrating Gaming Vest |
| JPWO2013018267A1 (ja) | 2011-07-29 | 2015-03-05 | パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America | 提示制御装置、及び提示制御方法 |
| KR101328054B1 (ko) | 2011-08-09 | 2013-11-08 | 엘지전자 주식회사 | 실감 진동을 발생시키는 영상표시장치 및 실감 진동 구현방법 |
| US9101812B2 (en) * | 2011-10-25 | 2015-08-11 | Aquimo, Llc | Method and system to analyze sports motions using motion sensors of a mobile device |
| CN103246379A (zh) * | 2012-02-10 | 2013-08-14 | 联想移动通信科技有限公司 | 触摸反馈方法、装置及无线终端 |
| US8711118B2 (en) * | 2012-02-15 | 2014-04-29 | Immersion Corporation | Interactivity model for shared feedback on mobile devices |
| JP2013187148A (ja) * | 2012-03-09 | 2013-09-19 | Gs Yuasa Corp | 制御弁式鉛蓄電池 |
| EP2856289A4 (en) * | 2012-05-25 | 2016-01-06 | Immerz Inc | HAPTIC INTERFACE FOR PORTABLE ELECTRONIC DEVICE |
| US9874964B2 (en) * | 2012-06-04 | 2018-01-23 | Sony Interactive Entertainment Inc. | Flat joystick controller |
| WO2014041032A1 (en) * | 2012-09-11 | 2014-03-20 | L.I.F.E. Corporation S.A. | Wearable communication platform |
| US20140198130A1 (en) * | 2013-01-15 | 2014-07-17 | Immersion Corporation | Augmented reality user interface with haptic feedback |
| KR20140112648A (ko) | 2013-03-13 | 2014-09-24 | 삼성전기주식회사 | 수평 진동 발생 장치 |
| US9606721B2 (en) * | 2013-07-22 | 2017-03-28 | Lg Electronics Inc. | Mobile terminal and control method thereof |
| US9293015B2 (en) * | 2013-09-09 | 2016-03-22 | Immersion Corporation | Electrical stimulation haptic feedback interface |
| US20150105129A1 (en) * | 2013-10-15 | 2015-04-16 | Kevin Chapman | Video Game Body Suit |
| US9164587B2 (en) * | 2013-11-14 | 2015-10-20 | Immersion Corporation | Haptic spatialization system |
| US9619029B2 (en) * | 2013-11-14 | 2017-04-11 | Immersion Corporation | Haptic trigger control system |
| US9671826B2 (en) * | 2013-11-27 | 2017-06-06 | Immersion Corporation | Method and apparatus of body-mediated digital content transfer and haptic feedback |
| CN103995584B (zh) * | 2014-04-29 | 2017-11-17 | 深圳超多维光电子有限公司 | 立体交互方法及其显示装置、操作棒和系统 |
| US9679546B2 (en) * | 2014-05-16 | 2017-06-13 | Not Impossible LLC | Sound vest |
| US10379614B2 (en) * | 2014-05-19 | 2019-08-13 | Immersion Corporation | Non-collocated haptic cues in immersive environments |
| JP2015231098A (ja) | 2014-06-04 | 2015-12-21 | ソニー株式会社 | 振動装置、および振動方法 |
| GB201410648D0 (en) * | 2014-06-14 | 2014-07-30 | Pape Lise S | Walking aid providing tactile and visual cues to trigger and improve mobility |
| CN204484695U (zh) * | 2014-06-30 | 2015-07-22 | 岳川凯 | 一种可穿戴式导盲系统 |
| JP2016025620A (ja) | 2014-07-24 | 2016-02-08 | ソニー株式会社 | 画像処理システム、クライアントシステム、画像処理方法、および記憶媒体 |
| US9690381B2 (en) | 2014-08-21 | 2017-06-27 | Immersion Corporation | Systems and methods for shape input and output for a haptically-enabled deformable surface |
| KR102254705B1 (ko) * | 2014-09-19 | 2021-05-24 | 삼성전자주식회사 | 단말 장치 및 그 제어방법 |
| CN104545936A (zh) * | 2014-12-31 | 2015-04-29 | 戴晓伟 | 腰部姿态检测方法及检测结果的触觉反馈方法 |
| JP6834962B2 (ja) * | 2015-09-08 | 2021-02-24 | ソニー株式会社 | 情報処理装置、方法およびコンピュータプログラム |
| JP6922908B2 (ja) * | 2016-07-07 | 2021-08-18 | ソニーグループ株式会社 | 情報処理装置、情報処理方法、およびプログラム |
| KR20180019270A (ko) * | 2016-08-16 | 2018-02-26 | (주)알비케이이엠디 | 증강현실 또는 가상현실 구현 기기와 연동되는 햅틱 디바이스 |
| KR102442179B1 (ko) * | 2017-11-17 | 2022-09-08 | 삼성전자주식회사 | 웨어러블 장치를 통하여 햅틱 피드백을 제공하는 전자 장치 및 그 방법 |
| US10569167B1 (en) * | 2018-08-22 | 2020-02-25 | Abdul Hotaki | System and methods for providing wireless feedback to a video game |
-
2016
- 2016-08-31 JP JP2017539139A patent/JP6834962B2/ja active Active
- 2016-08-31 WO PCT/JP2016/075581 patent/WO2017043400A1/ja not_active Ceased
- 2016-08-31 EP EP16844261.4A patent/EP3349096B1/en active Active
- 2016-08-31 US US15/742,651 patent/US10331214B2/en active Active
- 2016-08-31 CN CN201680050120.XA patent/CN107924236B/zh active Active
- 2016-08-31 KR KR1020187001671A patent/KR102639118B1/ko active Active
- 2016-09-08 WO PCT/JP2016/076531 patent/WO2017043610A1/ja not_active Ceased
- 2016-09-08 JP JP2017538528A patent/JP6822408B2/ja active Active
- 2016-09-08 US US15/743,874 patent/US10353470B2/en active Active
- 2016-09-08 CN CN201680050699.XA patent/CN107949818B/zh active Active
- 2016-09-08 EP EP20166761.5A patent/EP3690613A1/en active Pending
- 2016-09-08 EP EP16844471.9A patent/EP3349097B1/en active Active
-
2019
- 2019-05-15 US US16/413,140 patent/US10838500B2/en active Active
- 2019-05-22 US US16/419,463 patent/US10942573B2/en active Active
-
2021
- 2021-03-05 US US17/193,387 patent/US11314333B2/en active Active
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH1185400A (ja) * | 1997-09-11 | 1999-03-30 | Sony Corp | 表示装置 |
| JP2012187148A (ja) * | 2011-03-08 | 2012-10-04 | Sony Computer Entertainment Inc | 情報処理装置および情報処理方法 |
| JP2013054645A (ja) * | 2011-09-06 | 2013-03-21 | Denso Wave Inc | 光学的情報読取装置 |
| JP2013150201A (ja) * | 2012-01-20 | 2013-08-01 | Nec Access Technica Ltd | 端末装置、報知制御方法、プログラム |
| JP2014179088A (ja) * | 2013-03-14 | 2014-09-25 | Immersion Corp | 接触子に基づいたハプティックフィードバック生成 |
Cited By (48)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10963054B2 (en) | 2016-12-15 | 2021-03-30 | Sony Interactive Entertainment Inc. | Information processing system, vibration control method and program |
| WO2018110432A1 (ja) * | 2016-12-15 | 2018-06-21 | 株式会社ソニー・インタラクティブエンタテインメント | 情報処理システム、コントローラデバイス、コントローラデバイスの制御方法、及びプログラム |
| US10963055B2 (en) | 2016-12-15 | 2021-03-30 | Sony Interactive Entertainment Inc. | Vibration device and control system for presenting corrected vibration data |
| JPWO2018110432A1 (ja) * | 2016-12-15 | 2019-04-11 | 株式会社ソニー・インタラクティブエンタテインメント | 情報処理システム、コントローラデバイス、コントローラデバイスの制御方法、及びプログラム |
| US10969867B2 (en) | 2016-12-15 | 2021-04-06 | Sony Interactive Entertainment Inc. | Information processing system, controller device, controller device control method and program |
| JPWO2018193650A1 (ja) * | 2017-04-18 | 2019-11-07 | 株式会社ソニー・インタラクティブエンタテインメント | 振動制御装置 |
| JPWO2018193513A1 (ja) * | 2017-04-18 | 2019-08-08 | 株式会社ソニー・インタラクティブエンタテインメント | 振動制御装置 |
| US10981053B2 (en) | 2017-04-18 | 2021-04-20 | Sony Interactive Entertainment Inc. | Vibration control apparatus |
| US11145172B2 (en) | 2017-04-18 | 2021-10-12 | Sony Interactive Entertainment Inc. | Vibration control apparatus |
| WO2018193514A1 (ja) * | 2017-04-18 | 2018-10-25 | 株式会社ソニー・インタラクティブエンタテインメント | 振動制御装置 |
| WO2018193650A1 (ja) * | 2017-04-18 | 2018-10-25 | 株式会社ソニー・インタラクティブエンタテインメント | 振動制御装置 |
| US11013990B2 (en) | 2017-04-19 | 2021-05-25 | Sony Interactive Entertainment Inc. | Vibration control apparatus |
| US11458389B2 (en) | 2017-04-26 | 2022-10-04 | Sony Interactive Entertainment Inc. | Vibration control apparatus |
| JP2020526065A (ja) * | 2017-06-13 | 2020-08-27 | ビーハプティクス インコーポレイテッド | ヘッドマウントディスプレイ |
| US11131856B2 (en) | 2017-06-13 | 2021-09-28 | Bhaptics Inc. | Head-mounted display |
| US11195293B2 (en) | 2017-07-20 | 2021-12-07 | Sony Interactive Entertainment Inc. | Information processing device and positional information obtaining method |
| US11779836B2 (en) | 2017-08-24 | 2023-10-10 | Sony Interactive Entertainment Inc. | Vibration control apparatus |
| US11738261B2 (en) | 2017-08-24 | 2023-08-29 | Sony Interactive Entertainment Inc. | Vibration control apparatus |
| US11198059B2 (en) | 2017-08-29 | 2021-12-14 | Sony Interactive Entertainment Inc. | Vibration control apparatus, vibration control method, and program |
| JPWO2019124068A1 (ja) * | 2017-12-19 | 2020-12-24 | ソニー株式会社 | 情報処理装置および方法、並びにプログラム |
| JP7192793B2 (ja) | 2017-12-19 | 2022-12-20 | ソニーグループ株式会社 | 情報処理装置および方法、並びにプログラム |
| WO2019124068A1 (ja) * | 2017-12-19 | 2019-06-27 | ソニー株式会社 | 情報処理装置および方法、並びにプログラム |
| WO2019244716A1 (ja) * | 2018-06-19 | 2019-12-26 | ソニー株式会社 | 情報処理装置、情報処理方法、およびプログラム |
| US11709550B2 (en) | 2018-06-19 | 2023-07-25 | Sony Corporation | Information processing apparatus, method for processing information, and program |
| JPWO2019244716A1 (ja) * | 2018-06-19 | 2021-06-24 | ソニーグループ株式会社 | 情報処理装置、情報処理方法、およびプログラム |
| WO2020026443A1 (ja) * | 2018-08-03 | 2020-02-06 | 株式会社ソニー・インタラクティブエンタテインメント | 触覚表現用振動制御システム、触覚表現用振動発生装置、触覚表現用振動制御装置、および触覚表現用振動制御方法 |
| JP7456388B2 (ja) | 2019-01-21 | 2024-03-27 | ソニーグループ株式会社 | 情報処理装置、情報処理方法、及びプログラム |
| WO2020153116A1 (ja) * | 2019-01-21 | 2020-07-30 | ソニー株式会社 | 情報処理装置、情報処理方法、及びプログラム |
| JPWO2020153116A1 (ja) * | 2019-01-21 | 2021-12-02 | ソニーグループ株式会社 | 情報処理装置、情報処理方法、及びプログラム |
| JP2021041325A (ja) * | 2019-09-10 | 2021-03-18 | 株式会社東海理化電機製作所 | 制御装置、制御方法、及びプログラム |
| WO2021049078A1 (ja) * | 2019-09-10 | 2021-03-18 | 株式会社東海理化電機製作所 | 制御装置、制御方法、及びプログラム |
| WO2021049080A1 (ja) * | 2019-09-10 | 2021-03-18 | 株式会社東海理化電機製作所 | 制御装置、制御方法、及びプログラム |
| JP2021041326A (ja) * | 2019-09-10 | 2021-03-18 | 株式会社東海理化電機製作所 | 制御装置、制御方法、及びプログラム |
| JP7360282B2 (ja) | 2019-09-10 | 2023-10-12 | 株式会社東海理化電機製作所 | 制御装置、制御方法、及びプログラム |
| JP7360281B2 (ja) | 2019-09-10 | 2023-10-12 | 株式会社東海理化電機製作所 | 制御装置、制御方法、及びプログラム |
| JPWO2021145455A1 (ja) * | 2020-01-16 | 2021-07-22 | ||
| WO2021145455A1 (ja) * | 2020-01-16 | 2021-07-22 | ソニーグループ株式会社 | 情報処理装置、情報処理端末及びプログラム |
| JP7586099B2 (ja) | 2020-01-16 | 2024-11-19 | ソニーグループ株式会社 | 情報処理装置 |
| US11983324B2 (en) | 2020-01-16 | 2024-05-14 | Sony Group Corporation | Information processing device, information processing terminal, and program |
| WO2021145025A1 (ja) * | 2020-01-16 | 2021-07-22 | ソニーグループ株式会社 | 情報処理装置、情報処理端末及びプログラム |
| WO2022123667A1 (ja) * | 2020-12-09 | 2022-06-16 | 日本電信電話株式会社 | 触覚システム |
| JPWO2022269799A1 (ja) * | 2021-06-23 | 2022-12-29 | ||
| JP7548435B2 (ja) | 2021-06-23 | 2024-09-10 | 日本電信電話株式会社 | 擬似触覚提示装置、擬似触覚提示方法、およびプログラム |
| WO2022269799A1 (ja) * | 2021-06-23 | 2022-12-29 | 日本電信電話株式会社 | 擬似触覚提示装置、擬似触覚提示方法、およびプログラム |
| JP2023007801A (ja) * | 2021-07-02 | 2023-01-19 | 日本電信電話株式会社 | 振動信号生成装置、振動提示装置、それらの方法、およびプログラム |
| JP7642942B2 (ja) | 2021-07-02 | 2025-03-11 | 日本電信電話株式会社 | 振動信号生成装置、振動提示装置、それらの方法、およびプログラム |
| KR20240045828A (ko) * | 2022-09-30 | 2024-04-08 | 광주과학기술원 | 포스피드백 장치 및 광대역 공진 진동자를 이용한 힘 및 복합 진동 렌더링 시스템과 이를 이용한 힘 및 복합 진동 제공 방법 |
| KR102885148B1 (ko) | 2022-09-30 | 2025-11-12 | 광주과학기술원 | 포스피드백 장치 및 광대역 공진 진동자를 이용한 힘 및 복합 진동 렌더링 시스템과 이를 이용한 힘 및 복합 진동 제공 방법 |
Also Published As
| Publication number | Publication date |
|---|---|
| US20190272038A1 (en) | 2019-09-05 |
| JP6822408B2 (ja) | 2021-01-27 |
| CN107949818B (zh) | 2021-07-27 |
| EP3349096B1 (en) | 2023-08-23 |
| US11314333B2 (en) | 2022-04-26 |
| US10838500B2 (en) | 2020-11-17 |
| EP3349097A4 (en) | 2019-04-10 |
| CN107924236B (zh) | 2021-09-21 |
| EP3690613A1 (en) | 2020-08-05 |
| EP3349097A1 (en) | 2018-07-18 |
| US20210191517A1 (en) | 2021-06-24 |
| EP3349097B1 (en) | 2020-04-01 |
| KR20180051482A (ko) | 2018-05-16 |
| CN107949818A (zh) | 2018-04-20 |
| CN107924236A (zh) | 2018-04-17 |
| EP3349096A1 (en) | 2018-07-18 |
| JP6834962B2 (ja) | 2021-02-24 |
| WO2017043400A1 (ja) | 2017-03-16 |
| US10331214B2 (en) | 2019-06-25 |
| US20180203510A1 (en) | 2018-07-19 |
| US10942573B2 (en) | 2021-03-09 |
| JPWO2017043610A1 (ja) | 2018-06-28 |
| KR102639118B1 (ko) | 2024-02-22 |
| EP3349096A4 (en) | 2019-04-10 |
| US10353470B2 (en) | 2019-07-16 |
| US20200026355A1 (en) | 2020-01-23 |
| US20180203509A1 (en) | 2018-07-19 |
| JPWO2017043400A1 (ja) | 2018-06-21 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP6822408B2 (ja) | 情報処理装置、方法およびコンピュータプログラム | |
| US10936072B2 (en) | Haptic information presentation system and method | |
| CN108170262B (zh) | 触觉环绕功能 | |
| JP4926799B2 (ja) | 情報処理装置、情報処理方法 | |
| US10444844B2 (en) | Systems and methods for providing haptic feedback via a case | |
| US20080100588A1 (en) | Tactile-feedback device and method | |
| CN110096131A (zh) | 触感交互方法、装置、以及触感可穿戴设备 | |
| JP2020013549A (ja) | 動的システム識別に基づく適応触覚効果レンダリング | |
| JP2017063916A (ja) | 提示する力覚を決定する装置、方法、およびプログラム | |
| EP3333674A1 (en) | Systems and methods for compliance simulation with haptics | |
| EP3489804A1 (en) | Haptic accessory apparatus | |
| EP3470960A1 (en) | Haptic effects with multiple peripheral devices | |
| JP2018206058A (ja) | 情報処理システム、情報処理プログラム、情報処理装置、および、情報処理方法 | |
| JP2022002129A (ja) | 触力覚情報提示システム | |
| JP2015153387A (ja) | 仮想現実体感装置、仮想現実体感方法、及びプログラム |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16844471 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2017538528 Country of ref document: JP Kind code of ref document: A |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 15743874 Country of ref document: US |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2016844471 Country of ref document: EP |