[go: up one dir, main page]

WO2015185389A1 - Procédé et dispositif de commande d'un dispositif haptique - Google Patents

Procédé et dispositif de commande d'un dispositif haptique Download PDF

Info

Publication number
WO2015185389A1
WO2015185389A1 PCT/EP2015/061556 EP2015061556W WO2015185389A1 WO 2015185389 A1 WO2015185389 A1 WO 2015185389A1 EP 2015061556 W EP2015061556 W EP 2015061556W WO 2015185389 A1 WO2015185389 A1 WO 2015185389A1
Authority
WO
WIPO (PCT)
Prior art keywords
haptic
actuator
information
effect
body model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/EP2015/061556
Other languages
English (en)
Inventor
Fabien DANIEAU
Julien Fleureau
Philippe Guillotel
Didier Doyen
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Thomson Licensing SAS
Original Assignee
Thomson Licensing SAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Thomson Licensing SAS filed Critical Thomson Licensing SAS
Publication of WO2015185389A1 publication Critical patent/WO2015185389A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/212Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • A63F13/28Output arrangements for video game devices responding to control signals received from the game device for affecting ambient conditions, e.g. for vibrating players' seats, activating scent dispensers or affecting temperature or light
    • A63F13/285Generating tactile feedback signals via the game input device, e.g. force feedback
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/013Force feedback applied to a game

Definitions

  • the present disclosure relates to the domain of haptics.
  • the present disclosure is also understood in the context of haptic effect(s) of any kind, e.g. motion, vibration, rendered by using one or more actuators.
  • the present disclosure may for example be implemented in automobile or aircraft simulators, video games, theme park attractions, at home for the rendering of haptic effects while watching a movie, or in auditoriums, for example movie theatres.
  • haptics and audiovisual content to increase the feeling of immersion while watching at the audiovisual content.
  • Such a combination of haptics and audiovisual content is known under the acronym HAV, standing for haptics-audiovisual.
  • MPEG-V architecture is one formalization of the workflow for producing, distributing and rendering HAV content.
  • the MPEG-V standard describes "sensory effects” which includes haptic effects but also gustatory and olfactory effects.
  • MPEG-V targets the stimulation of the entire user's body and also targets the stimulation of a specific point in the space in which moves the user.
  • the haptic devices also called actuators
  • the haptic devices can be defined by setting their capabilities (in term of ability to render haptic effects) and their location regarding the user's space.
  • a point in the user's space is defined to render the haptic effects which may be an issue when the user moves in the space. In such a case, the haptic effect will not be felt by the user.
  • the purpose of the present disclosure is to overcome at least one of these disadvantages of the prior art.
  • one purpose of the present disclosure is to control the user's experience of haptic effects by one or more actuators.
  • the present disclosure relates to a method of controlling at least one actuator adapted to render at least one haptic effect.
  • the method comprises: - receiving a first information representative of at least a part of a body model onto which the at least one haptic affect is to be applied;
  • the at least one parameter being computed according to said first information and to a second information representative of an association of the at least one actuator with a part of the body model.
  • the method further comprises receiving at least an image associated with the first information.
  • the at least one parameter is computed by a transfer function associated with a physics engine.
  • the method further comprises transmitting the at least one parameter to the at least one actuator.
  • the at least one parameter belongs to a group of parameters comprising:
  • the present disclosure also relates to a device configured for controlling at least one actuator adapted to render at least one haptic effect, the device comprising at least one processor configured for:
  • the at least one parameter being computed according to the first information and to a second information representative of an association of the at least one actuator with a part of the body model.
  • the device further comprises means for (e.g. a receiver) receiving at least an image associated with said first information.
  • means for e.g. a receiver
  • the device further comprises at least a memory storing a transfer function and a physics engine.
  • the device further comprises means for (e.g. a transmitter) transmitting the at least one parameter to the at least one actuator.
  • means for e.g. a transmitter
  • the present disclosure also relates to a computer program product comprising instructions of program code for execution by at least one processor to perform the method for controlling at least one actuator adapted to render at least one haptic effect, when the program is executed on a computer.
  • the present disclosure also relates to a (non-transitory) processor readable medium having stored therein instructions for causing a processor to perform at least the abovementioned method of controlling at least one actuator adapted to render at least one haptic effect.
  • FIG. 1 shows a workflow for controlling one or more actuators to render haptic effects, according to a particular embodiment
  • FIG. 2 shows a body model of a user, according to a particular embodiment
  • figure 3 shows a method relying on the user's body model of figure 2 and on a physics engine to render haptic effects, according to a particular embodiment
  • FIG. 4 shows a graphical representation of a user wearing actuators to render haptic effects, according to a particular embodiment
  • figure 5 diagrammatically shows a device implementing a method of controlling the actuators of figure 4, according to a particular embodiment
  • figure 6 shows a method of controlling actuator(s) of figure 4, according to a particular embodiment. 5.
  • haptic effects mechanical stimulation of parts of the body of the user, by applying for example tactile or motion effects by applying forces and vibrations on parts of the body of the user.
  • the haptic effects are advantageously associated with an audiovisual content and enables the user to feel immersed into the audiovisual content while watching it.
  • a multimedia device for example a set-top box, a computer, a smartphone or a tablet, receives first information associated with the audiovisual content.
  • the first information corresponds to information about the part(s) of a body model onto which the haptic effects have to be applied.
  • the multimedia device Based on this first information and on a second information representative of the location of the actuators on the body model, the multimedia device compute parameters (for example the amplitude of the haptic effects, the duration of the haptic effects, the starting time of the haptic effects, etc) usable to control the actuators.
  • parameters for example the amplitude of the haptic effects, the duration of the haptic effects, the starting time of the haptic effects, etc.
  • the association of the haptic effects with the body model enables to link the haptic effects with the user, which enables to render the haptic effects on the body of the user even if the user moves in the space.
  • Figure 1 shows a workflow for controlling one or more actuators to render haptic effects, according to a particular and non-limitative embodiment of the present disclosure.
  • a multimedia content 1 1 having haptic information associated with is transmitted to unit 12 configured for converting the haptic effect associated with the multimedia content into commands adapted to control one or more actuators 13 able to render haptic effects.
  • the multimedia content 1 1 corresponds for example to an audiovisual content (for example a movie comprising real and/or computer graphics images), to a video game or to a content of a web site.
  • the haptic information, called first information, associated with the multimedia content comprises information about where the haptic effect has to be applied on the body of the user consuming the multimedia content.
  • a body model is used to provide the information about the location of the appliance of the haptic effect.
  • the first information may for example correspond to the part(s) of the body model onto which the haptic effect(s) is(are) to be applied.
  • Further information about the haptic effects may be associated with the multimedia content 1 1 , such as the type of the haptic effect (e.g. motion, vibration, etc.), the duration of the haptic effect, the starting time of the haptic effect (based for example on the timeline of the multimedia content).
  • the first information is then converted 12 into commands, i.e. control parameters, to control the actuators 13 used to render the haptic effects associated with the multimedia content.
  • the control parameters are advantageously computed based on the first information and on information about the actuators.
  • the information about the actuators comprises for example the location of the actuators on the body of the user, this location information being defined by using the body model of the user used for the first information. Further information about the actuators 13 may be advantageously provided to the conversion unit by the actuators themselves, this additional information comprising for example the type of the actuator (e.g. force-feedback device, vibration motors, motion simulator, etc.).
  • Figure 2 shows the body model used to locate where the haptic effect is to be applied and where the actuator(s) is(are) located, according to a particular and non-limitative embodiment of the present disclosure.
  • the body model 2 of figure 2 corresponds to a simplified user's body model represented with joints 21 and segments 22.
  • arms are considered as two segments (arm and forearm) and two joints (elbow and shoulder).
  • the neck is composed of one segment and one joint.
  • the size of the segments and angle limits of joints are advantageously defined by anatomical data, as described for example in "General Anatomy and Musculoskeletal System" by M. Schuenke, E. Schulte, U. Schumacher, L. M. Ross, E. D. Lamperti and M. Voll (Thieme Medical Publishers Inc., 2010).
  • the following table 1 gives examples of parameters associated with the body model 2, the examples being limited to the neck and the arm:
  • the parameters of the body model are updated in real-time according to data provided by sensors attached to the users.
  • the position of each joint is updated according to the data provided by the tracking system. It enables to update the haptic rendering according to the updated position of the user's body. If the user is facing an explosion for example, the user may feel vibrations on his torso, but if it is back that faces the display of the multimedia content, he may feel the vibrations on his back.
  • the body model 2 is not limited to the simplified body model used for illustration purpose with regard to figure 2 but extends to any body model.
  • the body model may comprise a more accurate representation of the human skeleton (size and weight of bones, constraints on the joints) and/or comprise a representation of the muscles.
  • a more detailed body model will enable the computation of more realistic haptic feedback. For example the propagation of the vibrations in the human body may take care of the bone density and the damping properties of the muscles.
  • Figure 3 shows the rendering of haptic effects based on the use of a body model 34 (corresponding for example to the body model 2 of figure 2) and on a physics engine 35, according to a particular and non-limitative embodiment of the present disclosure.
  • a haptic effect 31 e.g. a vibration, a force, etc.
  • the physics engine 35 computes the new state of the body model 34 due to the haptic stimulus (vibration propagation, force distribution, etc.), which provides the new states of the body model 34 attached to the haptic devices (also called actuators).
  • the body model is updated to take into account the deformation of the joints and segments induced by the haptic effect and computed by the physics engine 35.
  • the haptic rendering unit may then compute the commands (control parameters) 33 to be transmitted to the actuator(s) to activate the actuator(s) and to place the actual user's body parts in the desired states.
  • Information 32 about the actuators capabilities may be provided to the body model 34. Such information 32 may correspond to the capabilities of the haptic devices in term of rendering to the haptic effect, for example the type of effect the device is able to render, the maximum intensity of the effect the device is able to render, etc.
  • the physics engine 35 may advantageously take the form of a software.
  • a physics engine enables to provide a simulation, for example real- time, of physical systems, such as the body of a user via the body model 34.
  • a full physics engine may be used to compute the haptic effect to be rendered by each haptic device.
  • Figure 4 shows a graphical representation of a user wearing actuators to render an haptic effect, according to a particular and non- limitative embodiment of the present disclosure.
  • the user 4 wears for example a smartwatch 42 in his left hand and a smartphone 43 in his pocket. Both devices 42 and 43 comprise one vibrating motor which may be used to generate haptic effect.
  • the first information associated with the multimedia content and received by the computing device comprising the physics engine and the haptic rendering unit targets the torso of the user as the location for the haptic effect to be felt by the user. This location is illustrated with a square 41 on figure 4.
  • the haptic effect to be render on the user 4 is a vibration with an intensity of I located on a position P E corresponding to the user's torso.
  • the information about this haptic effect to be rendered may for example take the following form:
  • This information comprises the type of the effect (vibration), the intensity of the effect (4.1 ), the first information corresponding to the location on the body model (torso), the intensity range (0.0 to 50.0), the duration (7 seconds) and the fade (3).
  • the body model is also used to describe the location of the haptic devices (i.e. the smartwatch 42 and the smartphone 43 according to the illustrative example of figure 4).
  • the smartwatch 42 is located on the user's left hand (position P w ) and the smartphone 43 is located in the user's right pocket (position P P ).
  • the information about the location of the haptic devices on the body model may for example take the form of the following code:
  • This code advantageously further comprises the capability of each haptic device, i.e. which kind of haptic effect the device is able to render (vibration in this example for both devices), the maximum intensity of effect the device is able to render (600 according to this example), the id of each device and the location of the device on the body model (respectively left hand and top right hip).
  • the physics engine computes the vibrations to be applied on the user's hand and near his right pocket to obtain the wished haptic effect on the torso 41 .
  • the physics engine compute the control parameters (for example type of effect, intensity of the effect, duration of the effect) to be transmitted to each haptic device 42 and 43.
  • a transfer function associated may be developed. The rule developed with the transfer function may be for example: the longer the distance between the vibration source (i.e. the torso 41 ) and the target point (i.e. the haptic device 42, 43), the lower the intensity of the vibration (i.e. the higher the damping factor is).
  • the haptic effect i.e. the vibration obtained at the target point 42, 43 is actually rendered by the haptic device 42, 43.
  • the intensity for each device l D may be defined as:
  • I D k.
  • I E with k 1 - MAX(dist ⁇ P E , P D ), 1) k being the damping factor.
  • PD corresponds to the position of the haptic device and dist corresponds to the distance between P E (the position of the haptic effect) and P D .
  • dist corresponds to the distance between P E (the position of the haptic effect) and P D .
  • Several methods known from the skilled person in the art may be used to compute dist, for example the Euclidean distance between these two points or the distance passing by each joints of the body model between these two points.
  • the transfer function is not limited to the abovementioned example but extends to any transfer function which enables to compute the parameters of the haptic devices adapted to render the wished haptic effect.
  • FIG. 5 diagrammatically illustrates a hardware embodiment of a device 5 configured for computing the controlled parameters to be transmitted to the one or more haptic devices (actuators).
  • the device 5 corresponds for example to a set-top box, a personal computer (PC), a laptop, a tablet, a Smartphone, a games console or a multimedia terminal.
  • the device 5 comprises the following elements, connected to each other by a bus 50 of addresses and data that also transports a clock signal :
  • microprocessor 51 or CPU
  • a memory 53 for example of the Random Access Memory or RAM type for storing the transfer function, the physics engine and the description of the body model,
  • I/O devices 54 such as for example a keyboard, a mouse, a webcam, and
  • a communication interface configured to receive the first information, the information about the haptic effect to be rendered, data representative of the multimedia content and configured to transmit the control parameters to the haptic devices (the actuators).
  • the device 5 is also configured for the creation of display signals of one or several images, for example images representative of a Graphical User Interface which may be used to visualize the 3D representation of the user 4.
  • a first information representative of the part(s) of the body model onto which the haptic effect(s) is(are) to be applied is received.
  • the first information is for example received via a wired link (for example Ethernet) or via a wireless link (for example according to Wi-Fi).
  • the first information is for example received with the multimedia content with which it is associated. According to a variant, the first information is received before the multimedia content and stored into a memory until rendering when the corresponding part of the multimedia content is displayed.
  • Steps 61 and 62 are advantageously reiterated for each haptic effect associated with the multimedia content to be rendered.
  • the present disclosure is not limited to a method for controlling actuator(s) but also extends to any device implementing this method and notably any devices comprising at least one CPU and/or at least one GPU.
  • the present disclosure also relates to a method (and a device configured) for generating haptic effects.
  • the implementations described herein may be implemented in, for example, a method or a process, an apparatus, a software program, a data stream, or a signal. Even if only discussed in the context of a single form of implementation (for example, discussed only as a method or a device), the implementation of features discussed may also be implemented in other forms (for example a program).
  • An apparatus may be implemented in, for example, appropriate hardware, software, and firmware.
  • the methods may be implemented in, for example, an apparatus such as, for example, a processor, which refers to processing devices in general, including, for example, a computer, a microprocessor, an integrated circuit, or a programmable logic device. Processors also include communication devices, such as, for example, Smartphones, tablets, computers, mobile phones, portable/personal digital assistants ("PDAs”), and other devices that facilitate communication of information between end-users.
  • PDAs portable/personal digital assistants
  • Implementations of the various processes and features described herein may be embodied in a variety of different equipment or applications, particularly, for example, equipment or applications associated with data encoding, data decoding, view generation, texture processing, and other processing of images and related texture information and/or depth information.
  • equipment include an encoder, a decoder, a post-processor processing output from a decoder, a pre-processor providing input to an encoder, a video coder, a video decoder, a video codec, a web server, a set-top box, a laptop, a personal computer, a cell phone, a PDA, and other communication devices.
  • the equipment may be mobile and even installed in a mobile vehicle.
  • a processor may be characterized, therefore, as, for example, both a device configured to carry out a process and a device that includes a processor-readable medium (such as a storage device) having instructions for carrying out a process. Further, a processor-readable medium may store, in addition to or in lieu of instructions, data values produced by an implementation.
  • implementations may produce a variety of signals formatted to carry information that may be, for example, stored or transmitted.
  • the information may include, for example, instructions for performing a method, or data produced by one of the described implementations.
  • a signal may be formatted to carry as data the rules for writing or reading the syntax of a described embodiment, or to carry as data the actual syntax-values written by a described embodiment.
  • Such a signal may be formatted, for example, as an electromagnetic wave (for example, using a radio frequency portion of spectrum) or as a baseband signal.
  • the formatting may include, for example, encoding a data stream and modulating a carrier with the encoded data stream.
  • the information that the signal carries may be, for example, analog or digital information.
  • the signal may be transmitted over a variety of different wired or wireless links, as is known.
  • the signal may be stored on a processor-readable medium.
  • the present disclosure may be used in theatres, at home, in automobile or aircraft simulators, theme park attractions, ...
  • the device 5 described with respect to figure 5 is advantageously equipped with interaction means such as a keyboard a mouse, a joystick or any other modes for introduction of commands, vocal recognition being for instance also possible.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Cardiology (AREA)
  • Biophysics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un procédé et un dispositif de commande d'au moins un actionneur conçu pour rendre au moins un effet haptique. Le procédé consiste en la réception de premières informations représentatives d'au moins une partie d'un modèle corporel (34) sur lequel ledit effet haptique doit être appliqué; et en le calcul d'au moins un paramètre devant être appliqué audit actionneur pour rendre ledit effet haptique, ledit paramètre étant calculé en fonction des premières informations et de secondes informations représentatives d'une association dudit actionneur avec une partie du modèle corporel.
PCT/EP2015/061556 2014-06-02 2015-05-26 Procédé et dispositif de commande d'un dispositif haptique Ceased WO2015185389A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP14305823 2014-06-02
EP14305823.8 2014-06-02

Publications (1)

Publication Number Publication Date
WO2015185389A1 true WO2015185389A1 (fr) 2015-12-10

Family

ID=51136396

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2015/061556 Ceased WO2015185389A1 (fr) 2014-06-02 2015-05-26 Procédé et dispositif de commande d'un dispositif haptique

Country Status (1)

Country Link
WO (1) WO2015185389A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3413168A1 (fr) * 2017-06-05 2018-12-12 Immersion Corporation Rendu haptique avec une illusion de mouvement d'articulation flexible
EP4300263A1 (fr) 2022-07-01 2024-01-03 Go Touch VR Procédé et appareil de signalisation/d'analyse de données haptiques

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040254771A1 (en) * 2001-06-25 2004-12-16 Robert Riener Programmable joint simulator with force and motion feedback
US20080094351A1 (en) * 2006-10-23 2008-04-24 Canon Kabushiki Kaisha Information processing apparatus and information processing method
US20080120029A1 (en) * 2006-02-16 2008-05-22 Zelek John S Wearable tactile navigation system
WO2009136345A1 (fr) * 2008-05-09 2009-11-12 Koninklijke Philips Electronics N.V. Procédé et système pour communiquer une émotion
US20100302015A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Systems and methods for immersive interaction with virtual objects

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040254771A1 (en) * 2001-06-25 2004-12-16 Robert Riener Programmable joint simulator with force and motion feedback
US20080120029A1 (en) * 2006-02-16 2008-05-22 Zelek John S Wearable tactile navigation system
US20080094351A1 (en) * 2006-10-23 2008-04-24 Canon Kabushiki Kaisha Information processing apparatus and information processing method
WO2009136345A1 (fr) * 2008-05-09 2009-11-12 Koninklijke Philips Electronics N.V. Procédé et système pour communiquer une émotion
US20100302015A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Systems and methods for immersive interaction with virtual objects

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3413168A1 (fr) * 2017-06-05 2018-12-12 Immersion Corporation Rendu haptique avec une illusion de mouvement d'articulation flexible
US10366584B2 (en) 2017-06-05 2019-07-30 Immersion Corporation Rendering haptics with an illusion of flexible joint movement
EP4300263A1 (fr) 2022-07-01 2024-01-03 Go Touch VR Procédé et appareil de signalisation/d'analyse de données haptiques
WO2024002812A1 (fr) 2022-07-01 2024-01-04 Go Touch Vr Procédé et appareil de signalisation/d'analyse de données haptiques

Similar Documents

Publication Publication Date Title
CN110675474B (zh) 虚拟角色模型的学习方法、电子设备和可读存储介质
EP3659118B1 (fr) Systèmes et procédés pour animations et interactivité de personnages complexes en temps réel
US9821236B2 (en) Method and device for controlling a haptic device
EP3118723A1 (fr) Procédé et appareil pour fournir une rétroaction haptique et une interactivité basée sur un espace haptique utilisateur (hapspace)
US20170039986A1 (en) Mixed Reality Social Interactions
US20110128292A1 (en) Dynamics-based motion generation apparatus and method
Kervegant et al. Touch hologram in mid-air
JP2023549747A (ja) 触覚オブジェクトの表現フォーマット
EP3180911A1 (fr) Vidéo immersive
KR101799980B1 (ko) 가상 현실 영상 및 모션 시뮬레이터 제어 장치, 시스템 및 방법
KR102137326B1 (ko) 리깅 캐릭터를 이용한 애니메이션 생성 장치 및 애니메이션 생성 방법
JP2018206368A (ja) 柔軟関節の動きの錯覚を用いたハプティック表現
Kang Effect of interaction based on augmented context in immersive virtual reality environment
JP2015515691A (ja) 複数のローカルな力フィードバックを用いてグローバルな6自由度運動効果を提供する方法
WO2015185389A1 (fr) Procédé et dispositif de commande d'un dispositif haptique
US20240153188A1 (en) Physics-based simulation of dynamic character motion using generative artificial intelligence
US20160014386A1 (en) Method for reproducing an item of audiovisual content having haptic actuator control parameters and device implementing the method
Eid et al. Slingshot 3D: A synchronous haptic-audio-video game
US20190311589A1 (en) Apparatus and method for providing virtual texture
CN112102461B (zh) 一种人脸渲染方法、装置、电子设备和存储介质
Tian et al. 3d immersive cardiopulmonary resuscitation (cpr) trainer
KR20230163820A (ko) 가상현실 체험을 위한 전자장갑을 이용한 다중감각 인터페이스 시스템
Ma et al. Value evaluation of human motion simulation based on speech recognition control
He Virtual reality for budget smartphones
CN116489414B (zh) 直播互动方法、装置、系统、计算设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15724663

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15724663

Country of ref document: EP

Kind code of ref document: A1