WO2019163260A1 - 情報処理装置、情報処理方法、およびプログラム - Google Patents
情報処理装置、情報処理方法、およびプログラム Download PDFInfo
- Publication number
- WO2019163260A1 WO2019163260A1 PCT/JP2018/045563 JP2018045563W WO2019163260A1 WO 2019163260 A1 WO2019163260 A1 WO 2019163260A1 JP 2018045563 W JP2018045563 W JP 2018045563W WO 2019163260 A1 WO2019163260 A1 WO 2019163260A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- tactile
- presentation device
- information processing
- processing apparatus
- perceived
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
Definitions
- the present disclosure relates to an information processing apparatus, an information processing method, and a program.
- Patent Document 1 describes a technique for outputting a tactile stimulus to a predetermined device when an event occurs in a virtual space.
- the output tactile stimulus differs depending on the position of the target of the tactile stimulus.
- the same tactile stimulus is output regardless of the position information. It had been.
- the effect of the tactile sensation presentation can only be confirmed by the waveform of the signal output to each tactile sensation presentation device, and the actual presentation effect is difficult to understand.
- the present disclosure proposes an information processing apparatus, an information processing method, and a program that enable an intuitive operation regarding the setting of a perceptual position of a tactile stimulus.
- a display control unit that displays information related to the tactile presentation device and a perceived position specified by the user in the tactile presentation device, and a plurality of the perceptual positions and the tactile presentation device provided in the display control unit.
- An information processing apparatus comprising: a generation unit that generates an output control signal to be output to the plurality of tactile stimulation units so as to perceive a tactile stimulation at the perceptual position according to the position of the tactile stimulation unit.
- a processor for displaying information related to a haptic presentation device and a perceived position in the haptic presentation device specified by a user; and the perceived position and the haptic presentation device. Producing an output control signal to be output to the plurality of tactile stimulation units so as to perceive a tactile stimulation at the perceptual position in accordance with the positions of the plurality of tactile stimulation units.
- a computer in the display control unit that displays information related to a tactile presentation device and a perceived position in the tactile presentation device designated by a user, and the perceived position and the tactile presentation device.
- a program for functioning as a generation unit that generates an output control signal to be output to the plurality of tactile stimulation units so as to perceive a tactile stimulation at the perceptual position according to the positions of the plurality of tactile stimulation units. suggest.
- FIG. 1 is a diagram illustrating an overview of an information processing system according to an embodiment of the present disclosure.
- the information processing system according to the present embodiment includes a tactile presentation device 10 that presents a tactile stimulus to a user, and an information processing apparatus 20 that performs output setting of the tactile stimulus.
- a haptic presentation device 10 illustrated in FIG. 1 includes a plurality of haptic stimulation units 100 (also referred to as actuators) and two audio output units 102 therein.
- a predetermined number for example, six
- the individual haptic stimulation units 100 are arranged in such a positional relationship that the individual haptic stimulation units 100 arranged on the front side and the individual haptic stimulation units 100 arranged on the back side face each other. .
- FIG. 1 shows an example in which the tactile sense presentation device 10 is a vest type (clothes without sleeves).
- the tactile sense presentation device 10 may have a sleeve. Good.
- one or more tactile stimulation units 100 may be disposed not only on the user's chest and abdomen but also on positions corresponding to the user's arms.
- the tactile sense presentation device 10 is not limited to the jacket as shown in FIG. 1, and may be trousers, socks, shoes, a belt, a hat, gloves, a mask, or the like.
- one audio output unit 102 is disposed on each of the left and right sides of the shoulder portion.
- the audio output unit 102 is 1 Only one piece may be arranged, or three or more pieces may be arranged. Further, instead of being included in the haptic presentation device 10, the audio output unit 102 may be disposed in the predetermined space as an independent device, or may be a wearable device (for example, a headphone) different from the haptic presentation device 10. Or a headset, etc.) or a portable device (eg, a portable music player, a smartphone, a portable game machine, etc.).
- a wearable device for example, a headphone
- a portable device eg, a portable music player, a smartphone, a portable game machine, etc.
- the tactile sensation presentation device 10 is not limited to the above-mentioned accessories, and examples include a controller, a gun-type controller, a bed, and a chair.
- Tactile stimulation unit 100 When the plurality of haptic stimulation units 100 included in the haptic presentation device 10 each generate vibrations alone, the generated vibrations can be perceived only at the periphery of the haptic stimulation unit 100. That is, when the individual tactile stimulation units 100 are arranged apart from each other, vibrations generated separately by the individual tactile stimulation units 100 can be perceived discretely in the user's body.
- This phantom sensation is an illusion phenomenon in which when a stimulus is presented simultaneously to different positions on the skin, a human perceives only one stimulus between the presented stimulus positions.
- a perceived position the position of the stimulus perceived by the user (hereinafter referred to as a perceived position) is usually two tactile stimuli. It is known to be a position between the parts 100.
- the range of tactile stimulations that can be presented by the plurality of tactile stimulation units 100 is continuously expanded without changing the arrangement interval of the individual tactile stimulation units 100.
- the output intensity of each of the two tactile stimulation units 100 is continuously weakened, for example, “1”, “0.6”, “0”, and the second tactile stimulation Assume that the output intensity of the unit 100 is continuously increased like “0”, “0.6”, “1”.
- the perceived position (perceived by the user) can continuously move from the contact position of the first haptic stimulation unit 100 to the contact position of the second haptic stimulation unit 100.
- the perceived position and the perceived intensity can be designated on the GUI, and output signals to the corresponding plurality of tactile stimulation units are automatically generated based on the designated perceived position and intensity.
- the user can perform an intuitive operation regarding the setting of the perceptual position of the tactile stimulus.
- the output by each tactile stimulation unit is controlled to confirm the perceptual effect (experience) Again, it was necessary to repeat the three steps, such as conceiving / correcting the perceptual effect to be realized on the GUI, performing output control, and actually experiencing it.
- the perceptual effect desired to be realized on the GUI is controlled (perception position and strength designation), and the perceptual effect is confirmed (experience).
- perceptual effect it is possible to intuitively design a perceptual effect by repeating one step.
- Specification of the perceptual effect ie, perceived position and perceived intensity
- the output of the haptic presentation device 10 changes. In this way, by designating how to actually feel (in what place and with what strength), it becomes possible to omit and control the individual tactile stimulation units 100 themselves by the user.
- the information processing apparatus 20 includes a control unit 200, a communication unit 210, an operation input unit 220, a display unit 230, and a storage unit 240.
- Control unit 200 The control unit 200 functions as an arithmetic processing device and a control device, and controls the overall operation in the information processing device 20 according to various programs.
- the control unit 200 is realized by an electronic circuit such as a CPU (Central Processing Unit) or a microprocessor, for example.
- the control unit 200 may include a ROM (Read Only Memory) that stores programs to be used, calculation parameters, and the like, and a RAM (Random Access Memory) that temporarily stores parameters that change as appropriate.
- ROM Read Only Memory
- RAM Random Access Memory
- the control unit 200 also functions as a perceptual position / intensity determination unit 201, a screen generation unit 202, and an output control unit 203.
- the perceptual position / intensity determining unit 201 has a plurality of tactile stimulation units corresponding to a target perceptual position (so as to perceive the perceived intensity specified at the specified perceptual position) according to the specified perceived position and perceived intensity. It functions as a generation unit that determines (calculates) the vibration intensity for 100 and generates output control signals to be output to the plurality of tactile stimulation units 100.
- the designated perceptual position includes a movement path of the perceptual position.
- the perceived position and the movement path of the perceived position can be set on the surface of the user's body.
- the movement path of the perceived position can be set as a path connecting the first surface of the user's body, the inside of the user's body, and the second surface facing the first surface.
- the first surface may be the front surface of the user, and the second surface may be the back surface of the user.
- the first surface may be a surface on the front side of a predetermined part such as an arm, and the second surface may be a surface on the back side of the part.
- the movement route is a route connecting a position in the front of the user, the inside of the user's body, and a position in the back of the user
- the user is presented with a sense of sticking into the inside of the body from the front to the back. Can do.
- the perceptual position / intensity determination unit 201 can set a perceptual range.
- the perceived position / intensity determining unit 201 can also associate the perceived position with the timing according to the content being played (movie, game content, etc.). Thereby, it becomes possible to give a predetermined tactile stimulus at a predetermined timing to a user who is viewing the content using the tactile sense presentation device 10.
- the perceived position / intensity determining unit 201 first specifies a plurality (for example, three) of tactile stimulation units 100 located in the vicinity of the designated perceived position. The perceived position / intensity determining unit 201 then determines the plurality of tactile stimulating units 100 based on the positional relationship between each of the plurality of tactile stimulating units 100 and the designated perceived position, and the designated perceived intensity. Determine the output intensity of each of the. That is, the perceived position / intensity determining unit 201 outputs the output intensities of the plurality of tactile stimulation units 100 based on the specified perception position and perception intensity and the distances between the plurality of tactile stimulation units 100 arranged in the vicinity. (Output control signal to be output to each tactile stimulation unit 100 is generated). The technique described in PCT / JP2017 / 14379 can be used to adjust the output intensity of the plurality of tactile stimulation units 100 for presenting a predetermined tactile stimulation to a target position on the user's body.
- the perceived position / intensity determination unit 201 includes the first tactile stimulation unit 100 in the user's body.
- the output intensity of the first tactile stimulation unit 100 is determined based on the distance between the contact position and the target perceived position.
- the perceived position / intensity determination unit 201 determines the output intensity of the second tactile stimulation unit 100 based on the distance between the contact position of the second tactile stimulation unit 100 on the user's body and the target perceived position. .
- the perceived position / intensity determining unit 201 is based on the positional relationship between the target position and the intermediate position between the contact position of the first tactile stimulus unit 100 and the contact position of the second tactile stimulus unit 100.
- the output intensity of the first haptic stimulation unit 100 and the output intensity of the second haptic stimulation unit 100 are respectively determined.
- the perceived position / intensity determining unit 201 sums the output intensity of the first haptic stimulation unit 100 and the output intensity of the second haptic stimulation unit 100.
- the output intensities of the first haptic stimulation unit 100 and the second haptic stimulation unit 100 may be determined so as to increase.
- the perceived position / intensity determination unit 201 determines that the first tactile stimulation
- the output intensity of the first haptic stimulation unit 100 may be determined such that the output intensity of the first haptic stimulation unit 100 increases as the distance between the contact position of the unit 100 and the target perceived position increases. The same applies to the second tactile stimulation unit 100 (that is, the relationship is reversed).
- the perceived position / intensity determining unit 201 determines whether the output intensity of the first tactile stimulation unit 100 and the output intensity of the second tactile stimulation unit 100 are based on the positional relationship between the intermediate position and the target perceived position. Change the ratio.
- the screen generation unit 202 can generate a target perceptual position and perceptual intensity setting screen.
- On the setting screen as information related to the tactile presentation device 10, an image showing the positions of the plurality of tactile stimulation units 100 provided in the tactile presentation device 10 and an image showing the outer shape of the tactile presentation device 10 are displayed.
- the user can specify the position (perceived position) of the tactile stimulus to be presented.
- the positions of the plurality of tactile stimulation units 100 may be arranged freely by the user as virtual ones. When the positions of the plurality of tactile stimulation units 100 are preset and known, an image indicating the position of each tactile stimulation unit 100 is displayed.
- the tactile presentation is performed.
- An image showing only the outer shape of the device 10 may be displayed.
- the setting screen it is possible to input a movement path of the perceived position. It is also possible to set a plurality of perceived positions (movement paths of a plurality of perceived positions). It is also possible to set the perceived position and the perceived intensity so that a tactile stimulus is generated at a predetermined position at a predetermined timing corresponding to reproduction of a predetermined content.
- a controller such as a mouse operation, a touch operation, or a 3D pointer, for example. Examples of specific setting screens according to the present embodiment will be described later with reference to FIGS.
- the screen generation unit 202 can also generate a screen that displays the output control signals (waveforms) generated by the perceptual position / intensity determination unit 201 and output to each tactile stimulation unit 100.
- Output control unit 203 The output control unit 203 performs output control of the tactile stimulus to the tactile presentation device 10 according to the determination content of the perceived position / intensity determination unit 201. Thereby, for example, when the user actually wears and feels the tactile sense presentation device 10, the effect (perceptual effect) of the designated tactile stimulus can be confirmed. Specifically, the output control unit 203 outputs the generated output control signal (controls the generation of vibration) to a plurality of predetermined tactile stimulation units 100 determined by the perceived position / intensity determination unit 201. In addition, the output control unit 203 controls reproduction of content (video) to be reproduced on a display device (display, HMD, projector, PC, smartphone, or the like) and the predetermined timing set in accordance with the reproduction of the content.
- a display device display, HMD, projector, PC, smartphone, or the like
- the output control unit 203 can also perform control to reproduce sound such as impact sound corresponding to the tactile stimulus to be presented from the sound output unit 102 of the tactile presentation device 10. Further, the output control unit 203 can perform control to display various screens generated by the screen generation unit 202 on the display unit 230.
- the communication unit 210 transmits / receives information to / from other devices. For example, the communication unit 210 transmits a control signal of the output of the tactile stimulus to each of the plurality of tactile stimulus units 100 (or the tactile sense presentation device 10) according to the control of the output control unit 203. Further, the communication unit 210 transmits a control signal for displaying an image to be reproduced to a display device (not shown) according to the control of the output control unit 203, and outputs a control signal for outputting the sound to be reproduced to a plurality of audio signals. The data is transmitted to each of the output units 102 (or the tactile sense presentation device 10).
- the communication unit 210 is, for example, a wired / wireless LAN (Local Area Network), Wi-Fi (registered trademark), Bluetooth (registered trademark), short-range wireless communication, a mobile communication network (LTE (Long Term Evolution), or 3G ( The third generation mobile communication system)) etc. is used for communication connection with other devices.
- a wired / wireless LAN Local Area Network
- Wi-Fi registered trademark
- Bluetooth registered trademark
- short-range wireless communication a mobile communication network
- LTE Long Term Evolution
- 3G The third generation mobile communication system
- the operation input unit 220 receives an operation instruction from the user and outputs the operation content to the control unit 200.
- the operation input unit 220 may be a touch sensor, a pressure sensor, or a proximity sensor.
- the operation input unit 220 may have a physical configuration such as a keyboard, a mouse, a button, a switch, and a lever.
- the display unit 230 is a display device that outputs a setting screen or the like on which a perceptual effect can be set.
- the display unit 230 may be a display device such as a liquid crystal display (LCD) or an organic EL (Electro Luminescence) display.
- the storage unit 240 is realized by a ROM (Read Only Memory) that stores programs used in the processing of the control unit 200, calculation parameters, and the like, and a RAM (Random Access Memory) that temporarily stores parameters that change as appropriate.
- ROM Read Only Memory
- RAM Random Access Memory
- the configuration of the information processing apparatus 20 according to the present embodiment has been specifically described above.
- the configuration of the information processing apparatus 20 is not limited to the example illustrated in FIG.
- the information processing device 20 may be configured by a plurality of devices.
- the information processing apparatus 20 may further include a voice input unit and a voice output unit.
- the information processing apparatus 20 is not limited to a PC as shown in FIG. 1, and may be realized by a smartphone, a mobile phone terminal, a tablet terminal, a dedicated terminal, or the like. Further, at least a part of the control unit 200 of the information processing apparatus 20 may be realized by a server on the network.
- the display unit 230 may be realized by a projector, and a setting screen or the like may be projected on a wall, a table, a screen, or the like. In this case, the user's operation input on the projection screen may be such that a touch operation on the projection screen is detected by a separately provided camera.
- the haptic presentation device 10 includes a plurality of haptic stimulation units 100a to 100c, a control unit 110, a communication unit 120, and an audio output unit 102.
- the control unit 110 functions as an arithmetic processing unit and a control unit, and controls the overall operation in the tactile sense presentation device 10 according to various programs.
- the control unit 110 is realized by an electronic circuit such as a CPU (Central Processing Unit) or a microprocessor, for example.
- the tactile sense presentation device 10 may include a ROM (Read Only Memory) that stores programs to be used, calculation parameters, and the like, and a RAM (Random Access Memory) that temporarily stores parameters that change as appropriate.
- control unit 110 performs tactile stimulation (for example, vibration) in the plurality of tactile stimulation units 100a to 100c according to the control signal of the output of the tactile stimulation corresponding to each tactile stimulation unit 100 received from the information processing apparatus 20 via the communication unit 120. ) Output.
- tactile stimulation for example, vibration
- the communication unit 120 transmits / receives information to / from other devices.
- the communication unit 120 receives from the information processing apparatus 20 a control signal for the output of a tactile stimulus corresponding to each tactile stimulus unit 100.
- the communication unit 120 receives a control signal for outputting audio to be reproduced from the information processing apparatus 20.
- the communication unit 120 is, for example, a wired / wireless LAN (Local Area Network), Wi-Fi (registered trademark), Bluetooth (registered trademark), short-range wireless communication, a mobile communication network (LTE (Long Term Evolution), or 3G ( The third generation mobile communication system)) etc. is used for communication connection with other devices.
- the tactile stimulation unit 100 is an actuator that presents tactile stimulation to the user.
- the tactile stimulus unit 100 generates vibration as a tactile stimulus.
- the tactile stimulus presented by the tactile stimulus unit 100 is not limited to vibration, and for example, stimuli such as coldness, warmth, wind, water, and pressure are also assumed.
- the audio output unit 102 includes a speaker that reproduces an audio signal and an amplifier circuit for the speaker.
- the configuration of the tactile presentation device 10 according to the present embodiment has been described above. Note that the configuration of the tactile sense presentation device 10 according to the present embodiment is not limited to the example shown in FIG. For example, the tactile sense presentation device 10 may be configured without the audio output unit 102.
- FIG. 5 is a diagram illustrating an example of a setting screen 40 for setting the position of the haptic stimulation unit 100 according to the present embodiment.
- displays 402a to 402g indicating the haptic stimulation unit 100 are displayed on the display 401 of the shape of the jacket-type haptic presentation device 10, for example.
- the displays 402a to 402g indicating the tactile stimulation unit 100 can be moved to an arbitrary position by operating the cursor 403 using a mouse operation, touch operation, or other controller.
- the position of the tactile stimulation unit 100 may be known. That is, the already generated 3D data of the haptic presentation device 10 may be captured, and the external shape of the haptic presentation device 10 and the position of the haptic stimulation unit 100 arranged in the haptic presentation device 10 may be displayed.
- a front image and a side image are shown as an example, but a rear image may be displayed. Further, the front image, the side image, and the back image may be switched according to a user operation.
- FIG. 6 is a diagram showing an example of the perceptual effect setting screen 42 according to the present embodiment.
- the setting screen 42 displays a display 420 indicating the arrangement of the plurality of tactile stimulation units 100.
- the user designates a position (that is, a perceived position) where the tactile stimulus is desired to be generated by operating the mouse.
- a perceived position display 423 is displayed at the designated position.
- the movement path 424 can also be designated by a mouse operation or the like. The user can also set the movement speed of the perceived position.
- a moving image showing the movement locus of the perceived position is played back. That is, a moving image in which the perceived position display 423 moves along the movement path 424 can be reproduced. As a result, the user can confirm the setting contents.
- the playback button 426 is selected, the information processing apparatus 20 plays a moving image showing the movement locus of the perceived position on the setting screen 42 displayed on the display unit 230, and responds from the tactile presentation device 10 to which the information processing apparatus 20 is connected. You may make it perform output control of the tactile stimulus to perform. As a result, the user can immediately receive feedback of the perceptual effect.
- FIG. 7 is a diagram showing an example of the setting screen 44 for setting the tactile stimulus waveform type.
- tactile stimuli such as impact vibration.
- the user moves the waveform type of an arbitrary tactile stimulus (such waveform data is also referred to as “library”) to an arbitrary perceptual position 443 by dragging a mouse or the like, so that the tactile sense at the perceptual position 443 is obtained.
- the type of stimulus for example, the type of vibration waveform.
- a large number of libraries may be set in advance, or an arbitrary audio waveform or moving image data may be taken in and generated by the information processing apparatus 20.
- the information processing apparatus 20 includes a filter / algorithm for converting media into a tactile stimulus signal.
- the information processing apparatus 20 can also recommend an optimal library according to the location and length of the movement path of the perceived position designated by the user, the moving speed of the perceived position, and the like.
- FIG. 8 is a diagram showing an example of a reproduction screen for the set perceptual effect.
- the reproduction screens 46-1 to 46-3 include a display screen 460 for the perceived position, a seek bar 462, and content indicating the (temporal) length of the tactile stimulus set at the perceived position.
- a bar 464 is displayed.
- the play button 463 When the play button 463 is selected, a change in the perceived position is indicated, and the seek position of the corresponding tactile stimulus content also changes.
- the intensity of the perceived position is indicated by the size of the perceived position display 461, for example. In the example shown in FIG. 8, it can be seen that the perceived intensity is higher at the perceived position 461b shown in the reproduction screen 46-2 than at the perceived position 461a as the starting point.
- the user can also check the perceived position and perceived intensity by moving the seek bar.
- a tactile stimulus content bar 466 corresponding to the perceived position 465 may also be displayed.
- the user can arbitrarily adjust the time length of the content bars 464 and 466 (the length of the tactile stimulus).
- the corresponding (synchronized) video content and audio content are also displayed so that the change in the perceived position and the change in the seek position of each content can be confirmed. May be.
- the perceptual effect setting screen it is possible to control the tactile quality (presentation size, gradually becoming stronger / weaker, temperature control, etc.) and other modal effects (lighting effect, video effect, sound effect, etc.) It may be.
- a warning is displayed or the like so that the tactile presentation device 10 does not break down.
- an output control signal for realizing a specified perceived position and perceived intensity is generated.
- a tactile sense presentation time is too long or too strong, or a driving time is too long, and heat is generated.
- it simulates in advance that it may become a load due to low temperature burns or high temperature burns caused by the temperature presentation device, displays a warning on the setting screen, and automatically corrects the output so that it does not become a load Also good.
- the strength of tactile presentation may be expressed by the size of the marker, or the length of the presentation time may be expressed by the color or transparency of the marker.
- the actual output may not be strictly linked to the perceptual effect specified by the user. For example, even if the setting is made to generate 100 separate tactile perceptions in a short time, the actual output may be limited to about 10 times. In this case, the number of times after optimization and the presentation time may or may not be displayed on the setting screen.
- the perceived position is near the stomach (for example, when the ball hits the stomach), decrease the volume of the headphones (corresponding sound volume), or increase the volume if the perceived position is near the chest. It is also possible to automatically adjust the sound effect and the video effect according to the perceived position.
- a control signal for several minutes in the tactile sense presentation device 10 can be output, or the tool itself can function as a player (playback software).
- the arrangement of the plurality of tactile stimulation units 100 is determined before the perceived position is set.
- the information processing apparatus 20 is an arbitrary one designated by the user. It is also possible to recommend an optimal arrangement of the plurality of tactile stimulation units 100 based on the perceived position and the movement path.
- the optimum arrangement is assumed to be an arrangement capable of presenting at least a tactile sensation at a set perceived position, for example, and capable of realizing, for example, power saving and load reduction.
- the user may be allowed to input information such as the number of tactile stimulation units 100 that can be used and restrictions on the range in which they can be installed.
- FIG. 9 is a diagram illustrating an example of arrangement recommendation of a plurality of tactile stimulation units 100.
- the user draws movement paths 481a and 481b of one or more perceptual positions in the outline of the tactile sense presentation device 10 on the setting screen 48-2, and sets the perceptual effect.
- the optimum position display button 483 is selected on the setting screen 48-3, for example, the optimum arrangement of the plurality of tactile stimulation units 100 for realizing the set perceptual effect ( Indications 484a-484f) are shown.
- the recommendation of the optimal arrangement of the plurality of tactile stimulation units 100 includes a proposal for correcting the plurality of tactile stimulation units 100 that have already been arranged.
- a description will be given with reference to FIG.
- FIG. 10 is a diagram illustrating an example of arrangement correction of a plurality of tactile stimulation units 100.
- the display 501a to 501i indicating the positions of a plurality of preset tactile stimulation units 100 (or arbitrarily arranged by the user) is displayed on the setting screen 50-1.
- the user operates the mouse or the like to draw the movement paths 503a and 503b of the target perceived position.
- the information processing apparatus 20 performs a plurality of operations according to the movement paths 503a and 503b of the designated perceived position.
- the optimum positions of the tactile stimulation units 100 are calculated, and as shown in the figure, displays 505a to 505i indicating the optimum positions of the plurality of tactile stimulation units 100 are superimposed and displayed.
- displays 505a to 505i indicating the optimum positions of the plurality of tactile stimulation units 100 are superimposed and displayed.
- the preset displays 501c, 501f, and 501i of the haptic stimulation unit 100 are shifted from the optimum position displays 505c, 505f, and 505i and are corrected.
- the optimization display button 506 on the setting screen 50-3 when the user selects the optimization display button 506 on the setting screen 50-3, the state in which the plurality of tactile stimulation units 100 are changed to the optimal positions is displayed and confirmed.
- the user can adjust the arrangement of the actual plurality of tactile stimulation units 100 provided in the tactile presentation device 10 in accordance with the optimum positions of the plurality of tactile stimulation units 100 presented in this way.
- FIG. 11 is a diagram illustrating an example of an optimized display of the movement path of the perceived position.
- the user first inputs the perceived position and the movement path 520 on the setting screen 52-1.
- the information processing device 20 calculates the optimum route of the perceived position and displays it as the optimum route 522.
- the optimum path can be calculated in consideration of, for example, the influence of the tactile stimulation on the human body, the load on the tactile stimulation unit 100, the arrangement of the surrounding tactile stimulation unit 100, and the like.
- FIG. 12 is a flowchart showing tactile stimulus determination processing according to the present embodiment.
- a position (and intensity) at which a tactile stimulus is to be perceived is designated on the GUI using an input device such as a mouse or a touch panel (step S103).
- the information processing apparatus 20 has a tactile stimulation unit 100 that performs tactile presentation based on the specified perception position (and perception intensity), and presentation strength (the magnitude of a tactile control signal output from the tactile stimulation unit 100). Is determined (step S106).
- the content determined in this manner may be output to the tactile sense presentation device 10 in accordance with, for example, an operation of a playback button on the GUI.
- the user can set intuitive perceptual effects on the GUI, and can immediately feel and confirm the set perceptual effects with the tactile sense presentation device 10, and can adjust this repeatedly.
- the setting of the perceptual effect is not limited to a mouse, a touch panel, or the like, and can be directly input on a real object using a tactile pointer (3D pointer).
- a tactile pointer 3D pointer
- FIG. 13 is a diagram illustrating the haptic pointer 60 according to the present embodiment.
- the locus 601 drawn on the haptic presentation device 10 may be displayed on the display unit 640 of the haptic presentation device 10, or a visible ray 601 such as an LED is emitted from the haptic presentation device 10 to emit a locus 601 on the haptic presentation device 10. May be visualized.
- the tactile presentation control in the tactile presentation device 10 is performed in real time so that the tactile stimulus is presented by the locus drawn by the tactile pointer 60 by the tactile pointer 60 or the information processing apparatus 20 connected to the tactile pointer 60. May be performed. This allows the user to specify the movement path of the perceived position in real time and to experience the perceptual effect. Note that the user himself / herself wearing the tactile presentation device 10 may operate the tactile pointer 60. The user can create data as he / she feels by checking the perceptual effect in real time using the haptic pointer 60 even without expert knowledge.
- the locus 601 is recognized by detecting the distance d to the object (tactile sense presentation device 10) by a distance sensor such as IR (infrared ray) emitted from the tactile pointer 60, for example, and detecting the three-dimensional position of the object (tactile pointer 60). And the movement of the tactile pointer 60 is acquired by a posture sensor such as a gyro sensor or an acceleration sensor provided by the tactile pointer 60.
- a distance sensor such as IR (infrared ray)
- each haptic stimulation unit 100 arranged in the haptic presentation device 10 may be known, may be detected by a camera provided on the haptic pointer 60, or the user may use the haptic pointer 60.
- the position (relative position) of each tactile stimulation unit 100 may be designated and stored.
- the tactile pointer 60 is provided with operation input units 620 such as a recording start / stop button and a playback start / stop button.
- operation input units 620 such as a recording start / stop button and a playback start / stop button.
- the haptic pointer 60 emits infrared rays and starts recognizing the three-dimensional position of the object (the haptic presentation device 10).
- the recording stop button is pressed, the haptic pointer 60 ends the recognition of the three-dimensional position and stores the movement path of the recognized three-dimensional position (that is, the locus 601 on the haptic presentation device 10).
- the haptic pointer 60 is played back from the haptic presentation device 10 so as to present the haptic stimulus in the playback of the stored data, that is, the locus 601 on the stored haptic presentation device 10. Start output control.
- the tactile pointer 60 includes a control unit 600, a communication unit 610, an operation input unit 620, a sensor 630, a display unit 640, a tactile sense presentation unit 650, and a storage unit 660.
- the control unit 600 functions as an arithmetic processing unit and a control unit, and controls the entire operation in the haptic pointer 60 according to various programs.
- the control unit 600 is realized by an electronic circuit such as a CPU (Central Processing Unit) or a microprocessor, for example.
- the control unit 600 may include a ROM (Read Only Memory) that stores programs to be used, calculation parameters, and the like, and a RAM (Random Access Memory) that temporarily stores parameters that change as appropriate.
- the control unit 600 selects a library (tactile type), adjusts intensity, starts / stops recording (stores recorded multi-channel haptic data), Play / Stop (playback of recorded data), tactile presentation area setting (narrow / wide), tactile presentation mode selection (surface tracing (border contrast setting-clear / blurry), punch-through expression Presentation of a sense of sagging)), temperature adjustment (temperature sense presentation setting), force adjustment (force sense presentation setting), and the like.
- a library such as a user operation from the operation input unit 620
- vibration, temperature, and force information may be included in one library.
- control unit 600 can recognize the locus on the tactile sense presentation device 10 based on the detection result by the sensor 630. Specifically, the control unit 600 detects the distance d to the haptic presentation device 10 using, for example, infrared rays, a beacon (Bluetooth), a camera, or the like, and the position (three-dimensional position) of the haptic presentation device 10 with respect to the haptic pointer 60. ) To get. Then, the control unit 600 detects the movement of the haptic pointer 60 according to the detection result of the posture sensor, and recognizes the locus on the haptic presentation device 10 together with the three-dimensional position. The control unit 600 can also recognize the shape of the tactile presentation device 10 by analyzing a captured image captured by a camera provided on the tactile pointer 60 and recognize the relative position of the locus with respect to the tactile presentation device 10. is there.
- the control unit 600 can also recognize the shape of the tactile presentation device 10 by analyzing a captured image captured by a camera provided on the tactile pointer 60 and recognize the
- control unit 600 acquires the position of the haptic stimulation unit 100 provided in the haptic presentation device 10 and generates an output control signal in each haptic stimulation unit 100 for perceptual presentation with a specified locus. Is possible.
- the generation method is as described in the perceptual position / intensity determination unit 201 of the information processing apparatus 20 described above.
- the position of the haptic stimulation unit 100 provided in the haptic presentation device 10 may be input by the user using the haptic pointer 60.
- the user points the haptic stimulation unit 100 provided in the haptic presentation device 10 by using the haptic pointer 60, acquires the three-dimensional position of the target using the haptic pointer 60, and uses the 3 as the “position of the haptic stimulation unit 100”.
- the control unit 600 displays a camera image.
- the marker can be detected and the position of each tactile stimulation unit 100 can be grasped.
- control unit 600 may acquire three-dimensional position information from the haptic presentation device 10 or each haptic stimulation unit 100.
- the communication unit 610 transmits / receives information to / from other devices.
- the communication unit 610 may obtain library update information from the network, upload recorded data, or send the information to the information processing apparatus 20.
- the communication unit 610 may transmit an output control signal (a control signal for outputting a tactile stimulus) to the tactile sense presentation device 10 or each tactile stimulus unit 100.
- the communication unit 610 is, for example, a wired / wireless LAN (Local Area Network), Wi-Fi (registered trademark), Bluetooth (registered trademark), short-range wireless communication, a mobile communication network (LTE (Long Term Evolution), or 3G ( The third generation mobile communication system)) etc. is used for communication connection with other devices.
- the operation input unit 620 receives an operation instruction from the user and outputs the operation content to the control unit 600.
- the operation input unit 620 may be a touch sensor, a pressure sensor, or a proximity sensor.
- the operation input unit 620 may have a physical configuration such as a keyboard, a mouse, a button, a switch, and a lever.
- the sensor 630 includes, for example, a three-dimensional position sensor (a so-called distance sensor such as an infrared sensor, a beacon, or a camera) and an attitude sensor (a gyro sensor, an acceleration sensor, or the like).
- the sensor 630 may include an infrared camera, an RGB camera, or the like that detects markers indicating the positions of the plurality of tactile stimulation units 100 provided in the tactile presentation device 10.
- the display unit 640 is a display device that outputs various operation screens, a screen that displays an input locus (movement path of a perceived position), and the like.
- the display unit 640 may be a display device such as a liquid crystal display (LCD) or an organic EL (Electro Luminescence) display.
- the tactile sense presentation unit 650 has a function of performing feedback to the user's hand holding the tactile pointer 60.
- the tactile sense providing unit 650 can present vibration, force, temperature, and the like as tactile stimuli.
- the tactile sense providing unit 650 may present vibration in real time, for example, when the user inputs a perceptual effect to the tactile sense providing device 10. Thereby, even when the operator of the tactile pointer 60 is not wearing the tactile sense presentation device 10, it is possible to grasp the tactile sense set in real time.
- the storage unit 660 is realized by a ROM (Read Only Memory) that stores programs and calculation parameters used for the processing of the control unit 600, and a RAM (Random Access Memory) that temporarily stores parameters that change as appropriate. For example, the storage unit 660 can save the library and the recorded data.
- ROM Read Only Memory
- RAM Random Access Memory
- the configuration of the haptic pointer 60 according to the present embodiment has been specifically described above.
- the configuration illustrated in FIG. 14 is an example, and the present embodiment is not limited to this.
- the haptic pointer 60 may be configured without the display unit 640 or the haptic presentation unit 650.
- FIG. 15 is a diagram for explaining the adjustment of the presentation area of the tactile stimulus by the button operation of the tactile pointer 60.
- the presentation area of the tactile stimulus can be changed according to, for example, the pressing amount of a predetermined button provided on the tactile pointer 60. For example, as shown on the left side of FIG. 15, when the push amount of the button is small, the presentation area is small (narrow), and as shown on the right side of FIG. 15, the presentation area is large (widely wide). ) become.
- the trajectories 602a and 602b in this case may be visualized by LEDs that emit light from the tactile pointer 60.
- FIG. 16 is a diagram illustrating a case where a movement path of a perceived position that penetrates the body is set.
- the haptic pointer 60 it is possible to select the type of presentation of the haptic stimulus. For example, “penetration mode” or “surface tracing mode” is assumed.
- the penetration mode is selected, as shown in FIG. 7, a movement path connecting the front and the inside of the user's body wearing the tactile presentation device 10 to the inside and the back, that is, a movement path 603 that penetrates the user's body is input. Is possible.
- the tactile pointer 60 When the body is traced by the tactile pointer 60 in this state, it is possible to present a sensation as if the body was cut into two with a sword.
- FIG. 17 is a diagram for explaining a case where the perceptual effect is set using only the haptic pointer 60 without using the haptic presentation device 10. As shown in FIG. 17, it is possible to draw a trajectory with the tip of the tactile pointer 60 in direct contact with the body (an example of a real object) and set the movement path of the perceived position on the body.
- the haptic pointer 60 can acquire the three-dimensional coordinate information of the tip. At this time, a tactile sensation such as vibration or temperature change may be presented at the tip of the tactile pointer 60.
- the tactile pointer 60 or the information processing device 20 is worn on the body and the three-dimensional position information that is a recorded trajectory on the body.
- the output level from the surrounding tactile stimulus units 100 the magnitude of the tactile control signal is large so that the tactile stimulus is perceived in the target locus. Determine).
- FIG. 18 is a diagram illustrating the setting of the perceptual effect using a doll.
- a locus is drawn with a tactile pointer 60 on a doll 70 (an example of a real object), and tactile stimulation corresponding to the locus is presented to the tactile sense. It is possible to feed back (in real time) at the device 10.
- irradiation with a laser pointer from the tactile pointer 60 may be performed so that the perceived position can be visually recognized.
- the tactile pointer 60 acquires the three-dimensional position of the doll by the distance sensor, detects the movement of the tactile pointer 60 by the posture sensor, and detects the locus drawn on the body of the doll based on these.
- the tactile pointer 60 can acquire the irradiation position of the laser pointer irradiated on the doll with a camera and detect a locus drawn on the doll's body.
- the user draws a locus by moving the tactile pointer 60 while pressing the recording button of the tactile pointer 60.
- the haptic pointer 60 stores data for a plurality of channels (that is, output control signals of the haptic stimulation units 100 generated according to the drawn trajectory).
- FIG. 19 is a diagram for explaining various settings of the perceptual effect using a doll.
- the user uses the tactile pointer 60 to irradiate the doll 70 with a laser pointer and confirms the position, and looks at the doll 70 as a human wearing the tactile presentation device 10. It is possible to input the movement path of the perceived position.
- the size of the doll 70 is not particularly limited. When the size of the doll 70 is different from that of the tactile presentation device 10 (or the person wearing the tactile presentation device 10), the tactile pointer 60 is set to the size of the doll and the size of the tactile presentation device 10 (or the human body). In response, scale matching is performed.
- the intensity of tactile presentation can be expressed by the intensity of irradiation by the laser pointer.
- the user can operate the button of the tactile pointer 60 to adjust the intensity of the tactile presentation.
- the tactile presentation is weak
- the light pointer 606a is irradiated.
- the dark pointer 606b may be irradiated.
- the range of tactile presentation can be expressed by the magnitude of irradiation by the laser pointer.
- the user can adjust the range of the tactile presentation by operating the button of the tactile pointer 60.
- a small pointer 607a is irradiated when the range of the tactile presentation is narrow.
- a large pointer 607b may be irradiated.
- the functions of the haptic presentation device 10, the information processing apparatus 20, or the haptic pointer 60 are added to hardware such as a CPU, ROM, and RAM incorporated in the haptic presentation device 10, the information processing apparatus 20, or the haptic pointer 60 described above. It is also possible to create a computer program for exhibiting the above. A computer-readable storage medium storing the computer program is also provided.
- this technique can also take the following structures.
- a display control unit for displaying information related to the tactile presentation device and a perceived position in the tactile presentation device designated by the user;
- a generation unit that generates an output control signal to be output to the plurality of tactile stimulation units so as to perceive a tactile stimulation at the perceptual position according to the perceived position and the positions of the plurality of tactile stimulation units provided in the tactile presentation device.
- An information processing apparatus comprising: (2) The information related to the haptic presentation device is the information processing apparatus according to (1), wherein the information indicating positions of a plurality of haptic stimulation units in the haptic presentation device. (3) The information processing apparatus according to (1), wherein the information related to the tactile presentation device is information indicating an outer shape of the tactile presentation device.
- the information processing apparatus includes: The information processing apparatus according to (9) or (10), wherein control is performed to output the generated output control signal to the tactile sense presentation device in accordance with reproduction of a moving image indicating movement of the perceived position.
- the display control unit according to any one of (1) to (11), wherein the display control unit displays a recommendation screen in which positions of the plurality of tactile stimulation units are optimized according to a perceived position designated by the user.
- the display control unit displays a recommendation screen in which the perceived position is optimized according to the perceived position designated by the user and the positions of the plurality of tactile stimulation units, according to (1) to (12),
- the information processing apparatus according to any one of claims.
- the information processing apparatus according to any one of (1) to (13), wherein the perceived position and a movement path of the perceived position are input by a controller that acquires a three-dimensional position on a real object.
- Processor Displaying information related to the haptic presentation device and a perceived position on the haptic presentation device specified by the user; Generating an output control signal to be output to the plurality of tactile stimulation units so as to perceive a tactile stimulation at the perceptual position according to the perceived position and positions of the plurality of tactile stimulation units provided in the tactile presentation device; , Including an information processing method.
- Computer A display control unit for displaying information related to the tactile presentation device and a perceived position in the tactile presentation device designated by the user; A generation unit that generates an output control signal to be output to the plurality of tactile stimulation units so as to perceive a tactile stimulation at the perceptual position according to the perceived position and the positions of the plurality of tactile stimulation units provided in the tactile presentation device.
- Tactile presentation device 100 Tactile stimulation part 102 Audio
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
1.本開示の一実施形態による情報処理システムの概要
2.構成
2-1.情報処理装置の構成
2-2.触覚提示デバイスの構成
3.知覚効果の設定画面例
4.動作処理
5.触覚ポインタ
6.まとめ
図1は、本開示の一実施形態による情報処理システムの概要について説明する図である。図1に示すように、本実施形態による情報処理システムは、ユーザに触覚刺激を提示する触覚提示デバイス10と、触覚刺激の出力設定等を行う情報処理装置20とを含む。
触覚提示デバイス10に含まれる複数の触覚刺激部100がそれぞれ単独で振動を発生する場合、発生された振動は、当該触覚刺激部100の周辺部でのみ知覚され得る。つまり、個々の触覚刺激部100が離れて配置されている場合には、個々の触覚刺激部100が別々に発生する振動は、ユーザの身体において離散的に知覚され得る。
ここで、様々な触覚提示デバイスの触覚信号をデザインするにあたり、直感的な操作が望まれる。しかしながら、各触覚刺激部100(アクチュエータ)により提示される触覚刺激が、図2に示すように波形(触覚信号)で表示されたとしても、実際の知覚効果(どの部位がどのように感じるか)は不明であった。
続いて、図4を参照して本実施形態による情報処理装置20および触覚提示デバイス10の構成について具体的に説明する。
図4は、本実施形態による情報処理装置20は、制御部200、通信部210、操作入力部220、表示部230、および記憶部240を有する。
制御部200は、演算処理装置および制御装置として機能し、各種プログラムに従って情報処理装置20内の動作全般を制御する。制御部200は、例えばCPU(Central Processing Unit)、マイクロプロセッサ等の電子回路によって実現される。また、制御部200は、使用するプログラムや演算パラメータ等を記憶するROM(Read Only Memory)、及び適宜変化するパラメータ等を一時記憶するRAM(Random Access Memory)を含んでいてもよい。
知覚位置・強度決定部201は、指定された知覚位置および知覚強度に応じて、(指定された知覚位置で指定された知覚強度を知覚するよう)目標の知覚位置に対応する複数の触覚刺激部100に対する振動強度を決定(算出)し、複数の触覚刺激部100に出力する出力制御信号を生成する生成部として機能する。指定される知覚位置には、知覚位置の移動経路も含まれる。
画面生成部202は、目標とする知覚位置および知覚強度の設定画面を生成し得る。設定画面には、触覚提示デバイス10に関連する情報として、触覚提示デバイス10に設けられた複数の触覚刺激部100の位置を示す画像や、触覚提示デバイス10の外形を示す画像が表示され、これに対し、ユーザは、提示したい触覚刺激の位置(知覚位置)を指定し得る。複数の触覚刺激部100の位置は、仮想のものとしてユーザが自由に配置できるようにしてもよい。また、複数の触覚刺激部100の位置が予め設定されて既知の場合は、各触覚刺激部100の位置を示す画像が表示される。また、複数の触覚刺激部100の位置が予め設定されて既知であるが利用者には提示しない設定がされている場合(例えば、各触覚刺激部100の位置が企業秘密の場合)、触覚提示デバイス10の外形のみを示す画像が表示されるようにしてもよい。また、設定画面では、知覚位置の移動経路を入力することが可能である。また、複数の知覚位置(複数の知覚位置の移動経路)を設定することも可能である。また、所定のコンテンツの再生に対応する所定のタイミングで所定位置に触覚刺激を発生させるよう、知覚位置と知覚強度を設定することも可能である。このような設定画面における知覚位置および知覚強度の指定は、例えば、マウス操作やタッチ操作、3Dポインタ等のコントローラで行われ得る。本実施形態による具体的な設定画面の例については、図5~図11を参照して後述する。
出力制御部203は、知覚位置・強度決定部201の決定内容に従って、触覚提示デバイス10に対して触覚刺激の出力制御を行う。これにより、例えばユーザが実際に触覚提示デバイス10を着用して体感することで、指定した触覚刺激の効果(知覚効果)を確認することが可能となる。具体的には、出力制御部203は、知覚位置・強度決定部201により決定された所定の複数の触覚刺激部100に対し、生成された出力制御信号を出力(振動の発生を制御)する。また、出力制御部203は、表示装置(ディスプレイ、HMD、プロジェクター、PC、またはスマートフォン等)で再生するコンテンツ(映像)の再生制御、および当該コンテンツの再生に応じて、設定された所定のタイミングで触覚提示デバイス10からの触覚刺激の出力制御を行うことも可能である。また、出力制御部203は、提示する触覚刺激に対応する衝撃音等の音声を、触覚提示デバイス10の音声出力部102から再生する制御を行うことも可能である。また、出力制御部203は、画面生成部202により生成された各種画面を、表示部230に表示する制御を行い得る。
通信部210は、他の装置との間で情報の送受信を行う。例えば、通信部210は、出力制御部203の制御に従って、触覚刺激の出力の制御信号を複数の触覚刺激部100の各々(または触覚提示デバイス10)へ送信する。また、通信部210は、出力制御部203の制御に従って、再生対象の画像の表示の制御信号を表示装置(不図示)へ送信し、かつ、再生対象の音声の出力の制御信号を複数の音声出力部102の各々(または触覚提示デバイス10)へ送信する。
操作入力部220は、ユーザによる操作指示を受付け、その操作内容を制御部200に出力する。操作入力部220は、タッチセンサ、圧力センサ、若しくは近接センサであってもよい。あるいは、操作入力部220は、キーボード、マウス、ボタン、スイッチ、およびレバーなど、物理的構成であってもよい。
表示部230は、知覚効果の設定を行える設定画面等を出力する表示装置である。この表示部230は、例えば、液晶ディスプレイ(LCD:Liquid Crystal Display)、有機EL((Electro Luminescence)ディスプレイなどの表示装置であってもよい。
記憶部240は、制御部200の処理に用いられるプログラムや演算パラメータ等を記憶するROM(Read Only Memory)、および適宜変化するパラメータ等を一時記憶するRAM(Random Access Memory)により実現される。
次に、本実施形態による触覚提示デバイス10の構成について図4を参照して説明する。図4に示すように、本実施形態による触覚提示デバイス10は、複数の触覚刺激部100a~100c、制御部110、通信部120、および音声出力部102を有する。
次に、本実施形態による知覚効果の設定画面例について、図5~図11を参照して具体的に説明する。
以上説明した例では、知覚位置を設定する前に、複数の触覚刺激部100の配置を決定しているが、本実施形態はこれに限定されず、情報処理装置20は、ユーザが指定した任意の知覚位置および移動経路に基づいて、複数の触覚刺激部100の最適な配置を推薦することも可能である。最適な配置は、例えば少なくとも設定される知覚位置における触覚提示が可能な配置であって、かつ、例えば省電力および負荷の軽減等を実現し得る配置が想定される。この際、ユーザ側で、使用可能な触覚刺激部100の数や、設置可能な範囲の制約などの情報をインプットできるようにしてもよい。
続いて、本実施形態による情報処理システムの動作処理について図12を参照して説明する。図12は、本実施形態による触覚刺激の決定処理を示すフローチャートである。
図13は、本実施形態による触覚ポインタ60について説明する図である。図13に示すように、触覚ポインタ60を触覚提示デバイス10に向けて動かすことで、触覚提示デバイス10上(実物体)に直感的に軌跡601を描き、知覚位置や知覚位置の移動経路を設定することが可能となる。触覚提示デバイス10上で描かれる軌跡601は、触覚提示デバイス10の表示部640に表示してもよいし、触覚提示デバイス10からLED等の可視光線を発射して触覚提示デバイス10上の軌跡601を可視化してもよい。
次に、本実施形態による触覚ポインタ60の構成の一例を図14に示す。図14に示すように、触覚ポインタ60は、制御部600、通信部610、操作入力部620、センサ630、表示部640、触覚提示部650、および記憶部660を有する。
続いて、本実施形態による触覚ポインタ60を用いた知覚効果の具体的な設定操作例について、図面を用いて説明する。
上述したように、本開示の実施形態による情報処理システムでは、触覚刺激の知覚位置の設定に関して直感的な操作を可能とする。
(1)
触覚提示デバイスに関連する情報と、ユーザにより指定された前記触覚提示デバイスにおける知覚位置と、を表示する表示制御部と、
前記知覚位置および前記触覚提示デバイスに設けられた複数の触覚刺激部の位置に応じて、前記知覚位置で触覚刺激を知覚するよう前記複数の触覚刺激部に出力する出力制御信号を生成する生成部と、
を備える、情報処理装置。
(2)
前記触覚提示デバイスに関連する情報は、前記触覚提示デバイスにおける複数の触覚刺激部の位置を示す情報である、前記(1)に記載の情報処理装置。
(3)
前記触覚提示デバイスに関連する情報は、前記触覚提示デバイスの外形を示す情報である、前記(1)に記載の情報処理装置。
(4)
前記生成部は、前記ユーザに指定された前記知覚位置と知覚強度に応じて、前記出力制御信号を生成する、前記(1)~(3)のいずれか1項に記載の情報処理装置。
(5)
前記生成部は、前記ユーザに選択された種類の触覚刺激が前記知覚位置で知覚されるよう、前記出力制御信号を生成する、前記(1)~(4)のいずれか1項に記載の情報処理装置。
(6)
前記表示制御部は、複数種類の触覚刺激の選択画面を表示する、前記(5)に記載の情報処理装置。
(7)
前記表示制御部は、前記ユーザに指定された前記知覚位置に応じて、適した触覚刺激の種類を推薦する推薦画面を表示する、前記(5)に記載の情報処理装置。
(8)
前記表示制御部は、前記知覚位置の移動経路を示す表示を行う、前記(1)~(7)のいずれか1項に記載の情報処理装置。
(9)
前記表示制御部は、
前記知覚位置の移動を示す動画と、
前記動画のシークバーと、を表示する、前記(8)に記載の情報処理装置。
(10)
前記表示制御部は、
前記知覚位置で触覚刺激が提示される際に再生される動画コンテンツを併せて表示する、前記(9)に記載の情報処理装置。
(11)
前記情報処理装置は、
前記知覚位置の移動を示す動画の再生に合わせて、前記生成した出力制御信号を前記触覚提示デバイスに出力する制御を行う、前記(9)または(10)に記載の情報処理装置。
(12)
前記表示制御部は、前記ユーザに指定された知覚位置に応じて、前記複数の触覚刺激部の位置を最適化した推薦画面を表示する、前記(1)~(11)のいずれか1項に記載の情報処理装置。
(13)
前記表示制御部は、前記ユーザに指定された知覚位置と、前記複数の触覚刺激部の位置に応じて、前記知覚位置を最適化した推薦画面を表示する、前記(1)~(12)のいずれか1項に記載の情報処理装置。
(14)
前記知覚位置および前記知覚位置の移動経路は、実物体上の3次元位置を取得するコントローラにより入力される、前記(1)~(13)のいずれか1項に記載の情報処理装置。
(15)
プロセッサが、
触覚提示デバイスに関連する情報と、ユーザにより指定された前記触覚提示デバイスにおける知覚位置と、を表示することと、
前記知覚位置および前記触覚提示デバイスに設けられた複数の触覚刺激部の位置に応じて、前記知覚位置で触覚刺激を知覚するよう前記複数の触覚刺激部に出力する出力制御信号を生成することと、
を含む、情報処理方法。
(16)
コンピュータを、
触覚提示デバイスに関連する情報と、ユーザにより指定された前記触覚提示デバイスにおける知覚位置と、を表示する表示制御部と、
前記知覚位置および前記触覚提示デバイスに設けられた複数の触覚刺激部の位置に応じて、前記知覚位置で触覚刺激を知覚するよう前記複数の触覚刺激部に出力する出力制御信号を生成する生成部と、
として機能させるための、プログラム。
100 触覚刺激部
102 音声出力部
110 制御部
120 通信部
20 情報処理装置
200 制御部
201 知覚位置・強度決定部
202 画面生成部
203 出力制御部
210 通信部
220 操作入力部
230 表示部
240 記憶部
60 触覚ポインタ
600 制御部
610 通信部
620 操作入力部
630 センサ
640 表示部
650 触覚提示部
660 記憶部
Claims (16)
- 触覚提示デバイスに関連する情報と、ユーザにより指定された前記触覚提示デバイスにおける知覚位置と、を表示する表示制御部と、
前記知覚位置および前記触覚提示デバイスに設けられた複数の触覚刺激部の位置に応じて、前記知覚位置で触覚刺激を知覚するよう前記複数の触覚刺激部に出力する出力制御信号を生成する生成部と、
を備える、情報処理装置。 - 前記触覚提示デバイスに関連する情報は、前記触覚提示デバイスにおける複数の触覚刺激部の位置を示す情報である、請求項1に記載の情報処理装置。
- 前記触覚提示デバイスに関連する情報は、前記触覚提示デバイスの外形を示す情報である、請求項1に記載の情報処理装置。
- 前記生成部は、前記ユーザに指定された前記知覚位置と知覚強度に応じて、前記出力制御信号を生成する、請求項1に記載の情報処理装置。
- 前記生成部は、前記ユーザに選択された種類の触覚刺激が前記知覚位置で知覚されるよう、前記出力制御信号を生成する、請求項1に記載の情報処理装置。
- 前記表示制御部は、複数種類の触覚刺激の選択画面を表示する、請求項5に記載の情報処理装置。
- 前記表示制御部は、前記ユーザに指定された前記知覚位置に応じて、適した触覚刺激の種類を推薦する推薦画面を表示する、請求項5に記載の情報処理装置。
- 前記表示制御部は、前記知覚位置の移動経路を示す表示を行う、請求項1に記載の情報処理装置。
- 前記表示制御部は、
前記知覚位置の移動を示す動画と、
前記動画のシークバーと、を表示する、請求項8に記載の情報処理装置。 - 前記表示制御部は、
前記知覚位置で触覚刺激が提示される際に再生される動画コンテンツを併せて表示する、請求項9に記載の情報処理装置。 - 前記情報処理装置は、
前記知覚位置の移動を示す動画の再生に合わせて、前記生成した出力制御信号を前記触覚提示デバイスに出力する制御を行う、請求項9に記載の情報処理装置。 - 前記表示制御部は、前記ユーザに指定された知覚位置に応じて、前記複数の触覚刺激部の位置を最適化した推薦画面を表示する、請求項1に記載の情報処理装置。
- 前記表示制御部は、前記ユーザに指定された知覚位置と、前記複数の触覚刺激部の位置に応じて、前記知覚位置を最適化した推薦画面を表示する、請求項1に記載の情報処理装置。
- 前記知覚位置および前記知覚位置の移動経路は、実物体上の3次元位置を取得するコントローラにより入力される、請求項1に記載の情報処理装置。
- プロセッサが、
触覚提示デバイスに関連する情報と、ユーザにより指定された前記触覚提示デバイスにおける知覚位置と、を表示することと、
前記知覚位置および前記触覚提示デバイスに設けられた複数の触覚刺激部の位置に応じて、前記知覚位置で触覚刺激を知覚するよう前記複数の触覚刺激部に出力する出力制御信号を生成することと、
を含む、情報処理方法。 - コンピュータを、
触覚提示デバイスに関連する情報と、ユーザにより指定された前記触覚提示デバイスにおける知覚位置と、を表示する表示制御部と、
前記知覚位置および前記触覚提示デバイスに設けられた複数の触覚刺激部の位置に応じて、前記知覚位置で触覚刺激を知覚するよう前記複数の触覚刺激部に出力する出力制御信号を生成する生成部と、
として機能させるための、プログラム。
Priority Applications (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| EP18906864.6A EP3757721A4 (en) | 2018-02-20 | 2018-12-11 | INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD AND PROGRAM |
| JP2020502043A JP7314926B2 (ja) | 2018-02-20 | 2018-12-11 | 情報処理装置、情報処理方法、およびプログラム |
| CN201880089356.3A CN111712779B (zh) | 2018-02-20 | 2018-12-11 | 信息处理装置、信息处理方法和程序 |
| US16/969,695 US11334226B2 (en) | 2018-02-20 | 2018-12-11 | Information processing device, information processing method, and program |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2018027689 | 2018-02-20 | ||
| JP2018-027689 | 2018-02-20 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2019163260A1 true WO2019163260A1 (ja) | 2019-08-29 |
Family
ID=67687521
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2018/045563 Ceased WO2019163260A1 (ja) | 2018-02-20 | 2018-12-11 | 情報処理装置、情報処理方法、およびプログラム |
Country Status (5)
| Country | Link |
|---|---|
| US (1) | US11334226B2 (ja) |
| EP (1) | EP3757721A4 (ja) |
| JP (1) | JP7314926B2 (ja) |
| CN (1) | CN111712779B (ja) |
| WO (1) | WO2019163260A1 (ja) |
Cited By (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2021148712A (ja) * | 2020-03-23 | 2021-09-27 | カシオ計算機株式会社 | 位置測定システム、位置測定装置、位置測定方法及びプログラム |
| JP2021149585A (ja) * | 2020-03-19 | 2021-09-27 | 豊田合成株式会社 | アクチュエータ装置 |
| WO2021210341A1 (ja) * | 2020-04-14 | 2021-10-21 | ソニーグループ株式会社 | 情報処理装置、情報処理方法 |
| JPWO2022118746A1 (ja) * | 2020-12-04 | 2022-06-09 | ||
| JP2023007801A (ja) * | 2021-07-02 | 2023-01-19 | 日本電信電話株式会社 | 振動信号生成装置、振動提示装置、それらの方法、およびプログラム |
Families Citing this family (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2021026618A (ja) * | 2019-08-07 | 2021-02-22 | ソニー株式会社 | 生成装置、生成方法、プログラム及び触覚提示デバイス |
Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2014203960A1 (ja) * | 2013-06-21 | 2014-12-24 | 株式会社ニコン | 振動データ生成プログラム、および振動データ生成装置 |
| JP2015111417A (ja) * | 2013-11-14 | 2015-06-18 | イマージョン コーポレーションImmersion Corporation | ハプティック空間化システム |
| JP2016001472A (ja) * | 2014-06-09 | 2016-01-07 | イマージョン コーポレーションImmersion Corporation | オーディオ・トラックを介して触覚効果を与えるための触覚デバイスおよび方法 |
| US20160291694A1 (en) * | 2015-04-03 | 2016-10-06 | Disney Enterprises, Inc. | Haptic authoring tool for animated haptic media production |
| WO2017009181A1 (en) * | 2015-07-13 | 2017-01-19 | Thomson Licensing | Method and apparatus for providing haptic feedback and interactivity based on user haptic space (hapspace) |
| WO2017053761A1 (en) * | 2015-09-25 | 2017-03-30 | Immersion Corporation | Haptic effects design system |
| WO2018008217A1 (ja) * | 2016-07-07 | 2018-01-11 | ソニー株式会社 | 情報処理装置、情報処理方法、およびプログラム |
Family Cites Families (17)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN102016759A (zh) * | 2008-05-09 | 2011-04-13 | 皇家飞利浦电子股份有限公司 | 用于传送情绪的方法和系统 |
| KR20100078294A (ko) | 2008-12-30 | 2010-07-08 | 삼성전자주식회사 | 진동 패턴 생성 방법 및 이를 이용한 휴대 단말기 |
| US8540571B2 (en) * | 2010-03-31 | 2013-09-24 | Immersion Corporation | System and method for providing haptic stimulus based on position |
| US9880621B2 (en) * | 2010-04-08 | 2018-01-30 | Disney Enterprises, Inc. | Generating virtual stimulation devices and illusory sensations using tactile display technology |
| US8872762B2 (en) * | 2010-12-08 | 2014-10-28 | Primesense Ltd. | Three dimensional user interface cursor control |
| ES2705526T3 (es) * | 2012-09-11 | 2019-03-25 | Life Corp Sa | Plataforma de comunicación ponible |
| US9619029B2 (en) * | 2013-11-14 | 2017-04-11 | Immersion Corporation | Haptic trigger control system |
| US10532181B2 (en) * | 2014-02-04 | 2020-01-14 | Team Turquoise Ltd. | Wearable apparatus |
| CN106102848B (zh) * | 2014-03-26 | 2020-03-13 | 索尼公司 | 感觉导入装置、感觉导入系统和感觉导入方法 |
| US20170098350A1 (en) * | 2015-05-15 | 2017-04-06 | Mick Ebeling | Vibrotactile control software systems and methods |
| WO2016007798A2 (en) * | 2014-07-09 | 2016-01-14 | Akari Systems, Inc. | Wearable therapeutic light source |
| CN105589594B (zh) * | 2014-11-06 | 2019-12-31 | 天马微电子股份有限公司 | 电子装置和电子装置的操作控制方法 |
| US20180036531A1 (en) * | 2015-02-18 | 2018-02-08 | Wearable Life Science Gmbh | Device, system and method for the transmission of stimuli |
| WO2017196666A1 (en) * | 2016-05-09 | 2017-11-16 | Subpac, Inc. | Tactile sound device having active feedback system |
| US9797729B1 (en) * | 2016-10-25 | 2017-10-24 | Toyota Motor Engineering & Manufacturing North America, Inc. | Systems and methods for automatic fit adjustment of a wearable device |
| US10744058B2 (en) * | 2017-04-20 | 2020-08-18 | Neosensory, Inc. | Method and system for providing information to a user |
| CN107320095B (zh) * | 2017-06-30 | 2020-06-23 | 联想(北京)有限公司 | 一种心电监测方法和心电监测设备 |
-
2018
- 2018-12-11 WO PCT/JP2018/045563 patent/WO2019163260A1/ja not_active Ceased
- 2018-12-11 JP JP2020502043A patent/JP7314926B2/ja active Active
- 2018-12-11 EP EP18906864.6A patent/EP3757721A4/en not_active Withdrawn
- 2018-12-11 US US16/969,695 patent/US11334226B2/en active Active
- 2018-12-11 CN CN201880089356.3A patent/CN111712779B/zh active Active
Patent Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2014203960A1 (ja) * | 2013-06-21 | 2014-12-24 | 株式会社ニコン | 振動データ生成プログラム、および振動データ生成装置 |
| JP2015111417A (ja) * | 2013-11-14 | 2015-06-18 | イマージョン コーポレーションImmersion Corporation | ハプティック空間化システム |
| JP2016001472A (ja) * | 2014-06-09 | 2016-01-07 | イマージョン コーポレーションImmersion Corporation | オーディオ・トラックを介して触覚効果を与えるための触覚デバイスおよび方法 |
| US20160291694A1 (en) * | 2015-04-03 | 2016-10-06 | Disney Enterprises, Inc. | Haptic authoring tool for animated haptic media production |
| WO2017009181A1 (en) * | 2015-07-13 | 2017-01-19 | Thomson Licensing | Method and apparatus for providing haptic feedback and interactivity based on user haptic space (hapspace) |
| WO2017053761A1 (en) * | 2015-09-25 | 2017-03-30 | Immersion Corporation | Haptic effects design system |
| WO2018008217A1 (ja) * | 2016-07-07 | 2018-01-11 | ソニー株式会社 | 情報処理装置、情報処理方法、およびプログラム |
Non-Patent Citations (1)
| Title |
|---|
| See also references of EP3757721A4 * |
Cited By (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2021149585A (ja) * | 2020-03-19 | 2021-09-27 | 豊田合成株式会社 | アクチュエータ装置 |
| JP7205512B2 (ja) | 2020-03-19 | 2023-01-17 | 豊田合成株式会社 | アクチュエータ装置 |
| JP2021148712A (ja) * | 2020-03-23 | 2021-09-27 | カシオ計算機株式会社 | 位置測定システム、位置測定装置、位置測定方法及びプログラム |
| JP7006714B2 (ja) | 2020-03-23 | 2022-01-24 | カシオ計算機株式会社 | 位置測定システム、位置測定装置、位置測定方法及びプログラム |
| WO2021210341A1 (ja) * | 2020-04-14 | 2021-10-21 | ソニーグループ株式会社 | 情報処理装置、情報処理方法 |
| JPWO2022118746A1 (ja) * | 2020-12-04 | 2022-06-09 | ||
| WO2022118746A1 (ja) * | 2020-12-04 | 2022-06-09 | ソニーグループ株式会社 | 情報処理装置、情報処理方法、プログラム及び情報処理システム |
| US12422929B2 (en) | 2020-12-04 | 2025-09-23 | Sony Group Corporation | Information processing device, information processing method, program, and information processing system |
| JP2023007801A (ja) * | 2021-07-02 | 2023-01-19 | 日本電信電話株式会社 | 振動信号生成装置、振動提示装置、それらの方法、およびプログラム |
| JP7642942B2 (ja) | 2021-07-02 | 2025-03-11 | 日本電信電話株式会社 | 振動信号生成装置、振動提示装置、それらの方法、およびプログラム |
Also Published As
| Publication number | Publication date |
|---|---|
| JPWO2019163260A1 (ja) | 2021-02-04 |
| CN111712779A (zh) | 2020-09-25 |
| US11334226B2 (en) | 2022-05-17 |
| US20210004132A1 (en) | 2021-01-07 |
| JP7314926B2 (ja) | 2023-07-26 |
| EP3757721A4 (en) | 2021-04-21 |
| CN111712779B (zh) | 2025-04-15 |
| EP3757721A1 (en) | 2020-12-30 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US12135838B2 (en) | Information processing device and information processing method for tactile stimulation | |
| JP7314926B2 (ja) | 情報処理装置、情報処理方法、およびプログラム | |
| US11347311B2 (en) | Systems and methods for providing haptic feedback for remote interactions | |
| US20170221379A1 (en) | Information terminal, motion evaluating system, motion evaluating method, and recording medium | |
| US7732694B2 (en) | Portable music player with synchronized transmissive visual overlays | |
| CN109984911B (zh) | 一种具有虚拟现实功能的按摩设备及其控制方法 | |
| JP2019080920A (ja) | 音声情報を補償する視覚的表示方法及び装置、記録媒体、プログラム、電子機器 | |
| CN117311494A (zh) | 发光用户输入设备 | |
| WO2018008217A1 (ja) | 情報処理装置、情報処理方法、およびプログラム | |
| JP6908053B2 (ja) | 情報処理装置、情報処理方法、およびプログラム | |
| JP6834614B2 (ja) | 情報処理装置、情報処理方法、およびプログラム | |
| KR101415944B1 (ko) | 기상 상태에 대한 입체 음향을 제공하는 가상 골프 시뮬레이션 장치 및 방법 | |
| KR102254705B1 (ko) | 단말 장치 및 그 제어방법 | |
| EP4054297B1 (en) | Image display system, display control method, and light emission control method | |
| KR20210075082A (ko) | 정보 처리 장치, 정보 처리 방법, 및 프로그램 | |
| US11190874B2 (en) | Information processing device and information processing method | |
| US11228855B2 (en) | Information processing device and information processing method | |
| JP2021073749A (ja) | 情報処理装置、情報処理方法、およびプログラム | |
| US20240367035A1 (en) | Information processing method, information processing system and computer program | |
| US11392203B2 (en) | Information processing apparatus, information processing method, and program | |
| US20250287168A1 (en) | 3d audio generating device, 3d audio reproduction device, 3d audio generation method, 3d audio generating program, and memory medium |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18906864 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2020502043 Country of ref document: JP Kind code of ref document: A |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| ENP | Entry into the national phase |
Ref document number: 2018906864 Country of ref document: EP Effective date: 20200921 |
|
| WWG | Wipo information: grant in national office |
Ref document number: 201880089356.3 Country of ref document: CN |
|
| WWW | Wipo information: withdrawn in national office |
Ref document number: 2018906864 Country of ref document: EP |