[go: up one dir, main page]

WO2020049705A1 - Dispositif de fourniture de contenu, système de fourniture de contenu, dispositif serveur de fourniture de contenu, dispositif d'acquisition de contenu, procédé de fourniture de contenu et programme de fourniture de contenu - Google Patents

Dispositif de fourniture de contenu, système de fourniture de contenu, dispositif serveur de fourniture de contenu, dispositif d'acquisition de contenu, procédé de fourniture de contenu et programme de fourniture de contenu Download PDF

Info

Publication number
WO2020049705A1
WO2020049705A1 PCT/JP2018/033125 JP2018033125W WO2020049705A1 WO 2020049705 A1 WO2020049705 A1 WO 2020049705A1 JP 2018033125 W JP2018033125 W JP 2018033125W WO 2020049705 A1 WO2020049705 A1 WO 2020049705A1
Authority
WO
WIPO (PCT)
Prior art keywords
content
tactile
unit
haptic
mode
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2018/033125
Other languages
English (en)
Japanese (ja)
Inventor
理絵子 鈴木
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Facetherapie Inc
Original Assignee
Facetherapie Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Facetherapie Inc filed Critical Facetherapie Inc
Priority to PCT/JP2018/033125 priority Critical patent/WO2020049705A1/fr
Priority to JP2019557643A priority patent/JP6644293B1/ja
Publication of WO2020049705A1 publication Critical patent/WO2020049705A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F13/00Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units

Definitions

  • the present invention relates to a content providing device, a content providing system, a content providing server device, a content obtaining device, a content providing method, and a content providing program, and in particular, vibration information generated so as to make a user sense a predetermined sense of touch.
  • the present invention relates to an apparatus for providing or using haptic content, and a system including the apparatus.
  • the present inventor has devised a technique for generating haptic information that can transmit truly meaningful tactile information as a tactile sensation, for example, a message meaningful to a recipient by a haptic effect (for example, Patent Documents 1 and 2).
  • JP 2018-29802 A JP 2018-32236 A JP 2018-32236 A
  • Patent Documents 1 and 2 disclose techniques for generating haptic content, but do not disclose in detail how to provide and use haptic content. For example, it does not disclose how to provide a user with a plurality of types of haptic contents generated to make the user image various sensations related to haptics. Therefore, an object of the present invention is to provide an environment in which a user can easily use tactile contents.
  • a tactile mode designating unit that designates one of a plurality of tactile modes having different characteristics of tactile sensation is provided, and a predetermined tactile sensation is provided.
  • Content that is vibration information generated so as to cause the user to image the haptic content according to the tactile mode specified by the tactile mode specifying unit, and the obtained haptic content is used as a content-using device. To supply it.
  • a haptic content that generates a specific tactile sensation by vibration is set to the tactile mode. It can be appropriately provided to the user accordingly. Thereby, an environment in which the user can easily use the tactile content can be provided.
  • FIG. 2 is a diagram schematically illustrating a relationship between a content supply device and a content use device.
  • FIG. 2 is a block diagram illustrating a functional configuration example of the content supply device according to the first embodiment.
  • FIG. 2 is a block diagram illustrating a functional configuration example of the content supply device according to the first embodiment.
  • FIG. 2 is a block diagram illustrating a functional configuration example of the content supply device according to the first embodiment.
  • It is a block diagram showing an example of functional composition of a tactile mode designation part.
  • FIG. 9 is a diagram for describing an example of processing contents by a tactile parameter generation unit. It is a figure showing an example of the specification contents of the tactile mode by the mode specification part.
  • FIG. 9 is a diagram for describing an example of processing contents by a tactile parameter generation unit. It is a figure showing an example of the specification contents of the tactile mode by the mode specification part.
  • FIG. 3 is a block diagram illustrating a functional configuration example of a haptic content acquisition unit according to a first configuration example.
  • 5 is a flowchart illustrating an operation example of the content supply device using the haptic content acquisition unit according to the first configuration example. It is a block diagram showing the example of functional composition of the tactile content acquisition part concerning the 2nd example of composition. It is an image figure which shows the processing content of a vibration waveform processing part typically. It is a block diagram showing an example of functional composition of a vibration waveform processing part.
  • FIG. 5 is a diagram for explaining processing contents of a feature extraction unit and a weight information generation unit. It is a figure which shows the waveform information of the haptic content processed by the vibration waveform processing part with the waveform information of the audio content.
  • FIG. 9 is a flowchart illustrating an operation example of a content supply device using a haptic content acquisition unit according to a second configuration example. It is a block diagram showing the example of functional composition of the tactile content acquisition part concerning the 3rd example of composition. It is a block diagram showing the example of functional composition of the tactile content acquisition part concerning the 3rd example of composition. It is a block diagram showing the example of functional composition of the tactile content acquisition part concerning the 3rd example of composition. It is a block diagram showing the example of functional composition of the tactile content acquisition part concerning the 3rd example of composition.
  • FIG. 4 is a diagram schematically illustrating a storage example of tactile content stored in a tactile content storage unit. It is a flowchart which shows the operation example of the content supply apparatus using the tactile content acquisition part concerning the 3rd structural example.
  • FIG. 9 is a diagram for describing a repetition pattern determination algorithm by a repetition extraction unit. It is a flow chart which shows an example of operation of a contents supply device using a tactile contents acquisition part concerning a 4th example of composition.
  • FIG. 7 is a diagram for describing an example of processing contents by a tactile parameter generation unit when image information is used as target information.
  • FIG. 9 is a diagram for describing an example of processing contents by a tactile parameter generation unit when text information is used as target information. It is a figure for explaining an example of processing contents by a tactile parameter generation part at the time of inputting massage operation information.
  • It is a block diagram showing an example of functional composition of a contents supply device by a 2nd embodiment. It is a block diagram showing an example of functional composition of a contents supply device by a 2nd embodiment.
  • a content supply device for supplying a tactile content, which is vibration information generated to cause a user to image a predetermined tactile sensation, to a content using device.
  • FIG. 1 is a diagram schematically showing a relationship between a content supply device 100 according to the present embodiment and a content use device 500 that uses tactile content supplied from the content supply device 100.
  • the content supply device 100 of the present embodiment can be connected to the content use device 500 by wire.
  • the content supply device 100 supplies the haptic content to the content use device 500 via a wired cable or the like.
  • the content supply device 100 of the present embodiment can be wirelessly connected to the content use device 500.
  • the content supply device 100 supplies the haptic content to the content use device 500 using communication according to a predetermined wireless standard.
  • the content supply device 100 can be built in a content using device or attached to the content using device 500.
  • the content supply device 100 supplies the vibration information of the haptic content to the vibration imparting unit of the content use device 500 via an internal bus or a signal line of the content use device 500.
  • the content utilization device 500 is a device having at least a function of exerting a given function using vibration as a medium.
  • massage devices beauty devices, in-vehicle devices, home appliances, audio devices, virtual experience devices such as VR (virtual reality), AR (augmented reality), and MR (mixed reality), medical devices, welfare devices, toys, games Machine or its controller, switches and buttons, vibration alarm, touch panel with vibration feedback, etc.
  • VR virtual reality
  • AR augmented reality
  • MR mixed reality
  • the devices listed here are only examples of the content use device 500, and the present invention is not limited to these.
  • the content use device 500 may be a device having a function of performing a given operation exclusively using vibration as a medium, or a device that performs a given operation using information other than vibration (eg, audio, video, etc.) as a medium. It may be a device further provided with a function of performing the following.
  • the content supply device 100 has a function of storing at least the haptic content (including a case where the content received from the external device is stored or temporarily stored as described later) and supplying the stored haptic content to the content using device 500. What is necessary is just to have. Alternatively, the content supply device 100 stores information necessary for generating the haptic content (including the case where the content received from the external device is stored or temporarily stored), and the haptic content is obtained from the information. May be generated and provided to the content use device 500.
  • the content supply device 100 having the functions as described above, for example, a personal computer, a smartphone, a tablet terminal, a server device, an in-vehicle device, a game machine, or the like, or a dedicated terminal created for providing the content of the present embodiment may be used. It is possible. In the case where the content use device 500 incorporates or mounts the content supply device 100 as shown in FIG.
  • the content supply device 100 includes the massage device, the beauty device, the in-vehicle device, the home appliance, A configuration in which an audio device, a VR / AR / MR device, a medical device, a welfare device, a toy, a game machine, a personal computer, a smartphone, a tablet terminal, a server device, and the like may be provided.
  • FIGS. 2A to 2C are block diagrams showing a functional configuration example of the content supply device 100 according to the first embodiment.
  • the content supply device 100 supplies audio content to the content use device 500 in addition to the tactile content.
  • audio content is content including audio, and is not limited to content having only audio as main data, such as music, speech, sound effects, and alarm sounds. May be audio contents attached to the content.
  • the content supply device 100 has, as its functional configuration, a haptic content acquisition unit 11, a haptic content supply unit 12, a tactile mode designation unit 13, an audio content acquisition unit. 14 and an audio content supply unit 15.
  • Each of the functional blocks 11 to 15 can be configured by any of hardware, DSP (Digital Signal Processor), and software.
  • DSP Digital Signal Processor
  • each of the functional blocks 11 to 15 is actually configured to include a CPU, a RAM, a ROM, and the like of a computer, and is stored in a storage medium such as a RAM, a ROM, a hard disk, or a semiconductor memory. This is realized by the operation of the providing program.
  • the audio content acquisition unit 14 acquires audio content in response to an instruction from the user for selecting audio content.
  • the selection of the audio content includes selecting a content such as a moving image or a video accompanied by the audio content.
  • the original information of speech synthesis is selected (for example, selection of text information in the case of speech synthesis of a sentence or the like).
  • selection of various function buttons when a function description is read out using synthesized speech, etc.) is also included as an example of selection of speech content.
  • the synthesized speech corresponds to the speech content.
  • the audio content acquisition unit 14 includes an audio content reading unit 141 and an audio content storage unit 142.
  • the audio content storage unit 142 is a non-volatile storage medium such as a hard disk, a semiconductor memory, or the like, and can continue to store audio content unless a user explicitly gives an instruction for deletion.
  • the audio content reading unit 141 acquires audio content by reading audio content selected by a user operation from one or more audio contents stored in the audio content storage unit 142.
  • the selection of the audio content by the user operation is performed by, for example, a user operating an operation unit provided in the content supply device 100 or the content use device 500, or a predetermined graphical user interface provided on the display by the content supply device 100 or the content use device 500. It can be done by doing.
  • FIG. 2A shows a form in which one or more audio contents are stored in the audio content storage unit 142 in advance.
  • FIG. 2B shows that the content supply device 100 is configured to be connectable to an external device (for example, a server) 700 via a communication network 600, and receives desired audio content from the external device 700 via the communication network 600.
  • the figure shows a mode provided with a function to be stored in the audio content storage unit 142. That is, in the example of FIG. 2B, the haptic content acquisition unit 11 further includes the audio content recording unit 16.
  • the audio content recording unit 16 receives audio content from the external device 700 via the communication network 600 and stores the audio content in the audio content storage unit 142. The user can download the favorite audio content from the external device 700 and save it in the audio content storage unit 142.
  • the audio content storage unit 142 ' may be a storage medium for temporarily storing audio content received from the external device 700.
  • the audio content recording unit 16 notifies the external device 700 of the audio content selected by the user operation.
  • the audio content reading unit 141 ′ reads the audio content transmitted from the external device 700 as a response to the notification and temporarily stored in the audio content storage unit 142 ′, and supplies the audio content to the audio content supply unit 15.
  • the audio content reading unit 141 ′ and the audio content supply unit 15 read the audio content temporarily stored in the audio content storage unit 142 ′ and supply the audio content to the content using device 500 by the content supply device 100.
  • the processing is performed in series with the processing of receiving audio content from the external device 700 and storing the audio content in the audio content storage unit 142 ′.
  • the audio content supply unit 15 supplies the audio content acquired by the audio content acquisition unit 14 to the content use device 500.
  • the content use device 500 outputs a sound to a sound output unit such as a speaker based on the sound content supplied from the sound content supply unit 15.
  • a sound output unit such as a speaker based on the sound content supplied from the sound content supply unit 15.
  • the content using device 500 outputs the audio to an audio output unit such as a speaker based on the audio content, and outputs the video to a display.
  • the tactile mode designating unit 13 designates any one of a plurality of tactile modes having different sensational sensations.
  • the tactile mode designating unit 13 analyzes the audio content acquired by the audio content acquiring unit 14, and designates the tactile mode according to the analysis result. That is, the tactile mode designating unit 13 analyzes a feature of the waveform information of the audio content, and designates a tactile mode representing a tactile quality according to the feature. The details of this analysis processing will be described later.
  • the haptic content acquisition unit 11 acquires haptic content according to the tactile mode specified by the tactile mode specification unit 13. The details of the tactile content acquisition processing (there are a plurality of patterns) will be described later.
  • the haptic content supply unit 12 supplies the haptic content acquired by the haptic content acquisition unit 11 to the content use device 500.
  • the content use device 500 executes a given operation using vibration as a medium based on the haptic content supplied from the haptic content supply unit 12.
  • the haptic content acquisition unit 11 automatically acquires the haptic content when, for example, the user performs an operation of selecting the audio content.
  • the acquisition of the audio content by the audio content acquisition unit 14 and the acquisition of the haptic content by the haptic content acquisition unit 11 are performed according to the operation of selecting the audio content.
  • the haptic content acquisition unit 11 may acquire the haptic content when an operation for instructing acquisition of the haptic content is performed by the user separately from the operation of selecting the audio content. In this case, the acquisition of the audio content by the audio content acquisition unit 14 and the acquisition of the haptic content by the haptic content acquisition unit 11 are performed asynchronously.
  • the acquisition of the haptic content is automatically performed in response to the selection operation of the audio content as in the former case, the supply of the haptic content from the content supply device 100 to the content use device 500 and the supply of the haptic content from the content supply device 100 to the content use device 500 are performed. And the supply of audio content to. Accordingly, it is possible to easily perform the application of the vibration based on the tactile content in the content using device 500 and the output of the voice based on the audio content in the content using device 500 at the same time simply by selecting the audio content.
  • the audio content is used only for the analysis for specifying the tactile mode, and is not supplied to the content using device 500.
  • the content use device 500 has only a function of operating using vibration as a medium, a form in which the content supply device 100 is configured as a dedicated device for such a content use device 500 can be considered. In this case, the audio content supply unit 15 may be omitted.
  • the haptic content acquisition unit 11 acquires the haptic content according to the haptic mode, and the haptic mode is specified by analyzing the audio content. Even if the acquisition instruction operation of the tactile content is performed separately from the acquisition instruction operation of the tactile content, the operation of selecting the audio content is basically required. However, for example, when the default tactile mode is set and only the acquisition instruction operation of the tactile content is performed, the tactile content according to the default tactile mode may be acquired.
  • the audio content may be generated by a combination of the haptic content and the audio content.
  • the content supply device 100 generates audio content using the haptic content acquired by the haptic content acquisition unit 11 as first channel information and the audio content acquired by the audio content acquisition unit 14 as second channel information, This audio content may be supplied to the content use device 500.
  • the configuration may be such that the haptic content is supplied to the content using device 500, while the audio content is supplied to a content using device (not shown) different from the content using device 500.
  • a configuration is possible in which tactile content is supplied to a massage device, and audio content is supplied to an audio reproducing device.
  • FIG. 3 is a block diagram illustrating an example of a functional configuration of the tactile-mode specifying unit 13.
  • FIGS. 3A and 3B show two examples of the structure of the tactile-mode specifying unit 13.
  • the touch mode designation unit 13 includes a touch parameter generation unit 131, a touch feature amount calculation unit 132, and a mode designation unit 133 as functional components.
  • the tactile parameter generation unit 131 uses the waveform information of the audio content acquired by the audio content acquisition unit 14 to select two or more combinations of n (n ⁇ 2) tactile parameters each representing one element of the tactile sensation. Generate. For example, the tactile parameter generation unit 131 sets two or more combinations of the first tactile parameter relating to the intensity of the information and the second tactile parameter relating to the length of the information division section from the waveform information of the audio content. Generate.
  • the waveform information of audio content is information in which the amplitude corresponding to the intensity is continuous (in the case of an analog signal) or intermittent (in the case of digital data) along the time axis.
  • the first tactile parameter relating to the intensity of the information is the amplitude
  • the second tactile parameter relating to the length of the divided section of the information is the time of each divided section when the waveform information is divided into a plurality in the time axis direction. Is the length of Both the strength and the length of the divided section constitute one element of the tactile sensation (the strength and length of the tactile sensation).
  • the tactile parameter generation unit 131 first divides the waveform information of the audio content into a plurality of pieces in the time axis direction. Next, from each of the divided sections, the representative amplitude in the divided section is specified as the first tactile parameter, and the time length of the divided section is specified as the second tactile parameter, whereby the first tactile parameter is obtained. Two or more combinations of the tactile parameter and the second tactile parameter are generated.
  • the tactile parameter generation unit 131 generates a tactile parameter for the entire area of the waveform information of the audio content (from the beginning to the end of the audio content).
  • the tactile parameter may be generated by designating a part of the entire area of the audio content waveform information.
  • the section is specified in accordance with a predetermined rule. For example, it is conceivable to specify a predetermined time from the beginning of the audio content.
  • meta information indicating the target section may be stored for each audio content, and the section may be designated based on the meta information.
  • the tactile parameter generation unit 131 extracts the envelope of the input waveform information by performing a low-pass fill process on the waveform information, and subjects the envelope to a first segment from a plurality of divided sections.
  • the tactile parameter and the second tactile parameter may be specified.
  • FIG. 4 is a diagram for describing an example of the processing content by the tactile parameter generation unit 131.
  • FIG. 4 shows waveform information of audio content acquired by the audio content acquisition unit 14 (or envelope waveform information subjected to low-pass filter processing as preprocessing by the tactile parameter generation unit 131).
  • the tactile parameter generation unit 131 first divides the waveform information shown in FIG. 4 into a plurality of pieces in the time axis direction.
  • the waveform is divided for each time when the amplitude of the waveform is minimum. That is, the first divided section T1 from the start point of the waveform to the first minimum value, the second divided section T2 from the first minimum value to the second minimum value, and the second minimum value.
  • the waveform information is divided into a plurality of pieces in the time axis direction as in the third division section T3,.
  • the waveform information may be divided into a plurality of sections at each time when the amplitude of the waveform is maximized.
  • the waveform information may be divided into a plurality of sections at each time when the amplitude value becomes zero.
  • the waveform information may be divided into a plurality of sections at each time when the amplitude value becomes a predetermined value other than zero.
  • the tactile parameter generation unit 131 specifies the representative amplitudes h1, h2, h3,... As a first tactile parameter from each of the divided sections T1, T2, T3,.
  • the time lengths t1, t2, t3,... Of the divided sections are specified as quality parameters.
  • the representative amplitudes h1, h2, h3,... Are determined by dividing the minimum value of the start point or the minimum value of the end point in each of the divided sections T1, T2, T3,. The difference value from the local maximum value in the sections T1, T2, T3,... Is shown.
  • the difference between this minimum value and the maximum value is the representative amplitude h1.
  • the difference between the minimum value at the start point and the maximum value is the representative amplitude h2.
  • the difference between the minimum value of the end point and the maximum value is the representative amplitude h3.
  • the method of specifying the representative amplitude shown here is an example, and the present invention is not limited to this.
  • the smaller one of the minimum value of the start point or the minimum value of the end point in each of the divided sections T1, T2, T3,... And the maximum value in the divided sections T1, T2, T3,. May be specified as the representative amplitude.
  • the positive maximum value or the negative minimum value in each divided section is set to the first value. It may be specified as a representative amplitude of the tactile parameter.
  • the absolute value of the negative minimum value may be specified as the representative amplitude of the first tactile parameter.
  • the tactile feature quantity calculation unit 132 is based on two or more tactile parameters (a combination of two or more sets of a first tactile parameter and a second tactile parameter) generated by the tactile parameter generation unit 131. Then, the tactile feature amount related to the waveform information of the audio content is calculated. In the present embodiment, the four tactile operation values using the first tactile parameter and the second tactile parameter are calculated, and the length of the section where the same four tactile operation values appear is calculated as the first tactile feature value. . Also, the degree of diversity of the four arithmetic operations is calculated as a second tactile feature value.
  • the first tactile feature value P1 may be calculated using the number of sections until the same value of hi / ti appears as the section length.
  • “2”, which is the number of sections from the divided section T1 to the divided section T3 is the first tactile feature amount P1.
  • the first tactile feature quantity P1 is calculated for each section, and the average value is used as the final first tactile feature quantity. Determined as P1.
  • a maximum value, a minimum value, a median value, or the like may be determined as the first tactile feature amount P1.
  • the first tactile feature amount P1 obtained as described above represents the rhythm of the tactile sensation that the wavelength information of the audio content has potentially.
  • the second tactile feature amount P2 represents the degree of tactile sensation that the wavelength information of the audio content has potentially.
  • the wavelength information of the audio content is characterized by the two types of tactile feature amounts indicating the rhythm and the variety of the tactile sensation.
  • a value of ti / hi is calculated here, other four arithmetic values may be calculated. For example, a value of ti / hi, a value of hi ⁇ ti, a value of hi + ti, a value of hi-ti, or the like may be calculated.
  • the ⁇ mode designation unit 133 designates a tactile mode according to the content of the combination based on the combination of the plurality of types of tactile feature amounts calculated by the tactile feature amount calculation unit 132.
  • the mode specifying unit 133 maps the combination of the two types of tactile feature amounts calculated by the tactile feature amount calculating unit 132 on a predetermined coordinate space, and specifies a tactile mode according to the mapped position. I do.
  • the predetermined coordinate space can be, for example, two-dimensional coordinates in which the horizontal axis (x-axis) takes the first tactile feature amount P1 and the vertical axis (y-axis) takes the second tactile feature amount P2. It is.
  • FIG. 5 is a diagram showing an example of the specification content of the touch mode by the mode specifying unit 133.
  • the mode specifying unit 133 sets the two-dimensional coordinate space having the first tactile feature amount P1 on the horizontal axis and the second tactile feature amount P2 on the vertical axis as four axes. Divide into four quadrants. Then, different tactile modes are assigned to each of the four quadrants.
  • the first quadrant is an area where the first tactile feature quantity P1 is larger than the first threshold Th1 and the second tactile feature quantity P2 is larger than the second threshold Th2.
  • the first tactile mode in the first quadrant is a tactile mode having a property that a tactile rhythm relating to a tactile sensation is slow and a variety of tactile sensations is large.
  • the second quadrant is an area where the first tactile feature value P1 is equal to or less than the first threshold value Th1 and the second tactile feature value P2 is larger than the second threshold value Th2.
  • the second tactile mode in the second quadrant is a tactile mode in which the rhythm of the tactile sensation relating to the tactile sensation is fast and the variety of tactile sensations is large.
  • the third quadrant is an area where the first tactile feature quantity P1 is equal to or less than the first threshold Th1 and the second tactile feature quantity P2 is equal to or less than the second threshold Th2.
  • the third tactile mode in the third quadrant is a tactile mode having such characteristics that the tactile sensation relating to the tactile sensation is fast and the variety of tactile sensations is small.
  • the fourth quadrant is an area where the first tactile feature P1 is larger than the first threshold Th1 and the second tactile feature P2 is equal to or smaller than the second threshold Th2.
  • the fourth tactile mode in the fourth quadrant is a tactile mode having a property that the tactile rhythm relating to the tactile sensation is slow and the variety of tactile sensations is small.
  • the mode designating unit 133 is calculated by the tactile feature amount calculating unit 132 in the two-dimensional coordinate space in which the first to fourth tactile modes are assigned to the four quadrants divided as described above. It is determined to which quadrant the xy coordinates specified by the combination of the tactile feature amounts (P1, P2) belong. Then, the tactile mode assigned to the quadrant determined to belong is specified.
  • the value of the first threshold Th1 for the first tactile feature P1 and the value of the second threshold Th2 for the second tactile feature P2 can be arbitrarily determined.
  • the two-dimensional coordinate space represented by the two types of tactile feature amounts P1 and P2 is divided into four quadrants and four tactile modes are assigned. Not limited. For example, as shown in FIG. 5B, an area belonging to the third quadrant is set to the first tactile mode, and other areas (areas combining the first quadrant, the second quadrant, and the fourth quadrant) are set to the first touch mode.
  • the second touch mode may be used.
  • first tactile feature value P1 is calculated by the tactile feature value calculating unit 132, and the first tactile mode and the second tactile mode are set with the first threshold Th1 as a boundary as shown in FIG. A quality mode may be assigned.
  • second tactile feature value P2 is calculated by the tactile feature value calculation unit 132, and the first tactile mode and the second tactile mode are set with the second threshold Th2 as a boundary as shown in FIG. A tactile mode may be assigned.
  • the tactile mode specifying unit 13 includes, as its functional components, a tactile parameter generation unit 131, a tactile difference parameter generation unit 134, a tactile feature amount calculation unit 132 ′, and a mode specification unit. 133 is provided.
  • FIG. 3B components denoted by the same reference numerals as those shown in FIG. 3A have the same functions, and thus redundant description will be omitted.
  • the tactile difference parameter generation unit 134 calculates the difference values of the tactile parameters between the two sets with respect to the two or more sets of tactile parameters generated by the tactile parameter generation unit 131, and thereby sets the n tactile parameters. Two or more combinations of quality difference parameters are generated.
  • two or more sets of tactile parameters generated by the tactile parameter generation unit 131 are three sets of ⁇ h1, t1 ⁇ , ⁇ h2, t2 ⁇ , ⁇ h3, t3 ⁇ .
  • h1, h2, and h3 are first tactile parameters
  • t1, t2, and t3 are second tactile parameters.
  • the tactile difference parameter generation unit 134 calculates a tactile parameter difference value between the two pairs of ⁇ h1, t1 ⁇ and ⁇ h2, t2 ⁇ , and also calculates ⁇ h2, t2 ⁇ , ⁇ h3, t3 ⁇ .
  • the tactile difference parameter generation unit 134 calculates a tactile parameter difference value between the two pairs of ⁇ h1, t1 ⁇ and ⁇ h2, t2 ⁇ , and also calculates ⁇ h2, t2 ⁇ , ⁇ h3, t3 ⁇ .
  • the tactile feature amount calculation unit 132 ′ calculates a tactile feature amount regarding the waveform information of the audio content based on the two or more sets of tactile difference parameters generated by the tactile difference parameter generation unit 134.
  • the tactile feature value to be calculated is the same as that calculated by the tactile feature value calculation unit 132 described with reference to FIG.
  • correspondence information for example, table information
  • the tactile mode designation unit 13 designates the tactile mode by referring to the correspondence information. You may make it.
  • the tactile-mode specifying unit 13 analyzes the audio content stored in the audio content storage unit 142 and the stored audio content (see FIG. 3).
  • the correspondence information indicating the correspondence with the tactile mode specified based on the tactile feature amount obtained by the described analysis) is created and stored in advance. Then, when the audio content is acquired by the audio content acquisition unit 14, the tactile mode designation unit 13 acquires and designates the tactile mode corresponding to the acquired audio content from the correspondence information.
  • the generation of the correspondence information based on the analysis of the tactile mode corresponding to the audio content stored in the audio content storage unit 142 may be executed, for example, at a timing instructed by the user. Further, in the case of the configuration shown in FIG. 2B, the audio content may be periodically performed at predetermined intervals, or may be newly added each time the audio content received from the external device 700 is stored in the audio content storage unit 142. It may be executed for the stored audio content.
  • the tactile content corresponding to the tactile mode is stored in advance, and when the tactile mode is designated by the tactile mode designating unit 13, the tactile content corresponding to the designated tactile mode is stored. Is read.
  • the tactile mode is designated by the tactile mode designation unit 13
  • the tactile content corresponding to the designated tactile mode is generated. It is.
  • FIG. 6 is a block diagram illustrating a functional configuration example of the haptic content acquisition unit 11 according to the first configuration example.
  • the haptic content acquisition unit 11 according to the first configuration example includes a haptic content reading unit 1111 (or 1111 ′) and a haptic content storage unit 1112 (or 1112 ′).
  • FIGS. 6 (a) to 6 (c) show modes in which haptic contents are stored in advance in the haptic content storage unit 1112, similarly to FIGS. 2A to 2C, which show three modes for storing audio contents.
  • 6 (a)) a mode in which haptic content can be downloaded from the external device 700 and stored in the haptic content storage unit 1112 (FIG. 6 (b)), and haptic content is received from the external device 700 in a streaming manner.
  • the haptic content storage unit 1112 stores a plurality of haptic contents respectively corresponding to a plurality of tactile modes.
  • the haptic content storage unit 1112 is a non-volatile storage medium such as a hard disk and a semiconductor memory, and can keep storing audio contents unless the user explicitly gives an instruction to delete.
  • the tactile content corresponding to the tactile mode is, for example, vibration information having the property of the tactile feature corresponding to each tactile mode assigned as shown in FIG.
  • the haptic content corresponding to the first tactile mode in the first quadrant in FIG. 5A is vibration information having a property that the rhythm of vibration is slow and the variety of vibration is large.
  • the haptic content corresponding to the second tactile mode in the second quadrant is vibration information having a property that the rhythm of vibration is fast and the variety of vibration is large.
  • the haptic content corresponding to the third tactile mode in the third quadrant is vibration information having a property that the rhythm of vibration is fast and the variety of vibration is small.
  • the haptic content corresponding to the fourth tactile mode in the fourth quadrant is vibration information having a property that the rhythm of the vibration is slow and the diversity of the vibration is small.
  • Each of the tactile contents corresponding to the first to fourth tactile modes has a unique tactile sense specified by n (n ⁇ 2) tactile parameters or tactile difference parameters each representing one element of the tactile sensation.
  • Tactile content with an effect That is, when the tactile feature amount is calculated as shown in FIG. 3A, the tactile content includes a tactile content having a unique tactile effect specified by the first tactile parameter and the second tactile parameter. I do. On the other hand, when the tactile feature amount is calculated as shown in FIG. 3B, the tactile content has a tactile effect having a unique tactile effect specified by the first tactile difference parameter and the second tactile difference parameter. Content.
  • Such haptic contents are generated in advance for each tactile mode, and a plurality of haptic contents generated in advance for each tactile mode are stored in the haptic content storage unit 1112 in advance.
  • the method of generating the tactile content according to the tactile mode will be described in detail with reference to FIG.
  • the haptic content acquisition unit 11 acquires the haptic content according to the haptic mode specified by the haptic mode specification unit 13 by reading the haptic content reading unit 1111 from the haptic content storage unit 1112.
  • FIG. 6B has a function in which the content supply device 100 receives a plurality of haptic contents generated in advance for each haptic mode from the external device 700 via the communication network 600 and stores the haptic contents in the haptic content storage unit 1112.
  • the form is shown. That is, in the example of FIG. 6B, the content supply device 100 further includes the tactile content recording unit 17.
  • the haptic content recording unit 17 receives haptic content from the external device 700 via the communication network 600 and stores the haptic content in the haptic content storage unit 1112.
  • the user selects a desired haptic content, downloads the haptic content to the content supply device 100, and stores the haptic content in the haptic content storage unit 1112. It is possible to do.
  • Selection of a desired tactile content by the user is performed by, for example, the user using an operation unit provided in the content supply device 100 or the content use device 500, or a predetermined graphical user interface provided on the display by the content supply device 100 or the content use device 500. It can be done by manipulating.
  • the haptic content acquisition unit 11 When the haptic content acquisition unit 11 is configured as shown in FIG. 6B and a plurality of haptic contents are downloaded from the external device 700 and stored in the haptic content storage unit 1112, a plurality of types of haptic contents are set for one tactile mode. May be stored in the haptic content storage unit 1112. In this case, it is possible to apply an arbitrarily set rule on which haptic content is to be read according to the tactile mode specified by the tactile mode specifying unit 13.
  • the haptic content reading unit 1111 can read any haptic content in response to a user's instruction on selection of haptic content.
  • the tactile feature amount is stored as additional information in the tactile content stored in the tactile content storage unit 1112, and the tactile content readout unit 1111 is closest to the tactile feature amount calculated by the tactile mode designation unit 13.
  • the tactile content having the tactile feature amount may be selectively read.
  • FIG. 6C includes a haptic content reading unit 1111 ′ and a haptic content storing unit 1112 ′ instead of the haptic content reading unit 1111 and the haptic content storage unit 1112, and the haptic content received from the external device 700 is used by the content use device 500. Is shown in the form of a stream.
  • the haptic content storage unit 1112 ′ may be a storage medium that temporarily stores the haptic content received from the external device 700 by the haptic content recording unit 17.
  • the haptic content reading unit 1111 notifies the haptic content recording unit 17 of the haptic mode specified by the haptic mode specifying unit 13, and the haptic content recording unit 17 notifies the external device 700 of the haptic content mode.
  • the haptic content reading unit 1111 ' reads the haptic content transmitted from the external device 700 as a response to the notification and temporarily stored in the haptic content storage unit 1112', and supplies the haptic content supplying unit 12.
  • the haptic content reading unit 1111 'and the haptic content supplying unit 12 supply the haptic content received by the haptic content recording unit 17 and temporarily stored in the haptic content storage unit 1112' to the content use device 500. Is performed in series with the recording process of the haptic content in the haptic content storage unit 1112 ′.
  • the tactile content according to the tactile mode designated by the tactile mode designating unit 13 when a plurality of types of tactile contents according to the tactile mode are stored in the external device 700, the tactile content according to the tactile mode designated by the tactile mode designating unit 13.
  • Arbitrarily set rules can be applied to which haptic content is read.
  • the haptic content reading unit 1111 notifies the external device 700 of the tactile mode specified by the tactile mode specifying unit 13, and in response to this, the external device 700 transmits a plurality of tactile modes corresponding to the specified tactile mode.
  • a selection screen for presenting and selecting the haptic content is generated and displayed on the display of the content supply device 100.
  • the haptic content recording unit 17 receives the selected haptic content from the external device 700 and temporarily stores the haptic content in the haptic content storage unit 1112 'in response to the user's operation of selecting the haptic content on the selection screen. May be read by the haptic content reading unit 1111 ′ and supplied to the haptic content supplying unit 12.
  • the tactile feature amount is stored as accompanying information in the tactile content stored in the external device 700, and the tactile content readout unit 1111 'stores the tactile feature amount calculated by the tactile mode designating unit 13 in the external device 700.
  • the external device 700 selectively reads out the haptic content having the tactile feature amount closest to the notified tactile feature amount and transmits the haptic content to the content supply device 100, and the haptic content recording unit 17 receives the haptic content. Then, it may be stored in the tactile content storage unit 1112 '.
  • the haptic content reading unit 1111 selects the haptic content having the haptic feature amount closest to the haptic feature amount calculated by the haptic mode designating unit 13 from the haptic content storage unit 1112.
  • the tactile mode is not present in each area obtained by dividing the two-dimensional coordinate space of the tactile feature amount as shown in FIG. It can be said that a tactile mode exists for each tactile feature amount calculated by the tactile mode designating unit 13.
  • the tactile mode designation unit 13 corresponds to designating the tactile mode based on the calculated tactile feature amount itself.
  • the storage of the haptic content in the haptic content storage units 1112 and 1122 ′ can be in the three forms shown in FIGS. 6A to 6C. The same applies to the configuration example and the second to fourth embodiments.
  • the audio content is stored in advance in the audio content storage unit 142
  • the audio content downloaded from the external device 700 is stored in the audio content storage unit 142
  • the three modes have been described in which the audio content is received in a streaming manner from the external device 700 and temporarily stored in the audio content storage unit 142 ′.
  • the haptic content is stored in advance in the haptic content storage unit 1112
  • the haptic content downloaded from the external device 700 is stored in the haptic content storage unit 1112. 3 and a form in which the haptic content is received in a streaming manner from the external device 700 and temporarily stored in the haptic content storage unit 1112 ′.
  • Any one of the three modes relating to the audio content and any one of the three modes relating to the haptic content can be arbitrarily combined and applied.
  • the audio content stored in the audio content storage unit 142 is read and supplied to the content use device 500, and the audio content is analyzed and designated as shown in FIG. It is also possible to receive the haptic content corresponding to the tactile mode set from the external device 700 and supply the haptic content to the content using device 500 in a streaming manner.
  • the audio content selected by the user is received from the external device 700 and supplied to the content use device 500 in a streaming manner, and as shown in FIG. It is also possible to analyze the audio content received from the external device 700 from among the tactile contents stored in the 1112 and read out the tactile content corresponding to the designated tactile mode, and supply the read tactile content to the content use device 500. It is possible.
  • the audio content selected by the user is received from the external device 700 and supplied to the content use device 500 in a streaming manner, and the audio content is analyzed and designated as shown in FIG. It is also possible to receive the haptic content corresponding to the tactile mode from the external device 700 and supply the haptic content to the content use device 500 in a streaming manner. In this case, even if neither the audio content nor the tactile content is stored in the content supply device 100, the audio content desired by the user and the tactile content corresponding to the tactile mode analyzed from the audio content are stored. The content can be obtained from the external device 700 and supplied to the content use device 500.
  • FIG. 7 shows, as an example of the simplest combination, the content supply device 100 in the case where the acquisition of audio content is configured as shown in FIG. 2A and the acquisition of tactile content is configured as shown in FIG. 9 is a flowchart illustrating an operation example. Note that the flowchart illustrated in FIG. 7 starts when the content supply device 100 is activated.
  • the audio content reading unit 141 of the audio content acquisition unit 14 determines whether or not a user operation for selecting any one or more of the one or more audio contents stored in the audio content storage unit 142 has been performed (Ste S1). If the user operation has not been performed, the determination in step S1 is repeated.
  • the audio content reading unit 141 obtains the audio content by reading the audio content selected by the user operation from the audio content storage unit 142 (Step S2). .
  • the tactile mode designation unit 13 analyzes the audio content read from the audio content storage unit 142 by the audio content reading unit 141, and designates a tactile mode according to the analysis result (step S3).
  • the haptic content acquisition unit 11 acquires the haptic content according to the haptic mode specified by the haptic mode specifying unit 13 by reading the haptic content reading unit 1111 from the haptic content storage unit 1112 (step S4).
  • the audio content supply unit 15 supplies the audio content acquired by the audio content acquisition unit 14 to the content using device 500 (step S5), and the haptic content supply unit 12 is acquired by the haptic content acquisition unit 11.
  • the tactile content is supplied to the content use device 500 (step S6).
  • the audio content reading unit 141 determines whether or not the supply of the audio content to the content using device 500 has been completed (Step S7).
  • the supply of the audio content is terminated when the audio content read in step S2 is supplied to the content use device 500 to the end, or the user stops the supply of the audio content at a stage before the audio content is supplied to the end. Either of the cases where an instruction has been given.
  • step S5 If the supply of the audio content has not been completed, the process returns to step S5, and the supply of the audio content and the tactile content to the content using device 500 is continued. On the other hand, when the supply of the audio content ends, the processing of the flowchart illustrated in FIG. 7 ends.
  • FIG. 8 is a block diagram illustrating a functional configuration example of the haptic content acquisition unit 11 according to the second configuration example.
  • the haptic content acquisition unit 11 according to the second configuration example includes a haptic content reading unit 1121 (or 1121 ′), a haptic content storage unit 1122 (or 1122 ′), and a vibration waveform processing unit 1123. I have.
  • the haptic content acquisition unit 11 reads out the haptic content stored in the haptic content storage units 1112 and 1112 ′, and thereby reads the haptic content according to the tactile mode. The acquisition was done.
  • the haptic content acquisition unit 11 uses the vibration information based on the reference haptic content read from the haptic content storage units 1122 and 1122 ′. To generate a haptic content according to the tactile mode.
  • FIG. 8A shows a form in which the haptic content is stored in advance in the haptic content storage unit 1122
  • FIG. 8B shows the haptic content downloaded from the external device 700 in the haptic content storage unit
  • FIG. 8C shows three forms in which the haptic content is received in a streaming manner from the external device 700 and temporarily stored in the haptic content storage unit 1122 ′.
  • the haptic content storage units 1122 and 1122 ’ are non-volatile storage media such as a hard disk and a semiconductor memory, and store reference haptic content.
  • the haptic content stored in the haptic content storage units 1122 and 1122 ′ is not generated in accordance with the tactile mode, but is generated based on predetermined vibration information serving as a reference for generating the haptic content corresponding to the tactile mode. Haptic content.
  • the haptic content reading units 1121 and 1121 ′ are used when the user performs an operation for selecting audio content or when an operation for instructing acquisition of haptic content is performed by the user. 'Read haptic content from That is, the haptic content reading unit 1121 reads the haptic content stored in the haptic content storage unit 1122 in advance. In addition, the haptic content reading unit 1121 'reads the haptic content received from the external device 700 and temporarily stored in the haptic content storage unit 1122'.
  • the vibration waveform processing unit 1123 is weight information for the vibration information of the haptic content read by the haptic content reading units 1121 and 1121 ′, and is weight information according to the tactile mode specified by the tactile mode specification unit 13. Is generated, and the vibration information of the haptic content is processed based on the generated weight information, thereby generating the haptic content corresponding to the designated tactile mode. Then, the generated haptic content is supplied to the haptic content supply unit 12.
  • FIG. 9 is an image diagram schematically showing the processing contents of the vibration waveform processing unit 1123.
  • FIG. 9 shows an example in which one of the first to fourth tactile modes is designated based on a combination of the two tactile feature amounts P1 and P2, as in the example of FIG. ing.
  • the combination of tactile feature amounts corresponding to the reference tactile content stored in the tactile content storage units 1122 and 1122 ′ is indicated by diamond marks.
  • what is indicated by four arrows is the direction of weighting according to the tactile mode. Note that the direction of the arrow does not indicate the exact direction of the weighting, but schematically illustrates the direction to each quadrant as an image.
  • the value of the first tactile feature value P1 is the same as the first threshold value Th1
  • the value of the second tactile feature value P2 is the same as the second threshold value Th2.
  • the vibration information of the reference tactile content that does not correspond to any of the first to fourth tactile modes is processed by generating weight information having different directions according to the tactile mode and performing processing. 4 schematically illustrates generation of a haptic content having vibration information according to a haptic mode.
  • the haptic content is generated based on the haptic content corresponding to any of the haptic modes in accordance with the tactile mode.
  • the vibration information may be processed by the weight information. For example, when the tactile content corresponding to the first tactile mode is stored in the tactile content storage units 1122 and 1122 ′ based on the tactile content, and the tactile mode specified by the tactile mode designating unit 13 is the first tactile mode Does not perform weighting, and weights such that when the specified tactile mode is the second to fourth tactile modes, the first tactile mode is changed to any of the second to fourth tactile modes. May be performed.
  • FIG. 10 is a block diagram illustrating an example of a functional configuration of the vibration waveform processing unit 1123.
  • the vibration waveform processing unit 1123 includes a feature extraction unit 1123A, a weight information generation unit 1123B, and a weight processing unit 1123C as its functional configuration.
  • the feature extracting unit 1123 ⁇ / b> A extracts a plurality of feature locations that can be distinguished from other locations in the audio content waveform information acquired by the audio content acquiring unit 14. For example, the feature extracting unit 1123A extracts, in the waveform information of the audio content, a portion where the amplitude value increases by a predetermined value or more during a predetermined time as a characteristic portion.
  • the weight information generation unit 1123B generates, based on the plurality of feature locations extracted by the feature extraction unit 1123A, weight information whose value changes with time in a time section between the feature locations. For example, based on a plurality of feature points extracted by the feature extraction unit 1123A, the weight information generation unit 1123B may change values over time from the time at which one feature point is extracted to the time at which the next feature point is extracted. Generate weight information that gradually decreases.
  • FIG. 11 is a diagram for explaining the processing contents of the feature extraction unit 1123A and the weight information generation unit 1123B.
  • FIG. 11A illustrates a part of the waveform information of the audio content acquired by the audio content acquisition unit 14.
  • FIG. 11B schematically illustrates the weight information generated by the weight information generation unit 1123B for the haptic content waveform information read from the haptic content storage units 1122 and 1122 ′ by the haptic content reading units 1121 and 1121 ′.
  • the feature extracting unit 1123A identifies a portion where the amplitude value increases by a predetermined value or more during a predetermined time (for example, 0.1 second) into a plurality of characteristic portions F 1 , Extract as F 2 , F 3 ,. That is, the feature extracting unit 1123A extracts places where the amplitude value of the waveform information of the audio content sharply increases as feature places F 1 , F 2 , F 3 ,.
  • the weight information is information in which the weight value (all positive values) ranges from the minimum value to the maximum value, and is schematically shown as a sawtooth wave in FIG.
  • the weight value becomes maximum at the time when one feature point F i is extracted, and the value gradually decreases linearly or stepwise with time from the time when one feature point F i is extracted.
  • Weight information is generated such that the weight value becomes maximum again at the time when i + 1 is extracted.
  • the weight information generation unit 1123B determines that the weight value becomes maximum at the time when one feature point F i is extracted, and the weight value becomes exactly the minimum value at the time when the next feature point F i + 1 is extracted. Such weight information is generated.
  • FIG. 11B shows an example in which the weight value gradually decreases linearly at a constant rate, but the next feature point F i + 1 is extracted from the time when one feature point F i is extracted.
  • the weight information may be generated such that the value gradually decreases in a curve according to a predetermined quadratic function or logarithmic function until the time.
  • the rate at which the weight value gradually decreases may be set to be the same in all the sections.
  • the weight value reaches a minimum value before reaching the next feature point F i + 1.
  • the weight information generation unit 1123B generates the weight information such that the weight value is fixed to the minimum value until the next feature point F i + 1 .
  • the maximum value and the minimum value of the weight value may not be fixed, and may be a fluctuation value that fluctuates according to a predetermined condition.
  • the maximum value of the weight value may be made variable according to the magnitude of the amplitude value at the characteristic portion.
  • the weight information generation unit 1123B sets the weight value to be larger as the amplitude value at one feature point F i is larger, and outputs weight information such that the value is gradually reduced from there to the next feature point F i + 1. Generate. In this way, among a plurality of feature points F i the amplitude value becomes larger than a predetermined value for a predetermined time, a large weight value the larger the amplitude value of that feature point F i is to be set.
  • the weight processing unit 1123C processes the vibration information of the haptic content read from the haptic content storage units 1122 and 1122 'by the haptic content reading units 1121 and 1121' using the weight information generated by the weight information generation unit 1123B. For example, the weight processing unit 1123C processes the vibration information of the haptic content by multiplying the amplitude value of the waveform information of the haptic content by the weight value of the weight information.
  • the weight processing unit 1123C compares the amplitude value of the waveform information of the haptic content at each time shown in FIG. 11B with the amplitude value at each time also schematically shown as a sawtooth wave in FIG. 11B. Multiply weight values.
  • the waveform information of the haptic content and the weight information are superimposed to indicate the correspondence between the amplitude value of the waveform information at each time and the weight value to be multiplied with the amplitude information. That's why.
  • FIG. 12 is a diagram showing the waveform information of the haptic content processed by the vibration waveform processing unit 1123 together with the waveform information of the audio content.
  • FIG. 12A shows waveform information of audio content acquired by the audio content acquisition unit 14, and
  • FIG. 12B shows waveform information of haptic content processed by the vibration waveform processing unit 1123.
  • the waveform information of the audio content shown in FIG. 12A is the same as the waveform information of the audio content shown in FIG.
  • the waveform information of the audio content to be extracted is the same as the waveform information of the audio content shown in FIG. 12A (the waveform information of the audio content supplied to the content use device 500).
  • the audio content waveform information shown in FIG. 11A and the audio content waveform information shown in FIG. May be the same, but need not necessarily be the same.
  • the audio content from which the characteristic portion is to be extracted may be a left channel audio signal
  • the audio content supplied to the content use device 500 may be a right channel audio signal.
  • the reverse is also possible.
  • the audio signal of any one of the left and right channels may be analyzed to extract a characteristic portion, and the audio signals of both channels may be supplied to the content use device 500.
  • the feature extracting unit 1123A may extract, as the feature location, a location where the amplitude value is equal to or greater than a predetermined value in the audio content waveform information.
  • the waveform information of the audio content may be frequency-analyzed for each time, and a portion where the included frequency component changes abruptly may be extracted as a characteristic portion.
  • the same analysis as in FIG. 4 is performed on the waveform information of the audio content, and a portion where the four arithmetic operations of the first tactile parameter and the second tactile parameter rapidly change is extracted as a characteristic portion. Is also good.
  • the weight information generation unit 1123B generates the weight information such that the value gradually decreases from the time at which one feature location F i is extracted to the time at which the next feature location F i + 1 is extracted.
  • the present invention is not limited to this.
  • the feature extraction unit 1123A is, so as to extract a feature point suddenly decreases places the amplitude value during a predetermined time in the waveform information of the audio content
  • weight information generation unit 1123B is, in the one feature point F i Weight information may be generated such that the value gradually increases from the extracted time to the time when the next feature location F i + 1 is extracted.
  • the haptic content generated as described above has a waveform amplitude based on weight information whose weight value fluctuates in a manner synchronized with a characteristic portion in waveform information of audio content used in combination with the haptic content in the content using device 500.
  • the value has been processed. Therefore, by supplying the haptic content to the content use device 500 together with the audio content, the audio reproduced based on the audio content is provided to the user, and at the same time, the vibration of the haptic content synchronized with the characteristics of the audio is provided to the user. Can be provided.
  • the vibration waveform processing unit 1123 further generates weight information by performing processing according to the tactile mode specified by the tactile mode specifying unit 13. Specifically, according to the tactile mode specified by the tactile mode specifying unit 13, the vibration waveform processing unit 1123 extracts the content of the characteristic portion by the characteristic extracting unit 1123 A and the weight information by the weight information generating unit 1123 B. Change at least one of the generated contents of.
  • the extraction method may be changed so as to extract a portion where the amplitude value is equal to or larger than a predetermined value in the waveform information of the audio content as a characteristic portion, or the value of the predetermined value in this extraction method may be changed.
  • the position By changing the position, the position and number of the extracted characteristic portions may be changed.
  • the waveform information of audio content is subjected to frequency analysis for each time, and a portion where the included frequency component changes abruptly is extracted as a characteristic portion.
  • An analysis similar to that of FIG. 4 may be performed on the waveform information of the content, and the method may be changed to a method of extracting a portion where the four arithmetic operation values of the first tactile parameter and the second tactile parameter suddenly change as a characteristic portion. .
  • the weight information generated by the weight information generating unit 1123B changes based on the changed characteristic location by changing the position or the number of the characteristic location extracted by the feature extracting unit 1123A.
  • the waveform information of the haptic content processed by the weight processing unit 1123C using the weight information changes.
  • the tactile sensation of the vibration applied based on the tactile content can be changed according to the tactile mode.
  • the maximum value or the minimum value of the weight value may be changed according to the tactile mode.
  • the manner of changing the weight value from the time when one characteristic portion is extracted to the time when the next characteristic portion is extracted may be changed according to the tactile mode. That is, as a method of gradually changing the weight value from the maximum value to the minimum value of the weight value, any one of a linear shape, a staircase shape, a quadratic function, a logarithmic function, and the like is adopted according to the tactile mode. In this case, the rate at which the weight value gradually decreases may be changed according to the tactile mode.
  • the generated weight information changes.
  • the waveform information of the haptic content processed by the weight processing unit 1123C using the weight information changes.
  • the tactile sensation of the vibration applied based on the tactile content can be changed according to the tactile mode.
  • the haptic content according to the tactile mode can be generated by changing only one of the content extracted by the feature extracting unit 1123A and the content generated by the weight information generating unit 1123B. is there. However, by appropriately changing both the extracted content of the characteristic portion and the generated content of the weight information, it becomes possible to generate a haptic content that more appropriately expresses the haptic sensation to be embodied, and further increases the haptic content. Haptic content corresponding to the quality mode can be generated.
  • FIG. 13 is a flowchart illustrating an operation example of the content supply device 100 according to the second configuration example. Note that the flowchart illustrated in FIG. 13 illustrates an example of the operation of the content supply device 100 when the acquisition of audio content is configured as illustrated in FIG. 2A and the acquisition of tactile content is configured as illustrated in FIG. It is a flowchart shown. The flowchart shown in FIG. 13 starts when the content supply device 100 is activated.
  • the audio content reading unit 141 of the audio content acquisition unit 14 determines whether or not a user operation for selecting any one or more of the one or more audio contents stored in the audio content storage unit 142 has been performed ( Step S11). If the user operation has not been performed, the determination in step S11 is repeated.
  • the audio content reading unit 141 obtains the audio content by reading the audio content selected by the user operation from the audio content storage unit 142 (Step S12). .
  • the tactile mode designation unit 13 analyzes the audio content read from the audio content storage unit 142 by the audio content reading unit 141, and designates a tactile mode according to the analysis result (step S13).
  • the haptic content reading unit 1121 of the haptic content acquiring unit 11 reads the reference haptic content from the haptic content storage unit 1122 (Step S14).
  • the vibration waveform processing unit 1123 outputs weight information corresponding to the tactile mode specified by the tactile mode specifying unit 13 as weight information for the vibration information of the haptic content read by the haptic content reading unit 1121.
  • the tactile content according to the designated tactile mode is generated by processing the vibration information of the tactile content according to the generated weight information (step S15) (step S16).
  • the audio content supply unit 15 supplies the audio content acquired by the audio content acquisition unit 14 to the content use device 500 (step S17), and the haptic content supply unit 12 is processed by the vibration waveform processing unit 1123.
  • the tactile content is supplied to the content using device 500 (step S18).
  • the audio content reading unit 141 determines whether or not the supply of the audio content to the content using device 500 has been completed (step S19). If the supply of the audio content has not been completed, the process returns to step S16, and the supply of the audio content and the haptic content processed based on the weight information to the content using device 500 is continued. On the other hand, when the supply of the audio content ends, the processing of the flowchart illustrated in FIG. 13 ends.
  • the haptic content storage units 1122 and 1122 ′ are not a haptic content generated in accordance with the tactile mode, but a predetermined vibration serving as a reference for generating the haptic content in accordance with the tactile mode.
  • the example in which the reference haptic content including information is stored has been described.
  • the haptic content storage units 1122 and 1122 ′ store a plurality of haptic contents respectively corresponding to a plurality of tactile modes, and the haptic content reading units 1121 and 1121 ′
  • the tactile content according to the tactile mode specified by the tactile mode specifying unit 13 may be read from the tactile content storage units 1122 and 1122 ′.
  • FIG. 14A to 14C are block diagrams illustrating a functional configuration example of the haptic content acquisition unit 11 according to the third configuration example.
  • the haptic content acquisition unit 11 according to the third configuration example stores haptic content generated in advance in the haptic content storage units 1112 and 1112 ′ of FIG. 6 and the haptic content storage units 1122 and 1122 ′ of FIG. Instead, the haptic content is generated in real time according to the specification of the haptic mode.
  • the haptic content acquisition unit 11 includes a tactile content readout unit 1131, a tactile content storage unit 1132, a target information acquisition unit 1133, and a tactile parameter generation unit 1134.
  • a form (a form corresponding to FIG. 6B) in which tactile content (described later) downloaded from the external device 700 can be stored in the tactile content storage unit 1132 is shown.
  • the tactile content is stored in advance in the content storage unit 1132 (corresponding to FIG. 6A), or the tactile content is received from the external device 700 in a streaming manner (corresponding to FIG. 6C). ) Is also applicable.
  • the haptic content acquisition unit 11 includes a target information acquisition unit 1133, a tactile parameter generation unit 1134, a tactile content reading unit 1141, a tactile content storage unit 1142, and a vibration waveform processing unit 1123.
  • a target information acquisition unit 1133 includes a target information acquisition unit 1133, a tactile parameter generation unit 1134, a tactile content reading unit 1141, a tactile content storage unit 1142, and a vibration waveform processing unit 1123.
  • FIG. 14B components having the same functions as those shown in FIGS. 14A and 8B are denoted by the same reference numerals.
  • the tactile content storage unit 1142 In the embodiment, the tactile content is stored in advance (the form corresponding to FIG. 8A), or the tactile content is received in a streaming manner from the external device 700 (the form corresponding to FIG. 8C). Is also applicable.
  • FIG. 14C shows a modification of the configuration shown in FIGS. 14A and 14B.
  • FIG. 14C also shows only a configuration (a configuration corresponding to FIG. 8B) in which the tactile content downloaded from the external device 700 can be stored in the tactile content storage unit 1142.
  • the tactile content is stored in the storage unit 1142 in advance (corresponding to FIG. 8A), or the tactile content is streamed from the external device 700 (corresponding to FIG. 8C). Form) is also applicable.
  • FIG. 14A will be described, and points different from FIG. 14A will be described with reference to FIGS. 14B and 14C.
  • the tactile content storage unit 1132 is a tactile content having a unique tactile effect specified by n (n ⁇ 2) tactile parameters, each of which represents one element of the tactile sensation.
  • a plurality of tactile contents respectively corresponding to the modes are stored in association with combinations of n tactile parameters and tactile modes.
  • the first tactile parameter relating to the intensity of the information described in FIG. 4 and the second tactile parameter relating to the length of the divided section of the information are used as the two tactile parameters.
  • the tactile content storage unit 1132 has a unique tactile effect specified by the first tactile parameter relating to the intensity of the information and the second tactile parameter relating to the length of the divided section of the information.
  • the stored tactile content is stored in association with a combination of the first tactile parameter and the second tactile parameter.
  • the tactile parameter used here can be said to be a parameter indicating the degree of opposing tactile properties (hereinafter referred to as a tactile pair) such as ⁇ hard-soft> or ⁇ coarse-smooth>.
  • the intensity of information is a tactile parameter relating to a ⁇ hard-soft> tactile pair.
  • a larger value indicating strength indicates a harder value
  • a smaller value indicating strength indicates a softer value
  • the length of the information division section can be used as the tactile parameter relating to the tactile pair ⁇ rough-smooth>.
  • a larger value representing the length indicates smoother
  • a smaller value representing the length indicates coarser.
  • FIG. 15 is a diagram schematically illustrating a storage example of the tactile content stored in the tactile content storage unit 1132.
  • FIG. 15 shows a matrix-like tactile pair in which “vertical axis” represents “strength” as a first tactile parameter, and “horizontal axis” represents “length of a divided section” as a second tactile parameter. Shows space.
  • ⁇ hard-soft> is used as a tactile pair represented by a first tactile parameter (strength)
  • a tactile represented by a second tactile parameter length of a divided section
  • the pair using ⁇ rough-smooth> is shown.
  • 25 matrix spaces are formed by dividing the maximum to minimum values of the intensity into five levels and dividing the maximum to minimum values of the length of the divided section into five levels. I have. Index numbers 1 to 25 are assigned to the 25 matrix spaces, respectively.
  • the tactile content storage unit 1132 stores the tactile content according to the combination of the intensity and the length of the divided section in each of the 25 matrix spaces.
  • vibration information having the maximum strength and the minimum length of the divided section that is, vibration information of the tactile content capable of transmitting the image of the hardest and coarsest is stored.
  • vibration information having the smallest intensity and the largest length of the divided section that is, vibration information of the tactile content that can transmit the image of being the softest but the smoothest is stored. Is done.
  • ⁇ hard-soft> and ⁇ rough-smooth> are used as the two tactile pairs, but the present invention is not limited to this.
  • ⁇ large-small>, ⁇ sharp-dull>, ⁇ heavy-light>, ⁇ rough-slippery>, ⁇ fluctuation-stable>, ⁇ disappearing-remaining> A pair such as> may be used.
  • the tactile content storage unit 1132 stores the matrix correspondence information as shown in FIG. 15 according to a plurality of tactile modes. For example, when there are four tactile modes as shown in FIG. 5A, four pieces of correspondence information corresponding to the four tactile modes are stored. Thereby, even if the value of the first tactile parameter related to the intensity of the information and the value of the second tactile parameter related to the length of the divided section of the information are all the same, the tactile mode designation unit 13 specifies the value. The vibration information of the tactile content changes according to the tactile mode.
  • a plurality of pieces of correspondence information having different contents according to the tactile mode include a tactile pair ⁇ hard-soft> represented by a first tactile parameter and a tactile pair represented by a second tactile parameter.
  • the combination of ⁇ coarse-smooth> may be the same, and the content of the vibration information stored in association with each index number may be different.
  • the plurality of pieces of correspondence information having different contents depending on the tactile mode are a combination of a tactile pair represented by a first tactile parameter and a tactile pair represented by a second tactile parameter. It is also possible to change according to the quality mode.
  • FIG. 14A shows a mode in which the tactile content downloaded from the external device 700 can be stored in the tactile content storage unit 1132. That is, the tactile content recording unit 17 ′ acquires the tactile content from the external device 700 via the communication network 600 and stores the tactile content in the tactile content storage unit 1132.
  • the tactile content reading unit 1131 ′ (not shown) is used.
  • the haptic content supply unit 12 transmits the haptic content generated based on the haptic content received by the haptic content recording unit 17 ′ and temporarily stored in the haptic content storage unit 1132 ′ to the content use device 500. Are performed in series with the recording process of the tactile content in the tactile content storage unit 1132 ′.
  • the target information acquisition unit 1133 acquires target information from which a haptic content is generated.
  • the target information acquisition unit 1133 acquires target information in which a value corresponding to the intensity is continuously or intermittently connected.
  • time-series waveform information is acquired.
  • the waveform information acquired here may be an analog signal or digital data.
  • the waveform information to be obtained may be any information whose amplitude changes over time, and its type is arbitrary.
  • audio signals, video signals, measurement signals of various measuring devices such as seismometers, anemometers, and light meters, waveform measurement data with an oscilloscope, stock price fluctuation data, and the like are given, but these are merely examples.
  • the target information acquired by the target information acquisition unit 1133 is information different from the audio content acquired by the audio content acquisition unit 14.
  • the target information obtained by the target information obtaining unit 1133 can be arbitrarily specified by the user. That is, the user inputs target information prepared in advance to the content supply device 100, and the target information obtaining unit 1133 of the tactile content obtaining unit 11 obtains this.
  • the processing content of the tactile parameter generation unit 1134 is the same as the processing content of the tactile parameter generation unit 131 shown in FIG.
  • the tactile content readout unit 1131 specifies one or more combinations of the n tactile parameters (the first tactile parameter and the second tactile parameter) generated by the tactile parameter generation unit 1134, and One or more tactile contents corresponding to the specified one or more combinations and tactile modes are read from the tactile content storage unit 1132.
  • the tactile content readout unit 1131 uses the tactile parameter generation unit 1134 to execute a first tactile parameter (amplitude) and a second tactile parameter (the time length of the divided section) from each divided section.
  • a first tactile parameter amplitude
  • a second tactile parameter the time length of the divided section
  • the tactile content reading unit 1131 performs the combination ⁇ h1, t1 ⁇ , ⁇ h2, t2 of the representative amplitude and the length of the divided section specified from each of the divided sections T1, T2, T3,. ⁇ , ⁇ H3, t3 ⁇ , ... are specified. Then, on the matrix space of the tactile content shown in FIG. 15, the tactile content corresponding to each of the specified combinations is sequentially read.
  • any one or a plurality of combinations are designated, and One or more tactile contents corresponding to the specified combination may be read from the tactile content storage unit 1132.
  • the specification of the combination of the tactile parameters may be manually performed by a user who wants to generate a desired tactile content, or the tactile content reading unit 1131 may automatically specify the combination of tactile contents based on a predetermined rule. It may be performed.
  • the tactile content read by the tactile content reading unit 1131 is set as the tactile content, and the tactile content is supplied to the tactile content supply unit 12.
  • a configuration similar to the tactile difference parameter generation unit 134 illustrated in FIG. 3B may be provided at a subsequent stage of the tactile parameter generation unit 1134.
  • the tactile parameter generation unit 1134 uses the target information acquired by the target information acquisition unit 1131 to select two or more combinations of n (n ⁇ 2) tactile parameters each representing one element of the tactile sensation. Generate. Then, the tactile difference parameter generation unit provided at the subsequent stage of the tactile parameter generation unit 1134 outputs the tactile parameter between the two sets for the two or more sets of tactile parameters generated by the tactile parameter generation unit 1134. By calculating the difference values, one or more combinations of n tactile difference parameters are generated.
  • the tactile content storage unit 1132 stores a plurality of tactile contents each having a unique tactile effect specified by the n tactile difference parameters and corresponding to a plurality of tactile modes. , And a combination of n tactile difference parameters and a tactile mode are stored. Then, the tactile content reading unit 1131 specifies one or more combinations of the n tactile difference parameters generated by the tactile difference parameter generation unit, and sets the combination to the specified one or more combinations and tactile mode. The corresponding one or more tactile contents are read from the tactile content storage unit 1132. Also in this configuration, the tactile content read by the tactile content reading unit 1131 is set as the tactile content.
  • the tactile content downloaded from the external device 700 can be stored in the tactile content storage unit 1132, but also a form in which the tactile content is stored in the tactile content storage unit 1132 in advance.
  • the tactile difference is used instead of the tactile parameter. It goes without saying that parameters can be used.
  • the target information acquisition unit 1133 and the tactile parameter generation unit 1134 are the same as those described with reference to FIG. 14A.
  • the tactile content storage unit 1142 is a non-volatile storage medium such as a hard disk or a semiconductor memory, and stores tactile contents having a unique haptic effect specified by n tactile parameters into n tactile parameters. Is stored in association with the combination.
  • the tactile content stored in the tactile content storage unit 1142 is the same as that described with reference to FIG. 14A, but is not generated in accordance with the tactile mode, and is not generated in accordance with the tactile mode.
  • the tactile content is composed of predetermined vibration information serving as a reference for generating the content.
  • the tactile content reading unit 1141 specifies one or more combinations of the n tactile parameters generated by the tactile parameter generation unit 1134, and specifies one or more tactiles corresponding to the specified one or more combinations.
  • the content is read from the tactile content storage unit 1142.
  • the vibration waveform processing unit 1123 is a weight information for the vibration information of the tactile content read by the tactile content reading unit 1141, and assigns the weight information according to the tactile mode specified by the tactile mode designating unit 13.
  • the vibration information of the tactile content is processed by the generated weight information.
  • the tactile content processed by the vibration waveform processing unit 1123 is set as the tactile content.
  • a configuration similar to the tactile difference parameter generation unit 134 illustrated in FIG. 3B may be provided at a subsequent stage of the tactile parameter generation unit 1134.
  • the tactile parameter generation unit 1134 uses the target information acquired by the target information acquisition unit 1131 to select two or more combinations of n (n ⁇ 2) tactile parameters each representing one element of the tactile sensation. Generate. Then, the tactile difference parameter generation unit provided at the subsequent stage of the tactile parameter generation unit 1134 outputs the tactile parameter between the two sets for the two or more sets of tactile parameters generated by the tactile parameter generation unit 1134. By calculating the difference values, one or more combinations of n tactile difference parameters are generated.
  • the vibration waveform processing unit 1123 is weight information for the vibration information of the tactile content read by the tactile content reading unit 1141, and is weight information corresponding to the tactile mode specified by the tactile mode specifying unit 13. Is generated, and the vibration information of the tactile content is processed based on the generated weight information. Also in the case of this configuration, the tactile content processed by the vibration waveform processing unit 1123 is set as the tactile content.
  • the tactile content storage unit 1142 stores tactile content having a unique haptic effect specified by n tactile difference parameters in association with a combination of n tactile difference parameters. Then, the tactile content reading unit 1141 specifies one or more combinations of the n tactile difference parameters generated by the tactile difference parameter generation unit, and selects one or more combinations corresponding to the specified one or more combinations. Is read from the tactile content storage unit 1142.
  • the tactile content storage unit 1142 stores the predetermined vibration information serving as a reference for generating the tactile content corresponding to the tactile mode, instead of the tactile content generated corresponding to the tactile mode.
  • the example in which the tactile content composed of is stored has been described.
  • the tactile content storage unit 1142 stores a plurality of tactile contents respectively corresponding to the plurality of tactile modes, and the tactile content readout unit 1141 stores the tactile mode.
  • the tactile content according to the tactile mode specified by the specifying unit 13 may be read from the tactile content storage unit 1142.
  • the target information obtaining unit 1133 obtains, as target information, information different from the audio content obtained by the audio content obtaining unit 14, that is, waveform information specified separately from the audio content by the user.
  • the target information acquisition unit 1133 ′ acquires the audio content acquired by the audio content acquisition unit 14 as the target information. Others are the same as FIG. 14A or FIG. 14B.
  • the tactile content is in accordance with the tactile mode specified by the tactile mode specifying unit 13 based on the tactile feature amount of the audio content selected by the user, and
  • the haptic content corresponding to the tactile parameter relating to the audio content selected by the user is obtained. This makes it possible to generate tactile content having high affinity with audio content.
  • FIG. 16 is a flowchart illustrating an operation example of the content supply device 100 when the acquisition of audio content is configured as illustrated in FIG. 2A and the acquisition of haptic content is configured as the haptic content acquisition unit 11 as illustrated in FIG. 14A. It is. Note that the flowchart illustrated in FIG. 16 starts when the content supply device 100 is activated.
  • the target information acquisition unit 1133 acquires target information specified by the user (Step S21).
  • the tactile parameter generation unit 1134 generates a combination of n tactile parameters each representing one element of the tactile sensation from the target information acquired by the target information acquisition unit 1133 (Step S22).
  • the tactile difference parameter is generated based on the tactile parameter generated in step S22.
  • the audio content reading unit 141 of the audio content obtaining unit 14 determines whether or not a user operation for selecting any one or more of the one or more audio contents stored in the audio content storage unit 142 has been performed (Ste S23). If the user operation has not been performed, the determination in step S23 is repeated.
  • the audio content reading unit 141 acquires the audio content by reading the audio content selected by the user operation from the audio content storage unit 142 (step S24). .
  • the tactile mode designation unit 13 analyzes the audio content read from the audio content storage unit 142 by the audio content reading unit 141, and designates a tactile mode according to the analysis result (step S25).
  • the tactile content reading unit 1131 of the tactile content acquisition unit 11 specifies one or more combinations of the n tactile parameters generated by the tactile parameter generation unit 1134 in step S22, and One or more tactile contents corresponding to the combination and the tactile mode are read from the tactile content storage unit 1132 (step S26).
  • the audio content supply unit 15 supplies the audio content acquired by the audio content acquisition unit 14 in step S24 to the content use device 500 (step S27), and the haptic content supply unit 12 transmits the tactile content in step S26.
  • the tactile content read from the tactile content storage unit 1132 by the reading unit 1131 is supplied to the content using device 500 as tactile content (step S28).
  • the audio content reading unit 141 determines whether or not the supply of the audio content to the content using device 500 has been completed (step S29). If the supply of the audio content has not been completed, the process returns to step S27, and the supply of the audio content and the tactile content to the content using device 500 is continued. On the other hand, when the supply of the audio content ends, the processing of the flowchart illustrated in FIG. 16 ends.
  • FIG. 17 is a flowchart showing an operation example of the content supply device 100 when the acquisition of audio content is configured as shown in FIG. 2A and the acquisition of haptic content is configured as the haptic content acquisition unit 11 as shown in FIG. 14B. It is. Note that the flowchart illustrated in FIG. 17 starts when the content supply device 100 is activated.
  • the target information obtaining unit 1133 obtains target information specified by the user (step S31).
  • the tactile parameter generation unit 1134 generates a combination of n tactile parameters each representing one element of the tactile sensation from the target information acquired by the target information acquisition unit 1133 (Step S32).
  • the tactile difference parameter is generated based on the tactile parameter generated in step S32.
  • the audio content reading unit 141 of the audio content obtaining unit 14 determines whether or not a user operation for selecting any one or more of the one or more audio contents stored in the audio content storage unit 142 has been performed (Ste S33). If the user operation has not been performed, the determination in step S33 is repeated.
  • the audio content reading unit 141 acquires the audio content by reading the audio content selected by the user operation from the audio content storage unit 142 (Step S34). .
  • the tactile mode designation unit 13 analyzes the audio content read from the audio content storage unit 142 by the audio content reading unit 141, and designates a tactile mode according to the analysis result (step S35).
  • the tactile content reading unit 1141 of the tactile content acquisition unit 11 specifies one or more combinations of the n tactile parameters generated by the tactile parameter generation unit 1134 in step S32, and One or more tactile contents corresponding to the combination are read from the tactile content storage unit 1142 (step S36).
  • the vibration waveform processing unit 1123 is weight information for the vibration information of the tactile content read by the tactile content reading unit 1141, and is a weight corresponding to the tactile mode specified by the tactile mode designating unit 13. Information is generated (step S37), and tactile content according to the designated tactile mode is generated by processing vibration information of the tactile content using the generated weight information (step S38).
  • the audio content supply unit 15 supplies the audio content acquired by the audio content acquisition unit 14 in step S34 to the content use device 500 (step S39), and the haptic content supply unit 12 causes the vibration waveform processing unit 1123 to The processed tactile content is supplied as tactile content to the content use device 500 (step S40).
  • the audio content reading unit 141 determines whether or not the supply of the audio content to the content using device 500 has been completed (Step S41). If the supply of the audio content has not been completed, the process returns to step S38, and the supply of the haptic content processed based on the audio content and the weight information to the content using device 500 is continued. On the other hand, when the supply of the audio content ends, the processing of the flowchart illustrated in FIG. 17 ends.
  • FIGS. 14A to 14C are block diagrams illustrating a functional configuration example of the haptic content acquisition unit 11 according to the fourth configuration example.
  • the haptic content acquisition unit 11 according to the fourth configuration example is a modification of the haptic content acquisition unit 11 according to the third configuration example, and has the same functions as those of the components illustrated in FIGS. 14A to 14C.
  • the same reference numerals are given to components having.
  • the haptic content acquisition unit 11 according to the fourth configuration example further includes a repetition extraction unit 1145 in addition to the functional configuration of the haptic content acquisition unit 11 according to the third configuration example. I have.
  • the repetitive extraction unit 1145 includes a plurality of touch qualities sequentially read by the touch content reading unit 1131 in accordance with the touch mode specified by the touch mode specifying unit 13. It is determined whether the content has a repeating pattern consisting of a combination of two or more different haptic contents. Then, at least one cycle of the tactile content group constituting the repetition is extracted and generated as tactile content.
  • the repetitive extraction unit 1145 converts the plurality of tactile contents sequentially read by the tactile content reading unit 1141 from a combination of two or more different tactile contents. It is determined whether or not the pattern has a repetition of the following pattern. Then, at least one cycle of the tactile content group constituting the repetition is extracted and generated as tactile content.
  • the vibration waveform processing unit 1123 generates weight information corresponding to the tactile mode specified by the tactile mode specifying unit 13, which is weight information for the vibration information of the tactile content generated by the repetitive extraction unit 1145. Then, the vibration information of the tactile content is processed by the generated weight information.
  • the tactile content group that constitutes the repetition is information that can convey a meaningful message to the user who receives the vibration. The reason will be described below.
  • the tactile content of No. 20 and the tactile content of No. 25 are not only extracted one by one in order, but are also repeated twice or more in the order of No. 20 ⁇ No. 25 ⁇ No. 20 ⁇ No. It is possible to convey the message more strongly.
  • a tactile content having strong message characteristics is also obtained when three or more tactile contents are repeated.
  • FIG. 19 is a diagram for explaining a repetition pattern determination algorithm performed by the repetition extraction unit 1145.
  • FIG. 19A shows a permutation of index numbers of ten tactile contents sequentially read by, for example, the tactile content reading unit 1131 (the same applies to the tactile content reading unit 1141).
  • the repetition extraction unit 1145 sets the first index number among a plurality of index numbers corresponding to the tactile content read by the tactile content reading unit 1131 as a key and immediately before the same index number as the key appears. Is extracted as a pattern.
  • the pattern refers to a combination of two or more different index numbers (tactile content indicated by the index numbers).
  • the pattern up to immediately before the same index number as the key appears without providing a restriction of “two or more different”.
  • the first index number “1” is extracted as a pattern.
  • a pattern consisting of only one index number is not extracted as a pattern.
  • the repetition extraction unit 1145 sets a next index number of the extracted pattern as a key, and newly extracts a pattern until immediately before the same index number as the key appears.
  • the repetitive extraction unit 1145 performs the above processing up to the end of a plurality of index numbers corresponding to the tactile content sequentially read by the tactile content reading unit 1131.
  • FIG. 19B shows the result of performing such a pattern extraction process.
  • a leading index number “1” is used as a key, and a pattern is extracted until immediately before the same index number as the key appears.
  • “123” is extracted as the first pattern.
  • the next index number (fourth index number) “1” of the extracted first pattern is set as a key, and the pattern is extracted just before the same index number as the key appears.
  • “123” is extracted as the second pattern.
  • next index number (seventh index number) “1” of the extracted second pattern is set as a key, and the pattern is extracted until immediately before the same index number as the key appears. As a result, “14” is extracted as the third pattern. Further, the next index number (the ninth index number) “1” of the extracted third pattern is set as a key, and it is attempted to extract a pattern until immediately before the same index number as the key appears. No number is found, and the remaining $ "13" is extracted as the fourth pattern.
  • the repetition extraction unit 1145 extracts the tactile content group “123123” constituting the repetition, and generates this as the tactile content.
  • the repetition extraction unit 1145 extracts only one cycle (one pattern) of “123” from the tactile content group “123123” constituting the repetition, and generates this as tactile content. It may be.
  • a repeat haptic content like “123123” can be obtained by the repeat reproduction function of the content using device 500. is there.
  • the repetition extraction unit 1145 extracts a plurality of tactile content groups “123123” and “231231” constituting the repetition by performing pattern extraction processing while sequentially changing the set position of the key from the top. May be generated as tactile content.
  • all index numbers may be sequentially set to the key, and the repetitive pattern determination process may be performed.
  • the repetitive extraction unit 1145 groups the plurality of tactile contents sequentially read out by the tactile content reading unit 1131 by dividing the tactile contents by a predetermined number from the top, and performs a repetitive pattern determination process for each group. Is also good.
  • FIG. 20 is a flowchart showing an operation example of the content supply device 100 when the acquisition of audio content is configured as shown in FIG. 2A and the acquisition of haptic content is configured as the haptic content acquisition unit 11 as shown in FIG. 18A. It is.
  • the flowchart illustrated in FIG. 20 starts when the content supply device 100 is activated.
  • the processes with the same step numbers as those shown in FIG. 16 are the same, and thus the duplicate description will be omitted here.
  • the repetitive extraction unit 1145 determines that the plurality of tactile contents are It is determined whether or not it has a repetitive pattern (step S51). Then, the repetition extraction unit 1145 extracts at least one cycle of the tactile content group constituting the repetition, and generates this as tactile content (step S52). Thereafter, the process proceeds to step S27.
  • a tactile content group composed of one type of repetition pattern is extracted from a plurality of tactile contents read from the tactile content storage units 1132 and 1142, and the extracted tactile content group is extracted.
  • a tactile content may be generated by extracting at least one cycle of a tactile content group including a plurality of types of repetitive patterns and combining them.
  • index number is set to Key and a pattern is extracted just before the same index number as the Key appears as a pattern
  • present invention is not limited to this.
  • a combination of a plurality of index numbers may be set as Key, and a pattern up to immediately before a combination of the same index number as the Key appears may be extracted as a pattern.
  • a sequence in which the sequence of “123” is replaced by a cyclic sequence (“231” or “312”) may be generated, and one or a combination of these sequences may be generated as tactile content.
  • the haptic content may be generated by combining the same pattern like “231231” or “312312”, or may be arbitrarily combined with different patterns like “123231313”. It may be.
  • the tactile content storage unit 1142 stores the predetermined tactile content according to the tactile mode, not the tactile content generated in accordance with the tactile mode.
  • the example in which the tactile content including the vibration information of the tactile information is stored has been described.
  • the tactile content storage unit 1142 stores a plurality of tactile contents respectively corresponding to the plurality of tactile modes
  • the tactile content readout unit 1141 stores the tactile mode specified by the tactile mode designating unit 13. May be read from the tactile content storage unit 1142.
  • the present invention is not limited to this.
  • Information that can be used as a processing target in the present embodiment is information having a unique haptic effect specified by n (n ⁇ 2) tactile parameters. In particular, if the value corresponding to the intensity is continuous or intermittent, or can be converted into a continuous state, it can be used as the target information.
  • the above-described waveform information is information in which the amplitude corresponding to the intensity is continuous (in the case of an analog signal) or intermittent (in the case of digital data) along the time axis, and can be acquired by the target information acquisition unit 1133. It is a typical example of the information.
  • text information composed of a character string is not information in which values corresponding to strengths are continuous, but is information that can be converted into such information as described in detail later. Therefore, the text information is also one of the information that can be obtained by the target information obtaining unit 1133. In this case, the first tactile parameter and the second tactile parameter will be described later.
  • information other than a single information amount can be said to be information that can be handled in the present embodiment.
  • information consisting of only one character such as "a” cannot be said to be information in which a value corresponding to intensity is continuous, and cannot be converted into this. Will be excluded.
  • the text information is composed of a plurality of characters, it can be converted into information in which values corresponding to the strengths are linked, as described later, and thus the information can be handled in the present embodiment.
  • the tactile parameter generation unit 1134 divides the histogram information into a plurality of pieces in the frequency axis direction. Then, from each of the divided sections, the representative intensity in the divided section is specified as the first tactile parameter, and the frequency width of the divided section is specified as the second tactile parameter. And a second tactile parameter combination.
  • FIG. 21 is a diagram for explaining an example of the processing content by the tactile parameter generation unit 1134 when the histogram information is input.
  • FIG. 21A shows the histogram information acquired by the target information acquisition unit 1133.
  • the outer shape of the histogram information is the same as the waveform information. Therefore, it is possible to apply the same processing as the waveform information.
  • the envelope of the histogram information may be extracted as shown in FIG. 21B, and the first tactile parameter and the second tactile parameter may be specified for the envelope. .
  • the tactile parameter generation unit 1134 first divides the histogram information shown in FIG. 21A (or the envelope information shown in FIG. 21B) into a plurality of pieces in the frequency axis direction.
  • the envelope is divided for each time at which the amplitude of the envelope becomes minimum. That is, the first divided section F1 is from the start point of the envelope to the first minimal value, the second divided section F2 is from the first minimal value to the second minimal value, and the second minimal value is ,
  • the histogram information is divided into a plurality of parts in the frequency axis direction, as in a third division section F3,.
  • the division method is not limited to this.
  • the tactile parameter generation unit 1134 specifies the representative intensities h1, h2, h3,... As the first tactile parameter from each of the divided sections F1, F2, F3,.
  • the frequency widths f1, f2, f3,... Of the divided sections are specified as quality parameters.
  • the representative intensities h1, h2, h3,... Indicate the maximum values in the respective divided sections F1, F2, F3,.
  • the method of obtaining the representative strength is not limited to this.
  • Image information is an example of spatial distribution information.
  • the image information is information in which color information and luminance information having an intensity element change according to the position of a pixel or an area including a plurality of pixel groups.
  • the tactile parameter generation unit 1134 divides such image information into a plurality of pieces according to the position of a pixel or a region of a pixel group, and calculates a first tactile parameter and a second tactile parameter from each divided section. Generate a combination.
  • FIG. 22 is a diagram for explaining an example of the processing content by the tactile parameter generation unit 1134 when image information is input.
  • the image information shown in FIG. 22A shows an example of a pattern image divided into meshes of various sizes in vertical and horizontal directions.
  • the second example is suitable for generating a combination of the first tactile parameter and the second tactile parameter from such a pattern image.
  • the tactile parameter generation unit 1134 first divides the pattern image shown in FIG. 22A into a plurality of pieces according to the mesh area.
  • the upper left mesh is a first divided section A1
  • the right mesh is a second divided section A2
  • the right mesh is a third divided section A3,.
  • it reaches the right end it descends to one lower stage and divides in order from the left end to the right end.
  • the entire pattern image is divided into a plurality of patterns in accordance with the area of each mesh.
  • the tactile parameter generation unit 1134 specifies the representative strengths h1, h2, h3,... As the first tactile parameter from each of the divided sections A1, A2, A3,.
  • the areas a1, a2, a3,... Of the divided sections are specified as quality parameters.
  • the representative intensities h1, h2, h3,... It is possible to use an average value of any one of the saturation, lightness, and luminance in each of the divided sections A1, A2, A3,. .
  • the tactile parameter generation unit 1134 determines, as a second tactile parameter, the length (corresponding to the number of pixels) from the start point to the end point of each divided section A1, A2, A3,. May be specified.
  • the tactile parameter generation unit 1134 converts the image information into frequency spectrum information as preprocessing, and converts the first tactile parameter and the second tactile parameter from the frequency spectrum information. Generate a combination of parameters.
  • the tactile parameter generation unit 1134 analyzes the image information acquired by the target information acquisition unit 1133 to generate frequency spectrum information of the intensity-frequency distribution, and then divides the frequency spectrum information into a plurality of pieces in the frequency axis direction. By specifying the representative intensity in the divided section as a first tactile parameter and specifying the frequency width of the divided section as a second tactile parameter from each divided section, And a second tactile parameter combination.
  • a specific processing example in this case is the same as that of the first example described with reference to FIG.
  • this third example can be applied to audio signals in addition to image information.
  • the example of FIG. 15 cannot be applied.
  • the tactile parameter generation unit 1134 morphologically analyzes the text information and divides the sentence into sentences, and divides each sentence into segments. Specify the number of phonemes in the phrase. Then, from each sentence, the reciprocal of the representative phoneme number in the sentence is regarded as the strength and specified as the first tactile parameter, and the number of phrases in the sentence is specified as the second tactile parameter. A combination of the first tactile parameter and the second tactile parameter is generated.
  • FIG. 23 is a diagram for explaining an example of the processing content by the tactile parameter generation unit 1134 when text information is input.
  • FIG. 23 shows the text information obtained by the target information obtaining unit 1133 with kana added.
  • the text information input in the example illustrated in FIG. 23 is information including a character string of a sentence “Today is Sunday. I went to school today.”
  • the tactile parameter generation unit 1134 divides the text information of the sentence “Today is Sunday. I went to school today” for each sentence. That is, it is divided into two parts, “Today is Sunday” and “I went to school today”.
  • the first sentence is a first divided section B1
  • the second sentence is a second divided section B2.
  • the tactile parameter generation unit 1134 performs a morphological analysis of each sentence, divides each sentence into segments, and specifies the number of phonemes in each segment. That is, with regard to the first divided section B1, the sentence is divided into two clauses, “today” and “Sunday”. Then, the phoneme number of the first phrase “Kyoha” is specified as “3”, and the phoneme number of the second phrase “Nichijobi” is specified as “5”. Similarly, for the second divided section B2, the sentence is divided into four clauses, "I”, “Today”, “To school”, and "I went”. Then, the number of phonemes of each phrase is specified as “4”, “2”, “5”, and “5”.
  • the tactile parameter generation unit 1134 regards the reciprocal of the number of representative phonemes in the divided section as the intensity from each of the first divided section B1 and the second divided section B2, and uses this as the first tactile parameter.
  • the representative phoneme number can be, for example, the minimum phoneme number, the maximum phoneme number, or the average phoneme number in the divided section. For example, when the average phoneme number is used, the average phoneme number of the first divided section B1 is “4”, and the reciprocal thereof is specified as the first tactile parameter. The average number of phonemes in the second divided section B2 is also “4”, and the reciprocal thereof is specified as the first tactile parameter.
  • the tactile parameter generation unit 1134 specifies the number of phrases as the second tactile parameter from each of the first divided section B1 and the second divided section B2. That is, the number of phrases “2” is specified as the second tactile parameter for the first divided section B1, and the number of phrases “4” is specified as the second tactile parameter for the second divided section B2.
  • the tactile parameter generation unit 1134 divides the series of motion information for each motion. Then, from each operation, the representative value relating to the pressing force, speed, acceleration, or the reaching height of the human body by the operation is regarded as the intensity, and specified as the first tactile parameter, and the length of time required for one operation is defined as the first time.
  • the second tactile parameter is a second tactile parameter, a combination of the first tactile parameter and the second tactile parameter is generated.
  • FIG. 24 is a diagram for explaining an example of the processing content by the tactile parameter generation unit 1134 when the operation information of the massage is input.
  • Massage is performed by a combination of a plurality of procedures. Then, one procedure is performed by applying pressure for a predetermined time.
  • the motion information representing the movement of the massage treatment can be represented by bar-like information in which the horizontal axis is the time axis and the vertical axis is the pressure.
  • a portion where the space between the adjacent bar graphs is open means that two procedures are performed at a certain time interval. Further, a portion where the adjacent bar graphs are connected means that two procedures are performed continuously without time delay. For example, when massage is actually performed on a humanoid doll or the like having a pressure sensor on the surface, sensor information as shown in FIG. 24 is obtained, and this is input to the target information acquisition unit 1133. Is possible.
  • the tactile parameter generation unit 1134 divides a series of motion information, which is obtained by the target information obtaining unit 1133 and includes a combination of a plurality of procedures, for each procedure (one motion). If this is described using the image of FIG. 24, it corresponds to dividing into individual bar graphs. In this case, one procedure (bar graph) corresponds to one divided section. That is, the first procedure is performed in a series of a first division M1, the second procedure is performed in a second division M2, the third procedure is performed in a third division M3, and so on. The motion information is divided for each procedure.
  • the tactile parameter generation unit 1134 calculates the applied pressures p1, p2, p3,... From the respective divided sections M1, M2, M3,. Is specified as the first tactile parameter, and the length of time t1, t2, t3,... Required for performing each procedure is specified as the second tactile parameter, whereby the first tactile parameter is obtained. A combination of a quality parameter and a second tactile parameter is generated.
  • Another example of a series of motion information representing a person's motion is motion information captured by motion capture.
  • the motion information obtained by the motion capture is provided as moving image information in which the motion of a person is digitally recorded.
  • the tactile parameter generation unit 1134 analyzes the image of the moving image information, specifies the starting point of the movement of the specific part (for example, the hand) of the person, and performs one operation until the hand returns to the starting point. Catch as. Then, a series of operation information is divided for each operation. In this case, one operation until the hand returns to the start point corresponds to one divided section.
  • the tactile parameter generation unit 1134 regards, as the first tactile parameter relating to the strength, a representative value relating to the speed, acceleration, or reaching height of the human body due to the operation of the divided section from each divided section. Identify. For example, when the representative value of the operation speed is used as the first tactile parameter, the maximum speed, the minimum speed, or the average speed in one operation can be used as the representative value. Further, when the representative value of the reaching height of the human body is used as the first tactile parameter, for example, the highest point from the floor when the user operates by moving his hand can be used as the representative value. In addition, the tactile parameter generation unit 1134 specifies the length of time required for one operation as a second tactile parameter.
  • the tactile parameter generation unit 1134 analyzes the moving image information by optical flow to calculate a displacement amount between frame images, thereby elapse of time. At the same time as the waveform information in which the displacement amount changes. Then, the waveform information is divided into a plurality of pieces in the time axis direction, a representative intensity in the divided section is specified as a first tactile parameter from each divided section, and a time length of the divided section is defined as a second tactile parameter. A combination of the first tactile parameter and the second tactile parameter is generated by specifying the tactile parameter as the tactile parameter.
  • the moving image information is any moving image information including the motion information by the motion capture described above.
  • a known method can be applied to the optical flow analysis.
  • the optical flow values in the x and y directions obtained for each pixel are Fx and Fy, for example, a value of (Fx 2 + Fy 2 ) / 2 is obtained for each pixel, and further, an average value is obtained for all the pixels.
  • this average value is set as the displacement amount at time t2.
  • the tactile mode designation unit 13 analyzes the audio content acquired by the audio content acquisition unit 14, and designates the tactile mode based on the analysis result. went.
  • the tactile mode can be designated according to a user operation instead of or in addition to the designation of the tactile mode based on the analysis result of the audio content.
  • FIGS. 25 and 26 are block diagrams showing a functional configuration example of the content supply device 100 according to the second embodiment.
  • FIGS. 25 and 26 show a modification of the content supply device 100 shown in FIG. 2B, it is also applicable as a modification of the content supply device 100 shown in FIG. 2A or 2C.
  • the configuration example of the tactile content acquisition unit 11 any of the configurations in FIGS. 6, 8, 14A to 14C, and 18A to 18C can be applied.
  • FIG. 25 shows a configuration in which the tactile mode can be designated according to a user operation in addition to the designation of the tactile mode based on the analysis result of the audio content. That is, the tactile mode designating unit 13A has a function of designating the tactile mode in accordance with an instruction from the user regarding the selection of the tactile mode, in addition to the function of designating the tactile mode based on the analysis result of the audio content. Is further provided.
  • the tactile mode designating unit 13A sets the state of designating the tactile mode according to the analysis result of the audio content acquired by the audio content acquiring unit 14 as a default, and instructs the user to select the tactile mode. Is performed, the state may be changed to a state in which the tactile mode is designated according to an instruction from the user.
  • the tactile mode specified based on the analysis result of the audio content selected by the user is set as the recommended mode, and unless otherwise specified by the user, the haptic content acquisition unit 11 acquires the haptic content according to the recommended mode. I do.
  • the haptic content acquisition unit 11 acquires haptic content according to the specified tactile mode.
  • the tactile sense of the tactile content in the recommended mode is It is also possible to acquire tactile contents having different tactile sensations according to the user's desire.
  • the user selects the desired audio content and the desired tactile mode.
  • the haptic content reading unit 1111 of the haptic content obtaining unit 11 receives the instruction related to the selection of the tactile mode from the user, and switches to the tactile mode designated by the tactile mode designating unit 13A in response to the instruction.
  • the corresponding haptic content is read from the haptic content storage unit 1112.
  • the user selects a desired audio content and selects a desired tactile mode. It is possible to In this case, the haptic content readout unit 1121 of the haptic content acquisition unit 11 switches to the tactile mode specified by the tactile mode specifying unit 13A in response to the instruction from the user regarding the selection of the tactile mode. The corresponding tactile content is read from the tactile content storage unit 1122. Then, the vibration waveform processing unit 1123 processes the vibration information of the tactile content read by the haptic content reading unit 1121.
  • the tactile content reading unit 1131 of the tactile content acquisition unit 11 is a tactile content corresponding to the tactile parameter generated from the audio content selected by the user, and is independent of the tactile parameter related to the audio content.
  • the tactile content corresponding to the tactile mode can be read from the tactile content storage unit 1132.
  • FIG. 26 shows a configuration in which only the tactile mode based on the user operation can be specified instead of the tactile mode based on the analysis result of the audio content. That is, the tactile mode designating unit 13B has a function of designating the tactile mode in response to an instruction related to the selection of the tactile mode from the user instead of the function of designating the tactile mode based on the analysis result of the audio content. It has.
  • FIG. 27 shows a modification of the configuration in which only the tactile mode can be designated based on a user operation.
  • the content supply device 100 having the configuration shown in FIG. 27 does not include any configuration related to acquisition of audio content. That is, only the configuration relating to the acquisition of the tactile content is provided, and the specification of the tactile mode is performed by the tactile mode designating unit 13B in response to an instruction from the user regarding the selection of the tactile mode.
  • any of the configurations in FIGS. 6, 14A, 14C (a), 18A, and 18C (a) can be applied.
  • the haptic content reading units 1111 and 1111 ′ of the haptic content acquiring unit 11 convert the haptic content according to the haptic mode specified by the haptic mode specifying unit 13B into a plurality of haptic contents.
  • a plurality of haptic contents respectively corresponding to the quality modes are read from the haptic content storage units 1112 and 1112 '.
  • the haptic content reading unit 1131 of the haptic content acquisition unit 11 The tactile content corresponding to the tactile mode specified by the tactile mode specifying unit 13B is read from the tactile content storage unit 1132 that stores a plurality of tactile contents respectively corresponding to the plurality of tactile modes.
  • the third embodiment relates to a content providing system in which a content acquisition device 200 and a content providing server device 300 are connected via a communication network 600.
  • the haptic content acquired by the content acquisition device 200 from the content providing server device 300 can be used arbitrarily.
  • the haptic content acquired by the content acquisition device 200 is stored in the haptic content storage unit 1112 of the content supply device 100 described in the first embodiment, and the haptic content is supplied from the content supply device 100 to the content use device 500. Is possible.
  • FIG. 28 is a block diagram illustrating a functional configuration example of the content providing system according to the first configuration example. 28, components having the same functions as the components shown in FIG. 2 are denoted by the same reference numerals.
  • the content acquisition device 200 includes a haptic content reception unit 201, a haptic content storage unit 202, an audio content reception unit 203, an audio content storage unit 204, and an instruction transmission unit 205 as its functional configuration.
  • the content providing server device 300 includes, as its functional components, a haptic content acquisition unit 11, a tactile mode designation unit 13, an audio content acquisition unit 14, a haptic content transmission unit 301, an audio content transmission unit 302, and an instruction reception unit 303. Have.
  • Each functional block provided in the content acquisition device 200 can be configured by any of hardware, DSP, and software.
  • each of the functional blocks is actually configured to include a CPU, a RAM, a ROM, and the like of a computer, and a content providing program stored in a storage medium such as a RAM, a ROM, a hard disk, or a semiconductor memory. Is realized by operating. This is the same in the second to fifth configuration examples shown in FIGS. 29 to 32.
  • Each functional block included in the content providing server device 300 can be configured by any of hardware, DSP, and software.
  • each of the functional blocks is actually configured to include a CPU, a RAM, a ROM, and the like of a computer, and a content providing program stored in a storage medium such as a RAM, a ROM, a hard disk, or a semiconductor memory. Is realized by operating. This is the same in the second to fifth configuration examples shown in FIGS. 29 to 32.
  • the instruction transmission unit 205 of the content acquisition device 200 transmits to the content providing server device 300 an instruction related to selection of audio content from the user via the operation unit of the content acquisition device 200 or a predetermined graphical user interface.
  • the instruction receiving unit 303 of the content providing server device 300 receives this instruction. Note that the instruction related to the selection of the audio content is equivalent to the instruction related to the acquisition of the haptic content because the haptic content is acquired from the content providing server device 300 in response thereto.
  • the predetermined graphical user interface is, for example, a selection screen on which a list of audio contents stored in the audio content storage unit 142 of the content providing server device 300 is displayed and the user can select one of the audio contents. is there.
  • the selection screen is created by the content providing server device 300 and displayed on a display of the content acquisition device 200 or the like.
  • the audio content acquisition unit 14 of the content providing server device 300 acquires the audio content in response to the instruction received by the instruction receiving unit 303 from the user and related to the selection of the audio content. Specifically, the audio content reading unit 141 of the audio content acquisition unit 14 reads out the audio content selected by the user operation from the plurality of audio contents stored in the audio content storage unit 142, and To get.
  • the tactile mode designating unit 13 of the content providing server device 300 designates a tactile mode according to the analysis result of the audio content acquired by the audio content acquiring unit 14. “Specifying the tactile mode according to the analysis result” means that, as in the first embodiment, when the content providing server device 300 receives an instruction related to the selection of audio content, the tactile mode specifying unit 13 analyzes the audio content acquired by the audio content acquisition unit 14 and designates a tactile mode in accordance with the analysis result. And specifying the tactile mode based on the
  • audio content to be provided to the content acquisition device 200 is stored in the audio content storage unit 142 in advance. Therefore, the same analysis as in the first embodiment is performed in advance for all or a part of the audio content stored in the audio content storage unit 142 even if there is no instruction from the user regarding the selection of the audio content. It is possible to specify the tactile mode for each content. Then, based on the analysis result, it is possible to create and store correspondence information (table information) indicating the correspondence between the audio content and the tactile mode. When such table information is created, the tactile-mode specifying unit 13 refers to the table information when the instruction receiving unit 303 receives an instruction related to audio content selection, and selects the audio content selected by the instruction. Specify the tactile mode corresponding to.
  • the haptic content acquisition unit 11 acquires haptic content according to the tactile mode specified by the tactile mode specification unit 13.
  • the tactile mode specification unit 13 As a configuration example of the haptic content acquisition unit 11, any of the configurations of FIGS. 6A and 8B, FIGS. 8A and 8B, FIGS. 14A to 14C, and FIGS. 18A to 18C can be applied. It is possible.
  • the haptic content transmitting unit 301 transmits the haptic content acquired by the haptic content acquiring unit 11 to the content acquiring device 200.
  • the audio content transmission unit 302 transmits the audio content acquired by the audio content acquisition unit 14 to the content acquisition device 200.
  • the haptic content receiving unit 201 of the content acquisition device 200 receives the haptic content transmitted by the haptic content transmitting unit 301 of the content providing server device 300 as a response to the instruction from the instruction transmitting unit 205.
  • the haptic content storage unit 202 stores the haptic content received by the haptic content receiving unit 201.
  • the audio content receiving unit 203 of the content acquisition device 200 receives the audio content transmitted by the audio content transmitting unit 302 of the content providing server device 300 as a response to the instruction from the instruction transmitting unit 205.
  • the audio content storage unit 204 stores the audio content received by the audio content reception unit 203.
  • the haptic content stored in the haptic content storage unit 202 and the audio content stored in the audio content storage unit 204 can be supplied to the content use device 500. Further, the haptic content and the audio content stored in the content acquisition device 200 can be transmitted to another terminal via a removable storage medium or a communication network, and can be supplied from the other terminal to the content use device 500. .
  • both the audio content and the tactile content can be downloaded from the content providing server device 300 to the content obtaining device 200 and stored.
  • the selected audio content is downloaded, and the haptic content corresponding to the tactile mode specified based on the tactile feature amount of the selected audio content. That is, it is possible to download a tactile content that matches the characteristics of the audio content.
  • the haptic content may be transmitted from the content providing server device 300 to the content obtaining device 200 only when the instruction relating to the acquisition of the haptic content is transmitted from the content obtaining device 200 to the content providing server device 300.
  • the audio content may be used only for the analysis for specifying the tactile mode, and may not be supplied to the content use device 500.
  • the audio content receiving unit 203 and the audio content storage unit 204 of the content acquisition device 200 and the audio content transmitting unit 302 of the content providing server device 300 may be omitted.
  • the tactile mode may be specified according to a user operation.
  • the instruction transmitting unit 205 transmits an instruction relating to the specification of the touch mode to the content providing server device 300.
  • the instruction receiving unit 303 receives the instruction and notifies the tactile-mode specifying unit 13.
  • FIG. 29 is a block diagram illustrating a functional configuration example of a content providing system according to the second configuration example.
  • components having the same functions as those shown in FIGS. 27 and 28 are denoted by the same reference numerals.
  • the content providing system shown in FIG. 29 does not include a configuration related to acquisition of audio content. That is, only the configuration relating to the acquisition of the tactile content is provided, and the specification of the tactile mode by the tactile mode designating unit 13B is performed in response to an instruction relating to the selection of the tactile mode from the user of the content acquisition device 200.
  • the instruction transmission unit 205 of the content acquisition device 200 transmits an instruction relating to the specification of the tactile mode from the user via the operation unit of the content acquisition device 200 or the graphical user interface to the content providing server device 300.
  • the instruction receiving unit 303 of the content providing server device 300 receives the instruction related to the specification of the tactile mode transmitted from the instruction transmitting unit 205.
  • the tactile mode designating unit 13B designates the tactile mode in accordance with the instruction received by the instruction receiving unit 303 and related to the selection of the tactile mode.
  • the haptic content acquisition unit 11 acquires haptic content according to the haptic mode specified by the haptic mode specification unit 13B.
  • the haptic content acquisition unit 11 any one of the configurations of FIGS. 6A, 6B, 14A, and 18A can be applied.
  • FIG. 30 is a block diagram illustrating a functional configuration example of the content providing system according to the third configuration example.
  • components having the same functions as the components shown in FIG. 28 are denoted by the same reference numerals.
  • the content acquisition device 200 includes a haptic content reception unit 201, a haptic content storage unit 202, an audio content transmission unit 206, and an audio content storage unit 207 as its functional configuration.
  • the content providing server device 300 includes a haptic content acquisition unit 11, a haptic mode designation unit 13, a haptic content transmission unit 301, and an audio content reception unit 304 as its functional configuration.
  • the audio content storage unit 207 of the content acquisition device 200 stores one or more audio contents. Although not shown, the audio content storage unit 207 can additionally store audio content downloaded from an external device via a communication network as needed.
  • the audio content transmitting unit 206 transmits the audio content selected by the user from the audio contents stored in the audio content storage unit 207 to the content providing server device 300.
  • the selection of the audio content is performed through the operation unit of the content acquisition device 200 or a predetermined user interface. That is, the user selects, from a list of audio contents provided through the graphical user interface, audio contents for which it is desired to obtain haptic contents matching the characteristics of the audio, and instructs transmission.
  • the audio content transmission unit 206 reads the audio content selected by the user from the audio content storage unit 207 and transmits the audio content to the content providing server device 300.
  • the audio content receiving unit 304 of the content providing server device 300 receives the audio content transmitted from the content acquisition device 200.
  • the tactile mode designating unit 13 analyzes the audio content received by the audio content receiving unit 304, and designates a tactile mode according to the analysis result.
  • the haptic content acquisition unit 11 acquires haptic content according to the tactile mode specified by the tactile mode specification unit 13.
  • the tactile mode specification unit 13 As a configuration example of the haptic content acquisition unit 11, any of the configurations of FIGS. 6A and 8B, FIGS. 8A and 8B, FIGS. 14A to 14C, and FIGS. 18A to 18C can be applied. It is possible.
  • the haptic content transmitting unit 301 transmits the haptic content acquired by the haptic content acquiring unit 11 to the content acquiring device 200.
  • the haptic content receiving unit 201 of the content acquisition device 200 receives the haptic content transmitted by the haptic content transmitting unit 301 of the content providing server device 300 as a response to the transmission of the audio content by the audio content transmitting unit 305.
  • the content providing system according to the third configuration example shown in FIG. 30 is configured to acquire only the tactile content corresponding to the audio content from the content providing server device 300 when the audio content is present in the content acquisition device 200 in advance. This is a preferred example.
  • the tactile mode designating section 13 A shown in FIG. 25 or the tactile mode designating section 13 B shown in FIG. 26 is provided.
  • the tactile mode may be specified according to a user operation.
  • the haptic content according to the haptic parameter of the audio content selected by the user and the haptic mode selected by the user are set. The corresponding tactile content can be obtained.
  • FIG. 31 is a block diagram illustrating a functional configuration example of a content providing system according to a fourth configuration example.
  • components having the same functions as the components shown in FIG. 30 are denoted by the same reference numerals.
  • the fourth configuration example is a modification of the third configuration example, in which the content providing server device 300 returns the audio content received from the content acquisition device 200 to the content acquisition device 200 together with the tactile content.
  • the content acquisition device 200 includes an audio content reception unit 208 and an audio content storage unit 209 instead of the haptic content reception unit 201 and the haptic content storage unit 202 shown in FIG. Have.
  • the content providing server device 300 includes an audio content generation unit 305 and an audio content transmission unit 306 instead of the haptic content transmission unit 301 shown in FIG.
  • the audio content generation unit 305 of the content providing server device 300 includes the audio content received by the audio content reception unit 304 (the audio content transmitted from the content acquisition device 200) and the haptic content acquired by the haptic content acquisition unit 11. To generate audio content.
  • the audio content generation unit 305 generates audio content using the audio content received by the audio content reception unit 304 as the first channel information and using the haptic content acquired by the haptic content acquisition unit 11 as the second channel information. .
  • the audio content transmission unit 306 transmits the audio content generated by the audio content generation unit 305 to the content acquisition device 200.
  • the audio content receiving unit 208 of the content acquisition device 200 transmits the audio content transmitted by the audio content transmitting unit 306 of the content providing server device 300 in response to the transmission of the audio content to the content providing server device 300 by the audio content transmitting unit 206.
  • the audio content storage unit 209 stores the audio content received by the audio content reception unit 208.
  • the content acquisition device 200 can acquire the audio content and the tactile content in a synchronized state, and has an advantage that the content can be easily used.
  • the audio content and the haptic content are supplied from the content acquisition device 200 to the content use device 500
  • the audio content originally stored in the content acquisition device 200 and the haptic content acquired from the content providing server device 300 are used.
  • the audio content and the tactile content obtained as the audio content from the content providing server device 300 can be supplied to the content use device 500 in a synchronized state.
  • FIG. 32 is a block diagram illustrating a functional configuration example of a content providing system according to a fifth configuration example.
  • components having the same functions as those shown in FIGS. 28 and 31 are denoted by the same reference numerals.
  • the content acquisition device 200 includes an instruction transmission unit 205, an audio content reception unit 208, and an audio content storage unit 209.
  • the content providing server device 300 includes a haptic content acquisition unit 11, a tactile mode designation unit 13, an instruction reception unit 303, and an audio content transmission unit 306.
  • the haptic content acquisition unit 11 is different from any of the various configurations described in the first and second embodiments, and includes an audio content reading unit 181 and an audio content storage unit 182.
  • the acoustic content storage unit 182 of the haptic content acquisition unit 11 stores acoustic content according to the tactile mode, which is audio content including haptic content and audio content.
  • the audio content storage unit 182 is a non-volatile storage medium such as a hard disk or a semiconductor memory, and can continue to store audio content unless the user explicitly gives an instruction to delete.
  • the audio content referred to here is a combination of audio content and tactile content corresponding to a tactile mode corresponding to the tactile feature amount of the audio content. This is the content whose content is the second channel information.
  • the audio content reading unit 181 reads the audio content corresponding to the tactile mode specified by the tactile mode specification unit 13 from one or more audio contents stored in the audio content storage unit 182.
  • the specification of the touch mode by the touch mode specifying unit 13 is performed by receiving the instruction of the content providing server device 300 from the instruction transmitting unit 205 of the content acquisition device 200, similarly to the first configuration example illustrated in FIG. 28. This is performed according to an instruction related to selection of audio content transmitted to unit 303.
  • the tactile mode may be designated in accordance with the user's instruction related to the selection of the tactile mode transmitted from the instruction transmitting unit 205 of the content acquisition device 200 to the instruction receiving unit 303 of the content providing server device 300. All of these instructions correspond to “instructions regarding acquisition of tactile content”.
  • the audio content transmission unit 306 transmits the audio content acquired by the haptic content acquisition unit 11 (that is, read from the audio content recording unit 182 by the audio content reading unit 181) to the content acquisition device 200.
  • the audio content receiving unit 208 of the content acquisition device 200 receives the audio content transmitted by the audio content transmitting unit 306 of the content providing server device 300.
  • the audio content storage unit 209 stores the audio content received by the audio content reception unit 208.
  • the configuration of the haptic content acquisition unit 11 shown in FIG. 32 can be applied to the content supply device 100 shown in the first embodiment and the second embodiment.
  • the configuration of FIG. 32 is employed as the configuration of the haptic content acquisition unit 11 shown in FIGS. 2A to 2C, the audio content acquisition unit 14 and the audio content supply unit 15 are omitted, and the haptic content supply unit 12 is used instead. It is possible to provide an audio content supply unit.
  • the configuration of FIG. 32 is adopted as the configuration of the haptic content acquisition unit 11 shown in FIGS. 25 and 26, and the audio content acquisition unit 14, the audio content supply unit 15, and the audio content recording unit 16 are omitted, and the haptic content acquisition unit 11 is omitted. It is possible to provide an audio content supply unit instead of the supply unit 12.
  • the configuration of FIG. 32 may be adopted as the configuration of the haptic content acquisition unit 11 shown in FIG. 27, and an audio content supply unit may be provided instead of the haptic content supply unit 12.
  • the configuration in which the content obtaining apparatus 200 is connected to the content providing server apparatus 300 has been described.
  • the content obtaining apparatus 200 obtains and stores the content from the content providing server apparatus 300.
  • a function of supplying the acquired content to the content using device 500 may be further provided.
  • FIG. 33 is a block diagram illustrating a functional configuration example of a content supply device 100 according to the fourth embodiment. 33, components having the same functions as the components shown in FIG. 25 are denoted by the same reference numerals.
  • the fourth embodiment will be described as a modification to FIG. 25, but it is also possible to apply a mode (FIGS. 2A and 2C) different from that of FIG. Further, the fourth embodiment can be applied as a modification example of FIG.
  • the content supply device 100 includes, as its functional components, a haptic content acquisition unit 11, a tactile mode designation unit 13A, an audio content acquisition unit 14, an audio content generation unit 18, an audio content generation unit 18, The content providing unit 19 and the information providing unit 20 are provided.
  • the haptic content acquisition unit 11 is a haptic content corresponding to the haptic mode specified by the haptic mode specifying unit 13A, and a haptic content corresponding to the haptic parameter relating to the audio content selected by the user. You will get
  • the tactile content according to the tactile mode specified by the tactile mode specifying unit 13A is the tactile specified based on the analysis result (tactile feature amount) of the audio content selected by the user.
  • the tactile mode selected by the user operation may be a tactile mode having a tactile property unrelated to the tactile feature value of the audio content.
  • the “tactile content according to the tactile parameter related to the audio content selected by the user” means that the tactile content is obtained when the audio content selected by the user is acquired from the target information acquisition unit 1133 ′ in FIG. 14C or 18C.
  • the audio content generation unit 18 has the same function as the audio content generation unit 305 shown in FIG. That is, the audio content generation unit 18 generates an audio content by combining the audio content acquired by the audio content acquisition unit 14 and the haptic content acquired by the haptic content acquisition unit 11. For example, the audio content generation unit 18 generates audio content using the audio content acquired by the audio content acquisition unit 14 as first channel information and using the haptic content acquired by the haptic content acquisition unit 11 as second channel information. .
  • the audio content supply unit 19 supplies the audio content acquired by the audio content acquisition unit 18 to the content use device 500.
  • a configuration for supplying audio content to the content using device 500 is shown, but the haptic content supply unit 12 and the audio content supply unit 15 are provided similarly to FIG. You may make it supply to the use apparatus 500.
  • the information providing unit 20 provides the user with information on a predetermined physical effect or psychological effect expected when the vibration of the haptic content acquired by the haptic content acquiring unit 11 is presented to the user.
  • the physical effect or the psychological effect is an effect that can be caused by the “tactile content according to the tactile parameter related to the audio content selected by the user”.
  • the tactile parameter generated from the audio content represents one element of the tactile sensation as described above.
  • the tactile content generated from the tactile content read from the tactile content storage units 1132 and 1142 based on the tactile parameter also represents a predetermined tactile sensation.
  • the tactile sensation represented by the tactile parameter is based on a tactile pair such as ⁇ hard-soft> or ⁇ rough-smooth>. For this reason, for example, a physical effect or a psychological effect expected when a vibration is presented to a user based on tactile content composed of “vibration information having a hard touch” and “vibration information having a soft touch” When the vibration is presented to the user based on the tactile content, the physical effect or the psychological effect expected is different.
  • the information providing unit 20 obtains the selected audio content as target information of the tactile content obtaining unit 11.
  • the unit 1133 'generates haptic content by inputting the target information the unit 1133' provides the user with information on a predetermined physical effect or psychological effect expected from the haptic content.
  • the method of presenting information is arbitrary, but a form in which information is displayed on the display of the content supply device 100 is considered as an example.
  • table information created based on the results of a test or the like performed in advance is stored, and the information providing unit 20 stores the table information. Confirm by reference.
  • FIG. 34 is a diagram illustrating an example of an information providing screen provided by the information providing unit 20 to the display of the content supply device 100.
  • the name of the audio content selected by the user, the tactile mode specified by the tactile mode specifying unit 13, and the audio content selected by the user are stored in the tactile content acquisition unit 11. Displays the physical effect or psychological effect (vibration effect) expected when the haptic content is generated using the object information as attribute information. That is, the name of the audio content selected by the user is “ABC”, the tactile mode specified by the tactile mode specifying unit 13 is “fluffy mode”, and the tactile sensation generated by the audio content selected by the user as the target information. “Sleep” is displayed as the content vibration effect.
  • the tactile mode designated by analyzing the tactile parameter of the audio content selected by the user is displayed, and has a “fluffy” tactile sensation and a “sleep” vibration. This indicates that the haptic content having an effect is acquired by the haptic content acquiring unit 11.
  • a desired tactile mode for example, “smooth mode”
  • the tactile content having the “smooth” tactile sensation and the “sleeping” vibration effect is output to the tactile content acquisition unit. 11.
  • the display of the tactile mode on the information provision screen of FIG. 34 is the “smooth mode”.
  • the fourth embodiment it is possible to present to the user what physical effect or psychological effect is expected from the haptic content acquired by the haptic content acquiring unit 11.
  • the information on the tactile mode is also presented as shown in FIG. 34, the user has what tactile sensation acquired by the haptic content acquisition unit 11 and what physical effect or It is possible to grasp whether a psychological effect is expected. This makes it easier for the user to obtain the desired tactile content, thereby improving user convenience.
  • the information providing unit 20 has been described as being included in the content supply device 100. However, the information providing unit 20 may be included in the content acquisition device 200.
  • n 2 and the two tactile parameters are the first tactile parameter relating to the intensity of the information and the second tactile parameter relating to the length of the divided section of the information.
  • n may be 3 or more.
  • the tactile content corresponding to the combination of n tactile parameters is stored as an n-dimensional hierarchical space of three or more layers instead of the two-dimensional matrix space as shown in FIG. Will do.
  • the tactile parameter generation unit 1134 divides the target information acquired by the target information acquiring unit 1133 into a plurality of pieces, and from each of the divided sections, relates to the first tactile parameter relating to the intensity of the information and the length of the divided section. Using at least one of the second tactile parameters, it is possible to generate n (n ⁇ 2) tactile parameter combinations.
  • n 3
  • three different tactile parameters may be generated, or three different tactile parameters may be generated.
  • the target information acquisition unit 1133 inputs n types of target information. Then, the tactile parameter generation unit 1134 divides the input n types of target information into a plurality of pieces, and specifies the first tactile parameters one by one from each of the divided sections of the n types of target information. Generates a combination of n first tactile parameters.
  • the target information acquisition unit 1133 has input the following digital values for the three types of information of heart rate, acceleration (body movement), and blood flow as target information.
  • Heart rate 79, 60, 79, 75, 74, 79, 75 Acceleration: 40, 10, 30, 40, 35, 40, 20
  • Blood flow 80, 60, 40, 60, 80, 60, 80
  • the tactile parameter generation unit 1134 divides each of the three types of input target information for each input value, and specifies the first tactile parameter from each divided section of the three types of target information.
  • a combination of three first tactile parameters is generated as follows. ⁇ 79,40,80 ⁇ , ⁇ 60,10,60 ⁇ , ⁇ 79,30,40 ⁇ , ...
  • the tactile content reading units 1131 and 1141 specify the combination of the first tactile parameters generated in this manner, and specify the tactile content corresponding to the combination of ⁇ 79, 40, 80 ⁇ , and ⁇ 60 ⁇ , 10, 60 ⁇ , the tactile content corresponding to the combination of ⁇ 79, 30, 40 ⁇ ,... Are read from the tactile content storage units 1132, 1142.
  • the target information acquisition unit 1133 acquires n types of target information. Then, the tactile parameter generation unit 1134 divides each of the obtained n types of target information into a plurality of pieces, and specifies the second tactile parameter one by one from each divided section of the n types of target information. Generates a combination of n second tactile parameters.
  • the target information acquisition unit 1133 has input the same digital value as the above with respect to three types of information of heart rate, acceleration, and blood flow.
  • the tactile parameter generation unit 1134 divides each of the three types of target information into units until the same value is input, and divides each of the three types of target information into the second tactile information. By specifying the parameters, a combination of three second tactile parameters is generated.
  • the length of the divided section is 2, It becomes 3.
  • the value of the first “40” appears next three times, and the same value appears next two times later, so that the length of the divided section is 3,2.
  • the same value as the first “80” appears next four times, and the next same value appears two more times later, so the length of the divided section is 4,2.
  • the tactile parameter generation unit 1134 generates a combination of the second tactile parameters as ⁇ 2, 3, 4 ⁇ , ⁇ 3, 2, 2 ⁇ . Also, the tactile content reading units 1131 and 1141 specify the combination of the second tactile parameters generated in this way, and specify the tactile content corresponding to the combination of ⁇ 2, 3, 4 ⁇ , ⁇ 3, The tactile content corresponding to the combination of 2,2 ⁇ is read from the tactile content storage units 1132 and 1142.
  • one type of target information may be divided into a plurality of sections, and a plurality of tactile parameters relating to strength may be generated from each of the divided sections (for example, the maximum value of strength and the tactile parameter). The minimum value is used as the tactile parameter.)
  • the target information may be divided into a plurality of sections according to different methods, and the length of the divided section divided by each method may be used as the tactile parameter.
  • both the tactile parameter relating to the intensity and the tactile parameter relating to the length of the divided section may be combined and generated.
  • the tactile difference parameter difference value may be calculated. That is, the tactile difference parameter may be first-order difference information of the tactile parameter or m-th (m ⁇ 2) difference information.
  • n arithmetic operation values (values of hi / ti) are calculated for each of the plurality of divided sections, and the length of the section where the same arithmetic operation value appears is determined by the first tactile feature amount P1.
  • n arithmetic operation values values of hi / ti
  • the present invention is not limited to this.
  • four arithmetic operations using n tactile parameters are calculated for each of the divided sections, and an average value of m-order difference values (m ⁇ 1) of the four arithmetic operations calculated for each of the divided sections or
  • a representative value (such as a maximum value, a minimum value, or a median value) may be used as the first tactile feature amount P1.
  • n four arithmetic operation values (hi / ti values) are calculated for each of the divided sections, and the first-order difference value ⁇ (h2 / t2- (h1 / t1), (h3 / t3-h2 / t2),... ⁇ may be used as the first tactile feature amount P1.
  • the variance of a plurality of four arithmetic operation values (hi / ti values) calculated for each of a plurality of divided sections is calculated as an example of the diversity (second tactile feature amount P2).
  • the present invention is not limited to this.
  • a range from a minimum value to a maximum value of a plurality of four arithmetic operation values calculated for each divided section, or a set of the range and a median value may be calculated as the second tactile feature amount P2.
  • the information amount of a plurality of four arithmetic operation values calculated for each divided section may be calculated as the second tactile feature amount P2.
  • an example of the information amount can be entropy H (the average information amount of Shannon) calculated by the following equation.
  • p (e i ) indicates the occurrence probability of each of the four arithmetic operations.
  • a plurality of four arithmetic operation values (hi / ti values) calculated for a plurality of divided sections are as follows. 20,18,1,9,11,1,1,1,38,38,1,16
  • the variance is 191.8
  • the range is 37
  • the median is 16
  • the entropy is 0.48.
  • the tactile feature value calculation unit 132 can use the variance, the set of the range and the median, or the entropy calculated in this way as the second tactile feature value P2 representing the diversity of the four arithmetic operations. It is.
  • the second tactile feature value P2 When a pair of a range and a median is used as the second tactile feature value P2, there are two values as the second tactile feature value P2. When the first tactile feature amount P1 is combined, there are three values. In this case, the tactile mode may be specified based on the three tactile feature amounts. Alternatively, the second tactile feature value P2 may be replaced with one value by performing four arithmetic operations using the range and the median value.
  • the tactile feature value calculation unit 13 further calculates an m-th difference value (m ⁇ 1) of a plurality of four arithmetic operations calculated for each divided section, and substitutes the m-th difference value for the above-described four arithmetic operations. It may be used. That is, the length of the section where the same m-th difference value appears is calculated as the first tactile feature value P1, and the diversity of the m-th difference value is calculated as the second tactile feature value P2. Is also good.
  • a plurality of four arithmetic operation values (hi / ti values) calculated for a plurality of divided sections are as follows. 20,18,1,9,11,1,1,1,38,38,1,16
  • the tactile feature value calculation unit 132 may use the diversity of the plurality of m-th order difference values calculated in this way as the second tactile feature value P2.
  • the diversity of the m-th order difference value is calculated as the second tactile feature value P2
  • the variance, the range, the set of the range and the median, or the entropy of the plurality of m-th order differences are used as the diversity. Is possible.
  • the reason why the m-th order difference value is used is that the manner of changing the four arithmetic operations of the tactile parameter affects the tactile characteristic. For example, when a large value is included in the mth-order difference value, it means that there is a large change in the texture and the texture becomes hard. On the other hand, when a small value is included in the m-th order difference value, it means that there is a small change in the texture, and the texture becomes soft.
  • a combination of two types of tactile feature amounts is calculated.
  • a combination of three or more types of tactile feature amounts may be calculated.
  • the third type of tactile feature quantity for example, it is possible to use an average value (which may be a minimum value or a maximum value) of the lengths of the divided sections. It can be said that the average value or the like of the lengths of the divided sections indicates the tempo of the tactile sensation that the target information has potentially.
  • the tactile feature value calculation unit 132 further calculates the m-th difference value (m ⁇ 1) of the plurality of four arithmetic operations calculated for each divided section, and calculates the diversity of the m-th difference value.
  • the third tactile feature quantity may be further calculated. That is, the length of the section where the same four arithmetic operation values appear is defined as the first tactile feature amount, the diversity of the four arithmetic operation values is defined as the second tactile feature amount, and the diversity of the m-th order difference value is defined as the third tactile feature amount. It may be used as a tactile feature value.
  • the variance, range, set of range and median, or entropy of the plurality of m-th order difference values may be used as the diversity. It is possible.
  • a plurality of four arithmetic operation values (values of hi / ti) calculated for each of a plurality of target sections 1 and 2 for each of a plurality of divided sections are as follows.
  • Target information 1) 1, 2, 100, 1, 2, 100
  • Target information 2) 1, 2, 3, 1, 2, 3
  • P1 the number of sections until the same four arithmetic operation values appear
  • the third tactile feature in the target information 1 and 2 is obtained.
  • the quantities P3 are respectively as follows.
  • Target information 1): P3 98 (Absolute value of first floor difference value: 1,98,99,1,98)
  • Target information 2: P3 1 (Absolute value of first-order difference value: 1, 1, 2, 1, 1)
  • the tactile feature is added. It is possible to make a difference in the feature amount.
  • the tactile feature amount calculation unit 132 calculates four arithmetic operations using n tactile parameters for each divided section, and calculates the tactile feature using the four arithmetic operations. Although an example of calculation has been described, the present invention is not limited to this. For example, the tactile feature amount calculation unit 132 calculates, as the first tactile feature amount, the length of a section where the same one appears in the combination of the first tactile parameter and the second tactile parameter. The diversity of the combination of the first tactile parameter and the second tactile parameter may be calculated as the second tactile feature amount.
  • the tactile feature amount calculation unit 132 determines whether the tactile feature amount calculation unit 132 generates two or more sets of tactile characteristics generated by the tactile parameter generation unit 131. This is applied when the tactile feature amount is calculated based on the quality parameter.
  • the tactile feature amount calculation unit 132 ′ calculates a tactile feature amount based on two or more sets of tactile difference parameters generated by the tactile difference parameter generation unit 134. The above-described various modified examples can be similarly applied to the calculation.
  • each of the first to fourth embodiments described above is merely an example of a specific embodiment for carrying out the present invention, and the technical scope of the present invention should not be interpreted in a limited manner. It must not be. That is, the present invention can be implemented in various forms without departing from the gist or the main features thereof.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Information Transfer Between Computers (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention comprend : une unité de désignation de mode de nature tactile 13 qui désigne un mode quelconque d'une pluralité de modes de nature tactile différents de la nature de sentiments concernant un sens tactile ; une unité d'acquisition de contenu de sens tactile 11 qui acquiert un contenu de sens tactile, consistant en des informations de vibration générées de manière à amener un utilisateur à imaginer un sens tactile prescrit concernant le sens tactile, selon le mode de nature tactile désigné par l'unité de désignation de nature tactile 13 ; et une unité de fourniture de contenu de sens tactile 12 qui fournit le contenu de sens tactile acquis à un dispositif d'utilisation de contenu. La présente invention concerne un environnement dans lequel l'utilisateur peut facilement utiliser le contenu de sens tactile en fournissant de manière appropriée, parmi une pluralité de types de contenu de sens tactile pouvant être généré de façon à amener l'utilisateur à imaginer des sensations différentes concernant le sens tactile, le contenu de sens tactile qui génère un sens tactile particulier par vibration à l'utilisateur conformément au mode de nature tactile.
PCT/JP2018/033125 2018-09-07 2018-09-07 Dispositif de fourniture de contenu, système de fourniture de contenu, dispositif serveur de fourniture de contenu, dispositif d'acquisition de contenu, procédé de fourniture de contenu et programme de fourniture de contenu Ceased WO2020049705A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2018/033125 WO2020049705A1 (fr) 2018-09-07 2018-09-07 Dispositif de fourniture de contenu, système de fourniture de contenu, dispositif serveur de fourniture de contenu, dispositif d'acquisition de contenu, procédé de fourniture de contenu et programme de fourniture de contenu
JP2019557643A JP6644293B1 (ja) 2018-09-07 2018-09-07 コンテンツ供給装置、コンテンツ提供システム、コンテンツ提供サーバ装置、コンテンツ提供方法およびコンテンツ提供用プログラム

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/033125 WO2020049705A1 (fr) 2018-09-07 2018-09-07 Dispositif de fourniture de contenu, système de fourniture de contenu, dispositif serveur de fourniture de contenu, dispositif d'acquisition de contenu, procédé de fourniture de contenu et programme de fourniture de contenu

Publications (1)

Publication Number Publication Date
WO2020049705A1 true WO2020049705A1 (fr) 2020-03-12

Family

ID=69412175

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/033125 Ceased WO2020049705A1 (fr) 2018-09-07 2018-09-07 Dispositif de fourniture de contenu, système de fourniture de contenu, dispositif serveur de fourniture de contenu, dispositif d'acquisition de contenu, procédé de fourniture de contenu et programme de fourniture de contenu

Country Status (2)

Country Link
JP (1) JP6644293B1 (fr)
WO (1) WO2020049705A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116015154A (zh) * 2022-12-30 2023-04-25 歌尔股份有限公司 马达的振动控制方法、装置、终端设备及计算机介质

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015053054A (ja) * 2013-09-06 2015-03-19 イマージョン コーポレーションImmersion Corporation 音声信号に関係付けられる触覚効果を生成するためのシステム及び方法
JP2015053049A (ja) * 2013-09-06 2015-03-19 イマージョン コーポレーションImmersion Corporation スペクトログラムの視覚処理をして触覚効果を生成するためのシステム及び方法
WO2018139150A1 (fr) * 2017-01-26 2018-08-02 株式会社ファセテラピー Machine de massage à vibration de type masque

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6401758B2 (ja) * 2016-08-25 2018-10-10 株式会社ファセテラピー 触質コンテンツ生成装置、触質コンテンツ生成方法および触質コンテンツ利用機器
JP6383765B2 (ja) * 2016-08-25 2018-08-29 株式会社ファセテラピー 触覚コンテンツ生成装置、触覚コンテンツ生成方法および触覚コンテンツ利用機器

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015053054A (ja) * 2013-09-06 2015-03-19 イマージョン コーポレーションImmersion Corporation 音声信号に関係付けられる触覚効果を生成するためのシステム及び方法
JP2015053049A (ja) * 2013-09-06 2015-03-19 イマージョン コーポレーションImmersion Corporation スペクトログラムの視覚処理をして触覚効果を生成するためのシステム及び方法
WO2018139150A1 (fr) * 2017-01-26 2018-08-02 株式会社ファセテラピー Machine de massage à vibration de type masque

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116015154A (zh) * 2022-12-30 2023-04-25 歌尔股份有限公司 马达的振动控制方法、装置、终端设备及计算机介质

Also Published As

Publication number Publication date
JPWO2020049705A1 (ja) 2020-09-24
JP6644293B1 (ja) 2020-02-12

Similar Documents

Publication Publication Date Title
US10162416B2 (en) Dynamic haptic conversion system
US10410392B2 (en) Data structure for computer graphics, information processing device, information processing method and information processing system
US10049483B2 (en) Apparatus and method for generating animation
JP7741729B2 (ja) 映像生成装置
TW201523509A (zh) 律動影像化方法、系統以及電腦可讀取記錄媒體
KR20170024374A (ko) 공연상황 상호작용 및 관객참여형 무대영상 연출장치 및 그 연출방법
JPWO2018139150A1 (ja) マスク型振動マッサージ装置
US11120633B2 (en) Interactive virtual reality system for experiencing sound
Ujitoko et al. GAN-based fine-tuning of vibrotactile signals to render material surfaces
JP6644293B1 (ja) コンテンツ供給装置、コンテンツ提供システム、コンテンツ提供サーバ装置、コンテンツ提供方法およびコンテンツ提供用プログラム
WO2020158036A1 (fr) Dispositif de traitement d'informations
US20030110026A1 (en) Systems and methods for communicating through computer animated images
JP6401758B2 (ja) 触質コンテンツ生成装置、触質コンテンツ生成方法および触質コンテンツ利用機器
JP6322780B1 (ja) 触覚コンテンツ生成装置、音響コンテンツ生成装置、音響再生装置、触覚コンテンツ生成方法および音響コンテンツ生成方法
Remache-Vinueza et al. Phantom sensation: Threshold and quality indicators of a tactile illusion of motion
JP6660637B2 (ja) 触質情報処理装置および触質情報処理方法
JP2020078446A (ja) バイブレーション装置
Kim et al. Perceptually motivated automatic dance motion generation for music
JP6383765B2 (ja) 触覚コンテンツ生成装置、触覚コンテンツ生成方法および触覚コンテンツ利用機器
JP2005071256A (ja) 画像表示装置及び画像表示方法
JP4580812B2 (ja) 映像生成方法、スタンドアロン型映像再生装置及びネットワーク配信型映像再生システム
KR102710310B1 (ko) 사용자가 원하는 의상 스킨이 적용된 3d 아바타를 개인 맞춤형으로 생성할 수 있는 전자 장치 및 그 동작 방법
JP5583066B2 (ja) 映像コンテンツ生成装置およびコンピュータプログラム
JP3521635B2 (ja) 映像生成方法および映像生成システム
JP2016080908A (ja) 信号加工装置

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2019557643

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18932730

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18932730

Country of ref document: EP

Kind code of ref document: A1