WO2010070882A1 - 情報表示装置及び情報表示方法 - Google Patents
情報表示装置及び情報表示方法 Download PDFInfo
- Publication number
- WO2010070882A1 WO2010070882A1 PCT/JP2009/006884 JP2009006884W WO2010070882A1 WO 2010070882 A1 WO2010070882 A1 WO 2010070882A1 JP 2009006884 W JP2009006884 W JP 2009006884W WO 2010070882 A1 WO2010070882 A1 WO 2010070882A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- content
- interest
- degree
- unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1454—Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42201—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS] biosensors, e.g. heat sensor for presence detection, EEG sensors or any limb activity sensors worn by the user
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/4223—Cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/4402—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
- H04N21/440245—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display the reformatting operation being performed only on part of the stream, e.g. a region of the image or a time segment
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/442—Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
- H04N21/44213—Monitoring of end-user related data
- H04N21/44218—Detecting physical presence or behaviour of the user, e.g. using sensors to detect if the user is leaving the room or changes his face expression during a TV program
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/12—Overlay of images, i.e. displayed pixel being the result of switching between the corresponding input pixels
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2354/00—Aspects of interface with display user
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/21—Server components or server architectures
- H04N21/222—Secondary servers, e.g. proxy server, cable television Head-end
- H04N21/2223—Secondary servers, e.g. proxy server, cable television Head-end being a public access point, e.g. for downloading to or uploading from clients
Definitions
- the present invention relates to an information display device that displays content, and more particularly to an information display device that performs display control based on a user's state.
- an information display device that displays information in stages in response to a request from a user accepted by an input device has been proposed (see, for example, Patent Document 1).
- an information display device that presents contents excluding information that is not recognized according to the distance between the user and the information display device has also been proposed (see, for example, Patent Document 2).
- Patent Document 3 an information display device that displays content information during reproduction of content at a more appropriate timing in accordance with the viewing state of the user's content has also been proposed (see, for example, Patent Document 3).
- the information display device described in Patent Literature 3 generates an interest level according to the time when the user is watching the content, and recommends the reproduction of other content when the generated interest level decreases.
- Such an information display device described in Patent Documents 1 to 3 enables a user to efficiently acquire information.
- the conventional information display device can improve the efficiency of the user regarding information acquisition, the effect of attracting the user to the displayed content cannot be expected. In other words, the conventional information display device cannot efficiently increase the user's interest.
- the present invention solves the above-described conventional problems, and an object thereof is to provide an information display device and the like that can efficiently increase the user's interest in displayed content.
- an information display device detects a user state that is a physical state of a display unit that displays first content on a screen and a user located in front of the screen. And a degree-of-interest estimation unit that estimates the degree of interest indicating the degree of interest of the user with respect to the first content displayed on the screen by the display unit based on the user state detected by the user state detection unit And when the degree of change in interest level estimated by the interest level estimation unit is less than the first threshold, at least a part of the definition or disclosure level is lower than a predetermined definition or disclosure level. And a display control unit for displaying the second content on the display unit.
- the content can be displayed in a state in which at least a part of the definition or disclosure level is lower than a predetermined definition or disclosure level according to the user state, and the user's interest can be efficiently improved by the kinking effect. Can be increased.
- the blurring effect refers to an effect that can attract the user's attention by not revealing to the user information indicated by at least a part of the content.
- the squeezing effect is an effect that raises interest by sneaking the user.
- the display control unit may further include the clarity of the second content displayed on the screen when the magnitude of the change rate of the interest level estimated by the interest level estimation unit exceeds a second threshold. It is preferable to increase the degree of disclosure.
- the sharpness or disclosure level of the content whose visibility or disclosure level has been lowered can be dynamically increased according to the magnitude of the rate of change of the interest level of the user, so that the user's interest is increasing.
- the information indicated by the content can be disclosed to the user.
- the user's interest can be increased more efficiently, and the information indicated by the content can be strongly impressed by the user.
- the display control unit may further increase the clarity or disclosure level of the second content displayed on the screen when the interest level estimated by the interest level estimation unit exceeds a third threshold. preferable.
- the sharpness or disclosure level of the content whose visibility or disclosure level has been lowered can be dynamically increased according to the degree of interest of the user, so that when the user's interest is high, The information shown can be disclosed to the user.
- the user's interest can be increased more efficiently, and the information indicated by the content can be strongly impressed by the user.
- the display control unit may change the interest level estimated by the interest level estimation unit when the interest rate change rate estimated by the interest level estimation unit is less than the first threshold value.
- the display control unit may change the interest level estimated by the interest level estimation unit when the interest rate change rate estimated by the interest level estimation unit is less than the first threshold value.
- the user state detection unit detects the movement direction of the user as the user state
- the interest level estimation unit detects that the user movement direction detected by the user state detection unit is the screen from the user. It is preferable to estimate that the degree of interest is higher as it is closer to the representative position or the direction from the user to the position where the first content is displayed.
- the user state detection unit detects the movement speed of the user as the user state, and the interest level estimation unit has a higher degree of interest as the user movement speed detected by the user state detection unit is smaller. It is preferable to estimate.
- the user state detection unit detects the user position as the user state, and the interest level estimation unit displays the user position and the first content detected by the user state detection unit. It is preferable to estimate that the degree of interest is higher as the distance from the position is smaller.
- the user state detection unit detects a user's line-of-sight direction as the user state
- the interest level estimation unit indicates that the user's line-of-sight direction detected by the user state detection unit represents the screen from the user. It is preferable that it is estimated that the degree of interest is higher as it is closer to the direction toward the location where the first content is displayed or closer to the location where the first content is displayed from the user.
- the display control unit further includes a target user selection unit that selects a target user from among users positioned in front of the screen, and the display control unit changes the interest rate of the target user selected by the target user selection unit.
- a target user selection unit that selects a target user from among users positioned in front of the screen, and the display control unit changes the interest rate of the target user selected by the target user selection unit.
- the sharpness or disclosure level of the content can be changed, and the selected user's interest can be efficiently increased. it can.
- the target user selection unit preferably selects a user corresponding to the degree of interest as a target user.
- the target user selecting unit selects the user as the target user when the distance between the position where the first content is displayed and the position of the user is less than a fifth threshold.
- the display control unit further includes a content database storing at least an application area indicating a position of a part indicating the essential information which is a part of the second content and is information to be shown to the user.
- the second content is displayed on the display unit in a state where the definition or disclosure level of the adaptive area of the second content stored in the content database is lower than a predetermined definition or disclosure level. preferable.
- the predetermined definition or disclosure level is the definition or disclosure level of the first content, and the display control unit causes the display unit to display the first content as the second content. Is preferred.
- the information display method in 1 aspect of this invention is the user state which detects the user state which is the 1st display step which displays 1st content on a screen, and the user's physical state located ahead of the said screen
- the degree of change in interest level estimated in the interest level estimation step is less than the first threshold, the sharpness or disclosure level of at least some of the regions is lower than a predetermined sharpness level or disclosure level
- a second display step of displaying the second content is the user state which detects the user state which is the 1st display step which displays 1st content on a screen, and the user's physical state located ahead of the said screen
- the present invention can also be realized as a program for causing a computer to execute such an information display method.
- a program can be distributed via a computer-readable recording medium such as CD-ROM (Compact Disc Only Memory) or a transmission medium such as the Internet.
- At least a part of the definition or disclosure degree is defined in advance according to the change rate of the degree of interest of the user with respect to the displayed content.
- the content can be displayed in a state lower than the degree.
- the user's attention can be drawn by the squeezing effect, so that the user's interest can be increased efficiently.
- FIG. 1 is an external view of an information display device according to Embodiment 1 of the present invention.
- FIG. 2 is a block diagram showing a functional configuration of the information display apparatus according to Embodiment 1 of the present invention.
- FIG. 3 is a diagram for explaining the definition.
- FIG. 4 is a diagram for explaining the degree of disclosure.
- FIG. 5 is a flowchart showing a flow of processing executed by the information display device according to Embodiment 1 of the present invention.
- FIG. 6 is a flowchart showing a flow of processing executed by the information display device in the first modification of the first embodiment of the present invention.
- FIG. 7 is a flowchart showing a flow of processing executed by the information display device in the second modification of the first embodiment of the present invention.
- FIG. 1 is an external view of an information display device according to Embodiment 1 of the present invention.
- FIG. 2 is a block diagram showing a functional configuration of the information display apparatus according to Embodiment 1 of the present invention.
- FIG. 8 is a block diagram showing a functional configuration of the information display apparatus according to Embodiment 2 of the present invention.
- FIG. 9A is a diagram for describing a user position calculation method according to Embodiment 2 of the present invention.
- FIG. 9B is a diagram for describing a user position calculation method according to Embodiment 2 of the present invention.
- FIG. 10 is a diagram showing an example of a content database according to Embodiment 2 of the present invention.
- FIG. 11 is a diagram for explaining the application area.
- FIG. 12 is a flowchart showing a flow of processing performed by the information display device in Embodiment 2 of the present invention.
- FIG. 13 is a diagram for explaining the interest level calculation method according to Embodiment 2 of the present invention.
- FIG. 9A is a diagram for describing a user position calculation method according to Embodiment 2 of the present invention.
- FIG. 9B is a diagram for describing a user position calculation method according to Embodiment
- FIG. 14A is a diagram for describing a method of selecting a target user in the second embodiment of the present invention.
- FIG. 14B is a diagram for explaining a method of selecting a target user in Embodiment 2 of the present invention.
- FIG. 15 is a diagram for explaining the interest rate change rate calculation process according to Embodiment 2 of the present invention.
- FIG. 16 is a diagram showing a specific example of the operation of the information display device in Embodiment 2 of the present invention.
- FIG. 17A is a diagram showing another example of a content display method according to Embodiment 2 of the present invention.
- FIG. 17B is a diagram showing another example of a content display method according to Embodiment 2 of the present invention.
- FIG. 18 is a flowchart showing a flow of processing relating to detection of the line-of-sight direction in the modification of the second embodiment of the present invention.
- FIG. 19 is a diagram for explaining processing for detecting the orientation of the user's face in the modification of the second embodiment of the present invention.
- FIG. 20 is a diagram for describing the reference direction reference plane.
- FIG. 21A is a diagram for describing detection of the center of a black eye.
- FIG. 21B is a diagram for describing detection of the center of the black eye.
- the information display device 10 is a device that displays content on a screen. According to the degree of interest of the user with respect to content displayed on the screen (hereinafter simply referred to as “display content”), It is characterized by changing the clarity or disclosure level of display content.
- FIG. 1 is an external view of an information display device according to Embodiment 1 of the present invention.
- FIG. 2 is a block diagram showing a functional configuration of the information display apparatus according to Embodiment 1 of the present invention.
- the information display device 10 includes a user state detection unit 11, an interest level estimation unit 12, a display control unit 13, and a display unit 14.
- the user state detection unit 11 detects a user state that is a physical state of a user located in front of the screen.
- the user state detection unit 11 includes, for example, a biological signal such as an electroencephalogram, the user's position, speed, face or body direction, gaze direction, facial expression, utterance content, utterance volume, or voice volume. Detect as user status.
- the interest level estimation unit 12 estimates the interest level indicating the degree of interest of the user with respect to the content displayed on the screen by the display unit 14 based on the user status detected by the user status detection unit 11.
- the degree-of-interest estimation unit 12 determines that the detected moving direction of the user is closer to the direction from the user to the position representing the screen or the direction from the user to the position where the content is displayed. Estimate that the degree of interest is high. That is, the interest level estimation unit 12 estimates that the interest level is high if the user is moving toward the screen or the display content.
- the position representing the screen is a position on the screen, for example, the center position of the screen (hereinafter simply referred to as “the center of the screen”) or the position of the center of gravity.
- the position where the content is displayed is a position on the screen where the content is displayed, and is a position representing the content.
- the position where the content is displayed is the center position of the area where the content is displayed (hereinafter simply referred to as “the center of the content”) or the center of gravity position.
- the position where the content is displayed may be the center position or the center of gravity position of the region where the image (for example, the image of the advertisement target product) indicated by a part of the content is displayed.
- the interest level estimation unit 12 estimates that the degree of interest is higher as the detected moving speed of the user is lower. That is, if the user moves slowly in front of the screen, it is estimated that the degree of interest is high.
- the interest level estimation unit 12 estimates that the degree of interest is higher as the distance between the detected user position and the position where the content is displayed is smaller.
- the degree of interest estimation unit 12 indicates the degree of interest as the detected line-of-sight direction of the user is closer to the direction from the user to the position representing the screen or the direction from the user to the position where the content is displayed. Is estimated to be high. That is, the interest level estimation unit 12 estimates that the interest level is high if the user is looking at the direction of the screen or display content.
- the interest level estimation unit 12 may estimate the interest level based on the detected biological signal, facial expression, utterance content, utterance volume, or voice volume information.
- the display control unit 13 when the magnitude of the interest rate change rate estimated by the interest level estimation unit 12 is less than the first threshold value, at least a part of the definition level or the disclosure level is defined in advance.
- the content is displayed on the display unit 14 in a lower state.
- the display control unit 13 calculates the change rate magnitude a (t) indicating the temporal change degree of the interest level according to the equation (1).
- k (t) is the degree of interest at time t. Further, the display control unit 13 determines whether or not the magnitude of change rate a (t) calculated in this way is less than the first threshold value.
- the display control unit 13 displays the sharpness or disclosure of the content in which part or all of the sharpness or disclosure is displayed on the screen.
- the content is displayed on the display unit 14 in a lower state. That is, the display control unit 13 decreases the definition or disclosure level of at least a part of the content displayed on the screen, for example, for a predetermined time.
- sharpness indicates the sharpness of content. That is, the sharpness indicates the sharpness of information indicated by the content. For example, when a video effect such as a blur effect or a mosaic effect is applied to the content, the sharpness of the content decreases.
- the disclosure level indicates the level of content disclosure. That is, the degree of disclosure indicates the degree to which information indicated by content is disclosed. For example, if an image different from the image included in the content is arranged so as to overlap a part or all of the image included in the content, the degree of disclosure of the content decreases.
- the first threshold value is a value indicating that the change in the degree of interest of the user with respect to the content is small due to, for example, a decrease in the interest of the user with respect to the display content.
- the display unit 14 has a screen such as a plasma display panel (PDP: Plasma Display Panel) or a liquid crystal panel (Liquid Crystal Panel), and displays content on the screen.
- a plasma display panel PDP: Plasma Display Panel
- a liquid crystal panel Liquid Crystal Panel
- FIG. 3 is a diagram for explaining the definition.
- FIG. 4 is a diagram for explaining the degree of disclosure. 3 and 4, an image advertising the product “TV” with the brand name “ABCD” sold by the seller “XXX Corporation” will be described as an example of content displayed on the screen.
- FIG. 3 (a) shows the content before the sharpness is lowered.
- FIG. 3B shows content that has undergone mosaic processing in an area that shows the appearance of a product that is part of the content. That is, FIG. 3B shows the content displayed in a state lower than the definition of the content in FIG.
- the display control unit 13 displays on the screen a content in which a video effect such as a blurring effect or a mosaic effect is applied to a part of the content, thereby reducing the sharpness of a part of the content.
- a video effect such as a blurring effect or a mosaic effect
- FIG. 4A shows the content before the disclosure level is lowered.
- FIG. 4B shows content in which an image different from the product is superimposed on an area showing the appearance of the product that is a part of the content. That is, FIG. 4B shows the content displayed in a state lower than the disclosure level of the content in FIG.
- the display control unit 13 reduces the disclosure level of a part of the content by displaying an image different from the original image in a partial area of the content.
- the display control unit 13 may reduce the degree of disclosure of a part of the content by morphing that naturally deforms the original image into an image different from the image.
- the display control unit 13 may decrease the definition or disclosure level for the entire content.
- the display control unit 13 lowers the clarity or disclosure level of the portion indicating the essential information that is the information that the user is trying to show.
- the information display apparatus 10 can reduce the definition or the degree of disclosure of the area in which the essential information is displayed, and can attract more attention of the user due to the blurring effect.
- the information display apparatus 10 can efficiently increase the user's interest and can strongly impress the user on the information that is about to be shown to the user.
- FIG. 5 is a flowchart showing a flow of processing executed by the information display device in Embodiment 1 of the present invention.
- the display unit 14 displays the content on the screen (S100).
- the user state detection unit 11 detects the user state (S101).
- the interest level estimation unit 12 estimates the interest level based on the detected user state (S102).
- the display control unit 13 calculates the degree of change in interest level (S104). Then, the display control unit 13 determines whether or not the magnitude of the interest rate change rate is less than the first threshold (S106).
- the display control unit 13 decreases the definition or disclosure level of at least a part of the content displayed on the screen ( S108), the process is terminated. That is, the display control unit 13 displays the content on the screen in a state where at least a part of the definition or disclosure is lower than the definition or disclosure of the content displayed on the screen according to the degree of interest of the user. .
- the magnitude of the interest rate change rate is equal to or greater than the first threshold (No in S106)
- the process ends.
- the information display apparatus 10 can display at least a part of the definition or disclosure degree that is determined in advance according to the degree of interest of the user.
- the content can be displayed on the screen in a lower state.
- the information display apparatus 10 may repeatedly execute the processing from step S101 to step S108.
- the information display apparatus 10 has a clearness in which at least a part of the clearness or the degree of disclosure is determined in advance according to the magnitude of the change rate of the degree of interest of the user with respect to the displayed content.
- the content can be displayed in a state lower than the degree or the degree of disclosure.
- the information display apparatus 10 can attract the user's attention due to the squeezing effect, so that the user's interest can be increased efficiently.
- the display control unit 13 causes the display unit 14 to display the content in a state where the degree of clarity or the disclosure level is low when the degree of change in interest level is simply less than the first threshold.
- the display control unit 13 has at least a part of the definition or disclosure that has a predetermined definition or disclosure.
- the content may be displayed on the display unit 14 in a lower state.
- the information display device 10 responds to a temporary decrease in the rate of change in the degree of interest of the user, and reduces changing the definition or disclosure level of the content more than necessary. Can do.
- the information display device 10 according to the first modification of the first embodiment has a feature that, when the rate of change of the interest level of the user is small, content that is different from the content currently displayed on the screen is displayed on the screen. Different from the information display device 10 in the first embodiment.
- this modification will be described with reference to the drawings, focusing on differences from the first embodiment.
- the information display device 10 in this modification also includes a user state detection unit 11, an interest level estimation unit 12, a display control unit 13, and a display unit. 14.
- the user state detection unit 11, the interest level estimation unit 12, and the display unit 14 are the same as those in the first embodiment, description thereof is omitted.
- the display control unit 13 in the present modification includes at least content that is different from the content displayed on the screen.
- a part of the definition or disclosure level is displayed on the display unit 14 in a state where it is lower than a predetermined definition or disclosure level.
- the predetermined definition or disclosure level is, for example, the original definition or disclosure level of the content. That is, the predetermined sharpness or disclosure level is typically not applied with a video effect for reducing the sharpness or disclosure level, or is subjected to image processing for reducing the sharpness or disclosure level. It is the clarity or disclosure level of the content that is not.
- FIG. 6 is a flowchart showing a flow of processing executed by the information display device in Modification 1 of Embodiment 1 of the present invention.
- the same processes as those in FIG. 5 are denoted by the same reference numerals, and description thereof is omitted.
- the display control unit 13 selects at least a part of the clarity or disclosure level of content different from the content displayed on the screen. Is displayed on the display unit 14 in a state lower than the predetermined definition or disclosure level (S112), and the process is terminated. That is, the display control unit 13 causes the display unit 14 to display at least a part of the content whose visibility or disclosure level has been lowered in advance and is different from the content displayed on the screen.
- the information display device 10 can display the content in a state in which the clarity or the disclosure level is lowered when the content is newly displayed. Can be increased.
- the information display device 10 in this modification also includes a user state detection unit 11, an interest level estimation unit 12, a display control unit 13, and a display unit. 14.
- the user state detection unit 11, the interest level estimation unit 12, and the display unit 14 are the same as those in the first embodiment, description thereof is omitted.
- the display control unit 13 in the present modification increases the clarity or disclosure level of the content displayed on the screen based on the user's interest level.
- the display control unit 13 for example, when the magnitude of the interest rate change rate estimated by the interest level estimation unit 12 exceeds the second threshold, the sharpness of the content displayed on the screen or Increase disclosure.
- the second threshold is a value indicating that the rate of change of the user's interest level is increased by lowering the definition or disclosure level of the content.
- a 2nd threshold value is more than a 1st threshold value.
- the display control unit 13 may increase the clarity or disclosure level of the content displayed on the screen.
- the third threshold is a value indicating that the degree of interest of the user has increased by lowering the definition or disclosure level of the content.
- FIG. 7 is a flowchart showing the flow of processing executed by the information display device in Modification 2 of Embodiment 1 of the present invention.
- the same processes as those in FIG. 5 are denoted by the same reference numerals, and the description thereof is omitted.
- the display control unit 13 determines whether or not the sharpness or disclosure level of the content displayed on the screen has already been lowered (S122). That is, the display control unit 13 determines whether or not the content displayed on the screen is displayed on the screen in a state where the definition or disclosure level is lower than a predetermined definition or disclosure level.
- the information display apparatus 10 executes the processing of Step S106 to Step S108, and returns to the processing of Step S101.
- the display control unit 13 determines whether to increase the definition or disclosure level of the content based on the user's interest level (S124). ).
- the display control unit 13 increases the definition or disclosure level of the content displayed on the screen (S126). Return to processing. On the other hand, when it is determined that the definition or the disclosure level is not increased (No in S124), the process returns to Step S101.
- the information display device 10 changes the clarity or disclosure level of the content displayed on the screen in accordance with the change rate of the user's interest level or the interest level. Can be raised. Therefore, the information display device 10 can dynamically disclose the information indicated by the content to the user according to the user state. As a result, the information display apparatus 10 can increase the user's interest more efficiently and can make the user strongly impress the information indicated by the content.
- the information display device 20 according to the second embodiment of the present invention is mainly different from the information display device 10 according to the first embodiment in that a target user is selected from users located in front of the screen.
- FIG. 8 is a block diagram showing a functional configuration of the information display device according to Embodiment 2 of the present invention.
- the information display device 20 includes a user identification unit 21 that obtains image information from the user detection camera 60, a user state detection unit 22, an interest level estimation unit 23, a target user selection unit 24, content A database 25, a display control unit 26, and a display unit 27 are provided.
- At least two user detection cameras 60 are installed around the screen of the display unit 27. That is, the user detection camera 60 includes a first user detection camera 60a and a second user detection camera 60b.
- the user detection camera 60 includes an image sensor such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor) and an optical system.
- CCD Charge Coupled Device
- CMOS Complementary Metal Oxide Semiconductor
- the first user detection camera 60a and the second user detection camera 60b simultaneously photograph the user existing in front of the screen. Then, the first user detection camera 60 a and the second user detection camera 60 b output image information indicating the captured image to the user identification unit 21 and the user state detection unit 22.
- the user identification unit 21 extracts a face area in the image indicated by the image information obtained from the user detection camera 60. And the user identification part 21 outputs the user identification information which is the information which shows the characteristic of the said face area for every extracted face area, and is information for specifying a user.
- the user state detection unit 22 detects a user state that is a physical state of a user located in front of the screen.
- the user state detection unit 22 is an area in which a user is captured between images taken by the first user detection camera 60a and the second user detection camera 60b (hereinafter simply referred to as “user area”).
- the relative position between the user and the screen is calculated for each user as the user state using the corresponding relationship. That is, the user state detection unit 22 detects the position of the user as the user state for each user using the parallax based on stereo vision. Furthermore, the user state detection unit 22 detects the user's movement vector (the user's movement direction and the user's movement speed) based on the detected temporal change in the user's position.
- 9A and 9B are diagrams for explaining a user position calculation method according to Embodiment 2 of the present invention.
- the first user detection camera 60a and the second user detection camera 60b are installed parallel to the screen of the information display device 20 with a distance B from each other. Specifically, the first user detection camera 60a and the second user detection camera 60b are installed such that the optical axes of the respective user detection cameras are parallel to each other and separated from each other by a distance B and perpendicular to the screen. The optical center of each user detection camera is installed on a plane including the screen.
- the user state detection unit 22 extracts the user area in the image captured by each user detection camera 60. Then, the user state detection unit 22 calculates the distance D between the user and the screen of the information display device 20 based on the position shift of the corresponding user area between the images. Specifically, the user state detection unit 22 stores, for example, an image captured in advance by each user detection camera 60 in the absence of the user, and the user appears in the shooting range (user detectable region). The user area is extracted by obtaining the difference between the image captured at times and the stored image. The user state detection unit 22 can also extract a user face area obtained by detecting a face image and collating the face image as a user area.
- FIG. 9B shows the principle of distance measurement by stereo vision for obtaining the distance D between the user and the camera installation surface (screen of the information display device 20) based on the positional relationship between the corresponding user areas on the two images.
- FIG. 9B the image of the user whose position is to be measured is projected on the imaging surfaces of the image sensors of the first user detection camera 60a and the second user detection camera 60b, respectively.
- the user state detection unit 22 uses the focal length f of the camera and the distance B between the optical axes of the camera.
- the distance D between the information display device 20 and the screen of the information display device 20 can be calculated as shown in equation (2).
- the user state detection unit 22 obtains the user position in the direction parallel to the screen of the information display device 20 based on the position of the user area in the image and the distance D calculated by Expression (2). it can.
- the user state detection unit 22 calculates and outputs the relative position of the user with respect to the information display device 20 in this way.
- the user state detection unit 22 calculates the user's movement vector based on the temporal change of the position calculated as described above. Specifically, the user state detection unit 22 holds the calculated user position for each user, and uses the held user position to calculate the user movement vector for each user.
- the user state detection unit 22 does not necessarily have to calculate the position of the user based on stereo parallax.
- the user state detection unit 22 may calculate the relative position between the user and the information display device 20 from distance information obtained by the principle of light wave distance measurement (Time of Flight).
- at least one user detection camera 60 includes a distance image sensor that outputs distance information using the principle of light wave distance measurement.
- the user state detection unit 22 may acquire the relative position of the user with respect to the information display device 20 from the floor pressure sensor installed on the floor surface in front of the screen of the information display device 20. In this case, the user detection camera 60 does not need to be installed.
- the interest level estimation unit 23 estimates the user's interest level with respect to the display content for each user specified by the user identification information based on the user's position and the like output from the user state detection unit 22. A method of calculating the interest level will be described later.
- the target user selection unit 24 selects a target user based on the interest level of each user output from the interest level estimation unit 23. For example, the target user selection unit 24 selects a user corresponding to the degree of interest as a target user when the degree of interest in the display content is equal to or greater than a fourth threshold predetermined for each display content.
- the fourth threshold value is a value indicating that the user has some interest in the display content. That is, the fourth threshold is a value for selecting a user who is expected to further increase the interest level by lowering the definition or disclosure level of the content.
- the target user selecting unit 24 may select the user as the target user when the distance between the user's position and the center of the display content is less than a fifth threshold predetermined for each display content.
- the fifth threshold is a distance indicating that the user has some interest in the display content. That is, the fifth threshold is a distance for selecting a user who is expected to further increase the interest level by lowering the definition or disclosure level of the content.
- the content database 25 is a database that stores content, types of video effects applied to the content, and application areas for each interest level indicating the level of interest of the user.
- FIG. 10 is a diagram showing an example of a content database according to Embodiment 2 of the present invention.
- the content database 25 stores content, video effects, and application areas in association with each other for each level of interest.
- the case where the content stored in the content database 25 is content related to the advertisement of the product “TV” with the product name or brand name “ABCD” will be described as an example.
- the content of interest level “1” is content in which the image character of the product “TV” is drawn greatly.
- the content of interest level “2” is content in which the image character is drawn slightly smaller than interest level “1”, and the product image is drawn in the upper left instead.
- the content with the interest level “3” is content indicating a product lineup. In this way, as the degree of interest of the user increases, the content that presents detailed information on the product is stored, so that the information display device 20 shows more detailed information as the degree of interest of the user increases. Attracting user interest in advertisements.
- the video effect is an effect applied to content in order to reduce the clarity or disclosure level of the content.
- the video effect is an effect such as “hiding” for hiding content or “blur” for blurring content.
- “Blur (30)” indicates that the intensity of the blurring effect is “30”. That is, “blur (75)” indicates an effect that blurs more strongly than “blur (30)”.
- the content database 25 stores the intensity of the video effect in addition to the type of the video effect.
- the application area may be the entire content “all” or a partial area of the content.
- “rectangle (20, 50, 500, 300)” is a rectangle whose coordinates (20, 50) are one vertex, and whose coordinates (500, 300) are vertices located at the diagonal of the one vertex. Indicates the area.
- the display content is a product advertisement
- the application area to which the video effect for lowering the visibility or disclosure level is applied is an area showing information important to the appeal of the product, such as the product name or product character
- the user Therefore, the information display device 20 can attract the user's attention. That is, if the application area is an area indicating essential information, the user's interest can be enhanced by the rush effect. On the other hand, even if the application area is an area other than the area that shows important information, the information display device 20 shows only the important information clearly, and the entire image is displayed. It is possible to prevent the user from knowing, and the user's attention can be drawn.
- the display control unit 26 When the magnitude of the interest rate change rate estimated by the interest level estimation unit 23 is less than the first threshold value, the display control unit 26 has at least a part of the definition level or disclosure level that is determined in advance. The content different from the displayed content is displayed on the display unit 27 in a state lower than the predetermined level.
- the display control unit 26 includes an application control unit 26a and a screen drawing unit 26b.
- the application control unit 26a When updating the drawing content of the screen of the display unit 27, the application control unit 26a outputs update information, which is information relating to the update of the drawing content, to the screen drawing unit 26b.
- the screen drawing unit 26 b draws the content to be displayed on the display unit 27 based on the update information, and outputs the drawn content to the display unit 27.
- the display unit 27 has a screen and displays the content drawn by the screen drawing unit 26b on the screen.
- FIG. 12 is a flowchart showing a flow of processing performed by the information display device according to Embodiment 2 of the present invention.
- FIG. 13 is a diagram for explaining the interest level calculation method according to Embodiment 2 of the present invention.
- 14A and 14B are diagrams for explaining a method of selecting a target user in the second embodiment of the present invention.
- the user identification unit 21 generates user identification information for identifying a user by extracting a face area from an image photographed by the user detection camera 60 (S202).
- the user state detection unit 22 extracts a user area from the image captured by the user detection camera 60, and detects the user state for each extracted user area (S204). Specifically, the user state detection unit 22 calculates the position of the user for each user area. Furthermore, the user state detection unit 22 calculates a user movement vector based on the temporal change in the calculated user position.
- the degree-of-interest estimation unit 23 calculates the movement vector of the user
- the user's interest level k for the display content is calculated according to Equation (3). (S206).
- g1, g2, and g3 are gains and real numbers of 0 or more.
- Equation (3) The first term on the right side of Equation (3) is a term relating to the user's moving direction, and is a term for increasing the degree of interest as the user moves closer to the display content.
- the movement vector represented by a solid line indicates a vector that intersects the screen or a plane obtained by extending the screen if the movement vector is extended in the movement direction.
- the movement vector represented by the dotted line in FIG. 13 indicates a vector that does not intersect the screen and the plane that extends the screen even if the movement vector is extended in the movement direction.
- the distance s from the intersection of the screen and the center of the display content becomes smaller as the user moves toward the display content, and the first term becomes a larger value. Therefore, according to the first term, the degree of interest increases as the moving direction of the user moves toward the center of the display content.
- the distance s from the intersection of the screen and the center of the display content is shown only for the user A and the user C in FIG. 13, but the distance s of the other users is calculated in the same manner as the user A and the user C. . Further, the distance s of the user who does not intersect the screen and the surface extending the screen even if the movement vector is extended in the movement direction as in the user B is infinite ( ⁇ ), and the first term is g1 0 regardless of the value of.
- the distance s is the distance from the intersection of the user's movement vector and the screen to the center of the display content, but is the distance from the intersection of the user's movement vector and the screen to the center of the screen. Also good.
- the degree of interest increases as the user moves closer to the center of the screen.
- the distance s may be a distance from the intersection of the user's movement vector and the screen to the center of the image indicated by a part of the display content. In this case, the degree of interest increases as the user moves closer to the center of the image indicated by part of the display content.
- Equation (3) The second term on the right side of Equation (3) is a term relating to the user's speed, and is a term for increasing the degree of interest as the user moves slowly.
- Equation (3) The third term on the right side of Equation (3) is a term relating to the distance between the user's position and the center of the display content, and is a term for increasing the degree of interest as the user's position is closer to the display content. It is.
- the target user selection unit 24 selects a target user (S208). Specifically, the target user selection unit 24 selects, for example, a user whose interest level k is a predetermined fourth threshold value kTH or more as a target user. For example, in FIG. 14A, when the fourth threshold value kTH is set to “0.4”, the target user selection unit 24 has “user A” with an interest level of “0.7” and an interest level of “0. “User D”, which is “4”, is selected as the target user.
- the target user selection unit 24 may select a user whose distance d from the center of the display content is less than a predetermined fifth threshold as the target user. In this case, for example, in FIG. 14B, the target user selection unit 24 selects “user A” and “user B” whose distance d from the center of the display content is less than the fifth threshold value dt as target users.
- the display control unit 26 calculates an average value of the degree of change in the degree of interest for the target user (S210). Specifically, the display control unit 26 calculates the change rate of the interest level of each target user at a predetermined time interval ⁇ t at the current time t. For example, when “user A” and “user B” are selected as target users in step S208, the display control unit 26 determines the interest level of “user A” as shown in FIG. 15 and equation (4).
- the magnitude of change rate a_A (t) and the magnitude of change rate of interest rate “a_B (t) of“ user B ” are calculated by calculating the interest level k_a of“ user A ”and the interest level k_b of“ user B ”. And using
- the display control unit 26 calculates the average value of the interest rate change rates (a_A and a_B) of the user A and the user B calculated in this way, as the change rate of the interest rate of the target user.
- the average value a_all is calculated.
- the display control unit 26 determines whether or not the sharpness or disclosure level of at least a part of the content displayed on the display unit 27 is lower than a predetermined definition or disclosure level (S212). ).
- the display control unit 26 determines that the average value a_all of the interest rate change rate is the first threshold value. It is determined whether or not the state below is continued for a predetermined time or more (S214).
- the display control unit 26 sets the calculated interest level to the interest level.
- the corresponding content is displayed in a state where at least a part of the definition or disclosure is lower than a predetermined definition or disclosure (S216).
- the display control unit 26 refers to the content database 25 to acquire content and video effects corresponding to the calculated interest level.
- the display control unit 26 displays the acquired content on the display unit 27 and applies the acquired video effect to the content.
- the average value a_all of the degree of change in interest level is lower than the first threshold for a predetermined time or longer, reducing the clarity or disclosure level of the content is based on the user's interest level. This is because the effect of raising is high (the degree of interest should be raised).
- the display control unit 26 determines whether to increase the definition or disclosure level (S218). . Specifically, for example, the display control unit 26 increases the definition or the disclosure level according to whether or not the magnitude of the interest rate change rate estimated by the interest level estimation unit 23 exceeds the second threshold. Judge whether or not to. Alternatively, the display control unit 26 determines, for example, whether or not to increase the definition or the disclosure level according to whether or not the interest level estimated by the interest level estimation unit 23 exceeds the third threshold value.
- the display control unit 26 increases the definition or the disclosure level of the display content (S220), and ends the processing. Specifically, the display control unit 26 removes the video effect applied to the content displayed on the screen. On the other hand, when it is determined that the definition or the disclosure level should not be increased (No in S218), the process ends.
- the information display device 20 can change the clarity or disclosure level of at least a part of the content according to the interest level of the target user. Note that the information display device 20 may repeatedly execute the processing from step S202 to step S220.
- FIG. 16 is a diagram showing a specific example of the operation of the information display device in Embodiment 2 of the present invention.
- an advertisement for the product “TV” is displayed on a part of the screen.
- the information display device 20 displays the content stored in the content database 25 illustrated in FIG. 10 on the screen according to the degree of interest of the “user A”. For example, in FIG. 16A, since the interest level of “user A” is in the range of 0 to k1, content of interest level “1” is displayed.
- the situation shown in FIG. 16 (a) is a situation in which the increase in the degree of interest has slowed after the degree of interest in the target user has steadily increased. That is, the situation illustrated in FIG. 16A is a situation in which the change rate of the interest level of the target user is small and the change in the interest level is stable. Such a situation indicates that the interest of the display content of the target user has increased to some extent, but the situation is not such that interest is further increased. That is, it is considered that the target user has determined that the information presented by the display content is not information that further increases interest. In such a situation, by adding a video effect to all or part of the display content (advertisement), it is possible to give a new stimulus to the user and attract further interest to the display content (advertisement).
- the display control unit 26 since the magnitude of the change rate of the interest level of the target user is continuously lower than the first threshold value pc for a predetermined time, the display control unit 26, as shown in FIG. 16 (b), The content stored in the content database 25 is displayed in a state where a part or all of the definition or disclosure level is lower than a predetermined definition or disclosure level. Specifically, the display control unit 26 switches the display content to the content of the interest level “2”, which is the interest level one level higher than the current interest level, and further displays the video effect “hiding (20)”. Is applied to the display content application area “rectangle (20, 50, 500, 300)”. As a result, since the degree of disclosure of the portion where the product “TV” of the display content is displayed is lowered, the user pays attention to the portion where the visibility is deteriorated.
- the display control unit 26 increases the disclosure level lowered as shown in FIG. 16C when the magnitude of the change rate of the interest level exceeds the second threshold value pc2. That is, the display control unit 26 removes the video effect applied to the content.
- the display control unit 26 may increase the degree of content disclosure when, for example, the degree of interest exceeds k1.
- the interest level is in the range of k1 to k2, and as in FIG. 16A, the magnitude of the change rate of the interest level of the target user is continuously lower than the threshold value pc for a predetermined time. . Therefore, the display control unit 26 switches the display content to the content with the interest level “3” as shown in FIG. 16D and applies the application effect “blur (30)” to the application area “all” of the content. . As a result, the entire display content is temporarily blurred and the sharpness is lowered, so that the user pays attention to the display content again.
- the display control unit 26 increases the size of the display area of the display content (advertisement) as “user A” approaches the screen or display content. The size is reduced. As a result, the “user A” can easily see the entire display content, and the interest of the “user A” in the display content is increased.
- the display control unit 26 controls the position or size of the area of the screen on which the content is displayed according to the position of at least one target user. Specifically, for example, the display control unit 26 displays the content such that the area where the content is displayed becomes smaller as the position of at least one target user is closer to the position where the content is displayed. It is preferable to do. For example, the display control unit 26 displays the content such that the position of the area where the content is displayed is closer to the user as the position of at least one target user is closer to the position where the content is displayed. It is preferable.
- the information display device 20 can change the sharpness or disclosure level of the content according to the user state of the target user selected from the users located in front of the screen. It is possible to efficiently increase the interest of the user selected as the target user.
- the information display device 20 can select a user who has a high possibility of attracting attention due to the rush effect as a target user, and can efficiently increase the user's interest.
- video effects are not limited to those exemplified above. Other video effects may be used as long as there is an effect of temporarily lowering the clarity or disclosure level of the display content.
- the video effect may be an effect using a sign or material that easily attracts the user's attention.
- the information display device does not necessarily display content related to an advertisement.
- the information display device may display other content that is effective by increasing the user's interest.
- the content is displayed in a partial area of the screen.
- the content may be displayed on the entire screen.
- one content is displayed on the screen.
- two or more contents may be displayed on the screen.
- the information display device 20 preferably executes the process shown in FIG. 12 for each content.
- the degree-of-interest estimation unit 23 estimates the degree of interest by using all of the movement direction, movement speed, and position of the user, but the movement direction, movement speed, and position of the user.
- the degree of interest may be estimated using at least one of the above. For example, if the gain g1 and the gain g2 are set to 0 in Equation (3), the interest level estimation unit 23 may calculate the interest level k based only on the distance between the user's position and the center of the display content. it can.
- the information display device 20 in the modification of the second embodiment is different from the information display device 20 in the second embodiment in that the degree of interest is further estimated using the user's line-of-sight direction.
- this modification will be described with reference to the drawings, focusing on differences from the second embodiment.
- the information display device 20 in the present modification includes a user identification unit 21, a user state detection unit 22, an interest level estimation unit 23, and a target user.
- a selection unit 24, a content database 25, a display control unit 26, and a display unit 27 are provided.
- components other than the user state detection unit 22 and the interest level estimation unit 23 are the same as those in the second embodiment, description thereof is omitted.
- the user state detection unit 22 further detects the user's line-of-sight direction as the user state. A specific gaze direction detection method will be described later.
- the degree-of-interest estimation unit 23 estimates that the degree of interest is higher as the detected user's line-of-sight direction is closer to the position representing the screen from the user or the direction from the user to the position where the content is displayed. To do. Specifically, the degree-of-interest estimation unit 23 calculates the degree of interest using Expression (5) obtained by adding a fourth term related to the user's line-of-sight direction to the right side of Expression (3).
- the fourth term of Equation (5) is a line-of-sight vector that is a unit vector of the user's line-of-sight direction.
- g4 is a gain, as in g1, g2, and g3, and is a real number of 0 or more.
- the distance t is not the distance from the intersection between the line-of-sight direction vector and the screen to the center of the display content, but may be the distance from the intersection between the line-of-sight direction vector and the screen to the center of the screen.
- the degree of interest increases as the user sees the direction of the center of the screen.
- the distance t may be a distance from the intersection of the line-of-sight direction vector and the screen to the center of the image indicated by a part of the display content.
- the degree of interest increases as the user sees the direction of the center of the image indicated by part of the display content.
- FIG. 18 is a flowchart showing a flow of processing relating to the detection of the line-of-sight direction in the modification of the second embodiment of the present invention. That is, FIG. 18 shows a part of the user state detection process (step S204 in FIG. 12) by the user state detection unit 22.
- the user state detection unit 22 detects the result of the process of detecting the face direction of the user (S510) and the process of detecting a relative gaze direction that is a relative gaze direction with respect to the face direction (S530). ) To detect the line-of-sight direction (S550).
- the user state detection unit 22 detects a face area from a user image that is photographed by the user detection camera 60 and exists in front of the screen (S512). Next, the user state detection unit 22 applies the area of the facial part feature point corresponding to each reference face direction to the detected face area, and cuts out the area image of each facial part feature point (S514).
- the user state detection unit 22 calculates the degree of correlation between the clipped region image and the template image stored in advance (S516). Subsequently, the user state detection unit 22 obtains a weighted sum obtained by weighting and adding the angles indicated by the respective reference face orientations according to the calculated degree of correlation, and the user face corresponding to the detected face area is obtained. The direction is detected (S518).
- the user state detection unit 22 detects the orientation of the user's face by executing the processing of steps S512 to S518.
- the user state detection unit 22 detects the three-dimensional positions of the left and right eyes of the user using the image taken by the user detection camera 60 (S532). Subsequently, the user state detection unit 22 detects a three-dimensional position of the center of the left and right black eyes of the user using an image taken by the user detection camera 60 (S534). Then, the user state detection unit 22 detects the relative line-of-sight direction using the line-of-sight reference plane obtained from the three-dimensional positions of the left and right eye heads and the three-dimensional position of the center of the left and right eyes (S536).
- the user state detection unit 22 detects the relative line-of-sight direction by executing the processing of steps S532 to S536.
- the user state detection unit 22 detects the user's line-of-sight direction using the user's face orientation and the relative line-of-sight direction detected as described above.
- FIG. 19 is a diagram for explaining processing for detecting the orientation of the user's face in the modification of the second embodiment of the present invention.
- the user state detection unit 22 determines the facial part feature point area from the facial part area DB storing the facial part feature point areas corresponding to each reference face direction. read out. Subsequently, as shown in FIG. 19B, the user state detection unit 22 applies the facial part feature point area to the face area of the photographed image for each reference face direction, and the facial part feature point area image. For each reference face orientation.
- the user state detection unit 22 calculates the correlation between the clipped region image and the template image held in the face part region template DB for each reference face direction.
- the user state detection unit 22 calculates a weight for each reference face direction according to the degree of correlation indicated by the calculated degree of correlation. For example, the user state detection unit 22 calculates the ratio of the correlation degree of each reference face direction to the sum of the correlation degrees of the reference face direction as a weight.
- the user state detection unit 22 calculates the degree of correlation for the facial image of the facial part feature points.
- the user state detection unit 22 need not necessarily target the facial image of the facial part feature points.
- the user state detection unit 22 may calculate the degree of correlation for the entire face area image.
- a method for detecting the face orientation there is a method of detecting facial part feature points such as eyes, nose and mouth from the face image and calculating the face orientation from the positional relationship of the facial part feature points.
- a method of calculating the face orientation from the positional relationship of the facial part feature points rotate and enlarge the 3D model of the facial part feature points prepared in advance so as to best match the facial part feature points obtained from one camera.
- the position of the facial part feature point positions in the left and right cameras is calculated using the principle of stereo vision based on the images taken by the two cameras.
- the user state detection unit 22 first detects the gaze direction reference plane, then detects the three-dimensional position of the center of the black eye, and finally detects the relative gaze direction.
- FIG. 20 is a diagram for explaining the reference direction reference plane.
- the user state detection unit 22 detects the gaze direction reference plane by detecting the three-dimensional positions of the left and right eye heads (end points on the nose side of the left eye and the right eye).
- the gaze direction reference plane is a plane that becomes a reference when detecting the relative gaze direction, and is the same as the left-right symmetric plane of the face as shown in FIG. It should be noted that the position of the head of the eye is less fluctuated due to facial expressions and less erroneously detected than the position of other facial parts such as the corner of the eye, the corner of the mouth or the eyebrows. Therefore, in this modification, the user state detection unit 22 detects the gaze direction reference plane, which is a symmetrical plane of the face, using the three-dimensional position of the eye.
- the user state detection unit 22 uses a face detection module and a face component detection module in each of two images taken simultaneously by the first user detection camera 60a and the second user detection camera 60b. Used to detect the left and right eye area. Then, the user state detection unit 22 detects the three-dimensional position of each of the right and left eyes using the positional shift (parallax) between the images of the detected eye areas. Furthermore, as shown in FIG. 20, the user state detection unit 22 detects a vertical bisector of a line segment with the detected three-dimensional position of the left and right eyes as an end point as a gaze direction reference plane.
- 21A and 21B are diagrams for explaining detection of the center of the black eye.
- the person visually recognizes the object by the light from the object reaching the retina through the pupil, being converted into an electric signal, and the electric signal being transmitted to the brain. Therefore, the line-of-sight direction can be detected using the position of the pupil.
- the Japanese iris is black or brown, it is difficult to discriminate between the pupil and the iris by image processing.
- the center of the pupil and the center of the black eye substantially coincide. Therefore, in the present modification, the user state detection unit 22 detects the center of the black eye when detecting the relative line-of-sight direction.
- the user state detection unit 22 first detects the positions of the corners of the eyes and the eyes from the captured image. And the user state detection part 22 detects an area
- the user state detection unit 22 sets a black eye detection filter including a first area and a second area as shown in FIG. 21B at an arbitrary position in the black eye area. Then, the user state detection unit 22 searches for the position of the black-eye detection filter that maximizes the inter-region variance between the luminance of the pixels in the first region and the luminance of the pixels in the second region, and the search result indicates The position is detected as the center of the black eye. Similarly to the above, the user state detection unit 22 detects the three-dimensional position of the center of the black eye using the shift of the position of the center of the black eye in the two images taken at the same time.
- the user state detection unit 22 detects the relative line-of-sight direction using the detected line-of-sight direction reference plane and the three-dimensional position of the center of the black eye. It is known that there is almost no individual difference in the diameter of an eyeball of an adult. For example, in the case of a Japanese, it is about 24 mm. Therefore, if the position of the center of the black eye when the direction of the reference (for example, the front) is known is known, it can be converted and calculated in the line-of-sight direction by obtaining the displacement from there to the current center position of the black eye.
- the direction of the reference for example, the front
- this method utilizes the fact that when facing the front, the midpoint of the line segment connecting the centers of the left and right black eyes exists on the center of the face, that is, on the line-of-sight direction reference plane. That is, the user state detection unit 22 detects the relative line-of-sight direction by calculating the distance between the midpoint of the line segment connecting the left and right black eye centers and the line-of-sight direction reference plane.
- the user state detection unit 22 uses the eyeball radius R and the distance d between the midpoint of the line segment connecting the left and right black eye centers and the gaze direction reference plane, as shown in Expression (6).
- the rotation angle ⁇ in the left-right direction with respect to the face direction is detected as the relative line-of-sight direction.
- the user state detection unit 22 detects the relative line-of-sight direction using the line-of-sight reference plane and the three-dimensional position of the center of the black eye. And the user state detection part 22 detects a user's gaze direction using the detected face direction and relative gaze direction.
- the information display device 20 in the present modification can easily detect the user's line-of-sight direction, and can estimate the degree of interest with high accuracy using the detected line-of-sight direction.
- the degree of interest estimation unit 23 estimates the degree of interest using all of the user's movement direction, movement speed, position, and line-of-sight direction, but the degree of interest is not necessarily calculated using all. There is no need to estimate. For example, in Expression (5), if the gain g1, the gain g2, and the gain g3 are set to 0, the interest level estimation unit 23 can calculate the interest level k based only on the user's line-of-sight direction.
- the information display device 20 uses the distance s or the distance t to approximate the moving direction or the line-of-sight direction and the direction from the user to the position where the content is displayed.
- it may be digitized by other methods.
- the information display device 20 uses the absolute value of the angle ( ⁇ 180 degrees to 180 degrees) formed by the movement direction or the line-of-sight direction and the direction from the user toward the position where the content is displayed. The closeness between the direction and the direction from the user to the position where the content is displayed may be quantified.
- the information display device has been described based on the embodiments and the modifications thereof, but the present invention is not limited to these embodiments or the modifications thereof. Unless it deviates from the gist of the present invention, various modifications conceived by those skilled in the art are applied to the present embodiment or the modified examples thereof, or a form constructed by combining the components in the different embodiments or modified examples thereof. It is included within the scope of the present invention.
- the information display device includes a screen such as a plasma display panel or a liquid crystal panel, but it is not always necessary to include a screen.
- the information display device may be a projector that projects content onto a projection surface such as a screen or a wall surface.
- Each of the above devices is specifically a computer system including a microprocessor, a ROM, a RAM, a hard disk unit, a display unit, a keyboard, a mouse, and the like.
- a computer program is stored in the RAM or the hard disk unit.
- Each device achieves its function by the microprocessor operating according to the computer program.
- the computer program is configured by combining a plurality of instruction codes indicating instructions for the computer in order to achieve a predetermined function.
- Each device is not limited to a computer system including a microprocessor, a ROM, a RAM, a hard disk unit, a display unit, a keyboard, a mouse, and the like, but may be a computer system including a part of them.
- a part or all of the constituent elements constituting each of the above devices may be constituted by one system LSI (Large Scale Integration).
- the system LSI is a super multifunctional LSI manufactured by integrating a plurality of components on one chip, and specifically, a computer system including a microprocessor, a ROM, a RAM, and the like. .
- a computer program is stored in the RAM.
- the system LSI achieves its functions by the microprocessor operating according to the computer program.
- system LSI may be called IC, LSI, super LSI, or ultra LSI depending on the degree of integration.
- method of circuit integration is not limited to LSI's, and implementation using dedicated circuitry or general purpose processors is also possible.
- An FPGA Field Programmable Gate Array
- reconfigurable processor that can reconfigure the connection and setting of circuit cells inside the LSI may be used.
- a part or all of the constituent elements constituting each of the above devices may be constituted by an IC card or a single module that can be attached to and detached from each device.
- the IC card or the module is a computer system including a microprocessor, a ROM, a RAM, and the like.
- the IC card or the module may include the super multifunctional LSI described above.
- the IC card or the module achieves its function by the microprocessor operating according to the computer program. This IC card or this module may have tamper resistance.
- the present invention may be a method in which an operation of a characteristic component included in the information display device described above is used as a step. Further, the present invention may be a computer program that realizes these methods by a computer, or may be a digital signal composed of the computer program.
- the present invention also provides a computer-readable recording medium such as a flexible disk, hard disk, CD-ROM, MO, DVD, DVD-ROM, DVD-RAM, BD (Blu-ray Disc). (Registered trademark)) or a semiconductor memory. Further, the present invention may be the computer program or the digital signal recorded on these recording media.
- a computer-readable recording medium such as a flexible disk, hard disk, CD-ROM, MO, DVD, DVD-ROM, DVD-RAM, BD (Blu-ray Disc). (Registered trademark)) or a semiconductor memory.
- the present invention may be the computer program or the digital signal recorded on these recording media.
- the computer program or the digital signal may be transmitted via an electric communication line, a wireless or wired communication line, a network represented by the Internet, a data broadcast, or the like.
- the present invention may also be a computer system including a microprocessor and a memory.
- the memory may store the computer program, and the microprocessor may operate according to the computer program.
- the program or the digital signal is recorded on the recording medium and transferred, or the program or the digital signal is transferred via the network or the like, and is executed by another independent computer system. It is good.
- the information display device can increase the user's interest in the display content by changing the disclosure level or the clarity of the display content. It can be used as an information display device for display, for example, outdoor electronic advertisement (digital signage) or a large screen television.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Social Psychology (AREA)
- Computer Networks & Wireless Communication (AREA)
- Databases & Information Systems (AREA)
- Life Sciences & Earth Sciences (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Neurosurgery (AREA)
- Controls And Circuits For Display Device (AREA)
- User Interface Of Digital Computer (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
Description
本発明の実施の形態1における情報表示装置10は、画面にコンテンツを表示する装置であり、画面に表示されているコンテンツ(以下、単に「表示コンテンツ」という)に対するユーザの関心度に応じて、表示コンテンツの鮮明度又は開示度を変化させることを特徴とする。
実施の形態1の変形例1における情報表示装置10は、ユーザの関心度の変化率の大きさが小さいときに、現在画面に表示されているコンテンツとは異なるコンテンツを画面に表示する点が、実施の形態1における情報表示装置10と異なる。以下、実施の形態1と異なる点を中心に、本変形例について、図面を参照しながら説明する。
一律に所定の期間だけ鮮明度又は開示度を下げた場合、ユーザをじらす期間が長くなりすぎ、かえってユーザの関心を下げてしまうという問題がある。そこで、実施の形態1の変形例2における情報表示装置10は、コンテンツの鮮明度又は開示度を上げるタイミングをユーザの関心度に応じて動的に決定することを特徴とする。以下、実施の形態1と異なる点を中心に、本変形例について、図面を参照しながら説明する。
次に、本発明の実施の形態2について、図面を用いて詳細に説明する。
実施の形態2の変形例における情報表示装置20は、さらにユーザの視線方向を用いて関心度を推定する点が、実施の形態2における情報表示装置20と異なる。以下、実施の形態2と異なる点を中心に、本変形例について、図面を参照しながら説明する。
11、22 ユーザ状態検出部
12、23 関心度推定部
13、26 表示制御部
14、27 表示部
21 ユーザ識別部
24 ターゲットユーザ選定部
25 コンテンツデータベース
26a アプリケーション制御部
26b 画面描画部
60 ユーザ検出カメラ
60a 第1のユーザ検出カメラ
60b 第2のユーザ検出カメラ
Claims (15)
- 第1コンテンツを画面に表示する表示部と、
前記画面の前方に位置するユーザの物理的な状態であるユーザ状態を検出するユーザ状態検出部と、
前記ユーザ状態検出部によって検出されたユーザ状態に基づいて、前記表示部によって画面に表示された第1コンテンツに対するユーザの関心度合いを示す関心度を推定する関心度推定部と、
前記関心度推定部によって推定された関心度の変化率の大きさが第1閾値未満である場合、少なくとも一部の鮮明度又は開示度が予め定められた鮮明度又は開示度よりも低い状態で、第2コンテンツを前記表示部に表示させる表示制御部とを備える
情報表示装置。 - 前記表示制御部は、さらに、前記関心度推定部によって推定された関心度の変化率の大きさが第2閾値を超えたときに、前記画面に表示された第2コンテンツの鮮明度又は開示度を上げる
請求項1に記載の情報表示装置。 - 前記表示制御部は、さらに、前記関心度推定部によって推定された関心度が第3閾値を超えたときに、前記画面に表示された第2コンテンツの鮮明度又は開示度を上げる
請求項1に記載の情報表示装置。 - 前記表示制御部は、前記関心度推定部によって推定された関心度の変化率の大きさが前記第1閾値未満であって、かつ、前記関心度推定部によって推定された関心度の変化率の大きさが前記第1閾値未満である状態が所定時間継続している場合、前記第2コンテンツを前記表示部に表示させる
請求項1に記載の情報表示装置。 - 前記ユーザ状態検出部は、前記ユーザの移動方向を前記ユーザ状態として検出し、
前記関心度推定部は、前記ユーザ状態検出部によって検出されたユーザの移動方向が、当該ユーザから前記画面を代表する位置へ向かう方向、又は当該ユーザから前記第1コンテンツが表示されている位置へ向かう方向に近いほど、関心度が高いと推定する
請求項1に記載の情報表示装置。 - 前記ユーザ状態検出部は、前記ユーザの移動速度を前記ユーザ状態として検出し、
前記関心度推定部は、前記ユーザ状態検出部によって検出されたユーザの移動速度が小さいほど、関心度が高いと推定する
請求項1に記載の情報表示装置。 - 前記ユーザ状態検出部は、前記ユーザの位置を前記ユーザ状態として検出し、
前記関心度推定部は、前記ユーザ状態検出部によって検出されたユーザの位置と前記第1コンテンツが表示されている位置との距離が小さいほど関心度が高いと推定する
請求項1に記載の情報表示装置。 - 前記ユーザ状態検出部は、ユーザの視線方向を前記ユーザ状態として検出し、
前記関心度推定部は、前記ユーザ状態検出部によって検出されたユーザの視線方向が、当該ユーザから前記画面を代表する位置へ向かう方向、又は当該ユーザから前記第1コンテンツが表示されている位置へ向かう方向に近いほど、関心度が高いと推定する
請求項1に記載の情報表示装置。 - さらに、前記画面の前方に位置するユーザの中からターゲットユーザを選定するターゲットユーザ選定部を備え、
前記表示制御部は、前記ターゲットユーザ選定部によって選定されたターゲットユーザの関心度の変化率の大きさが前記第1閾値未満である場合、前記第2コンテンツを前記表示部に表示させる
請求項1に記載の情報表示装置。 - 前記ターゲットユーザ選定部は、前記関心度推定部によって推定された関心度が第4閾値以上である場合、当該関心度に対応するユーザをターゲットユーザとして選定する
請求項9に記載の情報表示装置。 - 前記ターゲットユーザ選定部は、前記第1コンテンツが表示されている位置とユーザの位置との距離が第5閾値未満である場合、当該ユーザをターゲットユーザとして選定する
請求項9に記載の情報表示装置。 - さらに、前記第2コンテンツの一部であってユーザに見せようとしている情報である本質情報を示す部分の位置を示す適用領域を少なくとも記憶しているコンテンツデータベースを備え、
前記表示制御部は、前記コンテンツデータベースに記憶された前記第2コンテンツの適応領域の鮮明度又は開示度が予め定められた鮮明度又は開示度よりも低い状態で、前記第2コンテンツを前記表示部に表示させる
請求項1に記載の情報表示装置。 - 前記予め定められた鮮明度又は開示度は、前記第1コンテンツの鮮明度又は開示度であり、
前記表示制御部は、前記第1コンテンツを前記第2コンテンツとして前記表示部に表示させる
請求項1に記載の情報表示装置。 - 第1コンテンツを画面に表示する第1の表示ステップと、
前記画面の前方に位置するユーザの物理的な状態であるユーザ状態を検出するユーザ状態検出ステップと、
前記ユーザ状態検出ステップにおいて検出されたユーザ状態に基づいて、前記表示ステップにおいて画面に表示された第1コンテンツに対するユーザの関心度合いを示す関心度を推定する関心度推定ステップと、
前記関心度推定ステップにおいて推定された関心度の変化率の大きさが第1閾値未満である場合、少なくとも一部の領域の鮮明度又は開示度が予め定められた鮮明度又は開示度よりも低い状態で、第2コンテンツを表示する第2の表示ステップとを含む
情報表示方法。 - 請求項14に記載の情報表示方法をコンピュータに実行させるためのプログラムを記録したコンピュータ読取可能な記録媒体。
Priority Applications (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| EP09833190.3A EP2360663B1 (en) | 2008-12-16 | 2009-12-15 | Information display device and information display method |
| US12/867,811 US8421782B2 (en) | 2008-12-16 | 2009-12-15 | Information displaying apparatus and information displaying method |
| CN2009801053677A CN101946274B (zh) | 2008-12-16 | 2009-12-15 | 信息显示装置以及信息显示方法 |
| JP2010525093A JP5518713B2 (ja) | 2008-12-16 | 2009-12-15 | 情報表示装置及び情報表示方法 |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2008319228 | 2008-12-16 | ||
| JP2008-319228 | 2008-12-16 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2010070882A1 true WO2010070882A1 (ja) | 2010-06-24 |
Family
ID=42268562
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2009/006884 Ceased WO2010070882A1 (ja) | 2008-12-16 | 2009-12-15 | 情報表示装置及び情報表示方法 |
Country Status (6)
| Country | Link |
|---|---|
| US (1) | US8421782B2 (ja) |
| EP (1) | EP2360663B1 (ja) |
| JP (1) | JP5518713B2 (ja) |
| KR (1) | KR101596975B1 (ja) |
| CN (1) | CN101946274B (ja) |
| WO (1) | WO2010070882A1 (ja) |
Cited By (22)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2011074198A1 (ja) * | 2009-12-14 | 2011-06-23 | パナソニック株式会社 | ユーザインタフェース装置および入力方法 |
| JP2012083669A (ja) * | 2010-10-14 | 2012-04-26 | Nippon Telegr & Teleph Corp <Ntt> | 情報提示装置、情報提示方法および情報提示プログラム |
| EP2475183A1 (en) * | 2011-01-06 | 2012-07-11 | Samsung Electronics Co., Ltd. | Display apparatus controlled by motion and motion control method thereof |
| WO2012105196A1 (ja) * | 2011-02-04 | 2012-08-09 | パナソニック株式会社 | 関心度推定装置および関心度推定方法 |
| JP2012186621A (ja) * | 2011-03-04 | 2012-09-27 | Sony Corp | 情報処理装置、情報処理方法およびプログラム |
| JP2013020136A (ja) * | 2011-07-12 | 2013-01-31 | Canon Inc | 表示制御装置、表示制御装置の制御方法、プログラム及び記憶媒体 |
| WO2012114203A3 (en) * | 2011-02-23 | 2013-03-14 | Ayuda Media Management Systems Inc. | Methods, apparatuses and systems for calculating an amount to be billed in respect of running an out-of-home advertisement during a period of time |
| JP2013114595A (ja) * | 2011-11-30 | 2013-06-10 | Canon Inc | 情報処理装置、情報処理方法及びプログラム |
| JP2014026040A (ja) * | 2012-07-25 | 2014-02-06 | Toshiba Tec Corp | 表示制御装置及び表示制御プログラム |
| JP2014153666A (ja) * | 2013-02-13 | 2014-08-25 | Mitsubishi Electric Corp | 広告提示装置 |
| JP2015064513A (ja) * | 2013-09-26 | 2015-04-09 | カシオ計算機株式会社 | 表示装置、コンテンツ表示方法及びプログラム |
| WO2015190093A1 (ja) * | 2014-06-10 | 2015-12-17 | 株式会社ソシオネクスト | 半導体集積回路およびそれを備えた表示装置並びに制御方法 |
| JP2016076259A (ja) * | 2015-12-21 | 2016-05-12 | キヤノン株式会社 | 情報処理装置、情報処理方法及びプログラム |
| JP2017090918A (ja) * | 2016-11-14 | 2017-05-25 | カシオ計算機株式会社 | コンテンツ出力装置及びプログラム |
| KR101744940B1 (ko) | 2016-01-20 | 2017-06-08 | 서강대학교산학협력단 | 디지털 사이니지 시스템에서 상황인지형 컨텐츠 추천장치 |
| WO2019044572A1 (ja) * | 2017-09-04 | 2019-03-07 | ソニー株式会社 | 情報処理装置および方法、並びにプログラム |
| JP2019531558A (ja) * | 2016-06-23 | 2019-10-31 | アウターネッツ、インコーポレイテッド | 対話式コンテンツ管理 |
| JP2020113075A (ja) * | 2019-01-11 | 2020-07-27 | 沖電気工業株式会社 | 情報処理装置、情報処理方法、プログラム、及び情報処理システム |
| JP2021125734A (ja) * | 2020-02-03 | 2021-08-30 | マルコムホールディングス株式会社 | 対話ユーザの感情情報の提供装置 |
| JP2022185120A (ja) * | 2019-09-18 | 2022-12-13 | デジタル・アドバタイジング・コンソーシアム株式会社 | プログラム、情報処理方法及び情報処理装置 |
| WO2024166619A1 (ja) * | 2023-02-07 | 2024-08-15 | パナソニックIpマネジメント株式会社 | 情報処理方法および情報処理システム |
| WO2025238978A1 (ja) * | 2024-05-14 | 2025-11-20 | パナソニックIpマネジメント株式会社 | 情報処理方法、情報処理装置及び情報処理プログラム |
Families Citing this family (31)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR101694820B1 (ko) * | 2010-05-07 | 2017-01-23 | 삼성전자주식회사 | 사용자 위치 인식 방법 및 장치 |
| JP2012165181A (ja) * | 2011-02-07 | 2012-08-30 | Sony Corp | 映像再生装置と映像再生方法およびプログラム |
| US9251588B2 (en) * | 2011-06-20 | 2016-02-02 | Nokia Technologies Oy | Methods, apparatuses and computer program products for performing accurate pose estimation of objects |
| US9690368B2 (en) | 2011-12-30 | 2017-06-27 | Adidas Ag | Customization based on physiological data |
| US10402879B2 (en) | 2011-12-30 | 2019-09-03 | Adidas Ag | Offering a customized collection of products |
| ITMI20120061A1 (it) * | 2012-01-20 | 2013-07-21 | Palo Leonardo De | Sistema di acquisizione automatica del numero degli spettatori presenti davanti al televisore e inoltro ad un server per l elaborazione dei dati acquisiti. |
| US9654563B2 (en) | 2012-12-14 | 2017-05-16 | Biscotti Inc. | Virtual remote functionality |
| US9485459B2 (en) * | 2012-12-14 | 2016-11-01 | Biscotti Inc. | Virtual window |
| WO2014115387A1 (ja) * | 2013-01-28 | 2014-07-31 | ソニー株式会社 | 情報処理装置、情報処理方法およびプログラム |
| KR101649896B1 (ko) * | 2014-08-18 | 2016-08-23 | 강민정 | 콘텐츠 제공장치 |
| US10129312B2 (en) * | 2014-09-11 | 2018-11-13 | Microsoft Technology Licensing, Llc | Dynamic video streaming based on viewer activity |
| JP6539966B2 (ja) * | 2014-09-18 | 2019-07-10 | カシオ計算機株式会社 | 情報出力装置、情報出力方法、およびプログラム |
| CN105824400A (zh) * | 2015-01-06 | 2016-08-03 | 索尼公司 | 电子设备的控制方法、控制装置以及电子设备 |
| US9521143B2 (en) * | 2015-02-20 | 2016-12-13 | Qualcomm Incorporated | Content control at gateway based on audience |
| CN107889466B (zh) | 2015-03-31 | 2021-01-22 | 飞利浦照明控股有限公司 | 用于提高人的警惕性的照明系统和方法 |
| US20160307227A1 (en) * | 2015-04-14 | 2016-10-20 | Ebay Inc. | Passing observer sensitive publication systems |
| JP6614547B2 (ja) * | 2015-08-17 | 2019-12-04 | パナソニックIpマネジメント株式会社 | 視聴状態検出装置、視聴状態検出システムおよび視聴状態検出方法 |
| US10405036B2 (en) | 2016-06-24 | 2019-09-03 | The Nielsen Company (Us), Llc | Invertible metering apparatus and related methods |
| US9984380B2 (en) | 2016-06-24 | 2018-05-29 | The Nielsen Company (Us), Llc. | Metering apparatus and related methods |
| CA3028702C (en) * | 2016-06-24 | 2023-08-08 | Timothy Scott Cooper | Invertible metering apparatus and related methods |
| US10178433B2 (en) | 2016-06-24 | 2019-01-08 | The Nielsen Company (Us), Llc | Invertible metering apparatus and related methods |
| WO2018035133A1 (en) | 2016-08-17 | 2018-02-22 | Vid Scale, Inc. | Secondary content insertion in 360-degree video |
| US10904615B2 (en) * | 2017-09-07 | 2021-01-26 | International Business Machines Corporation | Accessing and analyzing data to select an optimal line-of-sight and determine how media content is distributed and displayed |
| US11188944B2 (en) | 2017-12-04 | 2021-11-30 | At&T Intellectual Property I, L.P. | Apparatus and methods for adaptive signage |
| US20210023704A1 (en) * | 2018-04-10 | 2021-01-28 | Sony Corporation | Information processing apparatus, information processing method, and robot apparatus |
| KR102862424B1 (ko) * | 2019-11-22 | 2025-09-23 | 엘지전자 주식회사 | 사용자인식에 기반한 디바이스의 제어 |
| JP7554290B2 (ja) * | 2020-06-08 | 2024-09-19 | アップル インコーポレイテッド | 三次元環境におけるアバターの提示 |
| KR102284806B1 (ko) | 2021-04-29 | 2021-08-03 | (주)비상정보통신 | 복수의 동적객체인식 처리가 가능한 다중해상도 영상처리장치 및 방법 |
| CN114416272B (zh) * | 2022-02-07 | 2024-04-05 | 神策网络科技(北京)有限公司 | 图形组件显示方法、装置、存储介质及电子设备 |
| US12457301B2 (en) * | 2024-02-08 | 2025-10-28 | Lenovo (Singapore) Pte. Ltd. | Video processing adjustment based on user looking/not looking |
| CN118395503B (zh) * | 2024-06-24 | 2024-10-29 | 广东方天软件科技股份有限公司 | 基于同态加密的用户隐私保护方法 |
Citations (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH11296542A (ja) | 1998-04-14 | 1999-10-29 | Nec Corp | 情報表示装置及びその表示方法並びにその制御プログラムを記録した記録媒体 |
| JP2001043384A (ja) * | 1999-07-29 | 2001-02-16 | Nippon Telegr & Teleph Corp <Ntt> | マルチメディア情報空間の入力方法及び入力装置及び入出力連携方法及び入出力連携装置及び入力プログラム及び入出力連携プログラムを記録した記録媒体 |
| JP2001216527A (ja) * | 2000-02-04 | 2001-08-10 | Nippon Telegr & Teleph Corp <Ntt> | マルチメディア情報空間入出力装置およびその方法、ならびにそのプログラムを記録した記録媒体 |
| JP2003271283A (ja) | 2002-03-19 | 2003-09-26 | Fuji Xerox Co Ltd | 情報表示装置 |
| JP2005157134A (ja) * | 2003-11-27 | 2005-06-16 | Nippon Telegr & Teleph Corp <Ntt> | 情報出力方法及び装置及びプログラム及び情報出力プログラムを格納したコンピュータ読み取り可能な記憶媒体 |
| JP2005250322A (ja) | 2004-03-08 | 2005-09-15 | Matsushita Electric Ind Co Ltd | 表示装置 |
| JP2007272369A (ja) * | 2006-03-30 | 2007-10-18 | Advanced Telecommunication Research Institute International | コンテンツ提示装置 |
| JP2009244553A (ja) * | 2008-03-31 | 2009-10-22 | Dainippon Printing Co Ltd | 電子popシステム、サーバ、プログラム |
| JP2010027043A (ja) * | 2008-06-16 | 2010-02-04 | Dainippon Printing Co Ltd | 処理装置、処理方法、プログラム及びセンサシステム |
Family Cites Families (48)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5943049A (en) * | 1995-04-27 | 1999-08-24 | Casio Computer Co., Ltd. | Image processor for displayed message, balloon, and character's face |
| JP3671258B2 (ja) | 1995-04-27 | 2005-07-13 | カシオ計算機株式会社 | 画像処理装置 |
| JP2783212B2 (ja) | 1995-09-08 | 1998-08-06 | 日本電気株式会社 | 情報提示装置 |
| JPH1013605A (ja) | 1996-06-19 | 1998-01-16 | Star Micronics Co Ltd | 電子式情報表示装置 |
| US5731805A (en) | 1996-06-25 | 1998-03-24 | Sun Microsystems, Inc. | Method and apparatus for eyetrack-driven text enlargement |
| JP2894307B2 (ja) | 1996-12-19 | 1999-05-24 | 日本電気株式会社 | 情報表示装置 |
| JPH10187747A (ja) | 1996-12-26 | 1998-07-21 | Digital Vision Lab:Kk | 情報提示方法及び装置、情報表示装置及び音声出力装置 |
| JPH11276461A (ja) | 1998-03-27 | 1999-10-12 | Suzuki Motor Corp | 注意力測定装置及びこれを用いた情報提示装置 |
| JPH10320093A (ja) | 1998-05-22 | 1998-12-04 | Nec Corp | メッセージ出力制御方法 |
| DE19951001C2 (de) * | 1999-10-22 | 2003-06-18 | Bosch Gmbh Robert | Vorrichtung zur Darstellung von Informationen in einem Fahrzeug |
| JP2001306941A (ja) | 2000-02-15 | 2001-11-02 | Web Port:Kk | コンピュータネットワークを用いた広告方法、広告システムおよび記録媒体 |
| GB0012132D0 (en) * | 2000-05-20 | 2000-07-12 | Hewlett Packard Co | Targeted information display |
| KR100579121B1 (ko) | 2000-06-16 | 2006-05-12 | 가부시끼가이샤 도시바 | 광고분배방법 및 광고분배장치 |
| JP3707361B2 (ja) | 2000-06-28 | 2005-10-19 | 日本ビクター株式会社 | 情報提供サーバ及び情報提供方法 |
| JP2002023685A (ja) | 2000-07-10 | 2002-01-23 | Matsushita Electric Ind Co Ltd | 電光表示装置 |
| JP2002041537A (ja) | 2000-07-31 | 2002-02-08 | Nec Corp | 広告提示システム |
| JP2002063264A (ja) | 2000-08-17 | 2002-02-28 | Fujitsu Ltd | 広告情報提供システム及び方法、並びに、広告情報提供プログラムを記録したコンピュータ読取可能な記録媒体 |
| JP3770374B2 (ja) | 2000-10-16 | 2006-04-26 | シャープ株式会社 | 画像表示装置 |
| JP2001273064A (ja) | 2001-01-24 | 2001-10-05 | Casio Comput Co Ltd | 画像表示制御装置及び画像表示制御方法 |
| US7068813B2 (en) * | 2001-03-28 | 2006-06-27 | Koninklijke Philips Electronics N.V. | Method and apparatus for eye gazing smart display |
| JP2003058086A (ja) | 2001-08-10 | 2003-02-28 | Event Kogaku Kenkyusho:Kk | ネットワーク上での商品・サービス・広告情報の提供方法、この方法を用いた情報提供システム及びそのコンピュータプログラム |
| US7284201B2 (en) * | 2001-09-20 | 2007-10-16 | Koninklijke Philips Electronics N.V. | User attention-based adaptation of quality level to improve the management of real-time multi-media content delivery and distribution |
| JP2003216635A (ja) | 2002-01-17 | 2003-07-31 | Nippon Telegraph & Telephone East Corp | エージェント情報提示装置 |
| JP2003263145A (ja) | 2002-03-07 | 2003-09-19 | Pioneer Electronic Corp | 情報表示装置 |
| JP3968778B2 (ja) | 2002-10-17 | 2007-08-29 | 横河電機株式会社 | アラーム表示装置 |
| JP2004185139A (ja) | 2002-11-29 | 2004-07-02 | Fuji Photo Film Co Ltd | 表示システム及び表示装置の制御方法 |
| JP2004246709A (ja) | 2003-02-14 | 2004-09-02 | Fuji Xerox Co Ltd | 情報可視化装置、方法およびプログラム |
| JP2004310554A (ja) | 2003-04-08 | 2004-11-04 | Tokyo Electric Power Co Inc:The | 画像データインターフェースシステム、インターフェース方法、インターフェースプログラム |
| US7872635B2 (en) * | 2003-05-15 | 2011-01-18 | Optimetrics, Inc. | Foveated display eye-tracking system and method |
| JP4328136B2 (ja) | 2003-06-19 | 2009-09-09 | 国立大学法人 奈良先端科学技術大学院大学 | 関心度推定装置 |
| JP4140898B2 (ja) | 2003-08-20 | 2008-08-27 | 日本電信電話株式会社 | 情報提示装置および情報提示装置の使用方法 |
| JP2005115476A (ja) | 2003-10-03 | 2005-04-28 | Canon Inc | 情報出力装置、リンク報知方法、プログラム、及び記憶媒体 |
| CN1910648A (zh) | 2004-01-20 | 2007-02-07 | 皇家飞利浦电子股份有限公司 | 采用动态消息重定位的消息板 |
| JP4481682B2 (ja) | 2004-02-25 | 2010-06-16 | キヤノン株式会社 | 情報処理装置及びその制御方法 |
| JP4838499B2 (ja) * | 2004-05-21 | 2011-12-14 | オリンパス株式会社 | ユーザ支援装置 |
| JP4911557B2 (ja) * | 2004-09-16 | 2012-04-04 | 株式会社リコー | 画像表示装置、画像表示制御方法、プログラム及び情報記録媒体 |
| JP2006236013A (ja) | 2005-02-25 | 2006-09-07 | Nippon Telegr & Teleph Corp <Ntt> | 環境的情報提示装置、環境的情報提示方法およびこの方法のプログラム |
| JP2006330912A (ja) | 2005-05-24 | 2006-12-07 | Toshiba Corp | 情報処理装置およびプログラム |
| JP2007133305A (ja) | 2005-11-14 | 2007-05-31 | Nippon Telegr & Teleph Corp <Ntt> | 情報表示制御装置および情報表示制御方法 |
| JP2007133304A (ja) | 2005-11-14 | 2007-05-31 | Nippon Telegr & Teleph Corp <Ntt> | 情報表示制御装置および情報表示制御方法 |
| US20070150916A1 (en) * | 2005-12-28 | 2007-06-28 | James Begole | Using sensors to provide feedback on the access of digital content |
| JP4779673B2 (ja) | 2006-01-30 | 2011-09-28 | トヨタ自動車株式会社 | 情報提供装置 |
| WO2007128057A1 (en) * | 2006-05-04 | 2007-11-15 | National Ict Australia Limited | An electronic media system |
| US20070271580A1 (en) * | 2006-05-16 | 2007-11-22 | Bellsouth Intellectual Property Corporation | Methods, Apparatus and Computer Program Products for Audience-Adaptive Control of Content Presentation Based on Sensed Audience Demographics |
| US20080147488A1 (en) * | 2006-10-20 | 2008-06-19 | Tunick James A | System and method for monitoring viewer attention with respect to a display and determining associated charges |
| JP2008141484A (ja) | 2006-12-01 | 2008-06-19 | Sanyo Electric Co Ltd | 画像再生システム及び映像信号供給装置 |
| US7556377B2 (en) * | 2007-09-28 | 2009-07-07 | International Business Machines Corporation | System and method of detecting eye fixations using adaptive thresholds |
| US20100107184A1 (en) * | 2008-10-23 | 2010-04-29 | Peter Rae Shintani | TV with eye detection |
-
2009
- 2009-12-15 KR KR1020107017932A patent/KR101596975B1/ko active Active
- 2009-12-15 EP EP09833190.3A patent/EP2360663B1/en active Active
- 2009-12-15 US US12/867,811 patent/US8421782B2/en active Active
- 2009-12-15 CN CN2009801053677A patent/CN101946274B/zh active Active
- 2009-12-15 WO PCT/JP2009/006884 patent/WO2010070882A1/ja not_active Ceased
- 2009-12-15 JP JP2010525093A patent/JP5518713B2/ja active Active
Patent Citations (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH11296542A (ja) | 1998-04-14 | 1999-10-29 | Nec Corp | 情報表示装置及びその表示方法並びにその制御プログラムを記録した記録媒体 |
| JP2001043384A (ja) * | 1999-07-29 | 2001-02-16 | Nippon Telegr & Teleph Corp <Ntt> | マルチメディア情報空間の入力方法及び入力装置及び入出力連携方法及び入出力連携装置及び入力プログラム及び入出力連携プログラムを記録した記録媒体 |
| JP2001216527A (ja) * | 2000-02-04 | 2001-08-10 | Nippon Telegr & Teleph Corp <Ntt> | マルチメディア情報空間入出力装置およびその方法、ならびにそのプログラムを記録した記録媒体 |
| JP2003271283A (ja) | 2002-03-19 | 2003-09-26 | Fuji Xerox Co Ltd | 情報表示装置 |
| JP2005157134A (ja) * | 2003-11-27 | 2005-06-16 | Nippon Telegr & Teleph Corp <Ntt> | 情報出力方法及び装置及びプログラム及び情報出力プログラムを格納したコンピュータ読み取り可能な記憶媒体 |
| JP2005250322A (ja) | 2004-03-08 | 2005-09-15 | Matsushita Electric Ind Co Ltd | 表示装置 |
| JP2007272369A (ja) * | 2006-03-30 | 2007-10-18 | Advanced Telecommunication Research Institute International | コンテンツ提示装置 |
| JP2009244553A (ja) * | 2008-03-31 | 2009-10-22 | Dainippon Printing Co Ltd | 電子popシステム、サーバ、プログラム |
| JP2010027043A (ja) * | 2008-06-16 | 2010-02-04 | Dainippon Printing Co Ltd | 処理装置、処理方法、プログラム及びセンサシステム |
Non-Patent Citations (1)
| Title |
|---|
| See also references of EP2360663A4 |
Cited By (31)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US8830164B2 (en) | 2009-12-14 | 2014-09-09 | Panasonic Intellectual Property Corporation Of America | User interface device and input method |
| WO2011074198A1 (ja) * | 2009-12-14 | 2011-06-23 | パナソニック株式会社 | ユーザインタフェース装置および入力方法 |
| JP2012083669A (ja) * | 2010-10-14 | 2012-04-26 | Nippon Telegr & Teleph Corp <Ntt> | 情報提示装置、情報提示方法および情報提示プログラム |
| EP2475183A1 (en) * | 2011-01-06 | 2012-07-11 | Samsung Electronics Co., Ltd. | Display apparatus controlled by motion and motion control method thereof |
| WO2012105196A1 (ja) * | 2011-02-04 | 2012-08-09 | パナソニック株式会社 | 関心度推定装置および関心度推定方法 |
| US9538219B2 (en) | 2011-02-04 | 2017-01-03 | Panasonic Intellectual Property Corporation Of America | Degree of interest estimating device and degree of interest estimating method |
| US9183575B2 (en) | 2011-02-23 | 2015-11-10 | Ayuda Media Systems Inc. | Pay per look billing method and system for out-of-home advertisement |
| WO2012114203A3 (en) * | 2011-02-23 | 2013-03-14 | Ayuda Media Management Systems Inc. | Methods, apparatuses and systems for calculating an amount to be billed in respect of running an out-of-home advertisement during a period of time |
| AU2017239537B2 (en) * | 2011-02-23 | 2019-10-31 | Hivestack Inc. | Pay per look billing method and system for out-of-home advertisement |
| JP2012186621A (ja) * | 2011-03-04 | 2012-09-27 | Sony Corp | 情報処理装置、情報処理方法およびプログラム |
| US9344760B2 (en) | 2011-03-04 | 2016-05-17 | Sony Corporation | Information processing apparatus, information processing method, and program |
| JP2013020136A (ja) * | 2011-07-12 | 2013-01-31 | Canon Inc | 表示制御装置、表示制御装置の制御方法、プログラム及び記憶媒体 |
| US9224037B2 (en) | 2011-11-30 | 2015-12-29 | Canon Kabushiki Kaisha | Apparatus and method for controlling presentation of information toward human object |
| JP2013114595A (ja) * | 2011-11-30 | 2013-06-10 | Canon Inc | 情報処理装置、情報処理方法及びプログラム |
| JP2014026040A (ja) * | 2012-07-25 | 2014-02-06 | Toshiba Tec Corp | 表示制御装置及び表示制御プログラム |
| JP2014153666A (ja) * | 2013-02-13 | 2014-08-25 | Mitsubishi Electric Corp | 広告提示装置 |
| JP2015064513A (ja) * | 2013-09-26 | 2015-04-09 | カシオ計算機株式会社 | 表示装置、コンテンツ表示方法及びプログラム |
| US10855946B2 (en) | 2014-06-10 | 2020-12-01 | Socionext Inc. | Semiconductor integrated circuit, display device provided with same, and control method |
| WO2015190093A1 (ja) * | 2014-06-10 | 2015-12-17 | 株式会社ソシオネクスト | 半導体集積回路およびそれを備えた表示装置並びに制御方法 |
| JPWO2015190093A1 (ja) * | 2014-06-10 | 2017-06-01 | 株式会社ソシオネクスト | 半導体集積回路およびそれを備えた表示装置並びに制御方法 |
| JP2016076259A (ja) * | 2015-12-21 | 2016-05-12 | キヤノン株式会社 | 情報処理装置、情報処理方法及びプログラム |
| KR101744940B1 (ko) | 2016-01-20 | 2017-06-08 | 서강대학교산학협력단 | 디지털 사이니지 시스템에서 상황인지형 컨텐츠 추천장치 |
| JP2019531558A (ja) * | 2016-06-23 | 2019-10-31 | アウターネッツ、インコーポレイテッド | 対話式コンテンツ管理 |
| JP2017090918A (ja) * | 2016-11-14 | 2017-05-25 | カシオ計算機株式会社 | コンテンツ出力装置及びプログラム |
| WO2019044572A1 (ja) * | 2017-09-04 | 2019-03-07 | ソニー株式会社 | 情報処理装置および方法、並びにプログラム |
| JP2020113075A (ja) * | 2019-01-11 | 2020-07-27 | 沖電気工業株式会社 | 情報処理装置、情報処理方法、プログラム、及び情報処理システム |
| JP2022185120A (ja) * | 2019-09-18 | 2022-12-13 | デジタル・アドバタイジング・コンソーシアム株式会社 | プログラム、情報処理方法及び情報処理装置 |
| JP2021125734A (ja) * | 2020-02-03 | 2021-08-30 | マルコムホールディングス株式会社 | 対話ユーザの感情情報の提供装置 |
| JP7316664B2 (ja) | 2020-02-03 | 2023-07-28 | マルコムホールディングス株式会社 | 対話ユーザの感情情報の提供装置 |
| WO2024166619A1 (ja) * | 2023-02-07 | 2024-08-15 | パナソニックIpマネジメント株式会社 | 情報処理方法および情報処理システム |
| WO2025238978A1 (ja) * | 2024-05-14 | 2025-11-20 | パナソニックIpマネジメント株式会社 | 情報処理方法、情報処理装置及び情報処理プログラム |
Also Published As
| Publication number | Publication date |
|---|---|
| EP2360663A1 (en) | 2011-08-24 |
| US8421782B2 (en) | 2013-04-16 |
| CN101946274A (zh) | 2011-01-12 |
| CN101946274B (zh) | 2013-08-28 |
| EP2360663B1 (en) | 2016-04-20 |
| US20110050656A1 (en) | 2011-03-03 |
| KR101596975B1 (ko) | 2016-02-23 |
| JP5518713B2 (ja) | 2014-06-11 |
| EP2360663A4 (en) | 2012-09-05 |
| JPWO2010070882A1 (ja) | 2012-05-24 |
| KR20110098988A (ko) | 2011-09-05 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP5518713B2 (ja) | 情報表示装置及び情報表示方法 | |
| CN108027652B (zh) | 信息处理设备、信息处理方法以及记录介质 | |
| EP2395420B1 (en) | Information display device and information display method | |
| CN102301316B (zh) | 用户界面装置以及输入方法 | |
| US20120180084A1 (en) | Method and Apparatus for Video Insertion | |
| US20120011454A1 (en) | Method and system for intelligently mining data during communication streams to present context-sensitive advertisements using background substitution | |
| US20140267771A1 (en) | Gaze tracking and recognition with image location | |
| TWI620098B (zh) | 頭戴式裝置以及導覽方法 | |
| US12001746B2 (en) | Electronic apparatus, and method for displaying image on display device | |
| US12112449B2 (en) | Camera-based transparent display | |
| JP2007052304A (ja) | 映像表示システム | |
| WO2016158001A1 (ja) | 情報処理装置、情報処理方法、プログラム及び記録媒体 | |
| Turban et al. | Extrafoveal video extension for an immersive viewing experience | |
| US20190281280A1 (en) | Parallax Display using Head-Tracking and Light-Field Display | |
| CN106919246A (zh) | 一种应用界面的显示方法和装置 | |
| US11205405B2 (en) | Content arrangements on mirrored displays | |
| CN109255838B (zh) | 避免增强现实显示设备观看重影的方法及设备 | |
| JP2020101897A (ja) | 情報処理装置、情報処理方法及びプログラム | |
| US12450844B2 (en) | Image signal processing based on occlusion culling | |
| TWI885862B (zh) | 穿透式顯示方法和穿透式顯示系統 | |
| Orlosky | Depth based interaction and field of view manipulation for augmented reality | |
| US20250322615A1 (en) | See-through display method and see-through display system | |
| CN120128743A (zh) | 投屏显示动态调整方法、装置、设备及存储介质 | |
| CN120378737A (zh) | 推荐方法、装置、电子设备、芯片和存储介质 | |
| NL2005720C2 (en) | System and method for generating a depth map. |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| WWE | Wipo information: entry into national phase |
Ref document number: 200980105367.7 Country of ref document: CN |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2010525093 Country of ref document: JP |
|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09833190 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 20107017932 Country of ref document: KR Kind code of ref document: A |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 12867811 Country of ref document: US Ref document number: 2009833190 Country of ref document: EP |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |