[go: up one dir, main page]

WO2016003100A1 - Method for displaying information and displaying device thereof - Google Patents

Method for displaying information and displaying device thereof Download PDF

Info

Publication number
WO2016003100A1
WO2016003100A1 PCT/KR2015/006316 KR2015006316W WO2016003100A1 WO 2016003100 A1 WO2016003100 A1 WO 2016003100A1 KR 2015006316 W KR2015006316 W KR 2015006316W WO 2016003100 A1 WO2016003100 A1 WO 2016003100A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
display device
eye
electronic device
display
Prior art date
Application number
PCT/KR2015/006316
Other languages
French (fr)
Inventor
Chang Hwan Kang
Jin Gwan Kim
Jung Ik Lee
Jae Hwan Park
Da Hye Hyoung
Sung Woo Nam
Ju Hyun Won
Original Assignee
Alticast Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020140136980A external-priority patent/KR102358921B1/en
Application filed by Alticast Corporation filed Critical Alticast Corporation
Publication of WO2016003100A1 publication Critical patent/WO2016003100A1/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0141Head-up displays characterised by optical features characterised by the informative content of the display
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C2201/00Transmission systems of control signals via wireless link
    • G08C2201/90Additional features
    • G08C2201/92Universal remote control

Definitions

  • the present invention relates to a method and a display device for displaying information, and more particularly, to an information display method and a display device for extracting an object matched with an eye direction in which a user gazes by tracking an eye of the user and displaying the extracted object and detailed information or additional information of the extracted object.
  • the home network function refers to a function capable of connecting and remotely controlling various types of electronic devices present in a house through an Internet network, and is essentially required for home automation.
  • the home network function has been configured in a form in which, when a resident goes out and is located outside the home, the resident checks states of various electronic devices present in the house by connecting to an Internet network using a mobile terminal and performs a simple control operation.
  • the resident may have inconvenience to search and control electronic devices one by one using the mobile terminal with a small screen.
  • the present invention is conceived to outperform the aforementioned issues, and provides an information display method and a display device for displaying information associated with an object matched with an eye of a user in real time by tracking the eye of the user.
  • the present invention provides an information display method and a display device for displaying differentiated information about an object matched with an eye of a user and an object not matched with the eye of the user by tracking the eye of the user.
  • a method of displaying, by a display device, information by tracking an eye of a user including tracking the eye of the user from a user image in response to detecting the user; displaying, on a screen, a plurality of objects corresponding to a plurality of electronic devices connected to the display device over a home network; extracting an object matched with the tracked eye of the user from the plurality of objects displayed on the screen; and displaying detailed information associated with the object extracted from the plurality of objects.
  • An information display method and a display device for performing the method according to embodiments of the present invention may provide information desired by a user without a specific input by tracking an eye of the user and thereby displaying information associated with an object matched with the tracked eye of the user in real time on a transparent display device.
  • an information display method and a display device for performing the method according to embodiments of the present invention may provide detailed information and use information of an object matched with an eye of a user in real time by displaying detailed information or additional information about the matched object in real time.
  • FIG. 1 is a block diagram illustrating a display device for displaying information by tracking an eye of a user according to an embodiment of the present invention.
  • FIG. 2 is a flowchart illustrating a method of displaying, by a display device, information by tracking an eye of a user according to an embodiment of the present invention.
  • FIG. 3 is a flowchart illustrating a method of providing, by a display device, an additional function using a user identification according to an embodiment of the present invention.
  • FIG. 4 illustrates an example of a method of displaying, by a display device, information by tracking an eye of a user according to an embodiment of the present invention.
  • FIGS. 5A and 5B illustrate an example of a method of displaying, by a display device, information in response to a movement of an eye of a user according to an embodiment of the present invention.
  • FIG. 6 illustrates an example of an operation method of a display device in response to a gesture according to an embodiment of the present invention.
  • FIG. 7 illustrates an example of an available service when a display device is applied to a kitchen according to an embodiment of the present invention.
  • FIG. 1 is a block diagram illustrating a display device for displaying information by tracking an eye of a user according to an embodiment of the present invention.
  • the display device may include an image receiver 110, an eye tracker 120, a user identifier 130, a controller 140, a storage 150, and a display 160.
  • the image receiver 110 may acquire a peripheral image of a predetermined area.
  • the peripheral image may include a video.
  • the peripheral image received at the image receiver 110 includes a user image and may be an image acquired only when the user is detected through a real-time detection.
  • the image receiver 110 may be provided to the display device in a form of a camera. A plurality of image receivers 110 may be provided to achieve the accuracy enhancement.
  • the eye tracker 120 may be configured using a predetermined eye tracking method for extracting an eyeball movement of the user by analyzing the user image received at the image receiver 110 and tracking an eye of the user based on the extracted eyeball movement.
  • the user identifier 130 may be configured using a predetermined facial recognition program for recognizing and identifying a face of the user by analyzing the user image received at the image receiver 110.
  • the controller 140 may control a plurality of objects connected to the display device over a home network to be displayed on a screen.
  • the controller 140 may extract an object matched with the eye of the user tracked through the eye tracker 120 from the plurality of objects displayed on the screen.
  • the controller 140 may control detailed information associated with the object extracted from the plurality of objects to be displayed.
  • detailed information about the extracted object may include device information of an electronic device corresponding to the extracted object, a device use history, a device operation manual, and after service (A/S) information.
  • the controller 140 may control current state information, for example, in use or an ON/OFF state, of an electronic device corresponding to at least one object excluding the extracted object from the plurality of objects, with respect to the at least one object.
  • the controller 140 may control the storage 150 to cumulatively store an electronic device use history of the specific user. Also, the controller 140 may control an additional service customized for the specific user to be further provided by extracting the stored electronic device use history from the storage 150 and by using the extracted electronic device use history. The controller 140 may also control operations of the plurality of objects connected over the home network.
  • the storage 150 may cumulatively store a use history associated with a use of an object matched with the racked eye of the user for each user.
  • the object includes an electronic device connected over the home network.
  • the storage 150 may store a use history of the electronic device in addition to basic device information of the electronic device, a device operation manual, and device A/S information.
  • the storage 150 may cumulatively store the use history of the electronic device for each user.
  • the display 160 may provide a display screen according to a control of the controller 140.
  • the display 160 may be provided in a form of a transparent flexible display.
  • a display device may provide information associated with an object desired by a user without a specific input in real time, by tracking an eye of the user.
  • FIG. 2 is a flowchart illustrating a method of displaying, by a display device, information by tracking an eye of a user according to an embodiment of the present invention.
  • a display device may provide information desired by a user in real time without a specific selection or input by tracking an eye of the user.
  • the display device may track the eye of the user on a display screen by detecting the user from a user image.
  • the eye of the user may be tracked based on contents acquired by extracting eyeball or pupil movement of the user from the user image. Also, the eye of the user may be tracked using a separate eye tracking method.
  • the display device may extract a plurality of objects connected over a home network and may display the extracted objects on a screen.
  • the display device may display, on the screen, a plurality of objects corresponding to a plurality of electronic devices connected to the display device over the home network.
  • the display device as a transparent display device, may display the plurality of objects by transmitting the plurality of electronic devices located at the rear of the transparent display device.
  • the display device may recognize an object corresponding to each electronic device passing through the transparent display device.
  • the display device may extract an object matched with the tracked eye of the user from the plurality of objects displayed on the screen and may display detailed information associated with the extracted object. That is, the display device may recognize an electronic device matched with the eye of the user as being selected by the user and thereby extract the electronic device, and may display detailed information associated with the extracted electronic device on the screen.
  • the detailed information may include device information of the corresponding electronic device, a device use history, a device operation manual, and device A/S information.
  • the display device may display only brief information, for example, state information indicating a current state, such as ON/OFF, about electronic devices that are not matched with the eye of the user and thus not extracted, that is, electronic devices that are recognized not to be selected by the user and thus not extracted.
  • state information indicating a current state, such as ON/OFF
  • FIG. 3 is a flowchart illustrating a method of providing, by a display device, an additional function using a user identification according to an embodiment of the present invention.
  • the display device may further provide a function that is frequently used by the specific user based on a previous use history.
  • the display device may identify a user by detecting the user from a user image.
  • the user may be identified through a facial recognition program for recognizing and identifying a face of the user by analyzing the user image. For example, when the user is detected from an image captured from a camera, the display device may extract a facial image of the user from the captured image.
  • a specific user may be identifiable based on whether the extracted facial image is pre-stored in a database. When the facial image of the specific user is not stored in the database, the facial image of the specific user may be registered as a new user and be used.
  • the display device may cumulatively store a device use history of the identified user.
  • the device use history may include a number of recent use histories preset for an electronic device corresponding to an object extracted by tracking an eye of the user.
  • the display device may control the electronic device to be in an operating state corresponding to the selected use history.
  • the display device is connected to the electronic device over a home network and may transmit a control signal for activating the electronic device over the home network.
  • the device use history may include a use history for each of a plurality of users with respect to the electronic device corresponding to the object extracted by tracking the eye of the user.
  • the display device may transmit a control signal to the electronic device so that the electronic device is in an operating state corresponding to a use history of the detected user.
  • the user detection method may directly select a user using a user identification through a facial recognition and a touch of the user or a hand shape of the user, for example, recognized finger counts in an example when the user is designated several times such as 1, 2, 3, and 4. That is, when a specific user pre-stored in or registered to the database is to use an electronic device, the display device may cumulatively store corresponding use history information. For example, when the electronic device is an oven, the cumulatively stored use history information may include previous temperature and time information used for cooking using the oven. When the electronic device is a washing machine, the cumulatively stored use history information may include use mode information of the washing machine.
  • the display device may activate the corresponding electronic device by transmitting a control signal for setting a preference function frequently used by the specific user to the corresponding electronic device in operations S330, S340, and S350.
  • the display device may recognize that an oven is selected by tracking an eye of the specific user and may display detailed information about the oven on a screen.
  • the display device may identify the specific user by recognizing a face of the specific user, and may display a pre-stored oven use history of the identified specific user on the screen.
  • the display device may activate the corresponding electronic device by automatically transmitting a control device to the corresponding electronic device to set a preference function frequently used by the specific user based on the electronic device use history of the specific user.
  • FIG. 4 illustrates an example of a method of displaying, by a display device, information by tracking an eye of a user according to an embodiment of the present invention.
  • a display device may display detailed information of an object, for example, various types of electronic devices, matched with an eye of a user by tracking the eye of the user.
  • the display device 420 may be installed between a plurality of objects 430 and 440 and a user 410.
  • the display device 420 may recognize an object, for example, the object 430, desired to be selected by the user 410 among the plurality of objects 430 and 440 by tracking an eye of the user 410 through a plurality of eye tracking modules 421, 422, and 423 mounted in a predetermined area. That is, the display device 420 may select the object 430 based on an eye direction of the user 410 recognized using the eye tracking modules 421, 422, and 423, and may display information about the selected object 430.
  • the eye direction of the user 410 may be determined based on a face direction and a pupil location of the user 410.
  • the user 410 may experience optical illusion as if the information is directly displayed on the object 430 located at the rear of the display device 420.
  • the display device 420 may display latest detailed information associated with the object 430. In the case of another object, for example, the object 440 not matched with the eye or the eye direction of the user 410, the display device 420 may display only current state information.
  • FIGS. 5A and 5B illustrate an example of a method of displaying, by a display device, information in response to a movement of an eye of a user according to an embodiment of the present invention.
  • the display device 520 may display information associated with a refrigerator 530, an oven 540, and a microwave 550 located at the rear of the display device 520.
  • the display device 520 may be disposed between a user 510 and a variety of electronic devices and may include a module 521 for tracking an eye of the user 510 in a predetermined area. That is, the user 510 may gaze at the refrigerator 530 among the variety of electronic devices, for example, the refrigerator 530, the oven 540, and the microwave 550, transmitting the display device 520 made using a transparent material.
  • the display device 520 may recognize that the eye of the user 510 is matched with the refrigerator 530 by tracking the eye of the user 510 using the module 521, and may display detailed information 531 associated with the refrigerator 530 on a screen.
  • the detailed information 531 may include device information of a corresponding device, a device use history, device operation manual information, and A/S information.
  • the display device 520 may briefly display only state information 541 and 551 indicating current states such as availability or ON/OFF with respect to other electronic devices excluding the refrigerator 530.
  • the display device 520 may receive information associated with a variety of electronic devices in real time over a wired or wireless home network.
  • the display device 520 may display detailed information 541 mapped with the oven 540 selected in response to the eye of the user 510.
  • the display device 520 may display only state information 531 and 551, such as ON/OFF, on the screen. A variety of information mapped with the electronic devices may be searched and received in real time based on the home network.
  • an electronic device recognized by tracking the eye of the user 510 may one-to-one correspond to an object, for example, a screen indicating information, displayed on the screen of the display device 520.
  • the display device 520 may display detailed information associated with the other object matched with the moved eye of the user. That is, when detailed information associated with the other object is displayed, detailed information associated with the object being displayed may be reduced on the screen or may be displayed as only state information briefly indicating a current state.
  • a display device may selectively provide information of an object selected in response to a movement of an eye of a user in real time without a separate input by tracking the eye of the user, and may differentially provide information desired by the user and information undesired by the user.
  • FIG. 6 illustrates an example of an operation method of a display device in response to a gesture according to an embodiment of the present invention.
  • the display device 620 and a variety of electronic devices for example, a refrigerator 630, an oven 640, and a microwave 650 may be connected using a home network method.
  • the display device 620 may perform a preset operation in response to a gesture of a user 610.
  • the display device 620 may extract an object corresponding to the microwave 650 and may display detailed information 651.
  • the display device 620 may terminate a display operation.
  • the detected user gesture may include a touch operation, or may include a hand shape and a motion preset for terminating the display device 620.
  • the display device 620 may transmit an activation signal to the microwave 650 so that the corresponding electronic device is in a standby state for use.
  • the first user gesture may be a user hand shape preset to activate, for example, turn "ON", an electronic device corresponding to a object matched with the racked eye of the user.
  • the display device 620 may transmit an inactivation signal to the microwave 650.
  • the second user gesture may be a user hand shape preset to inactivate, for example, turn "OFF" or stand by, an electronic device corresponding to a object matched with the racked eye of the user.
  • the aforementioned first user gesture and second user gesture are not limited to a hand shape and may be a motion using a body or a touch by a hand.
  • FIG. 7 illustrates an example of an available service when a display device is applied to a kitchen according to an embodiment of the present invention.
  • the display device may be applied to and installed in a kitchen and may provide a service such as management of kitchen electric appliances and providing of a recipe.
  • a user may activate a display device [S710].
  • the display device may be configured in a form of a transparent display.
  • the user may activate the display device by manipulating a button 711 formed on a table through a gesture input or a touch input.
  • a display screen of the display device may be unfolded [S720].
  • the display device may be a flexible display device 721 that is freely bendable, and may be received and kept in a predetermined receiving space with being rolled around a small pulley 722.
  • Information associated with a variety of electronic devices present in the kitchen may be displayed through the display device [S730].
  • information 732, 733, and 734 being displayed through the display device 721 may include information associated with electronic devices located at the rear of the display device 721.
  • the display device 721 may display information associated with an electronic device gazed at by the user by tracking an eye of the user.
  • the display device 721 may display detailed information, for example, the information 732, about an object matched with the eye of the user by tracking the eye of the user, and may provide brief state information, for example, the information 733 and 734, about other objects excluding the matched object.
  • the display device 721 may display information, for example, an inside temperature, an antibiotic state, and a list of articles stored in the refrigerator, associated with the refrigerator as if the information is being displayed on the refrigerator by tracking the eye of the user.
  • information for example, an inside temperature, an antibiotic state, and a list of articles stored in the refrigerator, associated with the refrigerator as if the information is being displayed on the refrigerator by tracking the eye of the user.
  • the display device 721 may further display electronic device use information through the home network [S740 and S750].
  • the display device 721 may provide the user with a list of food reserves being kept in the refrigerator.
  • the list of food reserves may be information stored or registered in advance by the user.
  • predetermined food reserves 752 are selected from the list of food reserves in response to a user input, for example, a gesture input, the display device 721 may provide a recipe 762 using the selected food reserves by connecting to the Internet over the home network.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Provided is a display device and a method of displaying, by a display device, information by tracking an eye of a user, the method including tracking the eye of the user from a user image in response to detecting the user, displaying, on a screen, a plurality of objects corresponding to a plurality of electronic devices connected to the display device over a home network, extracting an object matched with the tracked eye of the user from the plurality of objects displayed on the screen, and displaying detailed information associated with the object extracted from the plurality of objects.

Description

METHOD FOR DISPLAYING INFORMATION AND DISPLAYING DEVICE THEREOF
The present invention relates to a method and a display device for displaying information, and more particularly, to an information display method and a display device for extracting an object matched with an eye direction in which a user gazes by tracking an eye of the user and displaying the extracted object and detailed information or additional information of the extracted object.
Currently, with the advancement in data communication technology, an Internet-based home network function is also in development. In addition, a home automatic system capable of remotely controlling electronic devices connected over a home network has appeared.
The home network function refers to a function capable of connecting and remotely controlling various types of electronic devices present in a house through an Internet network, and is essentially required for home automation.
In general, the home network function has been configured in a form in which, when a resident goes out and is located outside the home, the resident checks states of various electronic devices present in the house by connecting to an Internet network using a mobile terminal and performs a simple control operation.
However, although the resident stays at home, the resident may have inconvenience to search and control electronic devices one by one using the mobile terminal with a small screen.
The present invention is conceived to outperform the aforementioned issues, and provides an information display method and a display device for displaying information associated with an object matched with an eye of a user in real time by tracking the eye of the user.
Also, the present invention provides an information display method and a display device for displaying differentiated information about an object matched with an eye of a user and an object not matched with the eye of the user by tracking the eye of the user.
According to an embodiment of the present invention, there is provided a method of displaying, by a display device, information by tracking an eye of a user, the method including tracking the eye of the user from a user image in response to detecting the user; displaying, on a screen, a plurality of objects corresponding to a plurality of electronic devices connected to the display device over a home network; extracting an object matched with the tracked eye of the user from the plurality of objects displayed on the screen; and displaying detailed information associated with the object extracted from the plurality of objects.
An information display method and a display device for performing the method according to embodiments of the present invention may provide information desired by a user without a specific input by tracking an eye of the user and thereby displaying information associated with an object matched with the tracked eye of the user in real time on a transparent display device.
Also, an information display method and a display device for performing the method according to embodiments of the present invention may provide detailed information and use information of an object matched with an eye of a user in real time by displaying detailed information or additional information about the matched object in real time.
FIG. 1 is a block diagram illustrating a display device for displaying information by tracking an eye of a user according to an embodiment of the present invention.
FIG. 2 is a flowchart illustrating a method of displaying, by a display device, information by tracking an eye of a user according to an embodiment of the present invention.
FIG. 3 is a flowchart illustrating a method of providing, by a display device, an additional function using a user identification according to an embodiment of the present invention.
FIG. 4 illustrates an example of a method of displaying, by a display device, information by tracking an eye of a user according to an embodiment of the present invention.
FIGS. 5A and 5B illustrate an example of a method of displaying, by a display device, information in response to a movement of an eye of a user according to an embodiment of the present invention.
FIG. 6 illustrates an example of an operation method of a display device in response to a gesture according to an embodiment of the present invention.
FIG. 7 illustrates an example of an available service when a display device is applied to a kitchen according to an embodiment of the present invention.
Hereinafter, exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings. It should be noticed that if they are illustrated in different drawings, like reference numerals refer to like constituent elements throughout. Further, when it is determined detailed description related to a related known function or configuration they may make the purpose of the present invention unnecessarily ambiguous in describing the present invention, the detailed description will be omitted here.
Various embodiments of the present invention are described below. It should be understood that the inventions proposed herein may be embodied in various forms and predetermined structures or functions proposed herein, or all of them are provided as examples. Thus, those skilled in the art will understand that one embodiment proposed herein may be configured to be independent from other embodiments and two or more embodiments may be combined using a variety of methods. For example, a device/apparatus may be configured or a method may be implemented using a predetermined number of embodiments disclosed herein. Further, such a device/apparatus may be configured or such a method may be implemented using another structure or function, or another structure and function, in addition to one or more embodiments described herein or aside therefrom.
Hereinafter, the exemplary embodiments of the present invention will be described with reference to the accompanying drawings.
FIG. 1 is a block diagram illustrating a display device for displaying information by tracking an eye of a user according to an embodiment of the present invention.
Referring to FIG. 1, the display device may include an image receiver 110, an eye tracker 120, a user identifier 130, a controller 140, a storage 150, and a display 160.
The image receiver 110 may acquire a peripheral image of a predetermined area. The peripheral image may include a video. The peripheral image received at the image receiver 110 includes a user image and may be an image acquired only when the user is detected through a real-time detection. The image receiver 110 may be provided to the display device in a form of a camera. A plurality of image receivers 110 may be provided to achieve the accuracy enhancement.
The eye tracker 120 may be configured using a predetermined eye tracking method for extracting an eyeball movement of the user by analyzing the user image received at the image receiver 110 and tracking an eye of the user based on the extracted eyeball movement.
The user identifier 130 may be configured using a predetermined facial recognition program for recognizing and identifying a face of the user by analyzing the user image received at the image receiver 110.
The controller 140 may control a plurality of objects connected to the display device over a home network to be displayed on a screen. The controller 140 may extract an object matched with the eye of the user tracked through the eye tracker 120 from the plurality of objects displayed on the screen. The controller 140 may control detailed information associated with the object extracted from the plurality of objects to be displayed. Here, detailed information about the extracted object may include device information of an electronic device corresponding to the extracted object, a device use history, a device operation manual, and after service (A/S) information. Also, the controller 140 may control current state information, for example, in use or an ON/OFF state, of an electronic device corresponding to at least one object excluding the extracted object from the plurality of objects, with respect to the at least one object.
Meanwhile, when a specific user is identified by the user identifier 130, the controller 140 may control the storage 150 to cumulatively store an electronic device use history of the specific user. Also, the controller 140 may control an additional service customized for the specific user to be further provided by extracting the stored electronic device use history from the storage 150 and by using the extracted electronic device use history. The controller 140 may also control operations of the plurality of objects connected over the home network.
The storage 150 may cumulatively store a use history associated with a use of an object matched with the racked eye of the user for each user. For example, the object includes an electronic device connected over the home network. The storage 150 may store a use history of the electronic device in addition to basic device information of the electronic device, a device operation manual, and device A/S information. Here, the storage 150 may cumulatively store the use history of the electronic device for each user.
The display 160 may provide a display screen according to a control of the controller 140. For example, the display 160 may be provided in a form of a transparent flexible display.
As described above, a display device according to an embodiment of the present invention may provide information associated with an object desired by a user without a specific input in real time, by tracking an eye of the user.
FIG. 2 is a flowchart illustrating a method of displaying, by a display device, information by tracking an eye of a user according to an embodiment of the present invention.
Referring to FIG. 2, a display device according to an embodiment of the present invention may provide information desired by a user in real time without a specific selection or input by tracking an eye of the user.
In operation S210, the display device may track the eye of the user on a display screen by detecting the user from a user image. The eye of the user may be tracked based on contents acquired by extracting eyeball or pupil movement of the user from the user image. Also, the eye of the user may be tracked using a separate eye tracking method.
In operation S220, the display device may extract a plurality of objects connected over a home network and may display the extracted objects on a screen. The display device may display, on the screen, a plurality of objects corresponding to a plurality of electronic devices connected to the display device over the home network. For example, the display device, as a transparent display device, may display the plurality of objects by transmitting the plurality of electronic devices located at the rear of the transparent display device. In this instance, the display device may recognize an object corresponding to each electronic device passing through the transparent display device.
In operations S230, S240, and S250, the display device may extract an object matched with the tracked eye of the user from the plurality of objects displayed on the screen and may display detailed information associated with the extracted object. That is, the display device may recognize an electronic device matched with the eye of the user as being selected by the user and thereby extract the electronic device, and may display detailed information associated with the extracted electronic device on the screen. Here, the detailed information may include device information of the corresponding electronic device, a device use history, a device operation manual, and device A/S information.
In operation S260, the display device may display only brief information, for example, state information indicating a current state, such as ON/OFF, about electronic devices that are not matched with the eye of the user and thus not extracted, that is, electronic devices that are recognized not to be selected by the user and thus not extracted.
FIG. 3 is a flowchart illustrating a method of providing, by a display device, an additional function using a user identification according to an embodiment of the present invention.
Referring to FIG. 3, when a specific user identified by a display device according to an embodiment of the present invention is to use an electronic device, the display device may further provide a function that is frequently used by the specific user based on a previous use history.
In operation S310, the display device may identify a user by detecting the user from a user image. The user may be identified through a facial recognition program for recognizing and identifying a face of the user by analyzing the user image. For example, when the user is detected from an image captured from a camera, the display device may extract a facial image of the user from the captured image. A specific user may be identifiable based on whether the extracted facial image is pre-stored in a database. When the facial image of the specific user is not stored in the database, the facial image of the specific user may be registered as a new user and be used.
In operation S320, the display device may cumulatively store a device use history of the identified user. The device use history may include a number of recent use histories preset for an electronic device corresponding to an object extracted by tracking an eye of the user. When one of the recent use histories is selected, the display device may control the electronic device to be in an operating state corresponding to the selected use history. Here, the display device is connected to the electronic device over a home network and may transmit a control signal for activating the electronic device over the home network. Further, the device use history may include a use history for each of a plurality of users with respect to the electronic device corresponding to the object extracted by tracking the eye of the user. Accordingly, when one user is detected from the plurality of users, the display device may transmit a control signal to the electronic device so that the electronic device is in an operating state corresponding to a use history of the detected user. Here, the user detection method may directly select a user using a user identification through a facial recognition and a touch of the user or a hand shape of the user, for example, recognized finger counts in an example when the user is designated several times such as 1, 2, 3, and 4. That is, when a specific user pre-stored in or registered to the database is to use an electronic device, the display device may cumulatively store corresponding use history information. For example, when the electronic device is an oven, the cumulatively stored use history information may include previous temperature and time information used for cooking using the oven. When the electronic device is a washing machine, the cumulatively stored use history information may include use mode information of the washing machine.
When a corresponding electronic device use history of the identified specific user is extracted, the display device may activate the corresponding electronic device by transmitting a control signal for setting a preference function frequently used by the specific user to the corresponding electronic device in operations S330, S340, and S350. For example, in the case of a shared kitchen used by a plurality of users, the display device may recognize that an oven is selected by tracking an eye of the specific user and may display detailed information about the oven on a screen. The display device may identify the specific user by recognizing a face of the specific user, and may display a pre-stored oven use history of the identified specific user on the screen. When the oven is connected to the display device over the home network, the display device may activate the corresponding electronic device by automatically transmitting a control device to the corresponding electronic device to set a preference function frequently used by the specific user based on the electronic device use history of the specific user.
As described above, according to embodiments of the present invention, it is possible to provide information associated with an electronic device desired by a user without a separate input by tracking an eye of the user. Further, it is possible to provide an automated advanced home networking/home automation system by automatically setting an electronic device based on a function frequently used by an identified specific user or a use history of the identified specific user.
FIG. 4 illustrates an example of a method of displaying, by a display device, information by tracking an eye of a user according to an embodiment of the present invention.
Referring to FIG. 4, a display device according to an embodiment of the present invention may display detailed information of an object, for example, various types of electronic devices, matched with an eye of a user by tracking the eye of the user.
The display device 420 may be installed between a plurality of objects 430 and 440 and a user 410. The display device 420 may recognize an object, for example, the object 430, desired to be selected by the user 410 among the plurality of objects 430 and 440 by tracking an eye of the user 410 through a plurality of eye tracking modules 421, 422, and 423 mounted in a predetermined area. That is, the display device 420 may select the object 430 based on an eye direction of the user 410 recognized using the eye tracking modules 421, 422, and 423, and may display information about the selected object 430. The eye direction of the user 410 may be determined based on a face direction and a pupil location of the user 410.
Accordingly, although information about an object matched with the racked eye of the user, for example, the object 430, located at the rear of the display device 420 is displayed on the display device 420 made of a transparent material, the user 410 may experience optical illusion as if the information is directly displayed on the object 430 located at the rear of the display device 420.
In the case of the object 430 matched with the tracked eye or eye direction of the user 410, the display device 420 may display latest detailed information associated with the object 430. In the case of another object, for example, the object 440 not matched with the eye or the eye direction of the user 410, the display device 420 may display only current state information.
FIGS. 5A and 5B illustrate an example of a method of displaying, by a display device, information in response to a movement of an eye of a user according to an embodiment of the present invention.
Referring to FIG. 5A, the display device 520 may display information associated with a refrigerator 530, an oven 540, and a microwave 550 located at the rear of the display device 520. The display device 520 may be disposed between a user 510 and a variety of electronic devices and may include a module 521 for tracking an eye of the user 510 in a predetermined area. That is, the user 510 may gaze at the refrigerator 530 among the variety of electronic devices, for example, the refrigerator 530, the oven 540, and the microwave 550, transmitting the display device 520 made using a transparent material. In response thereto, the display device 520 may recognize that the eye of the user 510 is matched with the refrigerator 530 by tracking the eye of the user 510 using the module 521, and may display detailed information 531 associated with the refrigerator 530 on a screen. Here, the detailed information 531 may include device information of a corresponding device, a device use history, device operation manual information, and A/S information. On the contrary, the display device 520 may briefly display only state information 541 and 551 indicating current states such as availability or ON/OFF with respect to other electronic devices excluding the refrigerator 530. The display device 520 may receive information associated with a variety of electronic devices in real time over a wired or wireless home network.
Referring to FIG. 5B, when the eye of the user 510 moves from the refrigerator 530 to another electronic device, for example, the oven 540, the display device 520 may display detailed information 541 mapped with the oven 540 selected in response to the eye of the user 510. In this example, with respect to other electronic devices, for example, the refrigerator 530 and the microwave 550 not matched with the eye of the user 510, the display device 520 may display only state information 531 and 551, such as ON/OFF, on the screen. A variety of information mapped with the electronic devices may be searched and received in real time based on the home network.
Here, an electronic device recognized by tracking the eye of the user 510 may one-to-one correspond to an object, for example, a screen indicating information, displayed on the screen of the display device 520. When a movement of the eye of the user 510 to another object is detected in a state in which detailed information associated with an object extracted or recognized by tracking the eye of the user 510, the display device 520 may display detailed information associated with the other object matched with the moved eye of the user. That is, when detailed information associated with the other object is displayed, detailed information associated with the object being displayed may be reduced on the screen or may be displayed as only state information briefly indicating a current state.
As described above, a display device according to an embodiment of the present invention may selectively provide information of an object selected in response to a movement of an eye of a user in real time without a separate input by tracking the eye of the user, and may differentially provide information desired by the user and information undesired by the user.
FIG. 6 illustrates an example of an operation method of a display device in response to a gesture according to an embodiment of the present invention. Referring to FIG. 6, the display device 620 and a variety of electronic devices, for example, a refrigerator 630, an oven 640, and a microwave 650 may be connected using a home network method.
As illustrated in FIG. 6, the display device 620 may perform a preset operation in response to a gesture of a user 610. In particular, when the microwave 650 is selected by tracking the eye of the user 610, the display device 620 may extract an object corresponding to the microwave 650 and may display detailed information 651. In this instance, when a preset user gesture is detected in a state in which the detailed information 651 is being displayed through the display device 620, the display device 620 may terminate a display operation. Here, the detected user gesture may include a touch operation, or may include a hand shape and a motion preset for terminating the display device 620.
When a first user gesture is detected in a state in which the detailed information 651 associated with the object extracted by tracking the eye of the user 610 is being displayed through the display device 620, the display device 620 may transmit an activation signal to the microwave 650 so that the corresponding electronic device is in a standby state for use. In this example, the first user gesture may be a user hand shape preset to activate, for example, turn "ON", an electronic device corresponding to a object matched with the racked eye of the user.
When a second user gesture is detected in which the microwave 650 is activated by the activation signal, the display device 620 may transmit an inactivation signal to the microwave 650. In this example, the second user gesture may be a user hand shape preset to inactivate, for example, turn "OFF" or stand by, an electronic device corresponding to a object matched with the racked eye of the user. However, the aforementioned first user gesture and second user gesture are not limited to a hand shape and may be a motion using a body or a touch by a hand.
FIG. 7 illustrates an example of an available service when a display device is applied to a kitchen according to an embodiment of the present invention.
Referring to FIG. 7, the display device according to an embodiment of the present invention may be applied to and installed in a kitchen and may provide a service such as management of kitchen electric appliances and providing of a recipe.
Referring to FIG. 7, a user may activate a display device [S710]. For example, the display device may be configured in a form of a transparent display. The user may activate the display device by manipulating a button 711 formed on a table through a gesture input or a touch input.
A display screen of the display device may be unfolded [S720]. Here, the display device may be a flexible display device 721 that is freely bendable, and may be received and kept in a predetermined receiving space with being rolled around a small pulley 722.
Information associated with a variety of electronic devices present in the kitchen may be displayed through the display device [S730]. Here, information 732, 733, and 734 being displayed through the display device 721 may include information associated with electronic devices located at the rear of the display device 721. Accordingly, the display device 721 may display information associated with an electronic device gazed at by the user by tracking an eye of the user. Here, the display device 721 may display detailed information, for example, the information 732, about an object matched with the eye of the user by tracking the eye of the user, and may provide brief state information, for example, the information 733 and 734, about other objects excluding the matched object. For example, when the user gazes at a refrigerator, the display device 721 may display information, for example, an inside temperature, an antibiotic state, and a list of articles stored in the refrigerator, associated with the refrigerator as if the information is being displayed on the refrigerator by tracking the eye of the user.
In addition, the display device 721 may further display electronic device use information through the home network [S740 and S750]. For example, when the electronic device 721 is a refrigerator, the display device 721 may provide the user with a list of food reserves being kept in the refrigerator. The list of food reserves may be information stored or registered in advance by the user. When predetermined food reserves 752 are selected from the list of food reserves in response to a user input, for example, a gesture input, the display device 721 may provide a recipe 762 using the selected food reserves by connecting to the Internet over the home network.
Although a few embodiments of the present invention have been shown and described, they are provided only to help the overall understanding of the invention and the present invention is not limited to the described embodiments. Instead, it would be appreciated by those skilled in the art that various changes and modifications may be made from the description without departing from the principles and spirit of the invention.
Therefore, the scope of the invention is defined not by the detailed description, but by the claims and their equivalents, and all variations within the scope of the claims and their equivalents are to be construed as being included in the invention.

Claims (13)

  1. A display device for displaying information by tracking an eye of a user, the display device comprising:
    an eye tracker configured to track the eye of the user from a user image in response to detecting the user;
    a controller configured to control a plurality of objects corresponding to a plurality of electronic devices, connected to the display device over a home network, to be displayed on a screen, to control an object matched with the tracked eye of the user to be extracted from the plurality of objects displayed on the screen, and to control detailed information associated with the object extracted from the plurality of objects to be displayed; and
    a display configured to provide the screen according to a control of the controller.
  2. The display device of claim 1, wherein detailed information associated with the extracted object comprises information regarding at least one of device information of an electronic device corresponding to the extracted object, a device use history, a device operation manual, and after service (A/S) information.
  3. The display device of claim 1, wherein the controller is configured to display current state information of an electronic device corresponding to at least one object excluding the object from the plurality of objects, with respect to the at least one object.
  4. The display device of claim 1, wherein the controller is configured to, in response to detecting a movement of the eye of the user in a state in which detailed information associated with the extracted object is displayed, display detailed information associated with another object matched with the moved eye of the user among the plurality of objects.
  5. The display device of claim 1, wherein the controller is configured to, in response to detecting a preset user gesture in a state in which detailed information associated with the extracted object is displayed, terminate a display operation of the display device.
  6. The display device of claim 1, wherein the controller is configured to, in response to detecting a first user gesture in which detailed information associated with the extracted object is displayed, transmit an activation signal to an electronic device corresponding to the extracted object.
  7. The display device of claim 6, wherein the controller is configured to, in response to detecting a second user gesture in a state in which the electronic device is activated by the activation signal, transmit an inactivation signal to the electronic device.
  8. The display device of claim 2, wherein the device use history comprises a number of recent use histories set for the electronic device corresponding to the extracted object, and
    the controller is configured to, in response to selecting one of the recent use histories, transmit a control signal to the electronic device so that the electronic device is in an operating state corresponding to the selected use history.
  9. The display device of claim 2, wherein the device use history comprises a use history for each of a plurality of users with respect to the electronic device corresponding to the extracted object, and
    the controller is configured to, in response to detecting one user among the plurality of users, transmit a control signal to the electronic device so that the electronic device is in an operating state corresponding to a use history of the detected user.
  10. The display device of claim 1, further comprising:
    an image receiver configured to acquire a peripheral image comprising the user image of a predetermined area; and
    a user identifier configured to recognize and identify a face of the user by analyzing the user image received at the image receiver.
  11. A method of displaying, by a display device, information by tracking an eye of a user, the method comprising:
    tracking the eye of the user from a user image in response to detecting the user;
    displaying, on a screen, a plurality of objects corresponding to a plurality of electronic devices connected to the display device over a home network;
    extracting an object matched with the tracked eye of the user from the plurality of objects displayed on the screen; and
    displaying detailed information associated with the object extracted from the plurality of objects.
  12. A non-transitory computer-readable medium storing a program to implement the method of claim 11.
  13. A computer program stored in a non-transitory computer-readable medium to implement the method of claim 11.
PCT/KR2015/006316 2014-06-30 2015-06-22 Method for displaying information and displaying device thereof WO2016003100A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR10-2014-0081261 2014-06-30
KR20140081261 2014-06-30
KR10-2014-0136980 2014-10-10
KR1020140136980A KR102358921B1 (en) 2014-06-30 2014-10-10 Method for Displaying Information and Displaying Device Thereof

Publications (1)

Publication Number Publication Date
WO2016003100A1 true WO2016003100A1 (en) 2016-01-07

Family

ID=55019578

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2015/006316 WO2016003100A1 (en) 2014-06-30 2015-06-22 Method for displaying information and displaying device thereof

Country Status (1)

Country Link
WO (1) WO2016003100A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3071094A1 (en) * 2017-09-12 2019-03-15 Faurecia Interieur Industrie DISPLAY SYSTEM FOR A VEHICLE COCKPIT AND HABITACLE COMPRISING SUCH A SYSTEM
WO2022002685A1 (en) * 2020-07-03 2022-01-06 BSH Hausgeräte GmbH Dishwasher and method for operating a dishwasher

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100182232A1 (en) * 2009-01-22 2010-07-22 Alcatel-Lucent Usa Inc. Electronic Data Input System
KR20110101374A (en) * 2010-03-08 2011-09-16 이강수 Ubiquitous Remote Controller Using Eye Tracking Glasses
US20120256823A1 (en) * 2011-03-13 2012-10-11 Lg Electronics Inc. Transparent display apparatus and method for operating the same
US20130010207A1 (en) * 2011-07-04 2013-01-10 3Divi Gesture based interactive control of electronic equipment
US20140132511A1 (en) * 2012-11-14 2014-05-15 Electronics And Telecommunications Research Institute Control apparatus based on eyes and method for controlling device thereof

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100182232A1 (en) * 2009-01-22 2010-07-22 Alcatel-Lucent Usa Inc. Electronic Data Input System
KR20110101374A (en) * 2010-03-08 2011-09-16 이강수 Ubiquitous Remote Controller Using Eye Tracking Glasses
US20120256823A1 (en) * 2011-03-13 2012-10-11 Lg Electronics Inc. Transparent display apparatus and method for operating the same
US20130010207A1 (en) * 2011-07-04 2013-01-10 3Divi Gesture based interactive control of electronic equipment
US20140132511A1 (en) * 2012-11-14 2014-05-15 Electronics And Telecommunications Research Institute Control apparatus based on eyes and method for controlling device thereof

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3071094A1 (en) * 2017-09-12 2019-03-15 Faurecia Interieur Industrie DISPLAY SYSTEM FOR A VEHICLE COCKPIT AND HABITACLE COMPRISING SUCH A SYSTEM
WO2022002685A1 (en) * 2020-07-03 2022-01-06 BSH Hausgeräte GmbH Dishwasher and method for operating a dishwasher

Similar Documents

Publication Publication Date Title
WO2018128475A1 (en) Augmented reality control of internet of things devices
US10796694B2 (en) Optimum control method based on multi-mode command of operation-voice, and electronic device to which same is applied
US10055094B2 (en) Method and apparatus for dynamically displaying device list
CN105138123B (en) Apparatus control method and device
US9760169B2 (en) Line-of-sight processing method, line-of-sight processing system and wearable device
WO2014200173A1 (en) Home appliance, mobile device, and control system for home appliance
WO2014042320A1 (en) Apparatus and method of providing user interface on head mounted display and head mounted display thereof
WO2019132564A1 (en) Method and system for classifying time-series data
WO2018155968A1 (en) System and method for automated personalization of an environment
KR20150108216A (en) Method for processing input and an electronic device thereof
CN106896920B (en) Virtual reality system, virtual reality equipment, virtual reality control device and method
WO2017176066A2 (en) Electronic apparatus and operating method thereof
CN113495617A (en) Method and device for controlling equipment, terminal equipment and storage medium
WO2013118987A1 (en) Control method and apparatus of electronic device using control device
CN105468144A (en) Intelligent device control method and apparatus
WO2019190076A1 (en) Eye tracking method and terminal for performing same
CN108700912A (en) Method and system for operating a device through augmented reality
WO2016003100A1 (en) Method for displaying information and displaying device thereof
US20140240226A1 (en) User Interface Apparatus
CN108153477B (en) Multi-touch operation method, mobile terminal and computer-readable storage medium
CN117858315A (en) Method and device for controlling intelligent lamp and storage medium
WO2015163742A1 (en) Device control method and apparatus in home network system
KR102358921B1 (en) Method for Displaying Information and Displaying Device Thereof
CN110109364A (en) Apparatus control method, device, video camera and storage medium
CN111290689A (en) Electronic equipment, main control device, control method and touch control sharing system thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15815678

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15815678

Country of ref document: EP

Kind code of ref document: A1