[go: up one dir, main page]

WO2015057845A1 - Système de suivi oculaire et procédé de développement de contenu - Google Patents

Système de suivi oculaire et procédé de développement de contenu Download PDF

Info

Publication number
WO2015057845A1
WO2015057845A1 PCT/US2014/060706 US2014060706W WO2015057845A1 WO 2015057845 A1 WO2015057845 A1 WO 2015057845A1 US 2014060706 W US2014060706 W US 2014060706W WO 2015057845 A1 WO2015057845 A1 WO 2015057845A1
Authority
WO
WIPO (PCT)
Prior art keywords
content
user
developing
user according
representation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/US2014/060706
Other languages
English (en)
Inventor
Jeff CLUNE
Hod Lipson
Jason Byron YOSINSKI
Nicholas CHENEY
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Cornell University
Original Assignee
Cornell University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cornell University filed Critical Cornell University
Publication of WO2015057845A1 publication Critical patent/WO2015057845A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements

Definitions

  • the invention relates to developing content in the form of two-dimensions (2- D) or three-dimensions (3-D) using any automated design process. More specifically, the invention is a system and methods that enables a user to create an object using eye tracking technology as well as provides for the creation of objects in 3-D that can be printed using 3-D printing technology.
  • Automated design processes are known that adapt designs to user preferences.
  • One example of an automated design process is known as interactive evolution.
  • Interactive evolution uses this idea to drive design, either of solutions to a particular problem or of open-ended creation where the only objective is aesthetic appeal.
  • Certain evolutionary algorithms present human users with potential solutions, and allow them to show a preference for things they like and discourage things they don't like.
  • the information provided from the user's feedback is used to create novel designs similar to previously preferred solutions, iteratively finding designs more and more preferential to the user.
  • Advances in eye tracking based interactive evolution may be driven by the interaction between various observations including, for example,: (1 ) unique stimuli - in their setup, unique rotational orientations - are likely to cause involuntary shifts in visual attention, (2) novel solutions may be a better driver of evolution than fitness (target-goal) based approaches, which often converge to local-optima in complex problems, and (3) objects varying in any attribute such as shape, color, complexity, etc., from other objects on a display device (as well as objects previously seen by the user).
  • objects varying in any attribute such as shape, color, complexity, etc.
  • CPPN Compositional Pattern- Producing Network
  • NEAT NeuroEvolution of Augmenting Topologies
  • eye gaze/movements represent an inadvertent/subconscious stream of information.
  • traditional interactive evolution has made users explicitly choose from a number of potential choices, the invention does not require any active participation as required by traditional methods.
  • a user's preferences can be gathered based on where the user spends time looking.
  • User's preferences can also be gathered based on the user's eye movement, for example, a user's eyes moving back and forth on the display device over a period of time.
  • the invention does not require users to perform any action (or even be aware that they are providing information).
  • the invention overcome user fatigue, but also speeds up evaluations, and increases evaluation quality by tracking where a user's eyes are looking to gauge their interest in content such as objects.
  • content is referred to specifically in the form of an object.
  • any form of content is contemplated including 2-D and 3-D tangible or intangible thing, design, or artifact.
  • eye tracking technology can be used for direct design and free-form design of content.
  • eye tracking technology is an attractive interface for interactive design, as it can also enable participation from new populations of users, such as those with physical disabilities, or those using devices which traditionally do not employ interactivity such as televisions, but could easily take advantage of passive or involuntary interactions.
  • consumer devices such as computers and cell phones now include the capability to incorporate eye tracking, suggesting the possibility for including passive, preference-driven, customized design as part of everyday technological interactions.
  • the invention not only automates design, but coupled with recent innovations in 3-D printing, an effortless interface greatly increase the use the general public has for in-home 3-D printers.
  • Eye-tracking enables an open-ended exploration of evolved designs.
  • the invention may incorporate other modes of detecting which object a user is interested in, such as a Kinect or similar device for determining the position of a user's head - which can improve the accuracy of eye tracking and indicate user interest in its own right -, as well as brain-scanning technologies, such as the electroencephalogram (EEG) headsets such that a user could control a mouse via mind control to indicate preferences, and also where a user's emotional state can be inferred to determine their feelings regarding displayed objects.
  • EEG electroencephalogram
  • natural, interesting objects are evolved and then printed on 3-D printers, enabling the objects to exist in the physical world.
  • users may interactively evolve 3-D objects by manually performing fitness evaluations such as those gleaned by eye-tracking technology.
  • the fitness of each user is a sum of the fraction of display time a user spends looking at each object presented in a population.
  • the invention transforms interactive evolution from a tool that is rarely used due to user fatigue, to one that allows the user to sit back and relax while content is morphed to match their preferences.
  • the future potential of eye tracking for interactive design is enormous - especially when one considers its potential for commercial use.
  • the ability to customize design without the need to formally describe it may open many doors for distributed design, which couple well with the increase in distributed fabrication as 3- D printers become commonplace appliances.
  • the invention may be embedded within existing programming such as internet TV, so that users can create product designs as part of their traditional viewing process.
  • FIG. 1 illustrates a block diagram of an exemplary embodiment of the system according to the invention.
  • FIG. 2 illustrates a flow chart of an exemplary embodiment of the method according to the invention.
  • Eye tracking is the process of measuring either the point-of-regard (where one is looking) or the motion of an eye relative to the head. More specifically, the point- of-regard is a position in rendered content - also referred to as content representations - that the user is presumed to be viewing.
  • the dimensions of the point-of-regard may vary, for example, it may be a point (i.e., line of sight), or a range, or an area.
  • FIG. 1 An exemplary system 100 according to the invention is shown in FIG. 1.
  • the exemplary system 100 as shown may be used to implement the methods according to the invention using one or more processor devices 108.
  • the system 100 includes a display device 102 and an eye tracking device 104 connected to communication infrastructure 106 - such as a bus -, which forwards data from the communication infrastructure 106 to other components of the system 100.
  • the display device 102 may be, for example, a monitor, touch screen, or any other computer peripheral device, or any combination thereof, capable of entering and/or viewing data. It is also contemplated the display device 102 may be a web- based interface accessible through the system 100. According to the invention, the system 100 may be a small-sized computer device including, for example, a personal digital assistant (PDA), smart hand-held computing device, cellular telephone, or a laptop or netbook computer, hand held console or MP3 player, tablet, or similar hand held computer device, such as an iPad®, iPad Touch® or iPhone®.
  • PDA personal digital assistant
  • smart hand-held computing device cellular telephone
  • a laptop or netbook computer hand held console or MP3 player
  • tablet or similar hand held computer device, such as an iPad®, iPad Touch® or iPhone®.
  • the eye tracking device 104 measures eye positions and eye movement. More specifically, the device 104 incorporates illumination, sensors and processing to track eye movements and gaze point. The use of near-infrared light allows for accurate, continuous tracking regardless of surrounding light conditions. This technology is often referred to as pupil center corneal reflection eye tracking.
  • the eye tracking device 104 may be a remote eye tracking, or a mobile eye tracking. It is contemplated that the eye tracking device 104 may be a camera including a standard webcam.
  • the eye tracking device 104 can be calibrated using software, in which the user focuses on a blue dot as it moves to a variety of different locations on the display device 102. More specifically, the eye tracking device 104 operates by shining infrared light into the eye of the user to create reflections that cause the pupil to appear as a bright, well-defined disc in eye tracking device 104. The corneal reflection is also generated by the infrared light, appearing as a small, but sharp, glint outside of the pupil. The point being looked at by the user is then triangulated from the corneal reflection and the pupil center.
  • the system 100 includes one or more processor devices 108, which may be a special purpose or a general-purpose digital signal processor device that processes certain information.
  • the system 100 also includes a main memory 110 and/or secondary memory 112.
  • Main memory 110 includes, for example, random access memory (RAM), read-only memory (ROM), mass storage device, or any combination thereof.
  • Secondary memory 112 may include, for example, a hard disk unit, a removable storage unit, or any combination.
  • Main memory 110 and/or secondary memory 112 may each include a database 111 , 113, respectively.
  • the system 100 may also include a communication interface 114, for example, a modem, a network interface (such as an Ethernet card or Ethernet cable), a communication port, a PCMCIA slot and card, wired or wireless systems (such as Wi-Fi, Bluetooth, Infrared), local area networks, wide area networks, intranets, etc.
  • a communication interface 114 for example, a modem, a network interface (such as an Ethernet card or Ethernet cable), a communication port, a PCMCIA slot and card, wired or wireless systems (such as Wi-Fi, Bluetooth, Infrared), local area networks, wide area networks, intranets, etc.
  • main memory 110 secondary memory 111 , including database 111 and database 113, or a combination, function as a computer usable storage medium, otherwise referred to as a computer readable storage medium, to store and/or access computer software including computer instructions.
  • computer programs or other instructions may be loaded into the system 100 such as through a removable storage device, for example, a ZIP disk, portable flash drive, optical disk such as a CD or DVD or Blu-ray, Micro-Electro- Mechanical Systems (MEMS).
  • Computer programs, when executed, enable the system 100, particularly the processor device 108, to implement the methods of the invention according to computer software including instructions.
  • the system 100 may perform any one of, or any combination of, the steps of any of the methods according to the invention.
  • Communication interface 114 allows software, instructions and data to be transferred between the system 100 and external devices or external networks.
  • Software, instructions, and/or data transferred by the communication interface 114 are typically in the form of signals that may be electronic, electromagnetic, optical or other signals capable of being sent and received by the communication interface 114. Signals may be sent and received using wire or cable, fiber optics, a phone line, a cellular phone link, a Radio Frequency (RF) link, wireless link, or other communication channels.
  • RF Radio Frequency
  • FIG. 1 The system 100 of FIG. 1 is provided only for purposes of illustration, such that the invention is not limited to this specific embodiment. It is appreciated that a person skilled in the relevant art knows how to program and implement the invention using any computer system or network architecture.
  • FIG. 2 is a flowchart 200 according to one embodiment of a method for developing content by a user.
  • the content the user desires to develop can be accomplished using directed design methods or free-form design methods.
  • directed design methods are those used to develop by the user an object pre-selected by the system, i.e., target object.
  • Content can be created without requiring a target object.
  • Free-form design methods are those used to develop by the user any object the user desires, i.e., creative object.
  • a user's trace of the point-of regard is processed by the system using a function - stochastic, machine learning - to produce a representation of the one or more preferences of the user. The user can simply sit back and look at the display device while the content changes.
  • the method is implemented with a user positioned in front of the system 100, specifically display device 102, as described in reference to FIG. 1. For example, a user may be placed approximately 27 inches in front of the display device.
  • a plurality of content representations, or objects is presented to a user on the display device.
  • Each content representation is displayed according an attribute type. More specifically, each content representation is displayed according to an attribute value for the attribute type.
  • attribute types include, for example, size, color, shape.
  • the attribute values may include, for example: small, large.
  • the attribute values may include, for example: red, blue, green.
  • the attribute type shape the attribute values may include, for example: cone, oval, rectangle.
  • any attribute type - texture, length, composition - and values - medium, purple, square - can be used according to the invention.
  • the invention is applicable to multi-part objects such as faces.
  • the plurality of content representations is shown to the user in the structure of an array, such as a 3 x 5 array, of 3-D objects.
  • each content representation is shown as rotating around its vertical axis.
  • an eye tracking device 104 calculates the point-of-regard data over a specified period of time for each content representation. For example, the eye tracking device 104 loses a signal resultant from the user's pupils leaving the capture range of the eye tracking device 104 or the point-of-regard appeared off the display device, the system 100 pauses, only to resume upon the return of a valid, onscreen signal, e.g., eye tracking device captures user's pupils.
  • the processor device 108 (FIG. 1) records the point-of-regard data for each content representation. 5.
  • point-of-regard data is recorded only within an array of three-dimensional objects.
  • the processor device accumulates the point-of-regard data for each content representation to obtain accumulated data for each content representation at step 208.
  • the accumulated point-of-regard data is taken as a proxy for an affinity of the user for the one or more content representations.
  • data is accumulated using a non-weighted sum.
  • data is accumulated using a weighted sum, for example, to discount any early attention paid to surprising or different, yet ultimately uninteresting object.
  • the processor device compares the accumulated data for each content representation to a pre-determined threshold value.
  • the predetermined threshold value is one second; however any value is contemplated.
  • the process device determines the accumulated data for each content representation exceeding the pre-determined threshold value to obtain one or more favored content representations. Of those, a content representation is selected that has an accumulated data exceeding the predetermined threshold value by the greatest amount at step 214.
  • the selected content representation is illustrated on the display device at step 216. In one embodiment, the selected content representation is highlighted to designate to the user that it is the content representation the user selected. It is contemplated that the user may communicate to the system 100 (FIG. 1 ) whether or not the selected content representation illustrated on the display device is correct.
  • CPPN Compositional Pattern- Producing Network
  • NEAT NeuroEvolution of Augmenting Topologies
  • CPPN is a way to encode designs in the same way nature encodes its designs (e.g. overlapping chemical gradients during the embryonic development of animals such as jaguars, hawks, or dolphins).
  • CPPN is similar to a neural network, but its nodes contain multiple math functions, for example: sine, sigmoid, Gaussian, and linear.
  • each voxel has an x, y, and z coordinate that is input into the network, along with the voxel's distance from center d.
  • An output of the network, queried at each geometric-coordinate location specifies whether any material is present at a given location.
  • the remaining output nodes are queried once (at the center point d) and specify the red, green, blue (RGB) values that comprise the object's color.
  • CPPN-NEAT iteratively queries each voxel within a specified bounding area and produces output values as a function of the coordinates of that voxel. These outputs determine the shape and color of an object.
  • the voxel shape is then smoothed, for example, with a Marching Cubes algorithm to produce the final object.
  • the CPPN-NEAT network is queried at some finite resolution, it actually specifies a mathematical representation of the shape and thus, critically for high quality 3-D printing, it can be queried with arbitrarily high resolution.
  • CPPN evolves according to the evolutionary algorithm, NEAT.
  • NEAT the evolutionary algorithm
  • a population size of 15 is used, such that the entire population is displayed to the user at each generation in a 3 x 5 array.
  • the eye tracking device records the user's point-of-regard within the 3 x 5 array cell corresponding with that user, that user would gain the clock time since the last refresh loop of the algorithm (this value is typically a small fraction of a second).
  • This process lasts until one user representation accumulates one second (1000 milliseconds) of time it was looked at by the user.
  • the generation ceases, and each user is assigned the fitness equal to the time it was looked at during that generation (in milliseconds).
  • the top user at each generation would have a fitness of 1000, while all other users have a fitness between 999 and 1 (the minimum baseline fitness), depending on the time the user spent looking at each of the content representations during the given generation.
  • the invention allows hands-free design by passively gathering user feedback via eye tracking technology, which accurately infers which object the user is looking at and can use that information to direct successful design sessions. Users can successfully design objects - target and creative - and can develop interesting, novel shapes without touching a keyboard or mouse.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne une technologie de suivi oculaire permettant un développement mains libres de contenu tel que des objets qui déduit précisément ce qu'un utilisateur est en train de regarder. Le contenu peut être créé sous n'importe quelle forme, y compris en trois dimensions (3D). L'utilisation de CPPN-NEAT pour encoder et faire évoluer l'objet permet d'imprimer ce dernier au moyen d'une technologie d'impression en 3D.
PCT/US2014/060706 2013-10-18 2014-10-15 Système de suivi oculaire et procédé de développement de contenu Ceased WO2015057845A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201361892945P 2013-10-18 2013-10-18
US61/892,945 2013-10-18

Publications (1)

Publication Number Publication Date
WO2015057845A1 true WO2015057845A1 (fr) 2015-04-23

Family

ID=52828654

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2014/060706 Ceased WO2015057845A1 (fr) 2013-10-18 2014-10-15 Système de suivi oculaire et procédé de développement de contenu

Country Status (1)

Country Link
WO (1) WO2015057845A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3438810A1 (fr) * 2017-08-04 2019-02-06 XYZprinting, Inc. Imprimante 3d et procédé d'impression 3d

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050024586A1 (en) * 2001-02-09 2005-02-03 Sensomotoric Instruments Gmbh Multidimensional eye tracking and position measurement system for diagnosis and treatment of the eye
US20050073136A1 (en) * 2002-10-15 2005-04-07 Volvo Technology Corporation Method and arrangement for interpreting a subjects head and eye activity
US20050175218A1 (en) * 2003-11-14 2005-08-11 Roel Vertegaal Method and apparatus for calibration-free eye tracking using multiple glints or surface reflections
US20060121436A1 (en) * 2004-12-03 2006-06-08 Elaine Kruse Graphical workspace for idea management
WO2009022924A1 (fr) * 2007-08-15 2009-02-19 William Bryan Woodard Système de génération d'image
US20100283843A1 (en) * 2007-07-17 2010-11-11 Yang Cai Multiple resolution video network with eye tracking based control
US20100323775A1 (en) * 2009-06-22 2010-12-23 University Of Central Florida Research Foundation, Inc. Systems and Methods for Evolving Content for Computer Games
US20110128223A1 (en) * 2008-08-07 2011-06-02 Koninklijke Phillips Electronics N.V. Method of and system for determining a head-motion/gaze relationship for a user, and an interactive display system
US20120272179A1 (en) * 2011-04-21 2012-10-25 Sony Computer Entertainment Inc. Gaze-Assisted Computer Interface
US20120290517A1 (en) * 2011-05-11 2012-11-15 Affectivon Ltd. Predictor of affective response baseline values
US20130103624A1 (en) * 2011-10-20 2013-04-25 Gil Thieberger Method and system for estimating response to token instance of interest

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050024586A1 (en) * 2001-02-09 2005-02-03 Sensomotoric Instruments Gmbh Multidimensional eye tracking and position measurement system for diagnosis and treatment of the eye
US20050073136A1 (en) * 2002-10-15 2005-04-07 Volvo Technology Corporation Method and arrangement for interpreting a subjects head and eye activity
US20050175218A1 (en) * 2003-11-14 2005-08-11 Roel Vertegaal Method and apparatus for calibration-free eye tracking using multiple glints or surface reflections
US20060121436A1 (en) * 2004-12-03 2006-06-08 Elaine Kruse Graphical workspace for idea management
US20100283843A1 (en) * 2007-07-17 2010-11-11 Yang Cai Multiple resolution video network with eye tracking based control
WO2009022924A1 (fr) * 2007-08-15 2009-02-19 William Bryan Woodard Système de génération d'image
US20110128223A1 (en) * 2008-08-07 2011-06-02 Koninklijke Phillips Electronics N.V. Method of and system for determining a head-motion/gaze relationship for a user, and an interactive display system
US20100323775A1 (en) * 2009-06-22 2010-12-23 University Of Central Florida Research Foundation, Inc. Systems and Methods for Evolving Content for Computer Games
US20120272179A1 (en) * 2011-04-21 2012-10-25 Sony Computer Entertainment Inc. Gaze-Assisted Computer Interface
US20120290517A1 (en) * 2011-05-11 2012-11-15 Affectivon Ltd. Predictor of affective response baseline values
US20130103624A1 (en) * 2011-10-20 2013-04-25 Gil Thieberger Method and system for estimating response to token instance of interest

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3438810A1 (fr) * 2017-08-04 2019-02-06 XYZprinting, Inc. Imprimante 3d et procédé d'impression 3d
CN109383029A (zh) * 2017-08-04 2019-02-26 三纬国际立体列印科技股份有限公司 立体打印设备以及立体打印方法
US10632682B2 (en) 2017-08-04 2020-04-28 Xyzprinting, Inc. Three-dimensional printing apparatus and three-dimensional printing method

Similar Documents

Publication Publication Date Title
US11656677B2 (en) Planar waveguide apparatus with diffraction element(s) and system employing same
CN113383295B (zh) 调节数字内容以激发更大的瞳孔半径响应的生物反馈方法
US9671566B2 (en) Planar waveguide apparatus with diffraction element(s) and system employing same
KR20220080030A (ko) 사용자 인터페이스 메뉴의 콘텍추얼 인식
KR20210153151A (ko) 생체 인증 정보를 교환하도록 구성된 머리 장착 디스플레이 시스템
WO2015006784A2 (fr) Appareil à guide d'ondes plan comportant un ou plusieurs éléments de diffraction, et système utilisant cet appareil
US10719193B2 (en) Augmenting search with three-dimensional representations
KR20140011204A (ko) 컨텐츠 제공 방법 및 이를 적용한 디스플레이 장치
KR20190066428A (ko) 기계학습에 기반한 가상 현실 콘텐츠의 사이버 멀미도 예측 모델 생성 및 정량화 조절 장치 및 방법
CN107562186A (zh) 基于注意力辨识进行情感运算的3d校园导览方法
WO2015057845A1 (fr) Système de suivi oculaire et procédé de développement de contenu
US20250298642A1 (en) Command recommendation system and user interface element generator, and methods of use thereof
US20250278902A1 (en) Techniques for coordinating artificial-reality interactions using augmented-reality interaction guides for performing interactions with physical objects within a user's physical surroundings, and systems and methods for using such techniques
Cheney et al. Hands-free evolution of 3d-printable objects via eye tracking

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14853764

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14853764

Country of ref document: EP

Kind code of ref document: A1