WO2022025921A1 - Détection d'indifférence aux changements par bioanalyse - Google Patents
Détection d'indifférence aux changements par bioanalyse Download PDFInfo
- Publication number
- WO2022025921A1 WO2022025921A1 PCT/US2020/044502 US2020044502W WO2022025921A1 WO 2022025921 A1 WO2022025921 A1 WO 2022025921A1 US 2020044502 W US2020044502 W US 2020044502W WO 2022025921 A1 WO2022025921 A1 WO 2022025921A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- scene
- subsystem
- cognitive load
- virtual
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/015—Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
Definitions
- Virtual reality systems utilize electronic displays to display a virtual scene to a user.
- Augmented reality systems utilize electronic displays that display virtual scene elements overlaid on real-world scenery.
- the real-world scenery may be captured via a camera and displayed via electronic display, together with the virtual scenery.
- the real-world scenery may be passively transmitted to the user with virtual scenery overlaid thereon.
- Mixed reality systems may combine virtual elements that can be occluded by real-world scenery.
- Extended reality is used to encompass virtual reality (VR), augmented reality (AR), and/or mixed reality (MR).
- Extended reality systems may utilize stereoscopic electronic display technologies to provide a more realistic depth perception of virtual elements and scenery.
- FIG. 1 A illustrates a block diagram of an example system to detect change blindness using bio-analytic information from a physiological sensor.
- FIG. 1 B illustrates a block diagram of an example system to provide a notification of change blindness based on information from a physiological sensor.
- FIG. 1 C illustrates a block diagram of an example system to modify an XR scene when change blindness is detected.
- FIG. 1 D illustrates a block diagram of an example system to detect change blindness using bio-analytic information from a physiological sensor and eye state information from an optics monitoring subsystem.
- FIG. 2 illustrates a flow chart of an example method for generating a notification that a user is experiencing change blindness using bio-analytic information from a physiological sensor.
- FIG. 3A illustrates a user operating an example virtual reality (VR) system in conjunction with a real-world object.
- FIG. 3B illustrates a user-view of virtual scenery displayed by the example VR system.
- VR virtual reality
- FIG. 4A illustrates the user-view of the user’s virtual hand grabbing a left most block.
- FIG. 4B illustrates the real-world movement of the user’s hand imperceptibly redirected to grab the real-world block in the center.
- FIG. 4C illustrates the user-view of the user’s virtual hands grabbing a right most block.
- FIG. 4D illustrates the real-world movement of the user’s hands to grab the real-world block in the center.
- XR extended reality
- XR systems include, without limitation, virtual reality (VR) systems, augmented reality (AR) systems, mixed reality (MR) systems, and possible combinations and variations thereof.
- VR virtual reality
- AR augmented reality
- MR mixed reality
- the XR system may detect change blindness to notify or trigger a subsystem to modify a virtual scene when the user is less likely to notice the change.
- imperceptible and unobtrusive are used to describe modifications made when the user is experiencing change blindness.
- modifications to the scene can be made in a manner that is less disruptive and/or causes less interruption to the user experience than if the same modification were made when the user was not experiencing change blindness.
- An XR system may include an electronic display subsystem to display an XR scene to a user.
- an XR system may include headgear or glasses that include two distinct electronic displays to display unique images to a user as part of a three-dimensional (3D) virtual scene, such as a completely virtual scene or mixed reality virtual scene.
- the XR system may include a physiological sensor to measure a physiological condition of the user. Examples of physiological conditions that can be monitored via a physiological sensor include, but are not limited to, heart rate, change in heart rate, facial expressions, galvanic skin response, blood sugar levels, changes in blood sugar, brain wave activity, activity in specific portions of the brain, pace, or gait of the user, skin coloration, facial flushing, breathing patterns, and the like.
- the XR system may alternatively (or additionally) include an optics monitoring system to monitor eye characteristics of the user, such as pupil dilation, creasing around the eye, squinting, eye movement, and the like.
- an optics monitoring system to monitor eye characteristics of the user, such as pupil dilation, creasing around the eye, squinting, eye movement, and the like.
- physiological sensors and physiological conditions encompass a large variety of possible non-eye related conditions, while terms such as optics monitoring sensors, eye conditions, and optics conditions are used to encompass eye-related conditions.
- the system may calculate a cognitive load of the user based on measured physiological conditions (e.g., changes in measured physiological conditions). Above certain thresholds of cognitive load, the user may experience change blindness. During periods of change blindness, the user may, for example, be hyper-focused on certain tasks to such an extent that changes can be made to the overall scene and/or to certain elements of the scene without disrupting or interrupting the user. In some instances, the system may determine that a modification or change to a scene is desirable or warranted and display elements to increase the cognitive load of a user to induce change blindness prior to making the scene change.
- measured physiological conditions e.g., changes in measured physiological conditions
- the system may determine that a modification or change to a scene is desirable or warranted and display elements to increase the cognitive load of a user to induce change blindness prior to making the scene change.
- the system may leverage change blindness to make imperceptible or unobtrusive changes to a virtual scene for any number of reasons.
- the system may leverage moments that the user is experiencing change blindness to manipulate psychomotor feedback to guide a user motion (e.g., walking, head movement, hand movement, finger movement, etc.) in the physical world.
- a user motion e.g., walking, head movement, hand movement, finger movement, etc.
- the system may leverage detected change blindness of a user to redirect walking or for haptic retargeting.
- the system may redirect walking when the user is experiencing change blindness to adjust the trajectory of the user in virtual space in an imperceptible or unobtrusive manner to redirect the user’s motion in physical space.
- a user may visualize walking in a straight or near straight line in virtual space, but, through imperceptible or unobtrusive scene modifications when the user is experiencing change blindness, actual traverse a curved or even circular path in physical space.
- the system may guide a portion of the user’s body in the physical world.
- the system may utilize a limited set of physical props or physical, real-world features for any number of virtual objects that appear to the user to be in different locations and in different scenes.
- the system may use eye-tracking sensors to detect blinking or saccades that may be indicative of change blindness.
- the system may then notify a third-party application or module that the user is experiencing change blindness.
- the third-party application or module may use the notification to modify the virtual scene as appropriate or helpful.
- the system may determine that cognitive load value exceeds a change blindness threshold value, indicating that the user is experiencing change blindness.
- physiological sensors may detect physiological conditions of the user indicative of an elevated cognitive load that gives rise to change blindness.
- Either approach for detecting change blindness, or a combination thereof, may be used as part of a change blindness prediction engine that combines different signals to create a holistic and/or converted prediction of change blindness using various any combination of sensor inputs and measurements.
- the system may determine that the user is experiencing change blindness based on a weighted analysis of the user’s heart rate, eye saccades, and breathing rhythm.
- the sensors, processing, analysis, electronic display subsystems, and other portions of the systems described herein may be discrete physical components, housed in a headset of an XR system, embodied as part of a laptop or other portable computer, integrated as part of a mobile phone or tablet, and/or integrated into other wearable technology and/or smart home devices.
- modules, systems, and subsystems are described herein as implementing functions and/or as performing actions. In many instances, modules, systems, and subsystems may be divided into sub-modules, subsystems, or even as sub-portions of subsystems. Modules, systems, and subsystems may be implemented in hardware, software, hardware, and/or combinations thereof.
- FIG. 1A illustrates a block diagram of an example system 100 to detect change blindness using bio-analytic information from a physiological sensor.
- the system 100 may include an electronic display subsystem 110 to display an XR scene to a user.
- the electronic display subsystem 110 may include a controller and stereoscopic displays to display images and/or video to the user.
- the system may also include, as described herein, a physiological sensor subsystem 120, a cognitive load estimation subsystem 130, and a change blindness detection subsystem 140.
- the physiological sensor subsystem 120 may include any number of physiological sensors and/or associated processing hardware.
- the physiological sensor subsystem 120 may include a heart rate monitor, a facial expression monitor, a visible light camera, an infrared camera, an electrodermal monitor, a blood sugar meter, a brain wave activity monitor, speed monitor, step counter, a pulse oximeter, a motion sensor, or the like.
- the physiological sensor subsystem 120 may monitor physiological conditions of a user, such as heart rate, changes in heart rate, facial expressions, galvanic skin response, blood sugar levels, changes in blood sugar, brain wave activity, activity in specific portions of the brain, pace or gait of the user, skin coloration, facial flushing, breathing patterns, and the like.
- a cognitive load estimation subsystem 130 may calculate a relative cognitive load, such as a cognitive load value of the user.
- the cognitive load estimation subsystem 130 may calculate the cognitive load value based on the measured physiological condition.
- the change blindness detection subsystem 140 may determine or detect that the user is experiencing change blindness based on the measured or calculated cognitive load value. For example, the change blindness detection subsystem 140 may compare the calculated cognitive load value with general or personalized threshold levels that cause or are anticipated to cause the user to experience change blindness.
- FIG. 1 B illustrates a block diagram of an example XR system 100 to provide a notification of change blindness based on information from a physiological sensor.
- the illustrated example includes the electronic display subsystem 110, the physiological sensor subsystem 120, the cognitive load estimation subsystem 130, and the change blindness detection subsystem 140 as described in conjunction with FIG. 1A.
- a notification subsystem 150 may transmit a notification that the user is experiencing change blindness to a separate internal module or to an external or independent module.
- the notification subsystem 150 may transmit a notification that the user is experiencing change blindness to a scene modification subsystem.
- the XR system 100 may include a processor 190, memory 191 , a non-transitory computer readable medium 192, and/or a data communication subsystem 193 (e.g., a bus or network communication device).
- the non-transitory computer readable medium 192 of the XR system 100 may include instructions stored thereon that, when executed by the processor interact with, supplement, or implement the various functions and processes described in conjunction with the electronic display subsystem 110, the physiological sensor subsystem 120, the cognitive load estimation subsystem 130, and/or the change blindness detection subsystem 140.
- the non-transitory computer readable medium 192 may include instructions stored thereon that, when executed by the processor, cause the XR system 100 to display a scene (e.g., a virtual reality scene) via the electronic display subsystem 110.
- the instructions may further facilitate the XR system 100 in determining that a scene change is desired, would be beneficial, or needed (generally referred to as “warranted.”
- the instructions may cause the XR system 100 to provoke an increase in the cognitive load of a user by displaying elements targeted to increase the cognitive load of a user.
- the instructions may cause the electronic display subsystem 110 to display a message for the user to read, flash a light, present a problem for the user to solve, or otherwise cause the user to focus attention for a period of time.
- the instructions may cause the XR system 100 to implement the scene change during the provoked increase in the cognitive load of the user.
- FIG. 1 C illustrates a block diagram of an example XR system 100 to modify an XR scene when change blindness is detected.
- the illustrated example includes the electronic display subsystem 110, the physiological sensor subsystem 120, the cognitive load estimation subsystem 130, and the change blindness detection subsystem 140 as described in conjunction with FIG. 1A.
- the illustrated example XR system 100 further includes a scene modification subsystem 150 to modify the scene displayed via the XR system in response to the detection of change blindness or a transmitted (e.g., internally transmitted or externally transmitted) notification that the user is experiencing change blindness.
- a transmitted e.g., internally transmitted or externally transmitted
- FIG. 1 D illustrates a block diagram of an example XR system to detect change blindness using bio-analytic information from a physiological sensor and eye state information from an optics monitoring subsystem.
- the illustrated example includes the electronic display subsystem 110, the physiological sensor subsystem 120, the cognitive load estimation subsystem 130, and the change blindness detection subsystem 140 as described in conjunction with FIG. 1A.
- the illustrated example XR system 100 further includes an optics monitoring subsystem 125 to monitor a characteristic of an eye of the user.
- the optics monitoring subsystem 125 may detect pupil dilation, changes in the rate of blinking, eye movement, focus direction or angle within a field of view, saccades, or the like.
- FIG. 2 illustrates a flow chart of an example method 200 for generating a notification that a user is experiencing change blindness using bio-analytic information from a physiological sensor.
- An XR system such as a VR system, AR system, or MR system may measure, at 210, a physiological condition of a user via a physiological sensor subsystem.
- the physiological sensor subsystem may include any number of sensors, or a single sensor, to monitor non-eye related physiological conditions of the user viewing a scene, such as a virtual reality scene, an augmented reality scene, or a mixed reality scene.
- the system may estimate, at 220, a cognitive load value associated with the user based on the measured physiological condition(s).
- the system may use the estimated cognitive load value to determine, at 230, that the user is experiencing change blindness.
- the system may transmit, at 240, a notification that the user is experiencing change blindness to another portion of the XR system, third-party applications implemented on the XR system, and/or to an external system associated with or in communication with the XR system.
- the system may modify the scene displayed by the XR system while the user is experiencing change blindness, such that the modifications are imperceptible, unobtrusive, or otherwise non-disruptive to the user experience.
- the XR system may execute scene modifications for the purpose of haptic retargeting or redirected walking, as described herein.
- FIG. 3A illustrates a user 310 operating an example VR system 350 in conjunction with a real-world object 380.
- FIG. 3B illustrates a user-view of virtual scenery 351 displayed by the example VR system that includes virtual objects 381 , 382, and 383 and a virtual hand 321 .
- the VR system 350 may detect that the user is experiencing change blindness and use the opportunity to implement haptic retargeting by modifying the virtual scenery 351 such that the user’s real hand 320 interacts with the real object 380 while viewing the user’s virtual hand 321 interacting with the virtual object 383.
- FIG. 4A illustrates the user-view 451 of the virtual hand 421 grabbing a left most virtual block 481 .
- FIG. 4B illustrates the real-world movement of the real hand 420 of the user 410 imperceptibly redirected to grab the real-world block 480 in the center.
- the user’s real hand 420 can be haptically retargeted to interact with the real-world block 480 in the center despite the user 410 believing or perceiving to be interacting with a block in a spatial location corresponding to the perceived location of the left-most virtual block 481.
- FIG. 4C illustrates the user-view 451 of the user’s virtual hand 421 grabbing a right-most virtual block 483.
- FIG. 4D illustrates the real-world movement of the user’s hand 420 to grab the real-world block 480 in the center.
- the VR system 450 may modify displayed scenes for haptic retargeting such that the user 410 perceives to be interacting with the right-most virtual block 483 while, as before, actually interacting with the single, real block 480.
- a single, real-world block 480 is used for physical interactions by the user’s real hand 420 regardless of which of the virtual blocks 481 , 482, and 483 is being touched by the user’s virtual hand 421 .
- virtual scenes may be shifted or modified when a user is experiencing change blindness to adjust a physical or real trajectory without perceptively modifying a virtual trajectory in a virtual scene (e.g., a walking trajectory, hand trajectory, arm trajectory, finger trajectory, head motion, etc.
- changes in a virtual scene may be modified when a user is experiencing change blindness to give the user the perception of turning hard corners (e.g., walking through hallways or doorways) in a virtual scene while walking in circles or remaining within a bounded area in the real world (physical space)
- haptic retargeting a user may use physical tools while engaging in virtual interactions. Haptic retargeting when the user is experiencing change blindness may facilitate more realistic interactions. For example, a user may use real chopsticks while viewing a virtual scene with virtual chopsticks and virtual food items. The user may perceive the closure of the virtual chopsticks on the virtual food item in the virtual scene.
- the relative closure of the virtual chopsticks may be modified with respect to the real movement of the physical chopsticks such that the real ends of the chopsticks contact each other in the real world at the same time the virtual ends of the virtual chopsticks contact the virtual food item.
- Similar haptic retargeting techniques could be implemented to make the closed ends of real pliers contact each other at the same time virtual pliers close around a virtual bolt, such that the user experiences a realistic haptic sensation.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Dermatology (AREA)
- Neurosurgery (AREA)
- Neurology (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
L'invention concerne des systèmes et des procédés se rapportant à des systèmes de réalité étendue (XR), tels qu'un système de réalité virtuelle (VR), un système de réalité mixte (MR) ou un système de réalité augmentée (AR). Des changements de scène moins gênants ou imperceptibles peuvent être effectués lorsque l'utilisateur expérimente une indifférence aux changements. Un capteur physiologique peut être utilisé pour mesurer un état physiologique de l'utilisateur. Un sous-système de détection d'indifférence aux changements peut déterminer que l'utilisateur expérimente une indifférence aux changements sur la base d'une valeur de charge cognitive calculée dépassant une valeur seuil d'indifférence aux changements.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/US2020/044502 WO2022025921A1 (fr) | 2020-07-31 | 2020-07-31 | Détection d'indifférence aux changements par bioanalyse |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/US2020/044502 WO2022025921A1 (fr) | 2020-07-31 | 2020-07-31 | Détection d'indifférence aux changements par bioanalyse |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2022025921A1 true WO2022025921A1 (fr) | 2022-02-03 |
Family
ID=80036034
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2020/044502 Ceased WO2022025921A1 (fr) | 2020-07-31 | 2020-07-31 | Détection d'indifférence aux changements par bioanalyse |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2022025921A1 (fr) |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20160274660A1 (en) * | 2014-05-09 | 2016-09-22 | Eyefluence, Inc. | Systems and methods for biomechanically-based eye signals for interacting with real and virtual objects |
| US20200103967A1 (en) * | 2018-09-28 | 2020-04-02 | Apple Inc. | Pupil Modulation As A Cognitive Control Signal |
-
2020
- 2020-07-31 WO PCT/US2020/044502 patent/WO2022025921A1/fr not_active Ceased
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20160274660A1 (en) * | 2014-05-09 | 2016-09-22 | Eyefluence, Inc. | Systems and methods for biomechanically-based eye signals for interacting with real and virtual objects |
| US20200103967A1 (en) * | 2018-09-28 | 2020-04-02 | Apple Inc. | Pupil Modulation As A Cognitive Control Signal |
Non-Patent Citations (1)
| Title |
|---|
| MARWECKI SEBASTIAN SEBASTIAN.MARWECKI@GMAIL.COM; WILSON ANDREW D. AWILSON@MICROSOFT.COM; OFEK EYAL EYALOFEK@MICROSOFT.COM; GONZALE: "Mise-Unseen Using Eye Tracking to Hide Virtual Reality Scene Changes in Plain Sight", USER INTERFACE SOFTWARE AND TECHNOLOGY, ACM, 2 PENN PLAZA, SUITE 701NEW YORKNY10121-0701USA, 17 October 2019 (2019-10-17) - 23 October 2019 (2019-10-23), 2 Penn Plaza, Suite 701New YorkNY10121-0701USA , pages 777 - 789, XP058479541, ISBN: 978-1-4503-6816-2, DOI: 10.1145/3332165.3347919 * |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US12414703B2 (en) | Medical device system for remote monitoring and inspection | |
| US10684469B2 (en) | Detecting and mitigating motion sickness in augmented and virtual reality systems | |
| Arabadzhiyska et al. | Saccade landing position prediction for gaze-contingent rendering | |
| JP7211966B2 (ja) | マルチモーダル眼追跡 | |
| CN107949819B (zh) | 用于基于眼扫视检测的动态图形渲染的装置和方法 | |
| US8328691B2 (en) | Feedback device for guiding and supervising physical excercises | |
| US20200201434A1 (en) | Bioresponsive virtual reality system and method of operating the same | |
| EP3321773B1 (fr) | Dispositif de traitement d'informations, dispositif d'affichage, procédé de traitement d'informations et programme | |
| US20240310638A1 (en) | Head-mount display and head-mount display system | |
| US10832483B2 (en) | Apparatus and method of monitoring VR sickness prediction model for virtual reality content | |
| CN112272814B (zh) | 信息处理设备、信息处理方法及程序 | |
| AU2017233538A1 (en) | An emotionally aware wearable teleconferencing system | |
| KR20210107784A (ko) | Ar/vr 환경에서 사용자 관심의 시각적 표시자들 | |
| US20210158715A1 (en) | Method and apparatus for vr training | |
| US20240139462A1 (en) | Methods for cybersickness mitigation in virtual reality experiences | |
| Wang et al. | Control with vergence eye movement in augmented reality see-through vision | |
| Yamamura et al. | Pleasant locomotion--towards reducing cybersickness using fNIRS during walking events in VR | |
| US20240139463A1 (en) | Methods for cybersickness mitigation in virtual reality experiences | |
| US20180224925A1 (en) | System and method for image processing | |
| Pelz et al. | Development of a virtual laboratory for the study of complex human behavior | |
| WO2020246986A1 (fr) | Commandes de mouvements oculaires en réalité étendue | |
| WO2022025921A1 (fr) | Détection d'indifférence aux changements par bioanalyse | |
| JP7467094B2 (ja) | 情報処理装置、情報処理方法、及びプログラム | |
| Myall et al. | Design of a modular and low-latency virtual-environment platform for applications in motor adaptation research, neurological disorders, and neurorehabilitation | |
| WO2025161689A1 (fr) | Système d'interaction basé sur le regard et procédé associé |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20947493 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 20947493 Country of ref document: EP Kind code of ref document: A1 |