WO2018148076A1 - Système et procédé de positionnement automatisé d'un contenu de réalité augmentée - Google Patents
Système et procédé de positionnement automatisé d'un contenu de réalité augmentée Download PDFInfo
- Publication number
- WO2018148076A1 WO2018148076A1 PCT/US2018/016197 US2018016197W WO2018148076A1 WO 2018148076 A1 WO2018148076 A1 WO 2018148076A1 US 2018016197 W US2018016197 W US 2018016197W WO 2018148076 A1 WO2018148076 A1 WO 2018148076A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- content
- display
- render
- display device
- hmd
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/20—Scenes; Scene-specific elements in augmented reality scenes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/431—Generation of visual interfaces for content selection or interaction; Content or additional data rendering
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
- H04N21/816—Monomedia components thereof involving special video data, e.g 3D video
Definitions
- the AR device used for viewing the AR content is embedded with sensors capable of producing depth information from the environment.
- a sensor or combination of sensors, may include RGB-D cameras, stereo cameras, infrared cameras, lidar, radar, sonar, and any other sort of sensor known by those with skill in the art of image and depth sensing. Combinations of sensor types and enhanced processing methods may be employed for depth detection.
- sensors collect point cloud data from the environment as the user moves the device and sensors through the environment. Sensor observations with varying points of view are combined to form a coherent 3D reconstruction of the complete environment.
- the AR device proceeds to send the reconstructed model to the server along with a request for AR content.
- the level of completeness may be measured for example as the percentage of the surrounding area coverage, number of discrete observations, duration of sensing, etc. and any similar quality value that may be used as a threshold.
- 3D reconstruction of the environment may be carried out with any known reconstruction method, such as ones featured in KinectFusion or Point Cloud Library (PCL).
- the AR device at the beginning of an AR viewing session starts to continuously stream RGB-D data to the server.
- the server performs the 3D reconstruction process using the received RGB-D data stream and stores the reconstructed environment model.
- the server constructs the per client environment model, it also begins to filter available AR content by removing content not preferable given that client's environment model.
- the content selection processing becomes more efficient.
- Another embodiment takes the form of a system that includes a communication interface, a processor, and data storage containing instructions executable by the processor for causing the system to carry out at least the functions described in the preceding paragraph.
- AR content comprises 3D virtual content. This includes but is not limited to virtual models of objects associated with the primary media, such as a racecar for an F1 event or a solar system for a Neil deGrasse Tyson show. User preferences may be used to look-up which racer is the user's favorite. The system may then provide the 3D model of that racer's car. If the primary media is footage of a security camera, then the AR content may be a 3D virtual model of the secure building.
- An exemplary process described herein comprises analyzing the real-world environment to measure visual characteristics.
- this includes hardware components such as sensors as well as software components such as object classifiers working together.
- the form of analysis executed and the visual characteristics measured vary. This is a direct result of various optimizations that may be leveraged, based on detectable differences in use case scenarios.
- the analysis of the real-world environment may be carried out by the AR headset, an external sensor, an external computing device, and a combination thereof. For example, the analysis may not search for surfaces suitable for rendering virtual 3D content if the available AR content does not include any virtual 3D content types.
- generating AR content render parameters comprises comparing each AR content in the selection with colors around the display to avoid render locations with poor contrast.
- One example is not rendering a Christmas tree over a green wall.
- generating AR content render parameters comprises comparing each AR content in the selection with lighting conditions around the display to avoid render locations with poor contrast.
- One example is not rendering black text over a dark wall.
- generating AR content render parameters comprises comparing each AR content in the selection with a visual complexity around the display to avoid visually complex render locations.
- generating AR content render parameters comprises comparing each AR content in the selection with textures around the display to avoid render locations with poor textures (e.g., stone or brick walls and curtains).
- generating AR content render parameters comprises, (i) virtually testing the available AR content in a plurality of potential render locations with a plurality of potential render styles, (ii), generating AR content - location - style compatibility scores, (iii) generating the AR content render parameters based on the AR content - location - style compatibility scores.
- Output of AR content per the render parameters 306 may comprise output for side-stream content, output for 2D planar content, output for 3D virtual content, and output for 360-degree immersive content.
- Content streaming and viewing 532 may commence.
- the AR content server 506 optimizes 524 the requested AR content by removing content that is not preferable for the present viewing conditions and viewing hardware.
- an optimized content stream 534 is sent from an AR content server 506 to an AR view client 504.
- Display content 536 may be displayed to the user 502 by an AR viewer client 504.
- FIG. 8 is a depiction of an example real-world environment 800 comprising a display 802 depicting a primary media content, in accordance with at least one embodiment.
- the real-world environment 800 is the inside of a room.
- the room is a user's viewing location of choice for TV supplemented with AR content.
- the room includes a TV display 802 depicting a soccer match, two blocks 804, 806 on the floor, and a window 808.
- Behind the left side of the display is a brick wall 810 and behind the right side of the display is a wall clock 812 mounted near the ceiling.
- FIG. 8 is a reference image for use with the subsequent descriptions of FIGs. 9-14.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Processing Or Creating Images (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
La présente invention porte, dans un mode de réalisation, sur des systèmes et sur des procédés qui génèrent et affichent un contenu de réalité augmentée (AR) pour un environnement du monde réel dans lequel un affichage distinct est détecté par un visiocasque (HMD). Un contenu de réalité augmentée peut être sélectionné sur la base d'un contenu multimédia identifié sur l'affichage distinct. Un contenu de réalité augmentée peut être affiché à des emplacements, à proximité du dispositif d'affichage distinct, qui sont sélectionnés sur la base de caractéristiques visuelles des emplacements. Un contenu de réalité augmentée peut être affiché avec des paramètres de rendu qui augmentent la visibilité du contenu de réalité augmentée. Un mode de réalisation peut suivre la position et l'orientation du visiocasque et peut sélectionner un emplacement pour afficher le contenu de réalité augmentée sur la base de la position et de l'orientation du visiocasque. Un mode de réalisation peut afficher des connecteurs virtuels entre un contenu de réalité augmentée et des objets identifiés dans le contenu multimédia de l'affichage distinct. Un contenu de réalité augmentée peut être affiché à des emplacements qui réduisent à un minimum les intersections des connecteurs virtuels.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201762457442P | 2017-02-10 | 2017-02-10 | |
| US62/457,442 | 2017-02-10 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2018148076A1 true WO2018148076A1 (fr) | 2018-08-16 |
Family
ID=61244696
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2018/016197 Ceased WO2018148076A1 (fr) | 2017-02-10 | 2018-01-31 | Système et procédé de positionnement automatisé d'un contenu de réalité augmentée |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2018148076A1 (fr) |
Cited By (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN111986276A (zh) * | 2019-08-29 | 2020-11-24 | 芋头科技(杭州)有限公司 | 视觉增强设备中的内容生成 |
| CN112734941A (zh) * | 2021-01-27 | 2021-04-30 | 深圳迪乐普智能科技有限公司 | Ar内容的属性修改方法、装置、计算机设备及存储介质 |
| US20210390765A1 (en) * | 2020-06-15 | 2021-12-16 | Nokia Technologies Oy | Output of virtual content |
| US20220230396A1 (en) * | 2021-01-15 | 2022-07-21 | Arm Limited | Augmented reality system |
| US20220237913A1 (en) * | 2019-05-22 | 2022-07-28 | Pcms Holdings, Inc. | Method for rendering of augmented reality content in combination with external display |
| EP4312108A1 (fr) * | 2022-07-25 | 2024-01-31 | Sony Interactive Entertainment Europe Limited | Dispositif d'identification dans un environnement de réalité mixte |
| US12179091B2 (en) | 2019-08-22 | 2024-12-31 | NantG Mobile, LLC | Virtual and real-world content creation, apparatus, systems, and methods |
| CN119883172A (zh) * | 2025-03-27 | 2025-04-25 | 深圳市嘉润原新显科技有限公司 | 多屏显示协同控制方法和系统 |
| JP2025079565A (ja) * | 2023-11-10 | 2025-05-22 | 円谷フィールズホールディングス株式会社 | 情報処理装置、ヘッドマウントディスプレイ及びプログラム |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20140132484A1 (en) * | 2012-11-13 | 2014-05-15 | Qualcomm Incorporated | Modifying virtual object display properties to increase power performance of augmented reality devices |
| US20140168262A1 (en) * | 2012-12-18 | 2014-06-19 | Qualcomm Incorporated | User Interface for Augmented Reality Enabled Devices |
| US20160147492A1 (en) * | 2014-11-26 | 2016-05-26 | Sunny James Fugate | Augmented Reality Cross-Domain Solution for Physically Disconnected Security Domains |
| EP3096517A1 (fr) * | 2015-05-22 | 2016-11-23 | TP Vision Holding B.V. | Verres intelligents portables |
-
2018
- 2018-01-31 WO PCT/US2018/016197 patent/WO2018148076A1/fr not_active Ceased
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20140132484A1 (en) * | 2012-11-13 | 2014-05-15 | Qualcomm Incorporated | Modifying virtual object display properties to increase power performance of augmented reality devices |
| US20140168262A1 (en) * | 2012-12-18 | 2014-06-19 | Qualcomm Incorporated | User Interface for Augmented Reality Enabled Devices |
| US20160147492A1 (en) * | 2014-11-26 | 2016-05-26 | Sunny James Fugate | Augmented Reality Cross-Domain Solution for Physically Disconnected Security Domains |
| EP3096517A1 (fr) * | 2015-05-22 | 2016-11-23 | TP Vision Holding B.V. | Verres intelligents portables |
Cited By (14)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US11995578B2 (en) | 2019-05-22 | 2024-05-28 | Interdigital Vc Holdings, Inc. | Method for rendering of augmented reality content in combination with external display |
| US11727321B2 (en) * | 2019-05-22 | 2023-08-15 | InterDigital VC Holdings Inc. | Method for rendering of augmented reality content in combination with external display |
| US20220237913A1 (en) * | 2019-05-22 | 2022-07-28 | Pcms Holdings, Inc. | Method for rendering of augmented reality content in combination with external display |
| US12179091B2 (en) | 2019-08-22 | 2024-12-31 | NantG Mobile, LLC | Virtual and real-world content creation, apparatus, systems, and methods |
| CN111986276A (zh) * | 2019-08-29 | 2020-11-24 | 芋头科技(杭州)有限公司 | 视觉增强设备中的内容生成 |
| US11636644B2 (en) * | 2020-06-15 | 2023-04-25 | Nokia Technologies Oy | Output of virtual content |
| US20210390765A1 (en) * | 2020-06-15 | 2021-12-16 | Nokia Technologies Oy | Output of virtual content |
| US11544910B2 (en) * | 2021-01-15 | 2023-01-03 | Arm Limited | System and method for positioning image elements in augmented reality system |
| US20220230396A1 (en) * | 2021-01-15 | 2022-07-21 | Arm Limited | Augmented reality system |
| CN112734941A (zh) * | 2021-01-27 | 2021-04-30 | 深圳迪乐普智能科技有限公司 | Ar内容的属性修改方法、装置、计算机设备及存储介质 |
| EP4312108A1 (fr) * | 2022-07-25 | 2024-01-31 | Sony Interactive Entertainment Europe Limited | Dispositif d'identification dans un environnement de réalité mixte |
| JP2025079565A (ja) * | 2023-11-10 | 2025-05-22 | 円谷フィールズホールディングス株式会社 | 情報処理装置、ヘッドマウントディスプレイ及びプログラム |
| JP7689173B2 (ja) | 2023-11-10 | 2025-06-05 | 円谷フィールズホールディングス株式会社 | 情報処理装置、ヘッドマウントディスプレイ及びプログラム |
| CN119883172A (zh) * | 2025-03-27 | 2025-04-25 | 深圳市嘉润原新显科技有限公司 | 多屏显示协同控制方法和系统 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2018148076A1 (fr) | Système et procédé de positionnement automatisé d'un contenu de réalité augmentée | |
| US20250036194A1 (en) | Virtual 3d methods, systems and software | |
| US11210838B2 (en) | Fusing, texturing, and rendering views of dynamic three-dimensional models | |
| US9460351B2 (en) | Image processing apparatus and method using smart glass | |
| CN108475180B (zh) | 在多个显示区域之间分布视频 | |
| US20180189550A1 (en) | Facial signature methods, systems and software | |
| US10313657B2 (en) | Depth map generation apparatus, method and non-transitory computer-readable medium therefor | |
| WO2015192585A1 (fr) | Procédé et appareil de lecture d'une publicité dans une vidéo | |
| US20120287233A1 (en) | Personalizing 3dtv viewing experience | |
| CN110249291A (zh) | 用于在预捕获环境中的增强现实内容递送的系统和方法 | |
| US20230152883A1 (en) | Scene processing for holographic displays | |
| US20120068996A1 (en) | Safe mode transition in 3d content rendering | |
| US10453244B2 (en) | Multi-layer UV map based texture rendering for free-running FVV applications | |
| KR20140082610A (ko) | 휴대용 단말을 이용한 증강현실 전시 콘텐츠 재생 방법 및 장치 | |
| CN110730340B (zh) | 基于镜头变换的虚拟观众席展示方法、系统及存储介质 | |
| US20230122149A1 (en) | Asymmetric communication system with viewer position indications | |
| CN108076359B (zh) | 业务对象的展示方法、装置和电子设备 | |
| US12231702B2 (en) | Inserting digital contents into a multi-view video | |
| WO2020193703A1 (fr) | Techniques de détection d'occlusion en temps réel | |
| US20180095347A1 (en) | Information processing device, method of information processing, program, and image display system | |
| US20200265622A1 (en) | Forming seam to join images | |
| KR20130134638A (ko) | 동영상 정보 제공 방법 및 서버 | |
| US12101529B1 (en) | Client side augmented reality overlay | |
| EP3287975A1 (fr) | Système de génération d'images publicitaires et son procédé | |
| TW202239201A (zh) | 影像合成系統及其方法 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18706022 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 18706022 Country of ref document: EP Kind code of ref document: A1 |