[go: up one dir, main page]

WO2015123775A1 - Systèmes et procédés pour incorporer un train d'images réelles dans un train d'images virtuelles - Google Patents

Systèmes et procédés pour incorporer un train d'images réelles dans un train d'images virtuelles Download PDF

Info

Publication number
WO2015123775A1
WO2015123775A1 PCT/CA2015/050124 CA2015050124W WO2015123775A1 WO 2015123775 A1 WO2015123775 A1 WO 2015123775A1 CA 2015050124 W CA2015050124 W CA 2015050124W WO 2015123775 A1 WO2015123775 A1 WO 2015123775A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual
physical
image stream
camera
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/CA2015/050124
Other languages
English (en)
Inventor
Jian Zhang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sulon Technologies Inc
Original Assignee
Sulon Technologies Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sulon Technologies Inc filed Critical Sulon Technologies Inc
Publication of WO2015123775A1 publication Critical patent/WO2015123775A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0176Head mounted characterised by mechanical features
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/344Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/2224Studio circuitry; Studio devices; Studio equipment related to virtual studio applications
    • H04N5/2226Determination of depth image, e.g. for foreground/background separation
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye

Definitions

  • the following relates generally to wearable technologies, and more specifically to systems and methods for incorporating a real image stream in a virtual image stream.
  • AR augmented reality
  • VR virtual reality
  • a system for generating an augmented reality image stream combining virtual features and a physical image stream.
  • the system comprises: (a) an image camera mounted to a head mounted display, the image camera configured to capture the physical image stream of a physical environment within its field of view; (b) a depth camera mounted to the head mounted display, the depth camera configured to capture depth information for the physical environment within its field of view; (c) a processor configured to: (i) obtain the physical image stream and the depth information; (ii) align the physical image stream with the depth information; (d) a graphics engine configured to: (i) model the virtual features in a virtual map; (ii) model the physical features in the virtual map based on the depth information; (iii) capture the modelled virtual and physical features within a field of view of a virtual camera, the field of view of the virtual camera corresponding to the field of view of the image camera; and (e) a shader configured to generate a virtual image stream incorporating the physical image stream by: (i) colouring the visible portions of
  • the system may further display the virtual image stream incorporating the physical image stream, wherein the head mounted display system further comprises a display to display the virtual image stream.
  • the depth camera and image camera may be jointly provided by a stereo camera, and the processor may be configured to determine depth information from the stereo camera. The processor may be further configured to provide the parameters for the virtual map to the shader.
  • a method for generating an augmented reality image stream combining virtual features and a physical image stream comprising: obtaining a physical image stream of a physical environment within a field of view of an image camera; obtaining depth information for the physical image stream; modelling the physical environment in the physical image stream to a virtual map according to the depth data; modelling virtual features to the virtual map; obtaining a virtual image stream of the virtual map within a field of view of a virtual camera having a field of view corresponding to the field of view of the image camera; colouring visible portions of the modelled physical environment in the virtual image stream by assigning colour values from corresponding portions of the physical image stream; and colouring the visible portions of the modelled virtual features according to parameters for the virtual map.
  • the method may further comprise aligning the depth information to the physical image stream.
  • the method may still further comprise translating the depth information from physical coordinates to virtual map coordinates.
  • a method for generating an augmented reality image combining virtual features and a physical image comprises: obtaining a physical image of a physical environment within a field of view of an image camera; obtaining depth information for the physical image; modelling virtual features to a virtual map; capturing the virtual features from a virtual camera having a field of view corresponding to the field of view of the image camera; pasting the physical image to a rear clipping plane in the field of view of the virtual camera;
  • a method for generating an augmented reality image combining virtual features and a physical image comprises: capturing a physical image of the physical environment in a physical field of view; obtaining depth information for the physical image; modelling at least one virtual feature to be placed in a virtual view frustum overlaying the physical field of view, the view frustum having a virtual depth limited by a far clipping plane; providing the physical image and the least one virtual feature to a rendering engine defining a notional virtual camera having the virtual view frustum; and instructing the graphics engine to: (i) apply the physical image at the far clipping plane; and (ii) render points of the virtual feature for which the depth information indicates that no physical feature has a depth less than the virtual depth of the points of the virtual feature.
  • Fig. 1 illustrates a head mounted display for generating and displaying AR to a user thereof
  • FIG. 2 is a schematic diagram of components of the head mounted display illustrated in Fig. 1 ;
  • FIG. 3 illustrates a field of view of a notional camera simulated by a graphics engine configured to render a virtual image stream depicting an AR for display by a head mounted display;
  • FIG. 4 illustrates an exemplary incorporation of a real image stream into a virtual image stream generated by a graphics engine
  • FIG. 5 illustrates an exemplary scenario in which a graphics engine incorporates a physical image stream into a virtual image stream
  • FIG. 6 illustrates a method for incorporating a physical image stream into a virtual image stream as exemplified in Fig. 5;
  • FIG. 7 illustrates another exemplary scenario in which a graphics engine incorporates a physical image stream into a virtual image stream
  • FIG. 8 illustrates another method for incorporating a physical image stream into a virtual image stream as exemplified in Fig. 7
  • any module, unit, component, server, computer, terminal or device exemplified herein that executes instructions may include or otherwise have access to computer readable media such as storage media, computer storage media, or data storage devices (removable and/or non-removable) such as, for example, magnetic disks, optical disks, or tape.
  • Computer storage media may include volatile and non-volatile, removable and non- removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data.
  • Examples of computer storage media include RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by an application, module, or both. Any such computer storage media may be part of the device or accessible or connectable thereto.
  • any processor or controller set out herein may be implemented as a singular processor or as a plurality of processors. The plurality of processors may be arrayed or distributed, and any processing function referred to herein may be carried out by one or by a plurality of processors, even though a single processor may be exemplified.
  • AR augmented reality
  • AR includes: the interaction by a user with real physical features and structures along with virtual features and structures overlaid thereon; and the interaction by a user with a fully virtual set of features and structures that are generated to include renderings of physical features and structures and that may comply with scaled versions of physical environments to which virtual features and structures are applied, which may alternatively be referred to as an "enhanced virtual reality”.
  • VR virtual reality
  • FIG. 1 an exemplary HMD 12 configured as a helmet is shown;
  • the HMD 12 which may be worn by a user occupying a physical environment, may comprise: a processor 130 in communication with one or more of the following components: (i) a graphics processing unit 133 having a graphics engine to generate a virtual image stream representing AR; (ii) a memory 131 to store data used and generated by the processor 130; (iii) a depth camera 127 to capture depth information for the physical environment within its field of view; (iv) an image camera to capture a physical image stream of the physical environment within its field of view; (v) a display 121 to display the virtual image stream to the user; and (vi) a power source 103, such as, for example, a battery, to provide power to the components.
  • a graphics processing unit 133 having a graphics engine to generate a virtual image stream representing AR
  • a memory 131 to store data used and generated by the processor 130
  • a depth camera 127 to capture depth information for the physical environment within its field of view
  • an image camera to capture a physical image stream of the physical
  • Fig. 2 illustrates the components of the HMD 12 shown in Fig. 1 in schematic form.
  • the memory 131 is accessible by the processor 130.
  • the processor communicates with the image camera 123 and the depth camera 127 to obtain, respectively, a physical image stream (i.e., a "real image stream") and depth information for the physical environment.
  • the processor is further in communication with a graphics processing unit 133 (GPU) having a graphics engine 137 and a graphics engine plugin 135.
  • the graphics engine plugin may facilitate communication between the processor 130 and the graphics engine 137.
  • the graphics engine 137 obtains the physical image stream and depth information associated with the physical image stream from the processor 130, either directly, or as data stored by the processor 130 to the memory 131.
  • the graphics engine 137 generates a virtual image stream and provides the virtual image stream to the display 122.
  • the foregoing components are powered by the power source 103.
  • the power source 103 is shown as being electrically coupled to the processor 130, the power source may be electrically coupled directly to the remaining ones of the foregoing components.
  • the processor provides the depth information and the physical image stream to the graphics engine 137, for example, as a pixel map and a depth map, respectively, and the graphics engine 137 uses the depth information to generate models within a virtual environment of the captured physical environment alongside virtual features.
  • the shader 139 obtains the models from the graphics engine 137 and colours the models to provide a virtual image stream.
  • the image camera 123 may be any suitable image camera, such as, for example, a stereo camera or a monovision camera, suited to capture the physical environment within its field of view to generate a physical image stream.
  • the field of view of the image camera 123 is defined by parameters which may be continuously provided from the image camera 123 to the processor 130, or which may be predetermined and stored in the memory 131. For example, if the image camera 123 has a fixed field of view, the parameters of the field of view are fixed and may be stored in the memory 131 and accessible to the GPU 133 and processor 130.
  • the depth camera 127 may be any suitable depth camera or scanner, such as, for example, a range finder, a time-of-flight camera, a LIDAR scanner, radar scanner or scanning laser range finder operable to capture depth information for the physical environment surrounding the HMD and provide the depth information to the processor 130.
  • the field of view of the depth camera 127 intersects at least a region of the field of view of the image camera 123.
  • the field of view of the depth camera 127 substantially overlaps the field of view of the image camera 123.
  • the image camera 123 may be a stereo camera operable to provide depth information based on epipolar geometry, such that the depth camera 127 may be redundant.
  • the processor 130 may obtain sufficient depth information without requiring depth information from the depth camera 127, so that the depth camera 127 may not be required in such embodiments.
  • the processor 130 obtains the physical image stream from the image camera 123 and the depth information from the depth camera 127.
  • the processor 130 is configured to align the depth information from the depth camera 127 with the physical image stream captured by the image camera 123 such that, for any region, such as a pixel, within the physical image stream, the processor 130 may determine the corresponding position of the region in world coordinates relative to the image camera 123.
  • the processor 130 aligns the physical image stream with the depth information according to any suitable calibration technique.
  • the image camera 123 and the depth camera 127 are mounted to the HMD 12 at a fixed position and orientation with respect to each other.
  • the spatial relationship between the depth camera 127 and image camera 123 may be defined in the memory 131 for use by the processor 130 in performing the calibration.
  • Calibration may be according to any suitable calibration technique, such as the technique described in Canessa et al, "Calibrated depth and color cameras for accurate 3D interaction in a stereoscopic augmented reality environment", Journal of Visual Communication and Image Representation, Volume 25, Issue 1 , January 2014, Pages 227-237. If the image camera 123 is a stereo camera, the processor 130 may not calibrate the depth camera 127 to the image camera 123, since the image camera 123 may provide sufficient depth information on its own.
  • the processor 130 may transform the world coordinates associated to the pixels to the graphics engine coordinate system using a suitable transformation technique.
  • the processor 130 may store the transformed and/or un- transformed coordinates and pixel values to the memory 131 for subsequent retrieval by the graphics engine 137.
  • the processor 130 calls the graphics engine 137, such as, for example the Unity 3DTM engine, to generate a virtual image stream comprising rendered graphics representing a 3D virtual reality environment.
  • the display 122 of the HMD 12 obtains the virtual image stream and displays it to the user.
  • the virtual image stream may comprise computer generated imagery (CGI), such as, for example, virtual characters, virtual environments or virtual effects representing a 3D VR.
  • CGI computer generated imagery
  • the graphics engine 137 may simulate a notional camera 301 (referred to herein as a virtual camera) to capture the virtual image stream as it would be captured by a real camera occupying the virtual environment.
  • the virtual camera 301 may have the properties of a mono or stereo camera, in accordance with the image camera 123 of the HMD 12. It will be appreciated that a stereo virtual camera may render a virtual image stream which the user will perceive as 3D provided the display 122 of the HMD 12 is suitably configured to display a 3D image stream.
  • the virtual camera 301 further comprises a view frustum, which is defined as the region lying within the field of view between a far clipping plane 313 and a near clipping plane 311.
  • the virtual image stream only comprises virtual features lying within the view frustum.
  • the graphics engine 137 is configurable by the processor 130 to define a view frustum having specific properties.
  • the processor 130 preferably configures the graphics engine to define a view frustum defined by a field of view having parameters corresponding to the field of view of the image camera 123.
  • the parameters for the field of view of the image camera 123 may be obtained from the image camera 123, or may be predefined in the memory 131. In either event, the parameters are obtained by the processor 130 to configure the graphics engine 137 so that the graphics engine 137 accordingly defines the field of view of the virtual camera 301.
  • the graphics engine 137 may incorporate the physical image stream into the virtual image stream alongside virtual elements.
  • the processor 130 may therefore route the physical image stream from the image camera 123 to the graphics engine 137 for incorporation into the virtual image stream.
  • the physical image stream may depict features which are dispersed throughout 3 dimensions of the physical environment.
  • Certain available graphics engines incorporate the physical image stream by pasting the physical image stream to a far clipping plane as a 2D background texture. This has generally been done as the physical image stream has traditionally been used to display only bounds of the physical environment.
  • FIG. 4 illustrates an exemplary scenario in which the incorporation of a physical image stream 401 into a virtual image stream 411 as a background texture pasted to the far clipping plane 313 may provide an inaccurate representation to a user.
  • the virtual feature 413 is shown as having a depth of one pixel, but it will be appreciated that a virtual feature may have any depth.
  • the physical image stream 401 depicts a person 403 standing in front of a background feature 405.
  • the distance in the physical environment of the person 403 relative to the background feature 405 corresponds to a distance in graphics engine coordinates that is greater than the distance between the virtual feature 413 and the rear clipping plane 313, such that the person 403 should appear in the virtual image stream 411 as though located closer to the virtual camera 301 than the virtual feature 413, which lies at Z-Z v , but further from the virtual camera 301 than the near clipping plane 31 1.
  • the graphics engine 137 merely pastes the entire physical image stream 401 as a background texture on the far clipping plane 313, i.e., behind the virtual feature 413, the person 403 actually appears to be standing behind the virtual feature 413 in the virtual image stream 411.
  • the processor calls the graphics engine 137 to selectively display or not display pixels of the virtual feature 413 and the physical image 401 stream in the virtual image stream 511 so that for any two or more pixels within the view frustum having identical X and Y coordinates, the graphics engine selects display of the pixel having the lower Z-value, according to the method illustrated in Fig. 6.
  • the physical image stream 401 may be understood as comprising a background element 405 and a person 403 standing in front of the background element 405 by a distance in world coordinates (as determined by the depth camera of the HMD), which translates to a distance in the graphic engine coordinates that is greater than the distance between the virtual feature 413 and the far clipping plane 313.
  • the representation 403' illustrates the relative position of the person 403 with respect to the background feature 405 in graphics system coordinates; however, the graphics engine pastes the entire physical image stream 313, including the person 403, to the far clipping plane 313.
  • the processor 130 or the memory 131 provides the Z coordinates corresponding to the elements within the physical image stream 401 to the graphics engine 137, as determined by the processor 130 based on the depth information, such that the graphics engine 137 can determine that the pixels representing the person 403 lie in the X-Y plane at Z p , i.e., closer to the virtual camera 301 than the virtual feature 413. Therefore, within the view frustum, for any point X n , Y n , there may be a plurality of pixels.
  • the graphics engine 137 applies the colour for any point X n , Y n having two or more pixels the colour corresponding to that point for whichever feature has the lowest Z value, i.e., the pixel which is nearest the virtual camera 301. If, at the point X n , Y n , a physical feature is nearer the virtual camera 301than a virtual feature, the graphics engine 137 obtains the colour of the physical feature at the corresponding location in the physical image stream 401 and assigns that colour to the point X,, Y n in the virtual image stream.
  • Fig. 6 illustrates steps in a method for incorporating the physical image stream 401 into the virtual image stream 411 as previously described with reference to Fig. 5.
  • the image camera 123 captures the physical image stream 401 depicting the physical environment within its field of view
  • the depth camera 127 captures depth information of the physical environment within its field of view.
  • the processor 130 obtains the physical image stream 401 and the depth information and aligns the physical image 401 and depth information, as previously described, to assign coordinates to features within the physical image stream 401.
  • the processor 130 translates the assigned coordinates into graphics engine coordinates if necessary.
  • the processor 130 calls the graphics engine to render the virtual image stream 411 , incorporate the physical image stream 401 into the virtual image stream 411 and to display whichever of any overlapping pixels is nearer the virtual camera 301 along the Z-axis.
  • the processor 130 also calls the graphics engine 137 to define a virtual camera 301 having a field of view corresponding to the field of view of the image camera 123.
  • the graphics engine 137 obtains the physical image stream 401 and the assigned coordinates while rendering virtual features 413.
  • the graphics engine 137 determines which features in the physical image stream 401 overlap with the virtual features 413 from the point of view of the virtual camera 301.
  • the graphics engine 137 determines which feature within the overlap is closer to the virtual camera 301 and includes the pixels for that feature in the virtual image stream while not including the overlapping pixels for the feature further from the virtual camera. At block 617 the graphics engine 137 provides the virtual image stream 411 to the display 122.
  • the processor 130 obtains depth information from the depth camera 127 for the physical environment captured within the physical image stream.
  • the processor 130 associates the depth information to corresponding regions within the physical image stream and either provides the associated information directly to the graphics engine 137 or stores it to the memory 131 for subsequent retrieval by the graphics engine 137.
  • the processor 130 calls the graphics engine 137 to model the physical environment captured within the physical image stream 401 in the virtual environment using the depth information for the physical image stream 401.
  • the graphics engine 137 models each feature from the physical image stream 401 within the virtual environment as a 3D model, such as, for example, a point cloud, polygonal mesh or triangular mesh.
  • the graphics engine further models the virtual feature 411 in the virtual environment so that all physical and virtual features are modelled in 3D within the virtual environment.
  • the graphics engine 137 models the person 403 as a 3D model 703, the feature 405 as a 3D model 705, and the virtual feature 411 as a 3D model 711.
  • the virtual camera 301 captures the virtual and physical features within its view frustum.
  • the graphics engine 137 may model the person 403 in the virtual environment in 3D based on the depth information provided by the depth camera.
  • the model 703 of the person 403 appears in grey for illustrative purposes.
  • the graphics engine 137 determines which regions of the modelled virtual and physical features would be visible when captured by the virtual camera 301.
  • the shader 139 assigns colours to the visible regions.
  • the shader 139 obtains the colour values associated to the locations by the processor 130, as previously described. For example, if the processor 130 associates the colour black to a pixel located in graphics engine coordinates at p, Yp, Zp, the shader 139 assigns the colour black to the surface of the model 703 of the person 403 where the model 703 intersects that point.
  • the shader 139 may further assign various colours to the visible surfaces of the model 713 of the virtual feature 403 captured by the virtual camera 301.
  • the processor 130 may call the shader 139 to colour the virtual image stream 71 1 without reference to the colouring for the physical features from the physical image stream 401.
  • the structure of the physical features may be depicted in the virtual image stream 71 1
  • the surface colouration and, optionally, even texturing, of the physical elements may be depicted partially or entirely independently of the colouration and/or texture of the physical elements in the physical image stream 401.
  • the processor 130 may call the graphics engine 137 to alter the models of the physical elements or omit modelling other physical elements so that the structures in the virtual image stream are partially, but not entirely, related to physical features captured within the physical image stream 401.
  • Fig. 8 illustrates steps in a method for incorporating the physical image stream 401 into the virtual image stream 411 as previously described with reference to Fig. 7.
  • the image camera 123 captures the physical image stream 401 depicting the physical environment within its field of view
  • the depth camera 127 captures depth information of the physical environment within its field of view.
  • the processor 130 obtains the physical image stream 401 and the depth information and aligns the physical image 401 and depth information, as previously described, to assign coordinates to regions within the physical image stream 401.
  • the processor 130 translates the assigned coordinates into graphics engine coordinates if necessary.
  • the processor 130 calls the graphics engine to model the virtual features 413 and physical features as 3D models within a virtual environment.
  • the processor 130 also calls the graphics engine 137 to define a virtual camera 301 having a field of view corresponding to the field of view of the image camera 123.
  • the graphics engine 137 obtains the assigned coordinates and renders the 3D models.
  • the graphics engine 137 determines which regions of the models are visible to the virtual camera 301.
  • the shader 139 obtains the modelled features and colours the visible portions of the physical features according to their corresponding colours in the physical image stream 401 and the visible portions of the virtual features according to parameters for the virtual environment as defined, for example, by the processor 130.
  • the coloured depiction forms the virtual image stream 711.
  • the shader 139 provides the virtual image stream 711 to the display 122.
  • the processor calls the shader 139 to resolve such conflicts in favour of displaying the pixel of the physical feature. This resolution may, for example, enhance user safety by prioritising display of physical features which may pose safety hazards or physical obstacles which the user must navigate when moving throughout a physical environment.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Optics & Photonics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

L'invention porte sur des systèmes et sur des procédés pour rendre un train d'images combinant des éléments réels et virtuels pour l'affichage pour un utilisateur équipé d'un visiocasque sous la forme d'une réalité augmentée. Le visiocasque comprend : une caméra d'images pour capturer un train d'images physiques de l'environnement entourant l'utilisateur ; une caméra de profondeur pour capturer des informations de profondeur pour l'environnement physique ; un processeur pour recevoir les informations de profondeur et le train d'images physiques, et associer les informations de profondeur à des régions dans le train d'images physiques ; une unité de traitement graphique ayant un moteur graphique pour rendre un train d'images virtuelles comprenant des éléments virtuels avec le train d'images physiques ; et un dispositif d'affichage pour afficher le train d'images virtuelles pour l'utilisateur. Lors de l'utilisation, le processeur appelle le moteur graphique pour incorporer le train d'images physiques, de telle sorte que les profondeurs relatives des éléments virtuels et physiques sont représentées.
PCT/CA2015/050124 2014-02-18 2015-02-18 Systèmes et procédés pour incorporer un train d'images réelles dans un train d'images virtuelles Ceased WO2015123775A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201461941063P 2014-02-18 2014-02-18
US61/941,063 2014-02-18

Publications (1)

Publication Number Publication Date
WO2015123775A1 true WO2015123775A1 (fr) 2015-08-27

Family

ID=53877478

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CA2015/050124 Ceased WO2015123775A1 (fr) 2014-02-18 2015-02-18 Systèmes et procédés pour incorporer un train d'images réelles dans un train d'images virtuelles

Country Status (1)

Country Link
WO (1) WO2015123775A1 (fr)

Cited By (43)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017039348A1 (fr) * 2015-09-01 2017-03-09 Samsung Electronics Co., Ltd. Appareil de capture d'image et son procédé de fonctionnement
WO2017112138A1 (fr) * 2015-12-21 2017-06-29 Intel Corporation Application d'un signal de capteur de mouvement direct à l'entrée d'un pipeline de rendu
CN107076998A (zh) * 2016-04-29 2017-08-18 深圳市大疆创新科技有限公司 可穿戴设备及无人机系统
US9766449B2 (en) 2014-06-25 2017-09-19 Thalmic Labs Inc. Systems, devices, and methods for wearable heads-up displays
WO2017212130A1 (fr) 2016-06-10 2017-12-14 Estereolabs Dispositif individuel d'immersion visuelle pour personne en mouvement
US9904051B2 (en) 2015-10-23 2018-02-27 Thalmic Labs Inc. Systems, devices, and methods for laser eye tracking
US9958682B1 (en) 2015-02-17 2018-05-01 Thalmic Labs Inc. Systems, devices, and methods for splitter optics in wearable heads-up displays
GB2555841A (en) * 2016-11-11 2018-05-16 Sony Corp An apparatus, computer program and method
GB2556114A (en) * 2016-11-22 2018-05-23 Sony Interactive Entertainment Europe Ltd Virtual reality
US9989764B2 (en) 2015-02-17 2018-06-05 Thalmic Labs Inc. Systems, devices, and methods for eyebox expansion in wearable heads-up displays
US10073268B2 (en) 2015-05-28 2018-09-11 Thalmic Labs Inc. Display with integrated visible light eye tracking
US10126815B2 (en) 2016-01-20 2018-11-13 Thalmic Labs Inc. Systems, devices, and methods for proximity-based eye tracking
US10133075B2 (en) 2015-05-04 2018-11-20 Thalmic Labs Inc. Systems, devices, and methods for angle- and wavelength-multiplexed holographic optical elements
US10151926B2 (en) 2016-01-29 2018-12-11 North Inc. Systems, devices, and methods for preventing eyebox degradation in a wearable heads-up display
US10215987B2 (en) 2016-11-10 2019-02-26 North Inc. Systems, devices, and methods for astigmatism compensation in a wearable heads-up display
US10230929B2 (en) 2016-07-27 2019-03-12 North Inc. Systems, devices, and methods for laser projectors
US10365548B2 (en) 2016-04-13 2019-07-30 North Inc. Systems, devices, and methods for focusing laser projectors
US10365492B2 (en) 2016-12-23 2019-07-30 North Inc. Systems, devices, and methods for beam combining in wearable heads-up displays
US10409057B2 (en) 2016-11-30 2019-09-10 North Inc. Systems, devices, and methods for laser eye tracking in wearable heads-up displays
US10437073B2 (en) 2017-01-25 2019-10-08 North Inc. Systems, devices, and methods for beam combining in laser projectors
US10459223B2 (en) 2016-08-12 2019-10-29 North Inc. Systems, devices, and methods for variable luminance in wearable heads-up displays
WO2019185986A3 (fr) * 2018-03-28 2019-10-31 Nokia Technologies Oy Procédé, appareil et produit-programme informatique pour réalité virtuelle
US10488662B2 (en) 2015-09-04 2019-11-26 North Inc. Systems, articles, and methods for integrating holographic optical elements with eyeglass lenses
US10528135B2 (en) 2013-01-14 2020-01-07 Ctrl-Labs Corporation Wearable muscle interface systems, devices and methods that interact with content displayed on an electronic display
US10656822B2 (en) 2015-10-01 2020-05-19 North Inc. Systems, devices, and methods for interacting with content displayed on head-mounted displays
US10684692B2 (en) 2014-06-19 2020-06-16 Facebook Technologies, Llc Systems, devices, and methods for gesture identification
US10901216B2 (en) 2017-10-23 2021-01-26 Google Llc Free space multiple laser diode modules
US10969740B2 (en) 2017-06-27 2021-04-06 Nvidia Corporation System and method for near-eye light field rendering for wide field of view interactive three-dimensional computer graphics
US11079846B2 (en) 2013-11-12 2021-08-03 Facebook Technologies, Llc Systems, articles, and methods for capacitive electromyography sensors
CN114155175A (zh) * 2020-09-07 2022-03-08 北京达佳互联信息技术有限公司 图像生成方法、装置、电子设备及存储介质
US11308696B2 (en) 2018-08-06 2022-04-19 Apple Inc. Media compositor for computer-generated reality
CN114821001A (zh) * 2022-04-12 2022-07-29 支付宝(杭州)信息技术有限公司 基于ar的互动方法、装置及电子设备
US11635736B2 (en) 2017-10-19 2023-04-25 Meta Platforms Technologies, Llc Systems and methods for identifying biological structures associated with neuromuscular source signals
WO2023071586A1 (fr) * 2021-10-25 2023-05-04 腾讯科技(深圳)有限公司 Procédé et appareil de génération dimage, dispositif et support
US11644799B2 (en) 2013-10-04 2023-05-09 Meta Platforms Technologies, Llc Systems, articles and methods for wearable electronic devices employing contact sensors
US11666264B1 (en) 2013-11-27 2023-06-06 Meta Platforms Technologies, Llc Systems, articles, and methods for electromyography sensors
CN116843819A (zh) * 2023-07-10 2023-10-03 上海随幻智能科技有限公司 一种基于虚幻引擎的绿幕无限延展方法
US11797087B2 (en) 2018-11-27 2023-10-24 Meta Platforms Technologies, Llc Methods and apparatus for autocalibration of a wearable electrode sensor system
US11868531B1 (en) 2021-04-08 2024-01-09 Meta Platforms Technologies, Llc Wearable device providing for thumb-to-finger-based input gestures detected based on neuromuscular signals, and systems and methods of use thereof
US11907423B2 (en) 2019-11-25 2024-02-20 Meta Platforms Technologies, Llc Systems and methods for contextualized interactions with an environment
US11921471B2 (en) 2013-08-16 2024-03-05 Meta Platforms Technologies, Llc Systems, articles, and methods for wearable devices having secondary power sources in links of a band for providing secondary power in addition to a primary power source
US11961494B1 (en) 2019-03-29 2024-04-16 Meta Platforms Technologies, Llc Electromagnetic interference reduction in extended reality environments
WO2025024322A1 (fr) * 2023-07-21 2025-01-30 simpleAR, Inc. Développement d'applications de réalité mixte en relation avec un environnement de développement virtuel

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020000986A1 (en) * 1998-02-17 2002-01-03 Sowizral Henry A. Mitigating the effects of object approximations
US20070038944A1 (en) * 2005-05-03 2007-02-15 Seac02 S.R.I. Augmented reality system with real marker object identification
US20120120200A1 (en) * 2009-07-27 2012-05-17 Koninklijke Philips Electronics N.V. Combining 3d video and auxiliary data
US20120139906A1 (en) * 2010-12-03 2012-06-07 Qualcomm Incorporated Hybrid reality for 3d human-machine interface
US20130120365A1 (en) * 2011-11-14 2013-05-16 Electronics And Telecommunications Research Institute Content playback apparatus and method for providing interactive augmented space

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020000986A1 (en) * 1998-02-17 2002-01-03 Sowizral Henry A. Mitigating the effects of object approximations
US20070038944A1 (en) * 2005-05-03 2007-02-15 Seac02 S.R.I. Augmented reality system with real marker object identification
US20120120200A1 (en) * 2009-07-27 2012-05-17 Koninklijke Philips Electronics N.V. Combining 3d video and auxiliary data
US20120139906A1 (en) * 2010-12-03 2012-06-07 Qualcomm Incorporated Hybrid reality for 3d human-machine interface
US20130120365A1 (en) * 2011-11-14 2013-05-16 Electronics And Telecommunications Research Institute Content playback apparatus and method for providing interactive augmented space

Cited By (94)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11009951B2 (en) 2013-01-14 2021-05-18 Facebook Technologies, Llc Wearable muscle interface systems, devices and methods that interact with content displayed on an electronic display
US10528135B2 (en) 2013-01-14 2020-01-07 Ctrl-Labs Corporation Wearable muscle interface systems, devices and methods that interact with content displayed on an electronic display
US11921471B2 (en) 2013-08-16 2024-03-05 Meta Platforms Technologies, Llc Systems, articles, and methods for wearable devices having secondary power sources in links of a band for providing secondary power in addition to a primary power source
US11644799B2 (en) 2013-10-04 2023-05-09 Meta Platforms Technologies, Llc Systems, articles and methods for wearable electronic devices employing contact sensors
US11079846B2 (en) 2013-11-12 2021-08-03 Facebook Technologies, Llc Systems, articles, and methods for capacitive electromyography sensors
US11666264B1 (en) 2013-11-27 2023-06-06 Meta Platforms Technologies, Llc Systems, articles, and methods for electromyography sensors
US10684692B2 (en) 2014-06-19 2020-06-16 Facebook Technologies, Llc Systems, devices, and methods for gesture identification
US10067337B2 (en) 2014-06-25 2018-09-04 Thalmic Labs Inc. Systems, devices, and methods for wearable heads-up displays
US9874744B2 (en) 2014-06-25 2018-01-23 Thalmic Labs Inc. Systems, devices, and methods for wearable heads-up displays
US10012829B2 (en) 2014-06-25 2018-07-03 Thalmic Labs Inc. Systems, devices, and methods for wearable heads-up displays
US10054788B2 (en) 2014-06-25 2018-08-21 Thalmic Labs Inc. Systems, devices, and methods for wearable heads-up displays
US9766449B2 (en) 2014-06-25 2017-09-19 Thalmic Labs Inc. Systems, devices, and methods for wearable heads-up displays
US9958682B1 (en) 2015-02-17 2018-05-01 Thalmic Labs Inc. Systems, devices, and methods for splitter optics in wearable heads-up displays
US9989764B2 (en) 2015-02-17 2018-06-05 Thalmic Labs Inc. Systems, devices, and methods for eyebox expansion in wearable heads-up displays
US10031338B2 (en) 2015-02-17 2018-07-24 Thalmic Labs Inc. Systems, devices, and methods for eyebox expansion in wearable heads-up displays
US10191283B2 (en) 2015-02-17 2019-01-29 North Inc. Systems, devices, and methods for eyebox expansion displays in wearable heads-up displays
US10613331B2 (en) 2015-02-17 2020-04-07 North Inc. Systems, devices, and methods for splitter optics in wearable heads-up displays
US10197805B2 (en) 2015-05-04 2019-02-05 North Inc. Systems, devices, and methods for eyeboxes with heterogeneous exit pupils
US10175488B2 (en) 2015-05-04 2019-01-08 North Inc. Systems, devices, and methods for spatially-multiplexed holographic optical elements
US10133075B2 (en) 2015-05-04 2018-11-20 Thalmic Labs Inc. Systems, devices, and methods for angle- and wavelength-multiplexed holographic optical elements
US10114222B2 (en) 2015-05-28 2018-10-30 Thalmic Labs Inc. Integrated eye tracking and laser projection methods with holographic elements of varying optical powers
US10180578B2 (en) 2015-05-28 2019-01-15 North Inc. Methods that integrate visible light eye tracking in scanning laser projection displays
US10488661B2 (en) 2015-05-28 2019-11-26 North Inc. Systems, devices, and methods that integrate eye tracking and scanning laser projection in wearable heads-up displays
US10139633B2 (en) 2015-05-28 2018-11-27 Thalmic Labs Inc. Eyebox expansion and exit pupil replication in wearable heads-up display having integrated eye tracking and laser projection
US10078220B2 (en) 2015-05-28 2018-09-18 Thalmic Labs Inc. Wearable heads-up display with integrated eye tracker
US10073268B2 (en) 2015-05-28 2018-09-11 Thalmic Labs Inc. Display with integrated visible light eye tracking
US10078219B2 (en) 2015-05-28 2018-09-18 Thalmic Labs Inc. Wearable heads-up display with integrated eye tracker and different optical power holograms
WO2017039348A1 (fr) * 2015-09-01 2017-03-09 Samsung Electronics Co., Ltd. Appareil de capture d'image et son procédé de fonctionnement
US10165199B2 (en) 2015-09-01 2018-12-25 Samsung Electronics Co., Ltd. Image capturing apparatus for photographing object according to 3D virtual object
US10488662B2 (en) 2015-09-04 2019-11-26 North Inc. Systems, articles, and methods for integrating holographic optical elements with eyeglass lenses
US10890765B2 (en) 2015-09-04 2021-01-12 Google Llc Systems, articles, and methods for integrating holographic optical elements with eyeglass lenses
US10877272B2 (en) 2015-09-04 2020-12-29 Google Llc Systems, articles, and methods for integrating holographic optical elements with eyeglass lenses
US10718945B2 (en) 2015-09-04 2020-07-21 North Inc. Systems, articles, and methods for integrating holographic optical elements with eyeglass lenses
US10705342B2 (en) 2015-09-04 2020-07-07 North Inc. Systems, articles, and methods for integrating holographic optical elements with eyeglass lenses
US10656822B2 (en) 2015-10-01 2020-05-19 North Inc. Systems, devices, and methods for interacting with content displayed on head-mounted displays
US9904051B2 (en) 2015-10-23 2018-02-27 Thalmic Labs Inc. Systems, devices, and methods for laser eye tracking
US10228558B2 (en) 2015-10-23 2019-03-12 North Inc. Systems, devices, and methods for laser eye tracking
US10606072B2 (en) 2015-10-23 2020-03-31 North Inc. Systems, devices, and methods for laser eye tracking
US10096149B2 (en) 2015-12-21 2018-10-09 Intel Corporation Direct motion sensor input to rendering pipeline
WO2017112138A1 (fr) * 2015-12-21 2017-06-29 Intel Corporation Application d'un signal de capteur de mouvement direct à l'entrée d'un pipeline de rendu
US10303246B2 (en) 2016-01-20 2019-05-28 North Inc. Systems, devices, and methods for proximity-based eye tracking
US10241572B2 (en) 2016-01-20 2019-03-26 North Inc. Systems, devices, and methods for proximity-based eye tracking
US10126815B2 (en) 2016-01-20 2018-11-13 Thalmic Labs Inc. Systems, devices, and methods for proximity-based eye tracking
US10437067B2 (en) 2016-01-29 2019-10-08 North Inc. Systems, devices, and methods for preventing eyebox degradation in a wearable heads-up display
US10451881B2 (en) 2016-01-29 2019-10-22 North Inc. Systems, devices, and methods for preventing eyebox degradation in a wearable heads-up display
US10151926B2 (en) 2016-01-29 2018-12-11 North Inc. Systems, devices, and methods for preventing eyebox degradation in a wearable heads-up display
US10365549B2 (en) 2016-04-13 2019-07-30 North Inc. Systems, devices, and methods for focusing laser projectors
US10365550B2 (en) 2016-04-13 2019-07-30 North Inc. Systems, devices, and methods for focusing laser projectors
US10365548B2 (en) 2016-04-13 2019-07-30 North Inc. Systems, devices, and methods for focusing laser projectors
US11036050B2 (en) 2016-04-29 2021-06-15 SZ DJI Technology Co., Ltd. Wearable apparatus and unmanned aerial vehicle system
CN107076998A (zh) * 2016-04-29 2017-08-18 深圳市大疆创新科技有限公司 可穿戴设备及无人机系统
CN107076998B (zh) * 2016-04-29 2020-09-01 深圳市大疆创新科技有限公司 可穿戴设备及无人机系统
WO2017212130A1 (fr) 2016-06-10 2017-12-14 Estereolabs Dispositif individuel d'immersion visuelle pour personne en mouvement
FR3052565A1 (fr) * 2016-06-10 2017-12-15 Stereolabs Dispositif individuel d'immersion visuelle pour personne en mouvement
US10277874B2 (en) 2016-07-27 2019-04-30 North Inc. Systems, devices, and methods for laser projectors
US10230929B2 (en) 2016-07-27 2019-03-12 North Inc. Systems, devices, and methods for laser projectors
US10250856B2 (en) 2016-07-27 2019-04-02 North Inc. Systems, devices, and methods for laser projectors
US10459221B2 (en) 2016-08-12 2019-10-29 North Inc. Systems, devices, and methods for variable luminance in wearable heads-up displays
US10459223B2 (en) 2016-08-12 2019-10-29 North Inc. Systems, devices, and methods for variable luminance in wearable heads-up displays
US10459222B2 (en) 2016-08-12 2019-10-29 North Inc. Systems, devices, and methods for variable luminance in wearable heads-up displays
US10215987B2 (en) 2016-11-10 2019-02-26 North Inc. Systems, devices, and methods for astigmatism compensation in a wearable heads-up display
US10345596B2 (en) 2016-11-10 2019-07-09 North Inc. Systems, devices, and methods for astigmatism compensation in a wearable heads-up display
US11138794B2 (en) 2016-11-11 2021-10-05 Sony Corporation Apparatus, computer program and method
GB2555841A (en) * 2016-11-11 2018-05-16 Sony Corp An apparatus, computer program and method
GB2556114B (en) * 2016-11-22 2020-05-27 Sony Interactive Entertainment Europe Ltd Virtual reality
GB2556114A (en) * 2016-11-22 2018-05-23 Sony Interactive Entertainment Europe Ltd Virtual reality
US10409057B2 (en) 2016-11-30 2019-09-10 North Inc. Systems, devices, and methods for laser eye tracking in wearable heads-up displays
US10459220B2 (en) 2016-11-30 2019-10-29 North Inc. Systems, devices, and methods for laser eye tracking in wearable heads-up displays
US10663732B2 (en) 2016-12-23 2020-05-26 North Inc. Systems, devices, and methods for beam combining in wearable heads-up displays
US10365492B2 (en) 2016-12-23 2019-07-30 North Inc. Systems, devices, and methods for beam combining in wearable heads-up displays
US10437073B2 (en) 2017-01-25 2019-10-08 North Inc. Systems, devices, and methods for beam combining in laser projectors
US10437074B2 (en) 2017-01-25 2019-10-08 North Inc. Systems, devices, and methods for beam combining in laser projectors
US10718951B2 (en) 2017-01-25 2020-07-21 North Inc. Systems, devices, and methods for beam combining in laser projectors
US10969740B2 (en) 2017-06-27 2021-04-06 Nvidia Corporation System and method for near-eye light field rendering for wide field of view interactive three-dimensional computer graphics
US11747766B2 (en) 2017-06-27 2023-09-05 Nvidia Corporation System and method for near-eye light field rendering for wide field of view interactive three-dimensional computer graphics
US11635736B2 (en) 2017-10-19 2023-04-25 Meta Platforms Technologies, Llc Systems and methods for identifying biological structures associated with neuromuscular source signals
US10901216B2 (en) 2017-10-23 2021-01-26 Google Llc Free space multiple laser diode modules
US11300788B2 (en) 2017-10-23 2022-04-12 Google Llc Free space multiple laser diode modules
WO2019185986A3 (fr) * 2018-03-28 2019-10-31 Nokia Technologies Oy Procédé, appareil et produit-programme informatique pour réalité virtuelle
US11218685B2 (en) 2018-03-28 2022-01-04 Nokia Technologies Oy Method, an apparatus and a computer program product for virtual reality
US11804019B2 (en) 2018-08-06 2023-10-31 Apple Inc. Media compositor for computer-generated reality
US11308696B2 (en) 2018-08-06 2022-04-19 Apple Inc. Media compositor for computer-generated reality
US11797087B2 (en) 2018-11-27 2023-10-24 Meta Platforms Technologies, Llc Methods and apparatus for autocalibration of a wearable electrode sensor system
US11941176B1 (en) 2018-11-27 2024-03-26 Meta Platforms Technologies, Llc Methods and apparatus for autocalibration of a wearable electrode sensor system
US11961494B1 (en) 2019-03-29 2024-04-16 Meta Platforms Technologies, Llc Electromagnetic interference reduction in extended reality environments
US11907423B2 (en) 2019-11-25 2024-02-20 Meta Platforms Technologies, Llc Systems and methods for contextualized interactions with an environment
CN114155175A (zh) * 2020-09-07 2022-03-08 北京达佳互联信息技术有限公司 图像生成方法、装置、电子设备及存储介质
US11868531B1 (en) 2021-04-08 2024-01-09 Meta Platforms Technologies, Llc Wearable device providing for thumb-to-finger-based input gestures detected based on neuromuscular signals, and systems and methods of use thereof
WO2023071586A1 (fr) * 2021-10-25 2023-05-04 腾讯科技(深圳)有限公司 Procédé et appareil de génération dimage, dispositif et support
CN114821001A (zh) * 2022-04-12 2022-07-29 支付宝(杭州)信息技术有限公司 基于ar的互动方法、装置及电子设备
CN114821001B (zh) * 2022-04-12 2024-04-19 支付宝(杭州)信息技术有限公司 基于ar的互动方法、装置及电子设备
CN116843819A (zh) * 2023-07-10 2023-10-03 上海随幻智能科技有限公司 一种基于虚幻引擎的绿幕无限延展方法
CN116843819B (zh) * 2023-07-10 2024-02-02 上海随幻智能科技有限公司 一种基于虚幻引擎的绿幕无限延展方法
WO2025024322A1 (fr) * 2023-07-21 2025-01-30 simpleAR, Inc. Développement d'applications de réalité mixte en relation avec un environnement de développement virtuel

Similar Documents

Publication Publication Date Title
WO2015123775A1 (fr) Systèmes et procédés pour incorporer un train d'images réelles dans un train d'images virtuelles
JP7443602B2 (ja) 仮想コンテンツワーピングを伴う複合現実システムおよびそれを使用して仮想コンテンツを生成する方法
CN107564089B (zh) 三维图像处理方法、装置、存储介质和计算机设备
CN114785996B (zh) 虚拟现实视差校正
US10083540B2 (en) Virtual light in augmented reality
AU2013266187B2 (en) Systems and methods for rendering virtual try-on products
CN109829981B (zh) 三维场景呈现方法、装置、设备及存储介质
US11961250B2 (en) Light-field image generation system, image display system, shape information acquisition server, image generation server, display device, light-field image generation method, and image display method
CN116309854B (zh) 标定增强现实设备的方法、装置、设备、系统及存储介质
US20240046590A1 (en) Reconstruction of Essential Visual Cues in Mixed Reality Applications
CN105611267B (zh) 现实世界和虚拟世界图像基于深度和色度信息的合并
KR101208767B1 (ko) 곡면 투사를 이용한 입체 영상 생성 방법, 장치 및 시스템, 이를 위한 기록 매체
EP3622486B1 (fr) Illustration holographique des conditions météorologiques
CN115767068A (zh) 一种信息处理方法、装置和电子设备
EP4542344A1 (fr) Traitement de signal d'image basé sur l'élimination d'occlusion
US20230243973A1 (en) Real space object reconstruction within virtual space image using tof camera
Mori et al. Diminished hand: A diminished reality-based work area visualization
JP7261121B2 (ja) 情報端末装置及びプログラム
KR20190056694A (ko) 디지털 소묘 요소가 태깅된 2.5차원 오브젝트를 활용한 가상 전시 공간 제공 방법
de Sorbier et al. Depth Camera to Generate On-line Content for Auto-Stereoscopic Displays
CN119893065A (zh) 智能穿戴设备的图像渲染方法、装置、计算机设备及介质
CN113192208A (zh) 三维漫游方法及装置
JP2022067171A (ja) 生成装置、生成方法、及び、プログラム
Levin et al. Some aspects of the geospatial reality perception in human stereopsis-based defense display systems

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15752718

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15752718

Country of ref document: EP

Kind code of ref document: A1