WO2017012360A1 - Procédé pour la réponse d'un dispositif d'affichage de réalité virtuelle à une opération d'un dispositif périphérique - Google Patents
Procédé pour la réponse d'un dispositif d'affichage de réalité virtuelle à une opération d'un dispositif périphérique Download PDFInfo
- Publication number
- WO2017012360A1 WO2017012360A1 PCT/CN2016/076497 CN2016076497W WO2017012360A1 WO 2017012360 A1 WO2017012360 A1 WO 2017012360A1 CN 2016076497 W CN2016076497 W CN 2016076497W WO 2017012360 A1 WO2017012360 A1 WO 2017012360A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- range
- current position
- response
- coordinate
- position coordinates
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
Definitions
- the present invention relates to the field of virtual reality technologies, and in particular, to a method for a virtual reality display device to respond to operation of a peripheral device.
- VR Virtual Reality
- VR glasses are a special device for realizing virtual reality environments.
- the image of the stereoscopic video is decomposed into two left and right viewing angles, so that the two images enter the left and right glasses of the user through the VR glasses, so that the two eyes respectively see the two angles of the image, and are synthesized by the brain.
- a stereoscopic image is formed, which shows a coherent three-dimensional image, which enables the user to have a strong "immersive" feeling.
- the peripheral device is added to enable the user to interact with the image in the virtual reality through the peripheral device.
- the interaction refers to a 2D input of the peripheral device, which can be correspondingly expressed on the real device, that is, in the stereoscopic image of the virtual reality display device, as the position coordinates reported by the peripheral device move in the real space,
- the response position coordinates of the position coordinates also move accordingly.
- the peripheral device is a mouse
- the position of the mouse arrow displayed on the virtual reality display device is also moved accordingly, thereby implementing some "in the virtual reality stereo image”. Confirm or "cancel" and other operations.
- the response position coordinates of the 2D input may jump from the rightmost side of the screen to the leftmost side or from the leftmost side in the virtual reality stereo image. At the far right, this irregular jump affects the user's visual experience and gives the user an uncomfortable feeling.
- the embodiment of the present invention provides a method for the operation of the virtual reality display device in response to the operation of the peripheral device. To solve the above technical problem, the embodiment of the present invention discloses the following technical solution:
- a method for a virtual reality display device to respond to operation of a peripheral device comprising two display screens, each of the display screens corresponding to a portion of a total interaction range, the method comprising:
- Obtaining a current position coordinate of the peripheral device converting the current position coordinate by using a conversion manner corresponding to the predetermined condition to obtain a response position coordinate within a specified range, wherein the specified range is the two display screens Specifying a corresponding interaction range in the display screen; performing position interaction according to the response position coordinates.
- the step of converting the current position coordinates comprises:
- one vertex of the diagonal of the interaction range is a coordinate origin (0, 0), another vertex is (Px, Py), and one edge is on an x axis, wherein The origin (0,0) is a vertex, and the other vertex is (Px/2, Py).
- the rectangular area formed by an edge on the x-axis is the first range, and the first range is excluded in the interaction range.
- the area outside the range is the second range;
- the step of converting the current position coordinates comprises:
- one vertex of the diagonal of the interaction range is a coordinate origin (0, 0), another vertex is (Px, Py), and one edge is on an x axis, wherein The origin (0,0) is a vertex, and the other vertex is (Px/2, Py).
- the rectangular area formed by an edge on the x-axis is the first range, and the first range is excluded in the interaction range.
- the area outside the range is the second range;
- the converting the current position coordinates by using a conversion manner corresponding to the predetermined condition, and obtaining the response position coordinates within the specified range further includes:
- the step of converting the current location coordinates according to the current location coordinates and the previous location coordinates comprises:
- one vertex of the diagonal of the interaction range is a coordinate origin (0, 0), another vertex is (Px, Py), and one edge is on an x axis, wherein The origin (0,0) is a vertex, and the other vertex is (Px/2, Py).
- the rectangular area formed by an edge on the x-axis is the first range, and the first range is excluded in the interaction range.
- the area outside the range is the second range;
- the maximum abscissa of the second range is taken as the abscissa of the response position
- the ordinate of the current position is taken as the ordinate of the response position.
- the step of converting the current location coordinates according to the current location coordinates and the previous location coordinates comprises:
- one vertex of the diagonal of the interaction range is a coordinate origin (0, 0), another vertex is (Px, Py), and one edge is on an x axis, wherein The origin (0,0) is a vertex, and the other vertex is (Px/2, Py).
- the rectangular area formed by an edge on the x-axis is the first range, and the first range is excluded in the interaction range.
- the area outside the range is the second range;
- the minimum abscissa of the first range is taken as the abscissa of the response position
- the ordinate of the current position coordinate is taken as the ordinate of the response position.
- the step before the acquiring the current location coordinates of the peripheral device further includes:
- the virtual reality display device responds to the operation of the peripheral device by performing coordinate conversion on the current position coordinates of the acquired peripheral device, thereby obtaining the coordinate of the response position within the specified range, thereby enabling
- the virtual reality display device is capable of interacting with the peripheral device's response position coordinates in response to the peripheral device operation.
- the converted response position coordinates are limited to a specified range in the interaction range, which can prevent the current position coordinates from interacting with the virtual reality display device, and the response position coordinates of the 2D input jump in the stereoscopic image of the display device, thereby overcoming The response position coordinates of the 2D input jump to give the user an uncomfortable feeling during the virtual reality experience.
- FIG. 1 is a schematic flowchart of a method for a virtual reality display device to respond to operation of a peripheral device according to an embodiment of the present application
- FIG. 2 is a flowchart of another method for operating a virtual reality display device in response to a peripheral device according to an embodiment of the present disclosure
- FIG. 3 is a schematic diagram of interaction range division on a virtual reality display device according to the present application.
- FIG. 4 is a schematic diagram of movement of a 2D input position within a first range and a second range
- FIG. 5 is a specific flowchart of converting a current location coordinate in a method for operating a virtual reality display device in response to a peripheral device according to an embodiment of the present disclosure
- FIG. 6 is a schematic diagram of converting current position coordinates into coordinates in a first range according to an embodiment of the present application
- FIG. 7 is a schematic diagram of converting current position coordinates into coordinates in a second range according to an embodiment of the present application.
- FIG. 8 is a schematic flowchart of another method for responding to operation of a peripheral device by a virtual reality display device according to an embodiment of the present disclosure
- FIG. 9 is a schematic diagram of a coordinate position position conversion of a control position of a 2D input according to an embodiment of the present invention.
- FIG. 10 is a schematic diagram of another control position coordinate position conversion for controlling 2D input according to the embodiment.
- FIG. 1 is a flowchart of a method for a virtual reality display device to respond to operation of a peripheral device, the virtual reality display device being applied to a virtual reality simulation environment, for example, VR (Virtual Reality) glasses, according to an embodiment.
- a virtual reality simulation environment for example, VR (Virtual Reality) glasses
- the peripheral device interacts with the virtual reality display device through the 2D input.
- a cursor arrow, a mouse instruction arrow, and the like displayed on a stereoscopic image screen of the VR glasses are input through the peripheral device 2D.
- the steps of the method for the virtual reality display device to respond to the operation of the peripheral device include:
- step 101 the current position coordinates of the peripheral device are obtained.
- step 101 before the acquiring the current location coordinates of the peripheral device, the method further includes:
- Step 1011 Connect the peripheral device and the virtual reality display device, so that the location coordinates reported by the peripheral device can interact with the virtual reality display device.
- the peripheral device includes a mouse, a keyboard, a game controller, etc., and is connected to the virtual reality display device, so that the peripheral device can input a 2D event on the virtual reality display device to enhance the experience effect of the virtual reality.
- the peripheral device is connected to the virtual reality display device including, but not limited to, a cable, Bluetooth or WIFI connection.
- Step 1012 determining a current position coordinate of the peripheral device.
- the virtual reality display device further includes a central processing unit for identifying and determining current position coordinates reported by the peripheral device.
- a central processing unit on a PCB can analyze and determine the current position coordinates of the peripheral device within its range of motion, as well as the current position coordinates.
- the step of acquiring the current location coordinates includes: the central processing unit represents the spatial extent of the peripheral device activity as an area with a coordinate system, and the position of the current peripheral device in the coordinate system region as the current location Coordinates and report the current position coordinates to the central processing unit.
- step 102 the current position coordinates are converted by using a conversion manner corresponding to a predetermined condition to obtain response position coordinates within a specified range, wherein the specified range is the two display screens. Specifies the corresponding interaction range described in the display.
- the virtual reality display device includes two display screens, each of the display screens corresponding to a portion of all interaction ranges, the specified range being the corresponding display screen of the two display screens.
- the scope of interaction For example, in the VR glasses, the acquired current position coordinates are determined according to the position of the current position coordinates, and the coordinate of the response position obtained by converting the current position coordinates is within a specified range, and the specified range may be the left of the VR glasses. Display or right display.
- the predetermined condition includes converting the current position coordinates into a preset one of two display screens, the interaction range being a range of display screens on the virtual reality display device.
- Step 103 Perform position interaction according to the response position coordinates.
- the location interaction refers to a 2D input of the peripheral device, which can be correspondingly expressed on the real device, that is, in the stereoscopic image of the virtual reality display device, as the position coordinates reported by the peripheral device move in the real space.
- the response position coordinates of the position coordinates also move accordingly.
- the peripheral device held by the user is Gamepad
- the position of the mouse pointer displayed in the stereoscopic image can also be moved to "confirm” Or “cancel” the position of the operation button, which is convenient for the user to select the operation.
- the location interaction further includes displaying text information on the display device of the virtual reality through a peripheral device such as a keyboard.
- the interaction further includes: connecting, by the peripheral device, the display device, the current position coordinates reported by the peripheral device, and then responding to the display device, and causing the converted response position coordinates to follow Set the input condition of the device to change accordingly.
- the virtual reality display device provided by the present application responds to the operation of the peripheral device by performing coordinate conversion on the current position coordinates of the acquired peripheral device to obtain the coordinate of the response position within the specified range, thereby enabling the virtual reality display device to respond.
- the peripheral device operates to interact with the response location coordinates of the peripheral device.
- the converted response position coordinates are limited to a specified range in the interaction range, which can prevent the current position coordinates from interacting with the virtual reality display device, and the response position coordinates of the 2D input jump in the stereoscopic image of the display device, thereby overcoming The response position coordinates of the 2D input jump to give the user an uncomfortable feeling during the virtual reality experience.
- the response position coordinate of the 2D input exceeds the interaction range of any one of the display screens, so that the response position coordinates of the displayed 2D input are displayed on the left and right side of the stereoscopic image screen.
- the virtual reality display device is a VR glasses, and the interaction area of the left and right display screens on the VR glasses can be represented by a coordinate system.
- the display screen of the virtual reality display device is a The two-dimensional coordinate system takes the upper left of the display screen as the origin (0,0), the horizontal direction of the screen is the x-axis, the vertical direction is the y-axis, and the x-axis maximum and the y-axis maximum of the screen are respectively Expressed as Px and Py.
- the rectangular interaction range of the virtual reality display device is obtained by using one vertex of the diagonal of the display screen as the coordinate origin (0, 0), the other vertex as (Px, Py), and one edge on the x-axis, wherein the origin is (0,0) is a vertex, the other vertex is (Px/2, Py), a rectangular area formed by an edge on the x-axis is the first range, and the first range is excluded within the interaction range
- the area other than the second range, any of the response position coordinates in the display screen can be represented in the interaction range.
- the first range and the second range represent two left and right display screens of the VR glasses.
- the left and right display screens in VR glasses correspond to two different angles of video images.
- Two different angles of image are combined into the brain to synthesize a stereo image.
- Each of the display screens corresponds to a portion of all interaction ranges, and the 2D input positions on each of the display screens of the VR glasses are respectively displayed on the synthesized stereoscopic image display screen, for example, in the first range.
- the position of the 2D input is located in the upper left corner, and the coordinate of the response position of the 2D input is also displayed in the upper left corner on the synthesized display screen; similarly, the 2D input position in the second range is also on the synthesized display screen.
- the display position of the response position coordinates in response to the 2D input is the same.
- the coordinate of the response position by the interaction of the 2D input is displayed on the rightmost side of the stereoscopic image screen seen by the user.
- the coordinate position of the 2D input displayed on the stereoscopic image is from the top of the screen.
- the right side jumps to the leftmost side, and the movement from the first range to the second range causes the response position coordinate of the 2D input to jump from the rightmost side to the left side in the stereoscopic image screen, or jumps from the left side.
- the phenomenon to the right side gives the user an uncomfortable feeling when interacting in the virtual reality environment, and even causes dizziness, which affects the user's visual experience.
- step 102 the step of converting the current location coordinates further includes:
- Step 1021 Acquire a rectangular interaction range of the virtual reality display device.
- One vertex of the diagonal of the interaction range is a coordinate origin (0, 0)
- another vertex is A (Px, Py)
- one edge is on the x-axis.
- the other edge is on the y-axis, where the origin (0,0) is a vertex, the other vertex is B(Px/2,Py), and the rectangular region formed by an edge on the x-axis is the first range.
- the area other than the first range within the interaction range is the second range.
- the display screen of the virtual reality display device divides the interaction range according to the foregoing embodiment, the first range is equal to the area of the second range, and the two ranges respectively represent two display screens on the virtual reality device, for example As shown in FIG. 3, the first range represents the left lens display screen in the VR glasses, and the second range represents the right lens display screen in the VR glasses.
- Step 1022 Determine whether the current location coordinate is within the first range.
- the display device After obtaining the current location coordinates of the peripheral device, the display device determines whether the current location coordinate meets a preset condition, and further determines an interaction range of the current location coordinate on the display device.
- Step 1023 if yes, that is, when the current position coordinate is within the first range of the interaction range, the current position coordinate is taken as the response position coordinate.
- Step 1024 if not, the maximum abscissa of the first range is taken as the abscissa of the response position, and the ordinate of the current position is taken as the ordinate of the response position.
- step 1025 the coordinates of the response position within the specified range are obtained.
- the current position coordinate of the peripheral device is acquired as (x0, y0), and it is determined whether the current position coordinate (x0, y0) is within the first range, if Yes, the current position coordinate (x0, y0) is used as the response position coordinate (x0, y0), and the response position coordinate is displayed on the display screen of the virtual reality display device.
- the current position coordinate (x0, y0) is converted, specifically In order to convert the abscissa x0 of the current position coordinate (x0, y0) into the maximum abscissa Px/2 in the first range, and obtain the ordinate y0 of the current position as the ordinate of the response position.
- the response position coordinates are (Px/2, y0), and the converted response position coordinates (Px/2, y0) are interactively responded.
- the method provided in the foregoing embodiment determines the current position coordinates obtained by the method, and converts the current position coordinates according to the result of the determination, so that the converted response position coordinates are within a specified range, thereby preventing peripheral devices and
- the 2D input appears on the display screen when the display device interacts, the response position coordinate of the 2D input moves from the first range to the second range, or moves from the second range to the first range.
- the response position corresponding to the 2D input on each display screen can also be reflected in the same position on the display screen of the stereo image.
- the mouse pointer displayed in the upper left corner of the first range or the upper left corner of the second range corresponds to the same mouse pointer displayed in the upper left corner of the display screen. Converting the acquired current position coordinates into the response position coordinates within the specified range enables the position of the 2D input to be reflected on the corresponding position on the display screen, thereby realizing the virtual reality display device to interact with the peripheral device.
- the step of converting the current position coordinates of the acquired peripheral device into the response position coordinates in the second range includes:
- Step 1026 Determine whether the current position coordinate is within the second range.
- the interaction range of the virtual reality is first divided, and the detailed division steps and Step 1021 is the same, and the interaction range of the left and right display screens of the VR glasses is divided into a first range and a second range.
- Step 1027 If yes, the current position coordinate is taken as the response position coordinate.
- the acquired current position coordinate is (x1, y1), and when located within the second range, the response position coordinate is determined to be (x1, y1), which is the same as the acquired current position coordinate (x1, y1).
- Step 1028 If not, the minimum abscissa of the second range is taken as the abscissa of the response position, and the ordinate of the current position is taken as the ordinate of the response position.
- the minimum abscissa Px/2 of the second range is taken as the abscissa of the response position
- the ordinate y1 of the current position is taken as the ordinate of the response position.
- Step 1025 The coordinate of the response position within the specified range is (Px/2, y1).
- the current position coordinate is converted to obtain a response position coordinate within a specified range, and the response position coordinate is located in the second range, preventing the current position coordinate from the second range.
- the response position coordinates of the 2D input cause a jump phenomenon on the display screen.
- step 102 the conversion of the current position coordinates is performed by using a conversion manner corresponding to a predetermined condition, and the coordinate of the response position within the specified range is obtained.
- Step 201 Acquire a previous position coordinate of the current position coordinate.
- the method and the step of acquiring the previous position coordinates are the same as the foregoing method of acquiring the current position coordinates.
- Step 202 Convert the current position coordinate according to the current position coordinate and the previous position coordinate.
- Step 203 Obtain a coordinate of the response position within a specified range
- the virtual reality display device acquires the current position coordinates and the previous position coordinates reported by the peripheral device, and determines this The position of the two coordinates.
- the step of converting the current position coordinates is: taking the maximum abscissa of the second range as the abscissa of the response position, and the ordinate of the current position as the vertical position of the response position coordinate.
- the current position coordinate of the peripheral device is P1 (x1, y1)
- the obtained previous position coordinate is P2 (x2, y2)
- the current position coordinate P1 (X1, y1) is located in the second range
- the previous position coordinate P2 (x2, y2) is located in the first range
- the current position coordinate P1 (x1, y1) is converted to obtain a response position coordinate P3 ( X3, y3), such that the abscissa x3 of the response position is the maximum abscissa Px within the two ranges
- the ordinate y3 is the same as the ordinate y1 of the current position coordinate P1, and the coordinate of the response position is P3 (Px, Y1).
- the interactive 2D input position moves from the first range to the second range according to the positions of the two coordinates. If yes, converting the current position coordinates such that the converted response position coordinates are within a specified range (ie, within the second range), preventing movement of the 2D input position from the first range to the second range.
- the response position coordinate of the 2D input reflected on the display screen jumps from the rightmost side of the display screen to the leftmost side, thereby preventing the display device from interacting with the peripheral device, the response position coordinate of the 2D input is in the display screen. Jumping affects the user's experience.
- the minimum abscissa of the first range is taken as the abscissa of the response position
- the ordinate of the current position coordinate is taken as the ordinate of the response position.
- the current position coordinate of the peripheral device is obtained as P1 (x1, y1), and the acquired previous position coordinate is P2 (x2, y2), and the current position coordinate is determined.
- P1(x1, y1) is located in the first range
- the previous position coordinate P2(x2, y2) is located in the second range
- the current position coordinate P1(x1, y1) is converted to obtain a response.
- the position coordinate P3 (x3, y3) is such that the abscissa x3 of the response position is the minimum abscissa 0 in the first range, and the ordinate y3 is the same as the ordinate y1 of the current position coordinate P1, and the coordinate of the response position is obtained. Is P3(0, y1).
- the current position coordinate and the previous position coordinate are obtained, and according to the positions of the two coordinates, whether the interactive 2D input position is moved from the second range to the first range, and if so, Converting the current position coordinates such that the converted response position coordinates are within a specified range (ie, the first Within a range), when the 2D input position is prevented from moving from the second range to the first range, the response position coordinate of the 2D input jumps from the rightmost side to the leftmost side of the display screen.
- a specified range ie, the first Within a range
- Specific application example 3 when the acquired current position coordinate and the previous position coordinate are both in the same range, that is, both in the first range or in the second range, The acquired current position coordinate is used as the response position coordinate, and the interaction event is responded according to the response position coordinate.
- the ordinate range of the 2D input position is determined from the coordinate origin to the maximum ordinate Py in the interaction range between the virtual reality display device and the peripheral device, the aforementioned current position coordinate is converted, and only the current position coordinate is The abscissa is converted, and the ordinate is the ordinate of the current position obtained. If the display device and the peripheral device do not set the interaction range of the ordinate of the 2D input position, the active range of the ordinate of the response position of the 2D input needs to be set between the minimum ordinate and the maximum ordinate within the interaction range. .
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
Abstract
L'invention concerne un procédé pour la réponse d'un dispositif d'affichage de réalité virtuelle à une opération d'un dispositif périphérique. Le dispositif d'affichage de réalité virtuelle comporte deux écrans, chaque écran correspondant à une partie de la plage d'interaction totale. Le procédé comprend : l'obtention des coordonnées de position actuelles d'un dispositif périphérique ; la conversion des coordonnées de position actuelles au moyen d'un mode de conversion correspondant à une condition prédéfinie, afin d'obtenir des coordonnées de position en réponse dans une plage spécifiée ; et la mise en œuvre d'une interaction de positions en fonction des coordonnées de position en réponse. Selon ce procédé, une conversion de coordonnées est effectuée sur des coordonnées de position actuelles, de telle sorte que les coordonnées de position en réponse obtenues peuvent répondre à un dispositif périphérique. De plus, les coordonnées de position en réponse converties sont limitées à une plage spécifiée dans la plage d'interaction, de façon à ce que les coordonnées de position en réponse entrées en 2D ne puissent pas sauter dans une image stéréoscopique d'un dispositif d'affichage lorsque les coordonnées de position actuelles entrent en interaction avec le dispositif d'affichage de réalité virtuelle.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201510433732.4 | 2015-07-22 | ||
| CN201510433732.4A CN105159522B (zh) | 2015-07-22 | 2015-07-22 | 一种虚拟现实显示设备响应外设设备操作的方法 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2017012360A1 true WO2017012360A1 (fr) | 2017-01-26 |
Family
ID=54800399
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/CN2016/076497 Ceased WO2017012360A1 (fr) | 2015-07-22 | 2016-03-16 | Procédé pour la réponse d'un dispositif d'affichage de réalité virtuelle à une opération d'un dispositif périphérique |
Country Status (2)
| Country | Link |
|---|---|
| CN (1) | CN105159522B (fr) |
| WO (1) | WO2017012360A1 (fr) |
Families Citing this family (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN105159522B (zh) * | 2015-07-22 | 2018-03-13 | 深圳多新哆技术有限责任公司 | 一种虚拟现实显示设备响应外设设备操作的方法 |
| CN105824532A (zh) * | 2016-03-16 | 2016-08-03 | 无锡科技职业学院 | 一种矩形图形识别散列定位方法 |
| CN106060670A (zh) * | 2016-06-02 | 2016-10-26 | 北京光子互动科技有限公司 | 多媒体的处理方法、装置和系统 |
| CN106095238A (zh) * | 2016-06-08 | 2016-11-09 | 北京行云时空科技有限公司 | 基于智能眼镜的光标显示方法和装置 |
| US10496353B2 (en) | 2016-09-29 | 2019-12-03 | Jiang Chang | Three-dimensional image formation and color correction system and method |
| CN106683152B (zh) | 2016-11-16 | 2019-09-20 | 腾讯科技(深圳)有限公司 | 三维视觉效果模拟方法及装置 |
| CN109427100A (zh) * | 2017-08-29 | 2019-03-05 | 深圳市掌网科技股份有限公司 | 一种基于虚拟现实的配件组装方法和系统 |
| CN107413048B (zh) * | 2017-09-04 | 2020-10-27 | 网易(杭州)网络有限公司 | Vr游戏过程中的处理方法及装置 |
| CN107957781B (zh) * | 2017-12-13 | 2021-02-09 | 北京小米移动软件有限公司 | 信息显示方法及装置 |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN101859580A (zh) * | 2009-04-03 | 2010-10-13 | 索尼公司 | 信息处理设备、信息处理方法和程序 |
| CN104285243A (zh) * | 2012-05-09 | 2015-01-14 | Nec卡西欧移动通信株式会社 | 三维图像显示设备及其光标显示方法和计算机程序 |
| CN104598035A (zh) * | 2015-02-27 | 2015-05-06 | 北京极维客科技有限公司 | 基于3d立体图像显示的光标显示方法、智能设备及系统 |
| CN105159522A (zh) * | 2015-07-22 | 2015-12-16 | 深圳多新哆技术有限责任公司 | 一种虚拟现实显示设备响应外设设备操作的方法 |
Family Cites Families (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN103064514A (zh) * | 2012-12-13 | 2013-04-24 | 航天科工仿真技术有限责任公司 | 沉浸式虚拟现实系统中的空间菜单的实现方法 |
-
2015
- 2015-07-22 CN CN201510433732.4A patent/CN105159522B/zh active Active
-
2016
- 2016-03-16 WO PCT/CN2016/076497 patent/WO2017012360A1/fr not_active Ceased
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN101859580A (zh) * | 2009-04-03 | 2010-10-13 | 索尼公司 | 信息处理设备、信息处理方法和程序 |
| CN104285243A (zh) * | 2012-05-09 | 2015-01-14 | Nec卡西欧移动通信株式会社 | 三维图像显示设备及其光标显示方法和计算机程序 |
| CN104598035A (zh) * | 2015-02-27 | 2015-05-06 | 北京极维客科技有限公司 | 基于3d立体图像显示的光标显示方法、智能设备及系统 |
| CN105159522A (zh) * | 2015-07-22 | 2015-12-16 | 深圳多新哆技术有限责任公司 | 一种虚拟现实显示设备响应外设设备操作的方法 |
Also Published As
| Publication number | Publication date |
|---|---|
| CN105159522A (zh) | 2015-12-16 |
| CN105159522B (zh) | 2018-03-13 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2017012360A1 (fr) | Procédé pour la réponse d'un dispositif d'affichage de réalité virtuelle à une opération d'un dispositif périphérique | |
| US11275481B2 (en) | Collaborative augmented reality system | |
| US10657716B2 (en) | Collaborative augmented reality system | |
| US11195320B2 (en) | Feed-forward collision avoidance for artificial reality environments | |
| US10095458B2 (en) | Information processing apparatus, information processing method, non-transitory computer-readable storage medium, and system | |
| JP6186415B2 (ja) | 立体画像表示方法及び携帯端末 | |
| US9766793B2 (en) | Information processing device, information processing method and program | |
| US10990240B1 (en) | Artificial reality system having movable application content items in containers | |
| KR20250109800A (ko) | 환경에서 객체들을 조작하기 위한 방법들 | |
| KR102837402B1 (ko) | 포비에이티드 메시들로 몰입형 비디오 콘텐츠를 렌더링하기 위한 방법들, 시스템들, 및 매체들 | |
| CN114138106B (zh) | 混合虚拟现实桌面计算环境中的状态之间的转换 | |
| US11373271B1 (en) | Adaptive image warping based on object and distance information | |
| JP2017514192A (ja) | 応用素子の代替的グラフィック表示の事前の生成による入力に対する低レイテンシの視覚的応答およびグラフィック処理ユニットの入力処理 | |
| US9986225B2 (en) | Techniques for cut-away stereo content in a stereoscopic display | |
| KR20160014601A (ko) | 다수의 3d 디스플레이들에 대해 오브젝트를 렌더링하기 위한 방법 및 장치 | |
| US20140089859A1 (en) | Equipment control device, operation reception method, and program | |
| CN108377361B (zh) | 一种监控视频的显示控制方法及装置 | |
| JP2007293429A (ja) | 画像閲覧装置、コンピュータの制御方法及びプログラム | |
| CN116095356A (zh) | 用于呈现虚拟场景的方法、装置、设备和存储介质 | |
| KR101897789B1 (ko) | 3차원 바탕화면 제공 방법 및 시스템 | |
| US20230419439A1 (en) | Warping an input image based on depth and offset information | |
| JP2019032713A (ja) | 情報処理装置、情報処理方法及びプログラム | |
| US20230147561A1 (en) | Metaverse Content Modality Mapping | |
| JP2017134803A (ja) | 情報処理装置、情報処理方法 | |
| JP2025089960A (ja) | 表示制御装置、表示制御方法、及びプログラム |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16827051 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 16827051 Country of ref document: EP Kind code of ref document: A1 |