[go: up one dir, main page]

WO2018107923A1 - Procédé d'identification de point caractéristique de positionnement destiné à être utilisé dans un espace de réalité virtuelle - Google Patents

Procédé d'identification de point caractéristique de positionnement destiné à être utilisé dans un espace de réalité virtuelle Download PDF

Info

Publication number
WO2018107923A1
WO2018107923A1 PCT/CN2017/109795 CN2017109795W WO2018107923A1 WO 2018107923 A1 WO2018107923 A1 WO 2018107923A1 CN 2017109795 W CN2017109795 W CN 2017109795W WO 2018107923 A1 WO2018107923 A1 WO 2018107923A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual reality
infrared
infrared point
point light
light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/CN2017/109795
Other languages
English (en)
Chinese (zh)
Inventor
李宗乘
党少军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
VR Technology Holdings Ltd
Original Assignee
VR Technology Holdings Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by VR Technology Holdings Ltd filed Critical VR Technology Holdings Ltd
Publication of WO2018107923A1 publication Critical patent/WO2018107923A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • G06F3/0325Detection arrangements using opto-electronic means using a plurality of light emitters or reflectors or a plurality of detectors forming a reference frame from which to derive the orientation of the object, e.g. by triangulation or on the basis of reference deformation in the picked up image

Definitions

  • the present invention relates to the field of virtual reality, and more particularly to a virtual reality spatial positioning feature point identification method.
  • Spatial positioning generally uses optical or ultrasonic modes for positioning and measurement, and the model is used to derive the spatial position of the object to be measured.
  • the general virtual reality space positioning system uses the infrared point and the light-sensing camera to determine the spatial position of the object.
  • the infrared point is at the front end of the near-eye display device.
  • the light-sensing camera captures the position of the infrared point and then derives the user's position. Physical coordinates. If you know the correspondence between at least three light sources and projections, you can call the PnP algorithm to get the spatial positioning position of the helmet.
  • the key to realize this process is to determine the light source ID (Identity, serial number) corresponding to the projection.
  • the current virtual reality space positioning in determining the corresponding light source ID of the projection often has the disadvantages of corresponding inaccuracy and corresponding long interval, which affects the accuracy and efficiency of the positioning.
  • the present invention provides a virtual reality spatial positioning feature that determines the accuracy and efficiency of the projection ID. Point identification method.
  • a virtual reality spatial positioning feature point identification method which includes the following steps:
  • S2 illuminating one of the infrared point light sources on the virtual reality helmet, and the processing unit records an ID of the infrared point source corresponding to the light spot on the image captured by the infrared camera;
  • S3 the virtual reality helmet keeps the infrared point light source that is lit in the previous frame, and is in a lighting state, and Illuminating a new infrared point source, the processing unit determining an ID of the infrared point source corresponding to the newly added spot on the image captured by the infrared camera;
  • S4 repeating S3 until all of the infrared point light sources are illuminated and the processing unit determines an ID of the infrared point source corresponding to all light spots on the image captured by the infrared camera.
  • the ID of the infrared point source corresponding to the newly added light spot is determined by comparing the image difference between the current frame and the previous frame.
  • the processing unit combines the historical information of the previous frame to make a slight translation of the light spot of the image of the previous frame, so that the light spot of the image of the previous frame is Corresponding relationship is generated between the light spots of the current frame image, and the corresponding ID of each light spot having a corresponding relationship on the current frame image is determined according to the correspondence relationship and the history information of the previous frame.
  • the light spot on the current frame image that has no corresponding relationship with the previous frame image corresponds to the ID of the newly illuminated infrared point light source.
  • the present invention provides a method for determining the spot ID by accurately finding a method corresponding to the infrared point source ID of the image taken by the infrared camera by sequentially lighting the infrared point source. And efficient.
  • the ID corresponding to the newly added spot can be judged by comparing the images of the two frames before and after.
  • the added spot and its corresponding ID are determined by adding displacement, and the virtual reality helmet is provided.
  • a method of identifying a spot ID in a plurality of motion states By monitoring the number of infrared point sources and the number of spots on the image and whether the number of spots meets the number of points required by the PnP algorithm, the accuracy of the positioning is ensured and deviations are prevented.
  • FIG. 1 is a schematic diagram of a virtual reality helmet of a virtual reality spatial positioning feature point recognition method according to the present invention
  • FIG. 2 is a schematic diagram showing the principle of a virtual reality spatial positioning feature point recognition method according to the present invention
  • FIG. 3 is an infrared point image taken by an infrared camera.
  • the present invention provides a virtual reality spatial positioning feature point identification method for determining the accuracy and efficiency of the projection ID.
  • the virtual reality positioning feature point recognition method of the present invention comprises a virtual reality helmet 10, an infrared camera 20 and a processing unit 30, and the infrared camera 20 is electrically connected to the processing unit 30.
  • the virtual reality helmet 10 includes a front panel 11 , and a plurality of infrared point light sources 13 are distributed on the front panel 11 of the virtual reality helmet 10 and the four side panels of the upper, lower, left and right sides, and the plurality of infrared point light sources 13 can pass through the virtual reality
  • the firmware interface of the helmet 10 is illuminated or turned off as needed.
  • FIG. 3 shows an infrared point image taken by an infrared camera.
  • the front panel 11 of the virtual reality helmet 10 faces an infrared camera (not shown)
  • the infrared point source can be 13
  • a spot projection is formed on the image, and the rest forms a uniform background image.
  • the infrared point source 13 on the virtual reality helmet 10 can form a spot of light on the image.
  • the virtual reality helmet 10 When the ID recognition is started, the virtual reality helmet 10 is in an initial state, lighting an infrared point light source 13 on the virtual reality helmet 10, and the processing unit 30 records the illuminated infrared point source 13 according to the light spots on the image.
  • the correspondence relationship of the light spots that is, the ID of the infrared point light source 13 corresponding to the light spot on the image captured by the infrared camera.
  • the virtual reality helmet 10 keeps the infrared light source 13 that is lit in the previous frame, and is lit, and lights up a new infrared point light source 13, which can be found on the image taken by the infrared camera 20.
  • the processing unit 30 determines the ID of the infrared point source 13 corresponding to the newly added light spot.
  • the virtual reality helmet 10 keeps the infrared spot light source 13 illuminated in the previous frame, and lights up a new infrared point light source 13, and uses the same method to determine the added light on the image.
  • the ID of the spot is newly illuminated by an infrared point source 13 in each frame according to the above method until all the infrared point sources 13 are lit, and each spot successfully corresponds to the ID of the illuminated infrared point source 13, ID identification process End.
  • the method for the processing unit 30 to determine the ID of the infrared point source 13 corresponding to the newly added light spot is: in the initial state of the virtual reality helmet 10, there is no corresponding relationship of the previous frame, or the data loss of the previous frame needs to be determined again.
  • Corresponding relationship ⁇ the present invention illuminates only one infrared point source 13 at the initial ,, so that there is at most one spot on the image, and in this case, the correspondence can be easily determined. By illuminating a new infrared point source 13 with each additional point, it is possible to illuminate a plurality of infrared point sources 13 to determine the desired correspondence.
  • the light spot corresponding to the newly added infrared point light source 13 can be determined by comparing the image difference between the current frame and the previous frame, and the corresponding ID of the light spot is The ID of the illuminated infrared point source 13 is added.
  • the general sampling time is 30ms, so in general, each light spot of the previous frame and each of the current frames except the newly added light spots The position difference of the light spots is small, and the processing unit 30 combines the known historical information of the previous frame to make a slight translation of the light spots of the previous frame image, so that the light spots of the previous frame image and the light spots of the current frame image are generated.
  • Corresponding relationship determining, according to the correspondence relationship and the historical information of the previous frame, the corresponding ID of each light spot corresponding to the current frame image, and the same, the light spot having no corresponding relationship with the previous frame image on the current frame image Corresponding to the ID of the newly illuminated infrared point source.
  • the processing unit 30 can call the PnP algorithm to obtain the spatial positioning position of the helmet.
  • the Pn P algorithm belongs to the prior art, and the present invention will not be described again.
  • the present invention provides a method for determining the spot ID by correspondingly searching for the method corresponding to the infrared point source 13ID of the image taken by the infrared camera 20 by sequentially lighting the infrared point source 13. , accurate and efficient.
  • the virtual reality helmet 10 When the virtual reality helmet 10 is stationary, the contrast between the two frames is The ID corresponding to the newly added spot can be judged.
  • the added spot and its corresponding ID are determined by adding displacement, and the method for identifying the spot ID of the virtual reality helmet in multiple motion states is provided.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

L'invention concerne un procédé d'identification de point caractéristique de positionnement destiné à être utilisé dans un espace de réalité virtuelle, comprenant les étapes suivantes : confirmer que toutes les sources de lumière ponctuelle infrarouge sur un casque de réalité virtuelle ont été éteintes, et si toutes les sources de lumière ponctuelle infrarouge n'ont pas été éteintes, éteindre les sources de lumière ponctuelles infrarouges qui sont dans un état allumé ; allumer une des sources de lumière ponctuelle infrarouge sur le casque de réalité virtuelle, et une unité de traitement enregistrant une ID de source de lumière ponctuelle infrarouge qui correspond à un point lumineux sur une image capturée par une caméra infrarouge ; ledit casque de réalité virtuelle maintenant la source de lumière ponctuelle infrarouge allumée au moment d'une trame précédente dans un état éclairé, et allumant également une nouvelle source de lumière ponctuelle infrarouge, et l'unité de traitement déterminant un ID de source de lumière ponctuelle infrarouge correspondant à un point lumineux nouvellement ajouté sur l'image capturée par la caméra infrarouge. Au moyen de l'allumage séquentiel des sources de lumière ponctuelle infrarouge et de la recherche, de manière correspondante, d'ID de source de lumière ponctuelle infrarouge correspondant à des points lumineux sur une image capturée par une caméra infrarouge, le présent procédé fournit une solution précise et efficace pour déterminer des ID de spot lumineux.
PCT/CN2017/109795 2016-12-16 2017-11-07 Procédé d'identification de point caractéristique de positionnement destiné à être utilisé dans un espace de réalité virtuelle Ceased WO2018107923A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201611167337.7 2016-12-16
CN201611167337.7A CN106774992A (zh) 2016-12-16 2016-12-16 虚拟现实空间定位特征点识别方法

Publications (1)

Publication Number Publication Date
WO2018107923A1 true WO2018107923A1 (fr) 2018-06-21

Family

ID=58891904

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/109795 Ceased WO2018107923A1 (fr) 2016-12-16 2017-11-07 Procédé d'identification de point caractéristique de positionnement destiné à être utilisé dans un espace de réalité virtuelle

Country Status (2)

Country Link
CN (1) CN106774992A (fr)
WO (1) WO2018107923A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111914716A (zh) * 2020-07-24 2020-11-10 深圳市瑞立视多媒体科技有限公司 主动光刚体识别方法、装置、设备及存储介质

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106774992A (zh) * 2016-12-16 2017-05-31 深圳市虚拟现实技术有限公司 虚拟现实空间定位特征点识别方法
CN107390953A (zh) * 2017-07-04 2017-11-24 深圳市虚拟现实科技有限公司 虚拟现实手柄空间定位方法
CN107390952A (zh) * 2017-07-04 2017-11-24 深圳市虚拟现实科技有限公司 虚拟现实手柄特征点空间定位方法
CN107219963A (zh) * 2017-07-04 2017-09-29 深圳市虚拟现实科技有限公司 虚拟现实手柄图形空间定位方法和系统
CN115909115A (zh) * 2022-11-02 2023-04-04 歌尔科技有限公司 一种控制器定位方法、装置、设备及存储介质
CN115937725B (zh) * 2023-03-13 2023-06-06 江西科骏实业有限公司 空间交互装置的姿态显示方法、装置、设备及其存储介质

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102662501A (zh) * 2012-03-19 2012-09-12 Tcl集团股份有限公司 光标定位系统、方法、被遥控装置及遥控器
CN104637080A (zh) * 2013-11-07 2015-05-20 深圳先进技术研究院 一种基于人机交互的三维绘图系统及方法
CN105867611A (zh) * 2015-12-29 2016-08-17 乐视致新电子科技(天津)有限公司 虚拟现实系统中的空间定位方法、装置及系统
US20160252976A1 (en) * 2015-02-26 2016-09-01 Konica Minolta Laboratory U.S.A., Inc. Method and apparatus for interactive user interface with wearable device
CN106200981A (zh) * 2016-07-21 2016-12-07 北京小鸟看看科技有限公司 一种虚拟现实系统及其无线实现方法
CN106200985A (zh) * 2016-08-10 2016-12-07 北京天远景润科技有限公司 桌面型个人沉浸虚拟现实交互设备
CN106774992A (zh) * 2016-12-16 2017-05-31 深圳市虚拟现实技术有限公司 虚拟现实空间定位特征点识别方法

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0813040A3 (fr) * 1996-06-14 1999-05-26 Xerox Corporation Mappage spatial à précision avec des signaux vidéo et infrarouges combinés
CN104834165B (zh) * 2012-03-21 2017-04-12 海信集团有限公司 一种投影屏幕上的激光点位置确定方法
CN103593051B (zh) * 2013-11-11 2017-02-15 百度在线网络技术(北京)有限公司 头戴式显示设备
CN105931272B (zh) * 2016-05-06 2019-04-05 上海乐相科技有限公司 一种运动对象追踪方法及系统

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102662501A (zh) * 2012-03-19 2012-09-12 Tcl集团股份有限公司 光标定位系统、方法、被遥控装置及遥控器
CN104637080A (zh) * 2013-11-07 2015-05-20 深圳先进技术研究院 一种基于人机交互的三维绘图系统及方法
US20160252976A1 (en) * 2015-02-26 2016-09-01 Konica Minolta Laboratory U.S.A., Inc. Method and apparatus for interactive user interface with wearable device
CN105867611A (zh) * 2015-12-29 2016-08-17 乐视致新电子科技(天津)有限公司 虚拟现实系统中的空间定位方法、装置及系统
CN106200981A (zh) * 2016-07-21 2016-12-07 北京小鸟看看科技有限公司 一种虚拟现实系统及其无线实现方法
CN106200985A (zh) * 2016-08-10 2016-12-07 北京天远景润科技有限公司 桌面型个人沉浸虚拟现实交互设备
CN106774992A (zh) * 2016-12-16 2017-05-31 深圳市虚拟现实技术有限公司 虚拟现实空间定位特征点识别方法

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111914716A (zh) * 2020-07-24 2020-11-10 深圳市瑞立视多媒体科技有限公司 主动光刚体识别方法、装置、设备及存储介质
CN111914716B (zh) * 2020-07-24 2023-10-20 深圳市瑞立视多媒体科技有限公司 主动光刚体识别方法、装置、设备及存储介质

Also Published As

Publication number Publication date
CN106774992A (zh) 2017-05-31

Similar Documents

Publication Publication Date Title
WO2018107923A1 (fr) Procédé d'identification de point caractéristique de positionnement destiné à être utilisé dans un espace de réalité virtuelle
CN110266916B (zh) 用于在眼睛追踪中处理眩光的方法和系统
US9406170B1 (en) Augmented reality system with activity templates
WO2002054217A1 (fr) Procede et dispositif de saisie de donnees en ecriture manuscrite, et procede et dispositif d'authentification
US20170004363A1 (en) Gaze tracking device and a head mounted device embedding said gaze tracking device
WO2018113433A1 (fr) Procédé de criblage et de localisation spatiale de points caractéristiques de réalité virtuelle
US11712619B2 (en) Handle controller
JP2017049762A (ja) システム及び方法
RU2012130358A (ru) Инструментальное средство освещения для создания световых сцен
JP6104143B2 (ja) 機器制御システム、および、機器制御方法
US8901855B2 (en) LED light illuminating control system and method
CN105467356B (zh) 一种高精度的单led光源室内定位装置、系统及方法
WO2019033322A1 (fr) Dispositif de commande portatif, et procédé et système de suivi et de localisation
TWI526879B (zh) 互動系統、遙控器及其運作方法
CN107219963A (zh) 虚拟现实手柄图形空间定位方法和系统
JP5799232B2 (ja) 照明制御装置
CN106599930B (zh) 虚拟现实空间定位特征点筛选方法
CN106648147A (zh) 虚拟现实特征点空间定位方法和系统
US9329679B1 (en) Projection system with multi-surface projection screen
JP4627052B2 (ja) 画像に連携した音声出力方法および装置
JP2014052813A (ja) 瞳孔検出装置および瞳孔検出方法
JP2016027448A (ja) 情報評価装置、情報評価方法、及びプログラム
US10691203B2 (en) Image sound output device, image sound output method and image sound output program
JP4409545B2 (ja) 三次元位置特定装置および方法、奥行位置特定装置
US12038159B2 (en) Method for creating XYZ focus paths with a user device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17880900

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 04/11/2019)

122 Ep: pct application non-entry in european phase

Ref document number: 17880900

Country of ref document: EP

Kind code of ref document: A1