CN201220098Y - Head type controller for capturing and following virtual or remote target - Google Patents
Head type controller for capturing and following virtual or remote target Download PDFInfo
- Publication number
- CN201220098Y CN201220098Y CN 200820050747 CN200820050747U CN201220098Y CN 201220098 Y CN201220098 Y CN 201220098Y CN 200820050747 CN200820050747 CN 200820050747 CN 200820050747 U CN200820050747 U CN 200820050747U CN 201220098 Y CN201220098 Y CN 201220098Y
- Authority
- CN
- China
- Prior art keywords
- infrared
- infrared receiver
- head
- controller
- control circuit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
- 230000001133 acceleration Effects 0.000 claims abstract description 17
- 230000035945 sensitivity Effects 0.000 claims abstract description 4
- 239000011521 glass Substances 0.000 claims description 16
- 239000004973 liquid crystal related substance Substances 0.000 claims description 7
- 239000000203 mixture Substances 0.000 claims description 2
- 210000003128 head Anatomy 0.000 description 13
- 230000033001 locomotion Effects 0.000 description 13
- 230000000007 visual effect Effects 0.000 description 8
- 238000010586 diagram Methods 0.000 description 6
- 238000000034 method Methods 0.000 description 6
- 230000005540 biological transmission Effects 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 210000000887 face Anatomy 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 239000013078 crystal Substances 0.000 description 1
- 238000007405 data analysis Methods 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 230000004886 head movement Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- SBWGZAXBCCNRTM-CTHBEMJXSA-N n-methyl-n-[(2s,3r,4r,5r)-2,3,4,5,6-pentahydroxyhexyl]octanamide Chemical compound CCCCCCCC(=O)N(C)C[C@H](O)[C@@H](O)[C@H](O)[C@H](O)CO SBWGZAXBCCNRTM-CTHBEMJXSA-N 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
Images
Landscapes
- Position Input By Displaying (AREA)
Abstract
本实用新型公开一种头戴式虚拟或远程目标捕捉跟随控制器,该控制器包括控制电路,其特征在于所述的控制电路由一单片机构成的最小控制单元、红外线发射器和红外线接收器组成的系统空间坐标系的原点标定单元、三轴低量级加速度传感器和标准的RS-232串行接口连接构成,其中,所述的红外线接收器的输出端和三轴低量级加速度传感器的X、Y、Z三个方向加速度信号的输出端以及灵敏度选择输入端分别与所述单片机的I/O口连接;所述的RS-232串行接口串接于所述单片机的读写端口上。本实用新型实现了操作者自己的脸转向一个方向时,“第一人称”的视线就会转向同一个方向的目的,从而极大地提高了在虚拟环境中或远程机器人所处的环境中捕捉、跟踪、监视目标的成功率。
The utility model discloses a head-mounted virtual or remote target capture and follow controller. The controller includes a control circuit, which is characterized in that the control circuit is composed of a minimum control unit composed of a single-chip computer, an infrared transmitter and an infrared receiver. The origin calibration unit of the system space coordinate system, the three-axis low-level acceleration sensor and the standard RS-232 serial interface are connected, wherein, the output terminal of the infrared receiver and the X of the three-axis low-level acceleration sensor The output ends of acceleration signals in three directions of , Y, and Z and the sensitivity selection input end are respectively connected to the I/O port of the single-chip microcomputer; the RS-232 serial interface is serially connected to the read-write port of the single-chip microcomputer. The utility model realizes that when the operator's own face turns to one direction, the sight of the "first person" will turn to the same direction, thereby greatly improving the ability to capture and track in the virtual environment or the environment where the remote robot is located. , Monitor the success rate of the target.
Description
技术领域 technical field
本实用新型涉及一种控制装置,具体涉及一种专门适合于除手、脚外的人体其它部分操纵的跟随控制器,它适用于游戏画面中目标物和机器人视觉范围内目标物的捕捉和跟踪监视。The utility model relates to a control device, in particular to a follower controller specially suitable for manipulating other parts of the human body except hands and feet, which is suitable for capturing and tracking objects in the game screen and objects within the visual range of a robot monitor.
背景技术 Background technique
在现实生活中,人们捕捉周围环境中的目标物都是由头配合视觉系统完成的,但是在虚拟的环境中的目标物就要手或其它肢体配合视觉系统来完成。如,现有电脑游戏画面的移动基本上都通过鼠标、键盘或手柄等输入设备来进行操作和控制的,就连实际生活中由头部配合视觉系统完成的目标捕捉工作也得必须由手配合视觉系统来完成,这既要耗费相当长的训练时间,也不利于训练实际生活中所要的人体各部位的协调能力。再如远程操控机器人,控制者从远程摄像机传回的画面中捕捉了运动的目标,但要让机器人跟踪监视该运动目标就必须要用手来操纵如操纵杆或刻度盘等输入设备才能完成,这时控制者就失去了控制机器人的其它动作或机器人所携带的其它装备时机,往往会导致整个任务的失败。如果所控制的机器人是一个执行抓捕和格斗任务的机器人,甚至还会惨遭敌人的攻击。In real life, people capture the target objects in the surrounding environment through the cooperation of the head with the visual system, but the target objects in the virtual environment need to be completed with the cooperation of hands or other limbs with the visual system. For example, the movement of existing computer game screens is basically operated and controlled by input devices such as mouse, keyboard or handle. It needs to be completed by the visual system, which will take a long time for training, and it is not conducive to training the coordination ability of various parts of the human body required in real life. Another example is the remote control of the robot. The controller captures the moving target from the picture sent back from the remote camera, but in order for the robot to track and monitor the moving target, it must be done by manipulating input devices such as joysticks or dials by hand. At this time, the controller loses the opportunity to control other actions of the robot or other equipment carried by the robot, which often leads to the failure of the entire task. If the controlled robot is a robot performing capture and fighting tasks, it may even be attacked by the enemy.
由上述的两个实例可见,要跟踪监视虚拟环境或远程环境中的目标物,尤其是跟踪监视虚拟环境或远程环境中的运动目标,就得要让目标物出现在虚拟环境或远程环境中的“主人”(如第一人称游戏中的玩家和远程控制的机器人)的视觉范围内。要做到这一点,除保持“主人”与目标物的距离外,还得让“主人”的视线始终盯住目标物。让“主人”的视线始终盯住目标物的最理想的方法无疑是,让“主人”的视线跟随控制者头部的运动改变方向,而实现该目的首先要解决的技术问题是,控制者头部的空间运动轨迹的检测。It can be seen from the above two examples that to track and monitor the target in the virtual environment or the remote environment, especially to track and monitor the moving target in the virtual environment or the remote environment, it is necessary to allow the target to appear in the virtual environment or the remote environment. Within the line of sight of "masters" such as players in first-person games and remote-controlled robots. To achieve this, in addition to maintaining the distance between the "master" and the target, the "master" must always keep his eyes on the target. Undoubtedly, the most ideal way to keep the "master"'s sight fixed on the target is to let the "master"'s sight follow the movement of the controller's head to change direction, and the first technical problem to be solved to achieve this goal is that the controller's head The detection of the spatial motion trajectory of the department.
在现有技术中,人体或人的肢体运动轨迹的检测多见于医学领域。国知局2005年2月23日公开了“一种人体运动轨迹检测方法”(公开号为CN 1582851A),该方法是:将“标志点粘贴在被检测人待检测部位表面,在间距小或者有交叉运动的部位粘贴不同颜色标志点,并且所述标志点的颜色与待检测部位表面的颜色不相同;用彩色摄像机摄取被检测人在步道上行走过程的图像,并通过彩色图像采集卡实时将采集信息送入计算机;存储在计算机中的程序进行标志点的识别与跟踪,以及数据分析。”可见上述专利申请所述的方法只能得到供医学研究的数据,但要将所得到的数据用于实时跟随控制仍存在下述问题:其一是摄象机捕捉拍摄目标物图像的速度慢,要用其处理后得到的数据来进行虚拟环境中目标物跟踪监视的跟随控制,显然会造成虚拟环境中的“主人”视觉系统跟随时间的大幅度延时;其二是图像处理、传输的数据量十分庞大,又进一步增加了“主人”视觉系统跟随时间延时的幅度;其三是使用一台摄像机很难得到精确的人的肢体运动的三维轨迹。In the prior art, the detection of a human body or a human body's limb movement trajectory is mostly found in the medical field. The State Intellectual Property Bureau disclosed "a human body motion trajectory detection method" (public number CN 1582851A) on February 23, 2005. Paste different color mark points on parts with cross movement, and the color of the mark points is different from the color of the surface of the part to be detected; use a color camera to capture the images of the person being detected walking on the trail, and use the color image acquisition card to capture the images in real time Send the collected information into the computer; the program stored in the computer performs the identification and tracking of the landmarks, as well as data analysis.” It can be seen that the method described in the above patent application can only obtain data for medical research, but the obtained data must be There are still following problems for real-time follow-up control: one is that the camera captures and shoots the target image at a slow speed, and the data obtained after processing will be used for follow-up control of target tracking and monitoring in the virtual environment, which will obviously cause The "master" visual system in the virtual environment has a large delay in following time; the second is that the amount of image processing and transmission data is very large, which further increases the range of time delay of the "master" visual system; the third is to use It is difficult for a camera to obtain accurate three-dimensional trajectories of human body movements.
中国日报环球在线2008年2月28日刊载的《美国推出“第一人称”操纵的战斗机器人》一文,文中说操纵者戴上一种装上被称为“Head-Aimed Remote Viewer(HARV)”的这种系统的头盔,就会从戴上的目镜看到机器人看到的东西,且自己的脸转向一个方向时,装在机器人上的摄像机就会转向同一个方向。但是文中未对“Head-Aimed Remote Viewer(HARV)”进行任何描述,其控制方案无从考证。China Daily Global Online published an article on February 28, 2008, "The U.S. Launches "First-Person" Manipulated Combat Robots." The helmet of this system will see what the robot sees through the eyepieces worn, and when one's own face is turned in one direction, the camera mounted on the robot will turn in the same direction. However, there is no description of "Head-Aimed Remote Viewer (HARV)" in the article, and its control scheme cannot be verified.
发明内容 Contents of the invention
鉴于现有技术存在上述不足,本实用新型所要解决的技术问题是“第一人称”视线跟随人体头部运动轨迹改变的控制装置。上述“第一人称”可以是第一人称游戏的玩家,也可以是远程控制的机器人。In view of the above-mentioned deficiencies in the prior art, the technical problem to be solved by the utility model is a control device for "first-person" line of sight following the movement trajectory of the human head. The above-mentioned "first person" can be a player of a first-person game, or a remote-controlled robot.
本实用新型解决上述问题的技术方案为:The technical scheme that the utility model solves the above problems is:
一种头戴式虚拟或远程目标捕捉跟随控制器,该控制器包括控制电路,其特征在于所述的控制电路由一单片机构成的最小控制单元、红外线发射器和红外线接收器组成的系统空间坐标系的原点标定单元、三轴低量级加速度传感器和标准的RS-232串行接口连接构成,其中,A head-mounted virtual or remote target capture and follow controller, the controller includes a control circuit, characterized in that the control circuit is composed of a minimum control unit composed of a single-chip computer, an infrared transmitter and an infrared receiver. The origin calibration unit of the system, the three-axis low-level acceleration sensor and the standard RS-232 serial interface are connected. Among them,
所述的红外线接收器的输出端和三轴低量级加速度传感器的X、Y、Z三个方向加速度信号的输出端以及灵敏度选择输入端分别与所述单片机的I/O口连接;The output end of the infrared receiver, the output end of the acceleration signals in the X, Y, and Z directions of the three-axis low-level acceleration sensor, and the sensitivity selection input end are respectively connected to the I/O port of the single-chip microcomputer;
所述的的RS-232串行接口串接于所述单片机的读写端口上。The RS-232 serial interface is serially connected to the read-write port of the single-chip microcomputer.
本实用新型所述的跟随控制器,其中所述的串行接口可通过机房内PC机与网络另一头的游戏服务器、其它玩家的PC机或机器人连接,以实现“第一人称”视线跟随控制者人体头部运动轨迹改变的的目的。The following controller described in the utility model, wherein the serial interface can be connected with the game server at the other end of the network, other players’ PCs or robots through the PC in the computer room, so as to realize the "first person" line of sight to follow the controller The purpose of changing the trajectory of the human head movement.
本实用新型所述的跟随控制器,其中所述的红外线发射器可以由一只普通的红外线发光二极管担任,也可以由一只普通的红外线发光二极管串接一电子开关构成,如采用后一方案时,其中所述的电子开关的控制端与所述单片机的一I/O口连接,这样利用单片机附加一些特殊的编码信息,以提高抗干扰能力。In the following controller described in the utility model, the infrared emitter can be served by an ordinary infrared light-emitting diode, or can be composed of an ordinary infrared light-emitting diode connected in series with an electronic switch, such as adopting the latter scheme When, wherein the control terminal of the electronic switch is connected with an I/O port of the single-chip microcomputer, so that the single-chip microcomputer is used to add some special coding information to improve the anti-interference ability.
本实用新型所述的跟随控制器,它可以由发射装置和适合佩戴于人头部的数据头盔组成,也可以由发射装置和适合佩戴于人面部的数据眼镜组成。上述控制电路中的红外线发射器设置在所述发射装置中,除红外线发射器外的其它电路设置在数据眼镜的镜框中或数据头盔的盔体中,所述控制电路中的红外线接收器的受光面露出数据眼镜的镜框或数据头盔的盔体外,最好是露出镜框或盔体的正前方。如此设计,所述的发射装置可安装在电脑显示器面板的上边,以便于系统将人体面朝正前方时两只眼睛的聚焦点定义为系统坐标的原点,使后跟踪目标时人体头部的运动更加接近现实。所述数据眼镜的镜框中还可嵌设一液晶显示器,以使游戏者或机器人的控制者具有完全沉浸现场的体验。The following controller described in the utility model can be composed of a transmitting device and a data helmet suitable for wearing on a human head, or can be composed of a transmitting device and data glasses suitable for wearing on a human face. The infrared emitter in the above control circuit is set in the transmitting device, and other circuits except the infrared emitter are set in the frame of the data glasses or in the helmet body of the data helmet, and the light receiving of the infrared receiver in the control circuit The outside of the frame of the data glasses or the helmet of the data helmet is exposed, preferably the front of the frame or the helmet. Designed in this way, the launcher can be installed on the top of the computer display panel, so that the system defines the focal point of the two eyes as the origin of the system coordinates when the human body is facing the front, so that the movement of the human head when tracking the target closer to reality. A liquid crystal display can also be embedded in the frame of the data glasses, so that the player or the controller of the robot can fully immerse the scene experience.
本实用新型所述的跟随控制器,通过红外线发射器和红外线接收器将人头部的某一姿态转变成标定系统空间坐标系的原点,并采用三轴低量级加速度传感器感知人体头部的转动、倾斜、和俯、仰运动,并将其转换为X、Y、Z三个方向加速度信号来控制游戏画面的移动或远程机器人视线方向,实现了操作者自己的脸转向一个方向时,“第一人称”的视线就会转向同一个方向的目的,不仅将人的手解放出来,而且也符合人的视觉习惯,从而极大地提高了在虚拟环境中或远程机器人所处的环境中捕捉、跟踪、监视目标的成功率。此外实用新型所述的跟随控制器还具有结构简单成本低廉的优点,既可用于民用的电玩领域,也可用于公安和军事领域。The following controller described in the utility model converts a certain posture of the human head into the origin of the space coordinate system of the calibration system through an infrared transmitter and an infrared receiver, and uses a three-axis low-level acceleration sensor to sense the position of the human head. Turning, tilting, and pitching and tilting movements, and converting them into acceleration signals in the three directions of X, Y, and Z to control the movement of the game screen or the direction of sight of the remote robot, realizes that when the operator's own face turns to one direction, " The purpose of the "first-person" sight will turn to the same direction, which not only liberates human hands, but also conforms to human visual habits, thus greatly improving the ability to capture and track robots in the virtual environment or in the environment where the remote robot is located. , Monitor the success rate of the target. In addition, the following controller described in the utility model also has the advantages of simple structure and low cost, and can be used not only in civilian video game fields, but also in public security and military fields.
附图说明:Description of drawings:
图1为本实用新型所述跟随控制器的控制电路的一种具体实施例的电路原理框图;Fig. 1 is the circuit block diagram of a kind of specific embodiment of the control circuit of following controller described in the utility model;
图2为本实用新型所述跟随控制器的控制电路的一种具体实施例的电原理图;Fig. 2 is the electric principle diagram of a kind of specific embodiment of the control circuit of following controller described in the utility model;
图3为本实用新型所述跟随控制器的一种具体实施例的结构示意图,图中虚线箭头表示红外线的传输方向;Fig. 3 is a structural schematic diagram of a specific embodiment of the follower controller described in the present invention, in which the dotted arrow indicates the transmission direction of infrared rays;
图4为本实用新型所述跟随控制器的另一种具体实施例的结构示意图,图中虚线箭头表示红外线的传输方向;Fig. 4 is a structural schematic diagram of another specific embodiment of the follower controller described in the present invention, in which the dotted arrow indicates the transmission direction of infrared rays;
图5为图3中数据眼睛2的结构示意图;Fig. 5 is a schematic structural diagram of the
图6为图4中数据头盔3的结构示意图。FIG. 6 is a schematic structural diagram of the data helmet 3 in FIG. 4 .
具体实施方式:Detailed ways:
参见图1,本实用新型所述跟随控制器的控制电路以控制单元为核心配置系统空间坐标系的原点标定单元、三轴低量级加速度传感器和标准的RS-232串行接口等外围单元电路组成。Referring to Fig. 1, the control circuit following the controller described in the utility model takes the control unit as the core to configure the origin calibration unit of the system space coordinate system, the three-axis low-level acceleration sensor and the peripheral unit circuits such as the standard RS-232 serial interface composition.
参见图2,所述的控制单元由ATMEL公司生产的MEGA8型嵌入式单片机U1配以由16M晶振和电容组成的常规振荡器构成;系统空间坐标系的原点标定单元中的红外线发射器市售的由红外线发光二极管IR-LED串接一NPN型三极管所形成的电子开关构成,系统空间坐标系的原点标定单元中的红外线接收器由一接收放大一体式红外接收管SFH506-38担任,其中,红外接收管SFH506-38的信号输出端与嵌入式单片机U1的PB0口连接,NPN型三极管的基极与单片机U1的PB1口连接;三轴低量级加速度传感器U2选用飞思卡尔的MMA7260Q芯片,U2的X、Y、Z三个输出端分别与单片机U1的一个ADC口(PC0~PC2)连接,U2的灵敏度选择输入端Select1和Select2分别与单片机U1的PB2和PB3口连接,睡眠模式控制端与单片机U1的PD7口连接;RS-232串行接口U3由常用的MAX232担任,U3的一组收发口T1IN、T1OUT分别与单片机U1的读写口RXD、TXD连接。Referring to Fig. 2, described control unit is matched with the conventional oscillator that is made up of 16M crystal oscillator and electric capacity by the MEGA8 type embedded single-chip microcomputer U1 that ATMEL company produces; It consists of an electronic switch formed by an infrared light-emitting diode IR-LED connected in series with an NPN transistor. The infrared receiver in the origin calibration unit of the system space coordinate system is served by a receiving and amplifying integrated infrared receiving tube SFH506-38. Among them, the infrared The signal output terminal of the receiving tube SFH506-38 is connected to the PB0 port of the embedded microcontroller U1, and the base of the NPN transistor is connected to the PB1 port of the microcontroller U1; the three-axis low-level acceleration sensor U2 uses Freescale's MMA7260Q chip, U2 The X, Y, Z three output ports of the single-chip microcomputer U1 are respectively connected to an ADC port (PC0~PC2), the sensitivity selection input ports Select1 and Select2 of U2 are respectively connected to the PB2 and PB3 ports of the single-chip microcomputer U1, and the sleep mode control port is connected to The PD7 port of the single-chip microcomputer U1 is connected; the RS-232 serial interface U3 is served by the commonly used MAX232, and a group of transceiver ports T1IN and T1OUT of U3 are respectively connected with the read-write ports RXD and TXD of the single-chip microcomputer U1.
参见图3和图5,本例中,所述跟随控制器由发射装置1和适合佩戴于人面部的数据眼镜2组成,上述控制电路中的红外线发射器设在发射装置1中,控制单元、三轴低量级加速度传感器、标准的RS-232串行接口和红外线接收器设在数据眼镜2的镜框2-1内,红外线接收器的受光面由镜框2-1上边中部的小窗口2-2中露出;镜框2-1中还嵌设一液晶显示器2-3,该液晶显示器2-3的显示屏面朝镜脚2-4的一侧,佩戴者可通过液晶显示器2-2直接观看游戏画面或远程机器人拍摄传来的场景。Referring to Fig. 3 and Fig. 5, in this example, described following controller is made up of transmitting
参见图4和图6,本例中,所述跟随控制器由发射装置1和适合佩戴于人头部的数据头盔3组成,所述控制电路中的红外线发射器设在发射装置1中,控制单元、三轴低量级加速度传感器、标准的RS-232串行接口和红外线接收器设在数据头盔3的盔体3-1内,红外线接收器的受光面由盔体3-1前部中间的小窗口3-2露出。Referring to Fig. 4 and Fig. 6, in this example, described following controller is made up of transmitting
以下以图3和图5所示跟随控制器为例并结合图2简要描述使用本实用新型所述跟随控制器通过机器人的视觉系统捕捉远程环境中目标物和消除系统积累误差的工作过程。The following takes the following controller shown in Figure 3 and Figure 5 as an example and briefly describes the working process of using the following controller of the present invention to capture the target in the remote environment through the robot's vision system and eliminate the accumulated errors of the system in combination with Figure 2 .
使用时,将图3所示的数据眼镜2佩戴于人的面部并与计算机网络系统连接,然后将红外发射装置1可放置于数据眼镜2的佩戴者前任一固定不动的物体上,如电脑显示器面板的上方。准备好后,加电初始化,单片机U1以查询的方式不断地扫描与红外接收管SFH506-38输出端连接的ICP口,一旦数据眼睛上的小窗口2-2对准红外发射装置1,输出的系统空间坐标系原点标定信号时,单片机U1便记录此时U2的X、Y、Z三个输出端输出的电压值,并将此值为坐标系原点构建系统空间坐标系;接下来,数据眼镜的佩戴者在捕捉液晶显示器2-3所出现的目标物时,其头部会习惯地作左转、右转和俯、仰运动,随数据眼镜2的佩戴者头的运动,U2的X、Y、Z三个输出端便同步输出相应的电压值,该电压值由单片机U1变换为所述系统空间坐标系中的相对变化量,该相对变化量通过计算机和网络系统后即可控制远程机器人的视觉系统使其视线跟随数据眼镜2的佩戴者的头部同步运动。When in use, the
由于上述跟随控制过程中,整个系统必然会产生积累误差使系统控制精度下降。本实用新型所述跟随控制器中的系统空间坐标系的原点标定单元的另一个重要作用就是消除上述积累误差,其过程如下所述:系统空间坐标系的原点标定后,在数据眼镜佩戴者的头部作左转、右转和俯、仰运动时,必然有机会使所述的数据眼睛上的小窗口2-2对准红外发射装置1,每当二者对准时,即单片机U1检测到系统坐标原点标定信号时,便将所述系统空间坐标系中的相对变化量归零,即回到系统空间坐标系原点标定时的状态。比如系统空间坐标系原点标定时,在数据眼镜佩戴者的面部朝正前方的情况下,数据眼睛上的小窗口2-2就对准红外发射装置2,远程机器人的视觉系统的视线也朝机器人的正前方,那么此后,每当数据眼镜2的佩戴者的面部朝正前方时,无论此时远程机器人的视觉系统的视线的朝向如何都自动回到朝机器人的正前方的状态。Due to the above follow-up control process, the whole system will inevitably produce accumulated errors, which will reduce the control accuracy of the system. Another important function of the origin calibration unit of the system space coordinate system in the following controller described in the utility model is to eliminate the above-mentioned accumulated errors. The process is as follows: When the head is turned left, right, and tilted, there must be a chance to make the small window 2-2 on the data eyes aligned with the
本实用新型所述跟随控制器捕捉虚拟环境中目标物的工作过程,公众可参考上面的描述进行分析。The public can refer to the above description to analyze the working process of following the controller to capture the target in the virtual environment in the present invention.
Claims (4)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN 200820050747 CN201220098Y (en) | 2008-07-16 | 2008-07-16 | Head type controller for capturing and following virtual or remote target |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN 200820050747 CN201220098Y (en) | 2008-07-16 | 2008-07-16 | Head type controller for capturing and following virtual or remote target |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| CN201220098Y true CN201220098Y (en) | 2009-04-15 |
Family
ID=40573618
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN 200820050747 Expired - Fee Related CN201220098Y (en) | 2008-07-16 | 2008-07-16 | Head type controller for capturing and following virtual or remote target |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN201220098Y (en) |
Cited By (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN101890719A (en) * | 2010-07-09 | 2010-11-24 | 中国科学院深圳先进技术研究院 | Robot remote control device and robot system |
| CN102043410A (en) * | 2010-09-30 | 2011-05-04 | 清华大学 | Servo system for instructing pan-tilt system of unmanned aerial vehicle (UAV) by adopting head movement of operator |
| CN102358369A (en) * | 2011-07-22 | 2012-02-22 | 路海燕 | Intelligent tracking type electric remote-control wheelbarrow |
| WO2014121657A1 (en) * | 2013-02-07 | 2014-08-14 | Ma Kali | Prop for shooting game |
| CN104750220A (en) * | 2011-07-20 | 2015-07-01 | 谷歌公司 | Determining Whether A Wearable Device Is Used |
| CN107209009A (en) * | 2015-01-26 | 2017-09-26 | 普乐福尼克·迪特·布什股份公司 | Two main bodys are positioned by the calibration system with data glasses |
| WO2018058882A1 (en) * | 2016-09-30 | 2018-04-05 | 深圳市虚拟现实科技有限公司 | Method and system for automatic correction of attitude measurement device |
| WO2018058881A1 (en) * | 2016-09-30 | 2018-04-05 | 深圳市虚拟现实科技有限公司 | Method and system for automatic correction of attitude measurement device |
-
2008
- 2008-07-16 CN CN 200820050747 patent/CN201220098Y/en not_active Expired - Fee Related
Cited By (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN101890719A (en) * | 2010-07-09 | 2010-11-24 | 中国科学院深圳先进技术研究院 | Robot remote control device and robot system |
| CN101890719B (en) * | 2010-07-09 | 2015-06-03 | 中国科学院深圳先进技术研究院 | Robot remote control device and robot system |
| CN102043410A (en) * | 2010-09-30 | 2011-05-04 | 清华大学 | Servo system for instructing pan-tilt system of unmanned aerial vehicle (UAV) by adopting head movement of operator |
| CN104750220A (en) * | 2011-07-20 | 2015-07-01 | 谷歌公司 | Determining Whether A Wearable Device Is Used |
| CN102358369A (en) * | 2011-07-22 | 2012-02-22 | 路海燕 | Intelligent tracking type electric remote-control wheelbarrow |
| WO2014121657A1 (en) * | 2013-02-07 | 2014-08-14 | Ma Kali | Prop for shooting game |
| CN107209009A (en) * | 2015-01-26 | 2017-09-26 | 普乐福尼克·迪特·布什股份公司 | Two main bodys are positioned by the calibration system with data glasses |
| WO2018058882A1 (en) * | 2016-09-30 | 2018-04-05 | 深圳市虚拟现实科技有限公司 | Method and system for automatic correction of attitude measurement device |
| WO2018058881A1 (en) * | 2016-09-30 | 2018-04-05 | 深圳市虚拟现实科技有限公司 | Method and system for automatic correction of attitude measurement device |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN201220098Y (en) | Head type controller for capturing and following virtual or remote target | |
| US10191559B2 (en) | Computer interface for manipulated objects with an absolute pose detection component | |
| US7826641B2 (en) | Apparatus and method for determining an absolute pose of a manipulated object in a real three-dimensional environment with invariant features | |
| US9692990B2 (en) | Infrared tracking system | |
| US9892563B2 (en) | System and method for generating a mixed reality environment | |
| US9229540B2 (en) | Deriving input from six degrees of freedom interfaces | |
| KR101036403B1 (en) | Object detection using video input combined with tilt angle information | |
| AU2016407050A1 (en) | Multi -joint tracking combining embedded sensors and an external | |
| CN206224385U (en) | A kind of motion capture system with positioning function for reality environment | |
| CN108151738B (en) | Coding active light identification ball with posture resolving function | |
| CN102169366A (en) | Multi-target tracking method in three-dimensional space | |
| CN102221887A (en) | Interactive projection system and method | |
| US11944897B2 (en) | Device including plurality of markers | |
| CN103207667A (en) | Man-machine interaction control method and application thereof | |
| CN105749525A (en) | Basketball training device based on AR technology | |
| CN110038258A (en) | A kind of omnidirectional's treadmill and its virtual reality implementation method | |
| CN206819290U (en) | A kind of system of virtual reality multi-person interactive | |
| CN206741431U (en) | Desktop type space multistory interactive system | |
| Dong et al. | SEVAR: a stereo event camera dataset for virtual and augmented reality | |
| US20230349693A1 (en) | System and method for generating input data from pose estimates of a manipulated object by using light data and relative motion data | |
| Xu | A review on application of motion sensing technology in human-computer interaction | |
| CN206209591U (en) | Desktop type individual immerses virtual reality interactive device | |
| CN207503173U (en) | A kind of interactive holographic blackboard system | |
| CN108093222A (en) | A kind of hand-held VR imaging methods | |
| CN111158466A (en) | AI glasses sensing interaction method, system, medium and equipment suitable for intelligent chess |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| C14 | Grant of patent or utility model | ||
| GR01 | Patent grant | ||
| C17 | Cessation of patent right | ||
| CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20090415 Termination date: 20110716 |