[go: up one dir, main page]

TWM555001U - System, wearable device and flying device for providing semi-holographic images - Google Patents

System, wearable device and flying device for providing semi-holographic images Download PDF

Info

Publication number
TWM555001U
TWM555001U TW106215339U TW106215339U TWM555001U TW M555001 U TWM555001 U TW M555001U TW 106215339 U TW106215339 U TW 106215339U TW 106215339 U TW106215339 U TW 106215339U TW M555001 U TWM555001 U TW M555001U
Authority
TW
Taiwan
Prior art keywords
space
wearable device
flying
flight
user
Prior art date
Application number
TW106215339U
Other languages
Chinese (zh)
Inventor
鍾現
Original Assignee
華碩電腦股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 華碩電腦股份有限公司 filed Critical 華碩電腦股份有限公司
Priority to TW106215339U priority Critical patent/TWM555001U/en
Publication of TWM555001U publication Critical patent/TWM555001U/en

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

Present disclosure relates to a system, a wearable device and a flying device for displaying semi-holographic images. The system includes a wearable device and a flying device. The wearable device, operated in a first space, is configured to display a simulated scenario. The flying device operated in a second space is in communication with the wearable device. The flying device is configured to obtain image of a physical object from a first coordinate in the second space, wherein the first coordinate is corresponding to a first position in the first space. When the wearable device moves from the first position to a second position in the first space, the flying device moves from the first coordinate to a second coordinate in the second space, wherein the second space is corresponding to the second position. During the flying device moves from the first coordinate to the second coordinate, the flying device keeps obtaining images of a physical object in the second space, and the wearable device displays a virtual object corresponding to the physical object based on the images.

Description

類全息影像顯示系統、提供類全息影像之 飛行載具以及穿戴裝置 Holographic image display system, providing holographic images Flying vehicle and wearing device

本案涉及一種影像顯示系統,尤為一種類全息影像顯示系統。 The present invention relates to an image display system, and more particularly to a holographic image display system.

一般全息投影的效果是藉由透光材料反射多面向的平面影像來呈現。然而,此種技術能顯示的投影物體尺寸受限,且某些角度並無法觀看投影影像。另一種透過全息膜折射來營造舞台上的全息投影效果的技術,則需要相當大的地點方能設置,且須於黑暗的環境中才能較佳的顯示投影影像。另外,透過大範圍的光場拍攝來提供全息投影效果的技術不僅場地受限,且實現時的資訊傳遞量以及硬體成本十分高昂,使其難以普及。因此,上述的技術皆難以實現展示全息影像並同時為使用者營造與全息影像互動溝通的效果。 The effect of a general holographic projection is represented by a planar image of a multi-faceted reflective material. However, the projection object that such a technique can display is limited in size, and the projection image cannot be viewed at certain angles. Another technique that uses holographic film refraction to create a holographic projection effect on the stage requires a considerable amount of location to be set up, and the projected image must be better displayed in a dark environment. In addition, the technology for providing holographic projection effects through a wide range of light field shooting is not only limited in field, but also the amount of information transmission and the cost of hardware are very high, making it difficult to popularize. Therefore, the above techniques are difficult to realize the holographic image display and at the same time create an interactive communication effect with the holographic image for the user.

本案的一種實施態樣涉及一種類全息影像顯示 系統,其包含穿戴裝置,用以顯示虛擬環境景象;飛行載具,與穿戴裝置通訊耦接,用以對實體物件進行拍攝,並於穿戴裝置顯示虛擬環境景象;其中當穿戴裝置於第一空間中由第一地點移動至第二地點時,該飛行載具於第二空間中相對於第一空間中的第一位置對應移動至第二位置。 An embodiment of the present invention relates to a holographic image display a system comprising a wearable device for displaying a virtual environment scene; a flight vehicle coupled to the wearable device for capturing a physical object and displaying a virtual environment scene on the wearable device; wherein when the wear device is in the first space When moving from the first location to the second location, the flight carrier moves to the second location corresponding to the first location in the first space in the second space.

本案的一種實施態樣涉及一種提供類全息影像之飛行載具,適用於藉由網路伺服器通訊耦接穿戴裝置,其包含:飛行總成;光學偵測器,用以拍攝實體物件;第一處理器,電性耦接於光學偵測器以及飛行總成,用以對應光學偵測器拍攝的實體物件產生物件表面資訊;以及第一訊號收發器,電性耦接於第一處理器,用以將物件表面資訊傳輸至網路伺服器,並且接收來自網路伺服器的移動指令;其中第一處理器根據移動指令控制飛行總成。 An embodiment of the present invention relates to a flight vehicle for providing a holographic image, which is adapted to be coupled to a wearable device by a network server, comprising: a flight assembly; an optical detector for photographing a physical object; a processor electrically coupled to the optical detector and the flight assembly for generating surface information of the object corresponding to the physical object captured by the optical detector; and the first signal transceiver electrically coupled to the first processor And transmitting the object surface information to the network server, and receiving the movement instruction from the network server; wherein the first processor controls the flight assembly according to the movement instruction.

本案的一種實施態樣涉及一種提供類全息影像之穿戴裝置,適用於藉由網路伺服器通訊耦接飛行載具,其中飛行戴具用以產生物件表面資訊,包含:顯示介面,用以顯示虛擬環境景象;動作偵測器,用以偵測穿戴裝置於第一空間中的移動;第二處理器,與顯示介面以及動作偵測器電性耦接,用以對應動作偵測器偵測到的移動產生移動資訊;第二訊號收發器,與第二處理器電性耦接,用以將移動資訊傳輸至網路伺服器,更用以藉由網路伺服器接收來自飛行戴具所傳送的物件表面資訊;其中第二處理器更用以根據物件表面資訊控制顯示介面於虛擬環境景象中顯示虛擬物件。 An embodiment of the present invention relates to a wearable device for providing a holographic image, which is adapted to be coupled to a flight vehicle by a network server, wherein the flying wearer is used to generate object surface information, including: a display interface for displaying a virtual environment scene; a motion detector for detecting movement of the wearable device in the first space; and a second processor electrically coupled to the display interface and the motion detector for detecting the motion detector The mobile device generates mobile information; the second signal transceiver is electrically coupled to the second processor for transmitting mobile information to the network server, and is further configured to receive the mobile wear device by the network server. The transmitted object surface information; wherein the second processor is further configured to display the virtual object in the virtual environment scene according to the object surface information control display interface.

因此,根據本案之技術內容,藉以改善習知技 術受限於空間且佈置不易的問題,更改善了習知技術無法為使用者營造與全息影像互動溝通的效果。 Therefore, according to the technical content of the case, we can improve the conventional skills. The problem that the operation is limited by space and the arrangement is not easy, and the conventional technology cannot improve the interaction between the user and the holographic image.

100‧‧‧飛行載具 100‧‧‧ flying vehicle

101‧‧‧光學偵測器 101‧‧‧ Optical detector

102‧‧‧飛行總成 102‧‧‧Flight assembly

103、203‧‧‧處理器 103, 203‧‧‧ processor

104、204‧‧‧訊號收發器 104, 204‧‧‧ Signal Transceiver

200‧‧‧穿戴裝置 200‧‧‧Wearing device

201‧‧‧顯示介面 201‧‧‧Display interface

202‧‧‧動作偵測器 202‧‧‧ motion detector

300‧‧‧網路伺服器 300‧‧‧Web server

A‧‧‧第一空間 A‧‧‧First space

B‧‧‧第二空間 B‧‧‧Second space

F1‧‧‧第一飛行載具 F1‧‧‧first flight vehicle

F2‧‧‧第二飛行載具 F2‧‧‧Second flight vehicle

G1‧‧‧第一穿戴裝置 G1‧‧‧First wearing device

G2‧‧‧第二穿戴裝置 G2‧‧‧Second wearable device

U1、U2‧‧‧使用者 U1, U2‧‧‧ users

PN1‧‧‧第一地點 PN1‧‧‧ first place

PN2‧‧‧第二地點 PN2‧‧‧Second place

PN3‧‧‧第三地點 PN3‧‧‧ third place

CN1‧‧‧第一座標 CN1‧‧‧ first coordinate

CN2‧‧‧第二座標 CN2‧‧‧second coordinate

CN3‧‧‧第三座標 CN3‧‧‧ third coordinate

VU1‧‧‧第一虛擬人物 VU1‧‧‧The first avatar

VU2‧‧‧第二虛擬人物 VU2‧‧‧ second avatar

D1、D3、D5、D8‧‧‧拍攝方向 D1, D3, D5, D8‧‧‧ shooting direction

D2、D4、D6、D7‧‧‧指向方向 D2, D4, D6, D7‧‧‧ pointing directions

第1圖係根據本案部分實施例所繪示的提供類全息影像之飛行載具的示意圖;第2圖係根據本案部分實施例所繪示的提供類全息影像之飛行載具的示意圖;第3圖係根據本案部分實施例所繪示的提供類全息影像之穿戴裝置的示意圖;第4圖係根據本案部分實施例所繪示的提供類全息影像之穿戴裝置的之示意圖;第5圖係根據本案部分實施例所繪示的類全息影像顯示系統之示意圖;第6圖係根據本案部分實施例所繪示的類全息影像顯示系統之示意圖;第7圖係根據本案部分實施例所繪示的類全息影像顯示系統之示意圖;以及第8圖係根據本案部分實施例所繪示的類全息影像顯示系統之示意圖。 1 is a schematic diagram of a flight vehicle providing a hologram-like image according to some embodiments of the present invention; and FIG. 2 is a schematic diagram of a flight vehicle providing a hologram-like image according to some embodiments of the present invention; The figure is a schematic diagram of a wearable device for providing a hologram-like image according to some embodiments of the present invention; and FIG. 4 is a schematic view of a wearable device for providing a hologram-like image according to some embodiments of the present invention; A schematic diagram of a holographic image display system according to some embodiments of the present invention; FIG. 6 is a schematic diagram of a holographic image display system according to some embodiments of the present invention; FIG. 7 is a schematic diagram of a portion of the present disclosure. A schematic diagram of a holographic image display system; and FIG. 8 is a schematic diagram of a holographic image display system according to some embodiments of the present invention.

關於本文中所使用之『第一』、『第二』、... 等,並非特別指稱次序或順位的意思,亦非用以限定本案,其僅為了區別以相同技術用語描述的元件或操作。 About the "first", "second",... used in this article... Etc., not specifically to refer to the order or order, nor to limit the present invention, but merely to distinguish between elements or operations described in the same technical terms.

關於本文中所使用之『耦接』或『連接』,均可指二或多個元件或裝置相互直接作實體接觸,或是相互間接作實體接觸,亦可指二或多個元件或裝置相互操作或動作。 As used herein, "coupled" or "connected" may mean that two or more elements or devices are in direct physical contact with each other, or indirectly in physical contact with each other, or two or more elements or devices. Operation or action.

以下透過各圖式利用類全息影像顯示系統之示意圖,來說明本案所揭示之類全息影像顯示系統、提供類全息影像之飛行載具、以及提供類全息影像之穿戴裝置。 Hereinafter, a holographic image display system disclosed in the present disclosure, a flight carrier providing a hologram-like image, and a wearable device providing a hologram-like image will be described by using a schematic diagram of a holographic image display system.

請參考第1圖。第1圖係根據本案部分實施例所繪示的提供類全息影像之飛行載具的示意圖。在本案的一實施例中,類全息影像顯示系統包含飛行載具100,如圖中所示,飛行載具100係為一四旋翼機,但本案並不以此為限,各種可盤旋並懸停於空中的飛行器皆可用以作為本案的飛行載具100。飛行載具100具有光學偵測器101,光學偵測器101係為具備獲取景深資訊功能的攝影機,例如配置有雙鏡頭以偵測深度資訊的攝影機。應當理解,第1圖所繪示的飛行載具100僅係本案實施的一種可能性,在本案的其他實施例中,飛行載具100可用以載運行動裝置,行動裝置具有獲取景深資訊功能的攝影單元,如此,載運此種行動裝置的飛行載具100亦可達成與第1圖的飛行載具100相同之功效。 Please refer to Figure 1. 1 is a schematic diagram of a flight vehicle providing a hologram-like image according to some embodiments of the present invention. In an embodiment of the present invention, the holographic image display system includes a flying vehicle 100. As shown in the figure, the flying vehicle 100 is a four-rotor aircraft, but the present invention is not limited thereto, and various types can be hovered and suspended. The aircraft parked in the air can be used as the flight vehicle 100 of the present case. The flying vehicle 100 has an optical detector 101, which is a camera having a function of acquiring depth information, such as a camera equipped with a dual lens to detect depth information. It should be understood that the flight vehicle 100 illustrated in FIG. 1 is only one possibility of implementing the present invention. In other embodiments of the present invention, the flying vehicle 100 can be used to carry the running device, and the mobile device has the function of acquiring the depth information function. Thus, the flight vehicle 100 carrying such a mobile device can achieve the same effect as the flight vehicle 100 of FIG.

請參考第2圖。第2圖係根據本案部分實施例所繪示的提供類全息影像之飛行載具的示意圖。在本案的一實施例中,類全息影像顯示系統包含飛行載具100。如第2圖 所示,飛行載具100包含光學偵測器101,飛行總成102、處理器103以及訊號收發器104。飛行載具100的外觀可參照本案第1圖之實施例,係為一種四旋翼機。光學偵測器101係用以拍攝具景深資訊之影像的攝影機,其配置可參照本案第1圖所示之實施例,係設置於飛行載具100的一側。處理器103則與光學偵測器101和飛行總成102電性耦接,處理器103可用以接收光學偵測器101所拍攝的具景深資訊之影像並進行處理。飛行總成102則係受處理器103所控制,用以使飛行載具100於空中懸停、移動或旋轉,本實施例中的飛行總成102之配置可參照本案第1圖之實施例,例如第1圖中所示的飛行載具100所具備的旋翼皆為飛行總成102的一部分。訊號收發器104可為一種無線網路訊號收發器或藍芽訊號收發器,其亦與處理器103電性耦接,處理器103可透過訊號收發器104接收或發送資訊。 Please refer to Figure 2. 2 is a schematic diagram of a flight carrier providing a hologram-like image according to some embodiments of the present invention. In an embodiment of the present disclosure, the holographic image display system includes a flying vehicle 100. As shown in Figure 2 As shown, the flight vehicle 100 includes an optical detector 101, a flight assembly 102, a processor 103, and a signal transceiver 104. The appearance of the flying carrier 100 can be referred to as the embodiment of Fig. 1 of the present invention, which is a quadrotor. The optical detector 101 is a camera for capturing images with depth information, and its configuration can be set on one side of the flying vehicle 100 with reference to the embodiment shown in FIG. 1 of the present invention. The processor 103 is electrically coupled to the optical detector 101 and the flight assembly 102. The processor 103 can be used to receive and process the image of the depth of field information captured by the optical detector 101. The flight assembly 102 is controlled by the processor 103 to hover, move or rotate the flight carrier 100 in the air. For the configuration of the flight assembly 102 in this embodiment, reference may be made to the embodiment of FIG. 1 of the present invention. For example, the rotors provided in the flight vehicle 100 shown in FIG. 1 are all part of the flight assembly 102. The signal transceiver 104 can be a wireless network signal transceiver or a Bluetooth signal transceiver, which is also electrically coupled to the processor 103. The processor 103 can receive or send information through the signal transceiver 104.

請參考第3圖。第3圖係根據本案部分實施例所繪示的提供類全息影像之穿戴裝置的示意圖。在本案的一實施例中穿戴裝置200係為一種具光學鏡片的頭戴裝置(Helmet Mounted Display,HMD),但本案並不以此為限,各種可提供虛擬環境(如擴增實境、虛擬實境或混合實境)顯示的穿戴裝置皆可為本案的穿戴裝置200。穿戴裝置200具有顯示介面201,顯示介面201係設置於圖中的光學鏡片上,當使用者裝備穿戴裝置200時,其可透過顯示介面201觀看穿戴裝置200欲呈現的虛擬影像,由於顯示介面201的光學鏡片係具有一定透明度的鏡片,現實世界中的環 境光亦可穿透顯示介面201的光學鏡片,令使用者可同時觀看現實世界中的景物。 Please refer to Figure 3. FIG. 3 is a schematic diagram of a wearable device for providing a hologram-like image according to some embodiments of the present invention. In an embodiment of the present disclosure, the wearable device 200 is a Helmet Mounted Display (HMD), but the present invention is not limited thereto, and various virtual environments (such as augmented reality and virtual reality) can be provided. The wearable device displayed in the real world or in the mixed reality can be the wearable device 200 of the present case. The wearable device 200 has a display interface 201. The display interface 201 is disposed on the optical lens in the figure. When the user wears the wearable device 200, the wearable device 200 can view the virtual image to be presented by the wearable device 200. Optical lens is a lens with a certain transparency, the ring in the real world The ambient light can also penetrate the optical lens of the display interface 201, so that the user can simultaneously view the scene in the real world.

請參考第4圖,第4圖係根據本案部分實施例所繪示的提供類全息影像之穿戴裝置的示意圖。在本案的一實施例中,類全息影像顯示系統包含穿戴裝置200。如圖中所示,穿戴裝置200包含顯示介面201、動作偵測器202、處理器203以及訊號收發器204。顯示介面201係用以顯示虛擬影像,其外觀可參照本案第3圖之實施例,係設置於穿戴裝置200的光學鏡面上。動作偵測器202係設置於穿戴裝置200內部,用以偵測穿戴裝置200的移動,例如,其可以偵測穿戴裝置200沿三維軸向的移動、轉動或磁力變化等資訊。處理器203則與顯示介面201和動作偵測器202電性耦接,其係用以接收動作偵測器202所獲取的關於穿戴裝置200的移動資訊,處理器203更控制顯示介面201顯示虛擬影像。訊號收發器204可為一種無線網路訊號收發器或藍芽訊號收發器,訊號收發器204與處理器203電性耦接,故處理器203可透過訊號收發器204接收或發送資訊。 Please refer to FIG. 4, which is a schematic diagram of a wearable device for providing holographic images according to some embodiments of the present invention. In an embodiment of the present disclosure, the holographic image display system includes a wearable device 200. As shown in the figure, the wearable device 200 includes a display interface 201, a motion detector 202, a processor 203, and a signal transceiver 204. The display interface 201 is used to display a virtual image, and the appearance thereof can be set on the optical mirror surface of the wearable device 200 with reference to the embodiment of FIG. 3 of the present invention. The motion detector 202 is disposed inside the wearable device 200 for detecting the movement of the wearable device 200. For example, it can detect information such as movement, rotation, or magnetic force change of the wearable device 200 along the three-dimensional axis. The processor 203 is electrically coupled to the display interface 201 and the motion detector 202 for receiving the movement information about the wearable device 200 acquired by the motion detector 202. The processor 203 further controls the display interface 201 to display the virtual image. The signal transceiver 204 can be a wireless network signal transceiver or a Bluetooth signal transceiver. The signal transceiver 204 is electrically coupled to the processor 203. Therefore, the processor 203 can receive or send information through the signal transceiver 204.

請參考第5圖。第5圖係根據本案部分實施例所繪示的類全息影像顯示系統之示意圖。如圖中所示,本實施例的背景包含兩個相異空間,分別為圖中左側繪示的第一空間A以及圖中右側繪示的第二空間B。使用者U1身處於第一空間A當中,其裝備第一穿戴裝置G1且位於第一地點PN1,而使用者U2身處於第二空間B當中,其裝備第二穿戴裝置G2且位於第二地點PN2。在本實施例中,第一穿戴裝置G1 和第二穿戴裝置G2的外觀、內部結構以及基本的作動方式皆同於本案第3圖及第4圖所示的穿戴裝置200,故請參照本案第3圖及第4圖之實施例,於此不再贅述。在本實施例中,第一飛行載具F1運作於第一空間A當中,而第二飛行載具F2運作於第二空間B當中。而本實施例的第一飛行載具F1和第二飛行載具F2的外觀、內部結構以及基本的作動方式皆同於本案第1圖及第2圖所示的飛行載具100,故請參照本案第1圖及第2圖之實施例,於此不再贅述。 Please refer to Figure 5. Figure 5 is a schematic diagram of a holographic image display system according to some embodiments of the present invention. As shown in the figure, the background of the embodiment includes two distinct spaces, which are respectively the first space A shown on the left side of the figure and the second space B shown on the right side of the figure. The user U1 is in the first space A, which is equipped with the first wearing device G1 and located at the first location PN1, and the user U2 is in the second space B, which is equipped with the second wearing device G2 and located at the second location PN2 . In this embodiment, the first wearing device G1 The appearance, internal structure, and basic operation of the second wearable device G2 are the same as those of the wearable device 200 shown in Figures 3 and 4 of the present invention. Therefore, referring to the embodiments of Figures 3 and 4 of the present invention, This will not be repeated here. In the present embodiment, the first flying vehicle F1 operates in the first space A, and the second flying vehicle F2 operates in the second space B. The appearance, internal structure and basic operation mode of the first flight vehicle F1 and the second flight vehicle F2 of the present embodiment are the same as those of the flight vehicle 100 shown in Figures 1 and 2 of the present case, so please refer to The embodiments of the first and second figures of the present invention will not be described herein.

在本實施例中,第一穿戴裝置G1、第二穿戴裝置G2、第一飛行載具F1和第二飛行載具F2皆與網路伺服器300通訊耦接,第一飛行載具F1和第二飛行載具F2係透過如第1圖或第2圖所示的訊號收發器104向網路伺服器300發送資訊或自網路伺服器300接收資訊,而第一穿戴裝置G1及第二穿戴裝置G2係透過如第3圖或第4圖所示的訊號收發器204向網路伺服器300發送資訊或自網路伺服器300接收資訊。 In this embodiment, the first wearing device G1, the second wearing device G2, the first flying carrier F1 and the second flying carrier F2 are all communicatively coupled with the network server 300, the first flying vehicle F1 and the first The second flying vehicle F2 transmits information to or receives information from the network server 300 through the signal transceiver 104 as shown in FIG. 1 or FIG. 2, and the first wearing device G1 and the second wearing device The device G2 transmits information to or receives information from the network server 300 through the signal transceiver 204 as shown in FIG. 3 or FIG.

請參照第5圖的左半部,在本實施例中,當第一飛行載具F1運作時,其透過如第1或2圖所示的光學偵測器101偵測第一空間A的環境配置,例如第一空間A的房間形狀、裝潢、物件或人物等等,並據以產生關於第一空間A的空間資訊。第一飛行載具F1具有多向自由度的定位功能,例如六向自由度(Six Degrees of Freedom)或九向自由度(Nine Degrees of Freedom),故第一飛行載具F1的處理器103可利用同步定位與地圖構建(Simultaneous localization and mapping,SLAM)演算法運算空間資訊,以為第一空間A中的各個地點指派複數座標,所述座標可以是三維空間座標。當第一飛行載具F1完成座標指派後,其將可依據第一空間A中的各個座標為參照點進行移動。如第5圖的左半部及右半部所示,第一飛行載具F1懸停於第二座標CN2,此第二座標CN2係為對應第二空間B當中第二穿戴裝置G2所處的第二地點PN2,且其懸停於空中的拍攝方向D1亦對應第二穿戴裝置G2之指向方向D2。第一飛行載具F1可於其所在角度拍攝使用者U1的正面影像。 Referring to the left half of FIG. 5, in the present embodiment, when the first flying vehicle F1 is operated, it detects the environment of the first space A through the optical detector 101 as shown in FIG. 1 or 2. The configuration, for example, the room shape, the decoration, the object or the character of the first space A, and the like, and accordingly generates spatial information about the first space A. The first flying vehicle F1 has a multi-directional degree of freedom positioning function, such as Six Degrees of Freedom or Nine Degrees of Freedom, so the processor 103 of the first flying vehicle F1 can Use synchronous positioning and map construction (Simultaneous The localization and mapping (SLAM) algorithm computes spatial information to assign a plurality of coordinates to each location in the first space A, which may be a three-dimensional coordinate. When the first flight vehicle F1 completes the coordinate assignment, it will be able to move according to each coordinate in the first space A as a reference point. As shown in the left and right halves of FIG. 5, the first flying vehicle F1 is hovered over the second coordinate CN2, and the second coordinate CN2 is corresponding to the second wearing device G2 of the second space B. The second location PN2, and its shooting direction D1 hovering in the air also corresponds to the pointing direction D2 of the second wearing device G2. The first flying vehicle F1 can capture a frontal image of the user U1 at its angle.

請繼續參照第5圖的右半部,同樣地,在本實施例中,當第二飛行載具F2運作時,其亦將透過如第1或2圖所示的光學偵測器101偵測第二空間B的環境配置,並據以產生關於第二空間B的空間資訊。第二飛行載具F2亦具有多向自由度(例如六向或九向自由度)的定位功能,其處理器103亦可利用同步定位與地圖構建演算法運算空間資訊,以為第二空間B中的各個地點指派複數座標。當第二飛行載具F2完成座標指派後,其將可依據第二空間B中的各個座標為參照點進行移動。如第5圖的左半部及右半部所示所示,第二飛行載具F2懸停於第一座標CN1,此第一座標CN1係為對應第一空間A當中第一穿戴裝置G1所處的第一地點PN1,且其懸停於空中的拍攝方向D3亦對應第一穿戴裝置G1之指向方向D4。第二飛行載具F2可於其所在角度拍攝使用者U2的正面影像。 Please continue to refer to the right half of FIG. 5. Similarly, in the present embodiment, when the second flying vehicle F2 is operated, it will also be detected by the optical detector 101 as shown in FIG. 1 or 2. The environment configuration of the second space B, and accordingly generates spatial information about the second space B. The second flight vehicle F2 also has a multi-directional degree of freedom (for example, six-way or nine-way degrees of freedom) positioning function, and the processor 103 can also use the synchronous positioning and map construction algorithm to calculate spatial information, and the second space B is used. Each location assigns a plurality of coordinates. When the second flight vehicle F2 completes the coordinate assignment, it will be able to move according to each coordinate in the second space B as a reference point. As shown in the left half and the right half of FIG. 5, the second flying carrier F2 is hovered over the first coordinate CN1, and the first coordinate CN1 is corresponding to the first wearing device G1 of the first space A. The first location PN1 is located, and the shooting direction D3 hovering in the air also corresponds to the pointing direction D4 of the first wearing device G1. The second flying vehicle F2 can capture a frontal image of the user U2 at its angle.

請參考第6圖。第6圖係根據本案部分實施例所 繪示的類全息影像顯示系統之示意圖。本實施例之背景係延續前述第5圖之實施例。請參考第6圖的左半部,在本實施例中,當使用者U1裝備第一穿戴裝置G1時,第一穿戴裝置G1用以顯示第一虛擬環境景象給使用者U1觀看。請同時參照第5圖之實施例,於該實施例中,由於第二飛行載具F2於第二空間B中的第一座標CN1向著使用者U2拍攝獲取其正面影像,使用者U2的正面影像經第二飛行載具F2的處理器103處理後,第二飛行載具F2可透過如第1圖或第2圖所示的訊號收發器104向網路伺服器300發送使用者U2的正面影像資訊。如此,第一穿戴裝置G1即可自網路伺服器300獲取使用者U2的正面影像資訊,並將使用者U2的正面影像資訊顯示於第一虛擬環境景象中,顯示為第二虛擬人物VU2。由於第二飛行載具F2獲取使用者U2的正面影像具有景深資訊,故使用者U1於第一穿戴裝置G1的顯示介面201觀看到的第二虛擬人物VU2呈現的外觀即對應使用者U2的立體正面影像。另外,在本實施例中,第二飛行載具F2於第二空間B中飛行或懸停的高度係對應於使用者U1在第一空間A當中裝備第一穿戴裝置G1的高度,如此,第一穿戴裝置G1才能以正確的高度將第一虛擬環境景象顯示予使用者U1觀看。 Please refer to Figure 6. Figure 6 is a partial embodiment according to the present invention. A schematic diagram of a holographic image display system. The background of this embodiment continues the embodiment of the aforementioned fifth embodiment. Referring to the left half of FIG. 6, in the embodiment, when the user U1 is equipped with the first wearing device G1, the first wearing device G1 is used to display the first virtual environment scene for viewing by the user U1. Referring to the embodiment of FIG. 5, in this embodiment, since the second flying carrier F2 captures the front image of the user U2 toward the first coordinate CN1 in the second space B, the front image of the user U2 After being processed by the processor 103 of the second flight vehicle F2, the second flight vehicle F2 can transmit the front image of the user U2 to the network server 300 through the signal transceiver 104 as shown in FIG. 1 or FIG. News. In this way, the first wearable device G1 can obtain the front image information of the user U2 from the network server 300, and display the front image information of the user U2 in the first virtual environment scene, and display it as the second virtual character VU2. The second virtual character VU2 viewed by the user U1 on the display interface 201 of the first wearable device G1 has an appearance corresponding to the user U2, since the front view of the user U2 has the depth information. Positive image. In addition, in this embodiment, the height of the second flying vehicle F2 flying or hovering in the second space B corresponds to the height of the user U1 equipping the first wearing device G1 in the first space A, and thus, A wearable device G1 can display the first virtual environment scene to the user U1 at the correct height.

請繼續參照第6圖的右半部,在本實施例中,當使用者U2裝備第二穿戴裝置G2時,第一穿戴裝置G2用以顯示第二虛擬環境景象給使用者U2觀看。亦請同時參照第5圖之實施例,於該實施例中,由於第一飛行載具F1於第一空間A中的第二座標CN2向著使用者U1拍攝獲取其正面影像,使用 者U1的正面影像經第一飛行載具F1的處理器103處理後,第一飛行載具F1可透過訊號收發器104向網路伺服器300發送使用者U1的正面影像資訊,第二穿戴裝置G2則可自網路伺服器300獲取使用者U1的正面影像資訊,並將使用者U1的正面影像資訊顯示於第二虛擬環境景象中,顯示為第一虛擬人物VU1,其中第一虛擬人物VU1呈現的外觀即對應使用者U1的立體正面影像。同樣地,在本實施例中,第一飛行載具F1於第一空間A中飛行或懸停的高度係對應於使用者U2在第二空間B當中裝備第二穿戴裝置G2的高度,如此,第二穿戴裝置G2才能以正確的高度將第二虛擬環境景象顯示予使用者U2觀看。 Please refer to the right half of FIG. 6. In this embodiment, when the user U2 is equipped with the second wearing device G2, the first wearing device G2 is used to display the second virtual environment scene for the user U2 to view. Please refer to the embodiment of FIG. 5 again. In this embodiment, the first flying vehicle F1 is photographed by the second coordinate CN2 in the first space A to obtain the front image of the user U1. After the front image of the first flying vehicle F1 is processed by the processor 103 of the first flying vehicle F1, the first flying vehicle F1 can transmit the front image information of the user U1 to the network server 300 through the signal transceiver 104, and the second wearing device G2 can obtain the front image information of the user U1 from the network server 300, and display the front image information of the user U1 in the second virtual environment scene, and display it as the first virtual character VU1, wherein the first virtual character VU1 The appearance of the presentation corresponds to the stereo front image of the user U1. Similarly, in the present embodiment, the height of the first flying vehicle F1 flying or hovering in the first space A corresponds to the height at which the user U2 equips the second wearing device G2 in the second space B, and thus, The second wearing device G2 can display the second virtual environment scene to the user U2 at the correct height.

再請同時參考第5圖及第6圖之實施例。在此實施例中,使用者U1裝備的第一穿戴裝置G1不僅可以透過顯示介面201顯示第一虛擬景象,其更透過麥克風(圖中未示)收集第一空間A中的聲音,尤為使用者U1所發出之聲音,對應第一空間A中聲音之資訊將經由第一穿戴裝置G1的處理器203以及訊號收發器204傳輸至網路伺服器300,而位於第二空間B中的第二穿戴裝置G2則可透過其訊號收發器204接收對應第一空間A中聲音之資訊,再經由第二穿戴裝置G2的處理器203處理後,經由第二穿戴裝置G2的播音器(圖中未示)播放對應第一空間A的音訊,如此,使用者U2不僅可以觀看到第二虛擬景象中的第一虛擬人物VU1,更可以聽見使用者U1所發出之聲音。 Please refer to the embodiments of Figures 5 and 6 at the same time. In this embodiment, the first wearing device G1 equipped by the user U1 can not only display the first virtual scene through the display interface 201, but also collects the sound in the first space A through a microphone (not shown), especially the user. The sound emitted by U1, the information corresponding to the sound in the first space A will be transmitted to the network server 300 via the processor 203 and the signal transceiver 204 of the first wearing device G1, and the second wear in the second space B. The device G2 can receive the information corresponding to the sound in the first space A through the signal transceiver 204, and then through the processor 203 of the second wearing device G2, and then through the broadcaster of the second wearing device G2 (not shown) The audio corresponding to the first space A is played, so that the user U2 can not only view the first virtual character VU1 in the second virtual scene, but also can hear the sound emitted by the user U1.

在此實施例中,使用者U2裝備的第二穿戴裝置 G2也可以透過麥克風(圖中未示)收集第二空間B中的聲音,尤為使用者U2所發出之聲音,對應第二空間B中聲音之資訊將經由第二穿戴裝置G2傳輸至網路伺服器300,而位於第一空間A中的第一穿戴裝置G1則可接收對應第二空間B中聲音之資訊,再經由第一穿戴裝置G1的處理器203處理後,經由第一穿戴裝置G1的播音器(圖中未示)播放對應第二空間B的音訊,如此,使用者U1可以觀看到第一虛擬景象中的第二虛擬人物VU2,也可以聽見使用者U2所發出之聲音。本案的實施例即據以達成了類似直接與人物的全息影像溝通互動的效果。另外,應當理解,本案的麥克風及播音器並非限定設置於第一穿戴裝置G1或第二穿戴裝置G2上,其可以設置於第一飛行載具F1或第二飛行載具F2上,或為獨立分別設置於第一空間A及第二空間B當中,只要麥克風可以蒐集第一空間A或第二空間B當中的聲音,而播音器可以對應地播放第一空間A或第二空間B當中的音訊,皆為本案所涵蓋之範圍。 In this embodiment, the second wearing device equipped by the user U2 G2 can also collect the sound in the second space B through a microphone (not shown), especially the sound emitted by the user U2, and the information corresponding to the sound in the second space B will be transmitted to the network servo via the second wearing device G2. The first wearing device G1 in the first space A can receive the information corresponding to the sound in the second space B, and then processed by the processor 203 of the first wearing device G1, via the first wearing device G1. The broadcaster (not shown) plays the audio corresponding to the second space B. Thus, the user U1 can view the second virtual character VU2 in the first virtual scene, and can also hear the sound emitted by the user U2. The embodiment of the present case achieves an effect similar to the direct interaction with the holographic image of the character. In addition, it should be understood that the microphone and the sounder of the present invention are not limited to be disposed on the first wearing device G1 or the second wearing device G2, and may be disposed on the first flying vehicle F1 or the second flying carrier F2, or may be independent. Between the first space A and the second space B, the microphone can collect the sound in the first space A or the second space B, and the audio player can play the audio in the first space A or the second space B correspondingly. , are the scope of this case.

請同時參考第7圖及第8圖。第7圖及第8圖皆為根據本案部分實施例所繪示的類全息影像顯示系統之示意圖。本實施例之背景係延續前述第5圖及第6圖之實施例。請同時參照第5圖及第7圖的左半部,在本實施例中,裝備第一穿戴裝置G1的使用者U1由第一空間A中的第一地點PN1移動至第三地點PN3,此時,由於第一穿戴裝置G1亦具有多向自由度(例如六向或九向自由度)定位功能,其可透過如第4圖所示的動作偵測器202偵測使用者U1的移動路 徑,對應使用者U1移動路徑的資訊將經由第一穿戴裝置G1的處理器203及訊號收發器204發送至網路伺服器300。如此,第二飛行載具F2即可自網路伺服器300獲取使用者U1的移動路徑,第二飛行載具F2的處理器103即可控制飛行總成102依照使用者U1於第一空間A中的移動路徑移動。如第7圖的右半部所示,在本實施例中,在第二空間B中的第二飛行載具F2將對應使用者U1於第一空間A的移動路徑移動,第二飛行載具F2將由對應第一地點PN1的第一座標CN1飛行移動至對應第三地點PN3的第三座標CN3。 Please also refer to Figures 7 and 8. 7 and 8 are schematic views of a holographic image display system according to some embodiments of the present invention. The background of this embodiment is to continue the embodiments of Figures 5 and 6 above. Referring to the left half of FIG. 5 and FIG. 7 simultaneously, in the present embodiment, the user U1 equipped with the first wearing device G1 is moved from the first location PN1 in the first space A to the third location PN3. The first wearable device G1 also has a multi-directional degree of freedom (for example, six-way or nine-way degree of freedom) positioning function, and can detect the moving path of the user U1 through the motion detector 202 as shown in FIG. The information corresponding to the movement path of the user U1 is transmitted to the network server 300 via the processor 203 and the signal transceiver 204 of the first wearing device G1. In this way, the second flight vehicle F2 can acquire the moving path of the user U1 from the network server 300, and the processor 103 of the second flying vehicle F2 can control the flight assembly 102 according to the user U1 in the first space A. The movement path in the movement. As shown in the right half of FIG. 7, in the present embodiment, the second flying vehicle F2 in the second space B moves the moving path corresponding to the user U1 in the first space A, and the second flying vehicle F2 will fly from the first coordinate CN1 corresponding to the first location PN1 to the third coordinate CN3 corresponding to the third location PN3.

在本實施例中,由於第二飛行載具F2的光學偵測器201的拍攝方向D5係對應第一穿戴裝置G1的指向方向D6,在移動過程中,若使用者U1移動頭部嘗試追蹤如第6圖所示的第二虛擬人物VU2所在地點,第一穿戴裝置G1可透過動作偵測器202偵測使用者U1的頭部轉動角度,對應使用者U1頭部轉動角度的資訊將經由第一穿戴裝置G1的處理器203及訊號收發器204發送至網路伺服器300。如此,第二飛行載具F2即可自網路伺服器300獲取使用者U1的頭部轉動角度,第二飛行載具F2的處理器103即可控制飛行總成102使第二飛行載具F2旋轉,使第二飛行載具F2的光學偵測器201的拍攝方向D5持續對應第一穿戴裝置G1的指向方向D6。 In this embodiment, since the photographing direction D5 of the optical detector 201 of the second flying vehicle F2 corresponds to the pointing direction D6 of the first wearing device G1, during the moving process, if the user U1 moves the head to try to track, for example, At the location of the second avatar VU2 shown in FIG. 6, the first wearing device G1 can detect the head rotation angle of the user U1 through the motion detector 202, and the information corresponding to the rotation angle of the head of the user U1 will pass through the A processor 203 and a signal transceiver 204 of a wearable device G1 are sent to the network server 300. In this way, the second flight vehicle F2 can obtain the head rotation angle of the user U1 from the network server 300, and the processor 103 of the second flight vehicle F2 can control the flight assembly 102 to make the second flight vehicle F2. Rotating so that the photographing direction D5 of the optical detector 201 of the second flying vehicle F2 continues to correspond to the pointing direction D6 of the first wearing device G1.

如第7圖的右半部所示,在本實施例中,第二飛行載具F2在自第一座標CN1飛行移動至第三座標CN3的過程中,第二飛行載具F2將同時根據使用者U1的頭部轉動角 度沿第二飛行載具F2的中軸旋轉。如此,第二飛行載具F2在第二空間B當中飛行移動時,將可同時旋轉,以使其光學偵測器101的拍攝方向D5能朝向第一穿戴裝置G1的指向方向D6,令第二飛行載具F2可於第二空間B中獲取使用者U1希望觀看角度之影像。於此過程中,第二飛行載具F2的光學偵測器101將持續地透過光學偵測器101獲取第二空間B當中的使用者U2的正面或側面影像,並將使用者U2的正面或側面影像傳送至網路伺服器300。 As shown in the right half of Fig. 7, in the present embodiment, the second flight vehicle F2 is in the process of moving from the first coordinate CN1 to the third coordinate CN3, and the second flight carrier F2 is simultaneously used. U1 head rotation angle The degree rotates along the central axis of the second flight carrier F2. In this way, when the second flying vehicle F2 moves in the second space B, it can rotate simultaneously so that the shooting direction D5 of the optical detector 101 can face the pointing direction D6 of the first wearing device G1, so that the second The flying vehicle F2 can acquire an image of the user U1 wishing to view the angle in the second space B. During this process, the optical detector 101 of the second flight carrier F2 will continuously obtain the front or side image of the user U2 in the second space B through the optical detector 101, and the front side of the user U2 or The silhouette is transmitted to the web server 300.

承前段,第一穿戴裝置G1將透過如第4圖所示的訊號收發器204接收使用者U2的正面或側面影像,第一穿戴裝置G1的處理器203將根據使用者U2的正面或側面影像產生對應的虛擬人物資訊,再透過第一穿戴裝置G1的顯示介面201於第二虛擬環境景象當中顯示對應使用者U2的第二虛擬人物VU2,令使用者U1可以自顯示介面201觀看第二虛擬人物VU2的正面或側面的立體影像,如第8圖的左半部所示。如此,透過第二飛行載具F2在第二空間B當中移動或旋轉,第二飛行載具F2的光學偵測器101即扮演了類似使用者U1雙眼的角色,無論使用者U1於第一空間A中如何移動或轉動頭部,第二飛行載具F2也會於第二空間B中對應移動及轉動,使其光學偵測器101能指向對應使用者U1視線的角度並持續獲取使用者U2於第二空間B中的即時影像,而第一穿戴裝置G1即可據以於顯示介面201中顯示對應使用者U2的的第二虛擬人物VU2。如此,第一穿戴裝置G1可對使用者U1呈現類似投影使用者U2之類全息影像的效果。 In the anterior segment, the first wearable device G1 will receive the front or side image of the user U2 through the signal transceiver 204 as shown in FIG. 4, and the processor 203 of the first wearable device G1 will be based on the front or side image of the user U2. Corresponding avatar information is generated, and the second avatar VU2 corresponding to the user U2 is displayed in the second virtual environment scene through the display interface 201 of the first wearing device G1, so that the user U1 can view the second virtuality from the display interface 201. A stereoscopic image of the front or side of the character VU2, as shown in the left half of Fig. 8. In this way, the second flight carrier F2 moves or rotates in the second space B, and the optical detector 101 of the second flight carrier F2 plays the role of the user U1, regardless of the user U1. How to move or rotate the head in the space A, the second flying vehicle F2 also moves and rotates in the second space B, so that the optical detector 101 can point to the angle corresponding to the line of sight of the user U1 and continuously acquire the user. U2 is a real-time image in the second space B, and the first wearing device G1 can display the second virtual character VU2 corresponding to the user U2 in the display interface 201. In this way, the first wearing device G1 can present the effect of the holographic image like the user U2 to the user U1.

請再參照第7圖之右半部,在本實施例中,使用者U2裝備著第二穿戴裝置G2,而第二穿戴裝置G2的指向方向為D7。如前段所述,由於裝備第一穿戴裝置G1的使用者U1由第一空間A中的第一地點PN1移動至第三地點PN3,而在第二空間B中的第二飛行載具F2則對應地由第二空間B的第一座標CN1飛行移動至第三座標CN3。請一同參照第6圖及第8圖的右半部,自使用者U2透過第二穿戴裝置G2的顯示介面201觀看的角度,對應使用者U1的第一虛擬人物VU1將於第二虛擬景象中對應第二飛行載具F2的移動路徑移動,按照一般的使用習慣,使用者U2將轉動其頭部以跟蹤第一虛擬人物VU1的移動,如第7圖的右半部所示,由於第二穿戴裝置G2亦具有多向自由度(例如六向或九向自由度)定位功能,第二穿戴裝置G2中的動作偵測器202將偵測使用者U2的頭部轉動角度,對應使用者U2頭部轉動角度的資訊將經由第二穿戴裝置G2的處理器203處理,再經由第二穿戴裝置G2的訊號收發器204傳輸至網路伺服器300。 Referring again to the right half of FIG. 7, in the present embodiment, the user U2 is equipped with the second wearing device G2, and the pointing direction of the second wearing device G2 is D7. As described in the foregoing paragraph, since the user U1 equipped with the first wearing device G1 is moved from the first point PN1 in the first space A to the third point PN3, the second flying vehicle F2 in the second space B corresponds to The ground is moved by the first coordinate CN1 of the second space B to the third coordinate CN3. Referring to the right half of FIG. 6 and FIG. 8 together, from the angle viewed by the user U2 through the display interface 201 of the second wearing device G2, the first virtual character VU1 corresponding to the user U1 will be in the second virtual scene. Corresponding to the movement path of the second flight vehicle F2, according to the general usage habit, the user U2 will rotate its head to track the movement of the first avatar VU1, as shown in the right half of FIG. 7, due to the second The wearable device G2 also has a multi-directional degree of freedom (for example, a six-way or nine-way degree of freedom) positioning function, and the motion detector 202 in the second wearing device G2 detects the head rotation angle of the user U2, corresponding to the user U2. Information about the angle of rotation of the head will be processed by the processor 203 of the second wearable device G2 and transmitted to the network server 300 via the signal transceiver 204 of the second wearable device G2.

承上段,如第7圖的左半部所示,第一空間A當中的第一飛行載具F1將透過其訊號收發器接收對應使用者U2頭部轉動角度的資訊,第一飛行載具F1的處理器103即可控制飛行總成102使第一飛行載具F1沿其中軸依照使用者U2的頭部轉動角度旋轉,使得第一飛行載具F1的光學偵測器201的拍攝方向D8係對應第二穿戴裝置G2的指向方向D7。如此,雖然使用者U2並未於第二空間B當中移動,而僅是旋轉其頭部,位於第一空間A中的第一飛行載具F1仍會根 據使用者U2的頭部轉動角度旋轉。如此,透過第一飛行載具F1在第一空間A當中的旋轉,第一飛行載具F1的光學偵測器101即扮演了類似使用者U2雙眼的角色,無論使用者U2於第二空間B中如何轉動頭部,第一飛行載具F1也會於第一空間A中對應轉動,使其光學偵測器101能指向對應使用者U2視線的角度並持續獲取使用者U1於第一空間A中的即時影像,而第二穿戴裝置G2即可據以於顯示介面201中顯示對應使用者U1的第一虛擬人物VU1。如此,第二穿戴裝置G2可對使用者U2呈現類似投影使用者U1之類全息影像的效果。 In the upper section, as shown in the left half of Fig. 7, the first flight vehicle F1 in the first space A will receive information corresponding to the rotation angle of the head of the user U2 through its signal transceiver, the first flight vehicle F1 The processor 103 can control the flight assembly 102 to rotate the first flight carrier F1 along the central axis thereof according to the head rotation angle of the user U2, so that the imaging direction D8 of the optical detector 201 of the first flight carrier F1 is Corresponding to the pointing direction D7 of the second wearing device G2. Thus, although the user U2 does not move in the second space B, but only rotates its head, the first flying vehicle F1 located in the first space A still roots. Rotate according to the head rotation angle of the user U2. Thus, by the rotation of the first flying vehicle F1 in the first space A, the optical detector 101 of the first flying vehicle F1 plays the role similar to the eyes of the user U2, regardless of the user U2 in the second space. How to rotate the head in B, the first flying vehicle F1 will also rotate correspondingly in the first space A, so that the optical detector 101 can point to the angle corresponding to the line of sight of the user U2 and continuously acquire the user U1 in the first space. The instant image in A, and the second wearing device G2 can display the first avatar VU1 corresponding to the user U1 in the display interface 201. In this way, the second wearing device G2 can present the effect of the holographic image like the user U1 to the user U2.

於一實施例中,本案的第一穿戴裝置G1以及第二穿戴裝置G2亦可透過動作偵測器202分別偵測使用者U1、U2的高度變化,並將對應高度變化的資訊傳輸至網路伺服器300,以使對應的第一飛行載具F1以及第二飛行載具F2可以對應使用者U1、U2的高度變化上升或下降,以令第一飛行載具F1以及第二飛行載具F2的光學偵測器101可以準確地對應使用者U1、U2於另一空間中的視角高度。如此,第一穿戴裝置G1以及第二穿戴裝置G2方可於第一虛擬景象和第二虛擬景象中提供更佳的第一虛擬人物VU1以及第二虛擬人物VU2。 In an embodiment, the first wearing device G1 and the second wearing device G2 of the present invention can also detect the height changes of the users U1 and U2 through the motion detector 202, and transmit information corresponding to the height change to the network. The server 300 is configured such that the corresponding first flight vehicle F1 and the second flight vehicle F2 can rise or fall corresponding to the height changes of the users U1, U2 to make the first flight vehicle F1 and the second flight vehicle F2 The optical detector 101 can accurately correspond to the viewing angle height of the user U1, U2 in another space. As such, the first wearing device G1 and the second wearing device G2 can provide the better first virtual character VU1 and the second virtual character VU2 in the first virtual scene and the second virtual scene.

雖然本案以實施例揭露如上,然其並非用以限定本案,任何熟習此技藝者,在不脫離本案之精神和範圍內,當可作各種之更動與潤飾,因此本案之保護範圍當視後附之申請專利範圍所界定者為準。 Although the present invention is disclosed in the above embodiments, it is not intended to limit the present invention. Anyone skilled in the art can make various changes and refinements without departing from the spirit and scope of the present invention. Therefore, the scope of protection of this case is attached. The scope of the patent application is subject to change.

A‧‧‧第一空間 A‧‧‧First space

B‧‧‧第二空間 B‧‧‧Second space

F1‧‧‧第一飛行載具 F1‧‧‧first flight vehicle

F2‧‧‧第二飛行載具 F2‧‧‧Second flight vehicle

G1‧‧‧第一穿戴裝置 G1‧‧‧First wearing device

G2‧‧‧第二穿戴裝置 G2‧‧‧Second wearable device

U1、U2‧‧‧使用者 U1, U2‧‧‧ users

PN1‧‧‧第一地點 PN1‧‧‧ first place

PN2‧‧‧第二地點 PN2‧‧‧Second place

PN3‧‧‧第三地點 PN3‧‧‧ third place

CN1‧‧‧第一座標 CN1‧‧‧ first coordinate

CN2‧‧‧第二座標 CN2‧‧‧second coordinate

CN3‧‧‧第三座標 CN3‧‧‧ third coordinate

D5、D8‧‧‧拍攝方向 D5, D8‧‧‧ shooting direction

D6、D7‧‧‧指向方向 D6, D7‧‧‧ pointing direction

300‧‧‧網路伺服器 300‧‧‧Web server

Claims (10)

一種類全息影像顯示系統,包含:一穿戴裝置,用以顯示一虛擬環境景象;一飛行載具,與該穿戴裝置通訊耦接,用以對一實體物件進行拍攝,並於該穿戴裝置顯示該虛擬環境景象;其中當該穿戴裝置於一第一空間中由一第一地點移動至一第二地點時,該飛行載具於一第二空間中相對於該第一空間中的一第一位置對應移動至一第二位置。 A holographic image display system comprising: a wearable device for displaying a virtual environment scene; a flight vehicle coupled to the wearable device for capturing a physical object and displaying the physical object on the wearable device a virtual environment scene; wherein when the wearing device moves from a first location to a second location in a first space, the flight carrier is in a second space relative to a first location in the first space Move to a second position. 如請求項1所述之系統,更包含:一播音裝置,設置於該第二空間並且通訊耦接於該穿戴裝置,用以播放該穿戴裝置於該第一空間中所接收之一音訊。 The system of claim 1, further comprising: a broadcast device disposed in the second space and communicatively coupled to the wearable device for playing an audio received by the wearable device in the first space. 如請求項1所述之系統,其中當該穿戴裝置偵測到一旋轉角度時,該飛行載具對應該旋轉角度拍攝該實體物件。 The system of claim 1, wherein the flying vehicle photographs the physical object at a rotation angle when the wearing device detects a rotation angle. 如請求項1所述之系統,其中該飛行載具於該第二空間中飛行的高度對應於該穿戴裝置於該第一空間中的高度。 The system of claim 1, wherein the height of the flight vehicle flying in the second space corresponds to a height of the wearable device in the first space. 一種提供類全息影像之飛行載具,適用於藉由一網路伺服器通訊耦接一穿戴裝置,其包含:一飛行總成; 一光學偵測器,用以拍攝一實體物件;一第一處理器,電性耦接於該光學偵測器以及該飛行總成,用以對應該光學偵測器拍攝的該實體物件產生一物件表面資訊;以及一第一訊號收發器,電性耦接於該第一處理器,用以將該物件表面資訊傳輸至該網路伺服器,並且接收來自該網路伺服器的一移動指令;其中該第一處理器根據該移動指令控制該飛行總成。 A flying vehicle for providing a holographic image is adapted to be communicatively coupled to a wearable device by a network server, comprising: a flight assembly; An optical detector for capturing a physical object; a first processor electrically coupled to the optical detector and the flight assembly for generating a physical object corresponding to the optical detector And the first signal transceiver is electrically coupled to the first processor for transmitting the surface information of the object to the network server, and receiving a movement instruction from the network server The first processor controls the flight assembly according to the movement instruction. 如請求項5所述之飛行載具,其中該光學偵測器包含用以獲取景深資訊的一攝影機。 The flight vehicle of claim 5, wherein the optical detector comprises a camera for acquiring depth information. 如請求項5所述之飛行載具,其中該光學偵測器為一行動裝置,其具獲取景深資訊之攝影單元。 The flight vehicle of claim 5, wherein the optical detector is a mobile device having a camera unit for acquiring depth information. 如請求項5所述之飛行載具,其中該第一訊號收發器包含一無線網路訊號收發器。 The flight vehicle of claim 5, wherein the first signal transceiver comprises a wireless network signal transceiver. 一種提供類全息影像之穿戴裝置,適用於藉由一網路伺服器通訊耦接一飛行載具,其中該飛行戴具用以產生一物件表面資訊,包含:一顯示介面,用以顯示一虛擬環境景象;一動作偵測器,用以偵測該穿戴裝置於一第一空間中的移動;一第二處理器,與該顯示介面以及該動作偵測器電性耦 接,用以對應該動作偵測器偵測到的移動產生一移動資訊;一第二訊號收發器,與該第二處理器電性耦接,用以將該移動資訊傳輸至該網路伺服器,更用以藉由該網路伺服器接收來自該飛行戴具所傳送的該物件表面資訊;其中該第二處理器更用以根據該物件表面資訊控制該顯示介面於該虛擬環境景象中顯示一虛擬物件。 A wearable device for providing a holographic image, which is adapted to be coupled to a flying vehicle by a network server, wherein the flying wearing device is configured to generate an object surface information, comprising: a display interface for displaying a virtual a motion detector for detecting movement of the wearable device in a first space; a second processor electrically coupled to the display interface and the motion detector The second signal transceiver is electrically coupled to the second processor for transmitting the mobile information to the network servo. The second signal transceiver is electrically coupled to the second processor. The device is further configured to receive, by the network server, information about the surface of the object transmitted by the flying wearer; wherein the second processor is further configured to control the display interface in the virtual environment according to the surface information of the object Display a virtual object. 如請求項9所述之穿戴裝置,其中該第二訊號收發器分別包含一無線網路訊號收發器。 The wearable device of claim 9, wherein the second signal transceiver comprises a wireless network signal transceiver.
TW106215339U 2017-10-18 2017-10-18 System, wearable device and flying device for providing semi-holographic images TWM555001U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
TW106215339U TWM555001U (en) 2017-10-18 2017-10-18 System, wearable device and flying device for providing semi-holographic images

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW106215339U TWM555001U (en) 2017-10-18 2017-10-18 System, wearable device and flying device for providing semi-holographic images

Publications (1)

Publication Number Publication Date
TWM555001U true TWM555001U (en) 2018-02-01

Family

ID=62015711

Family Applications (1)

Application Number Title Priority Date Filing Date
TW106215339U TWM555001U (en) 2017-10-18 2017-10-18 System, wearable device and flying device for providing semi-holographic images

Country Status (1)

Country Link
TW (1) TWM555001U (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI667650B (en) * 2018-01-05 2019-08-01 美律實業股份有限公司 Portable electronic device for acustic imaging and operating method for the same
TWI714054B (en) * 2018-04-16 2020-12-21 宏達國際電子股份有限公司 Tracking system for tracking and rendering virtual object corresponding to physical object and the operating method for the same

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI667650B (en) * 2018-01-05 2019-08-01 美律實業股份有限公司 Portable electronic device for acustic imaging and operating method for the same
TWI714054B (en) * 2018-04-16 2020-12-21 宏達國際電子股份有限公司 Tracking system for tracking and rendering virtual object corresponding to physical object and the operating method for the same
US10993078B2 (en) 2018-04-16 2021-04-27 Htc Corporation Tracking system for tracking and rendering virtual object corresponding to physical object and the operating method for the same

Similar Documents

Publication Publication Date Title
US10372209B2 (en) Eye tracking enabling 3D viewing
EP3008691B1 (en) Head-mountable apparatus and systems
EP3008548B1 (en) Head-mountable apparatus and systems
US11507201B2 (en) Virtual reality
CN103149689A (en) Augmented reality virtual monitor
US11045733B2 (en) Virtual reality
CN107810634A (en) Displays for stereoscopic augmented reality
CN116866541A (en) Virtual-real combined real-time video interaction system and method
GB2517056A (en) Head-mountable apparatus and systems
TWM555001U (en) System, wearable device and flying device for providing semi-holographic images
GB2558283A (en) Image processing
EP3673348B1 (en) Data processing device, method and non-transitory machine-readable medium for detecting motion of the data processing device
JP7193539B2 (en) Data processing
CN112866672B (en) Augmented reality system and method for immersive cultural entertainment
CN207232891U (en) Holographic-like image display system, flight carrier and wearing device
JPH07306956A (en) Virtual space experience system using closed space equipment
US10902617B2 (en) Data processing for position detection using optically detectable indicators
GB2558278A (en) Virtual reality
WO2018096315A1 (en) Virtual reality