[go: up one dir, main page]

WO2021210155A1 - Dispositif et procédé d'affichage ainsi que programme - Google Patents

Dispositif et procédé d'affichage ainsi que programme Download PDF

Info

Publication number
WO2021210155A1
WO2021210155A1 PCT/JP2020/016841 JP2020016841W WO2021210155A1 WO 2021210155 A1 WO2021210155 A1 WO 2021210155A1 JP 2020016841 W JP2020016841 W JP 2020016841W WO 2021210155 A1 WO2021210155 A1 WO 2021210155A1
Authority
WO
WIPO (PCT)
Prior art keywords
cursor
display device
users
display
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2020/016841
Other languages
English (en)
Japanese (ja)
Inventor
佐藤 隆
誉宗 巻口
正典 横山
高田 英明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
NTT Inc
Original Assignee
Nippon Telegraph and Telephone Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nippon Telegraph and Telephone Corp filed Critical Nippon Telegraph and Telephone Corp
Priority to PCT/JP2020/016841 priority Critical patent/WO2021210155A1/fr
Publication of WO2021210155A1 publication Critical patent/WO2021210155A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/08Cursor circuits

Definitions

  • the present invention relates to a display device, a display method, and a program.
  • Non-Patent Document 1 proposes a system in which a plurality of users instruct coordinates with a laser pointer on a shared screen and perform operations such as drawing a figure.
  • the laser pointer is changed by the user and the coordinates are acquired by the camera.
  • the present invention has been made in view of the above, and an object of the present invention is to provide a display device that can intuitively visually recognize one's own input on a screen shared by a plurality of users.
  • the display device of one aspect of the present invention is associated with a plurality of projection units that provide different images for each of the plurality of users on a display surface shared by the plurality of users, and the plurality of projection units.
  • a plurality of input units that acquire each user's operation and operate the corresponding cursor, and each of the plurality of projection units are operated by the input unit corresponding to the projection unit in the image provided by the projection unit. It has a control unit that displays the first cursor in a manner different from that of the second cursor group operated by the other input units.
  • the present invention it is possible to provide a display device that can intuitively visually recognize one's own input on a screen shared by a plurality of users.
  • FIG. 1 is a diagram for explaining an outline of the display device of the present embodiment.
  • FIG. 2 is a top view showing an example of the configuration of the display device.
  • FIG. 3 is a cross-sectional view showing an example of the configuration of the display device.
  • FIG. 4 is a top view showing an example of the configuration of another display device.
  • FIG. 5 is a diagram showing an example of the configuration of the information processing device.
  • FIG. 6 is a diagram showing an example of the correspondence between the sensor held by the information processing device and the projector.
  • FIG. 7 is a flowchart showing a processing flow of the information processing apparatus.
  • FIG. 8 is a diagram showing an example of a cursor image.
  • FIG. 9 is a diagram showing an example of the hardware configuration of the information processing device.
  • the display device 1 of the present embodiment is a tubular display device having an opening on the upper surface, and is provided with a circular reflective screen 50 on the inner bottom of the main body.
  • Users 200A to 200C look into the bottom screen 50 through the opening.
  • a plurality of cursors operated by each of the users 200A to 200C are displayed on the screen 50.
  • Users 200A to 200C can operate the cursor displayed on the screen 50 by the pointing hand gesture.
  • the cursor operated by the user is highlighted in a manner different from that of the cursor group operated by other users.
  • the display device 1 includes a plurality of projectors 20-1 to 20-4 on the upper portion of the housing 40.
  • the projectors 20-1 to 20-4 are fixed to the inside of the housing 40 and project an image on the screen 50.
  • the screen 50 is a reflective optical screen that couples the iris surfaces (planes corresponding to the aperture of the lens) of the projectors 20-1 to 20-4 at positions corresponding to the projection distance and the focal length.
  • Projectors 20-1 to 20-4 can provide different images for each user 200A to 200C to the shared screen 50.
  • each user 200A to 200C can see the image projected by the projectors 20-1 to 20-4 arranged opposite to each other.
  • the user 200A can see the image projected by the projector 20-3, and the user 200C facing the user 200A sees the image projected by the projector 20-1 arranged on the opposite side of the projector 20-3. Can be done.
  • Sensors 30-1 to 30-4 are mounted on each of the projectors 20-1 to 20-4.
  • the sensors 30-1 to 30-4 sense the movements of the users 200A to 200C, and transmit the sensing data to the information processing device 10 described later.
  • the information processing device 10 recognizes the pointing hand gestures of the users 200A to 200C based on the sensing data of the sensors 30-1 to 30-4, and operates the cursors of the users 200A to 200C respectively.
  • the information processing device 10 associates the projectors 20-1 to 20-4 with the facing sensors 30-1 to 30-4. For example, the projector 20-3 that projects the image viewed by the user 200A and the sensor 30-1 that detects the pointing hand gesture of the user 200A are associated with each other.
  • the information processing device 10 calculates the coordinates of each cursor based on the sensing data of the sensors 30-1 to 30-4.
  • the information processing device 10 emphasizes the cursor operated based on the sensors 30-1 to 30-4 corresponding to the projectors 20-1 to 20-4 in the images supplied to the projectors 20-1 to 20-4, respectively. To display.
  • the cursor operated based on the sensing data of the sensor 30-1 is emphasized more than the cursor group operated based on the sensing data of the sensors 30-2 to 30-4. Is displayed. That is, the user 200A sees the image in which the cursor operated by the user 200A is emphasized and displayed. Other users 200B and 200C also see the image displayed with the cursor operated by each user highlighted.
  • the mounting positions of the projectors 20-1 to 20-4 and the sensors 30-1 to 30-4 are examples, and are not limited to these.
  • the arrangement position of the screen 50 is not limited to the bottom inside the main body of the display device 1.
  • the screen 50 may be arranged on the wall, and a plurality of projectors may be arranged so as to face the screen 50 and project images at different projection angles. Also in this case, different images can be visually recognized depending on the viewpoint position. It is desirable that the number of projectors and sensors is equal to or greater than the number of users who use them at the same time. When multiple sensors respond to one user, select one representative sensor, associate it with the cursor, and display it in the image that can be seen from that position.
  • Examples of the method of selecting the representative sensor include a method of selecting the sensor having the largest response and a method of selecting the sensor located in the center of the responding sensor group.
  • the cursors of the plurality of users must be displayed at the same time on an image that is commonly seen by the plurality of users.
  • FIG. 4 shows an example of another display device.
  • 60 projectors 20 are arranged side by side in a circle.
  • Each projector 20 outputs images of the entire circumference of the subject taken at different angles.
  • linear blending can be optically realized, and a three-dimensional image that smoothly complements the intermediate viewpoint can be presented.
  • the user can see the entire circumference of the three-dimensional image projected on the screen 50 by changing the viewpoint position along the outer circumference of the display device 1.
  • the sensor and the projector 20 do not have a one-to-one correspondence with each other.
  • a plurality of projectors may be associated with one sensor.
  • the display unit including the projectors 20-1 to 20-4 is not limited to the above-mentioned spatial imaging iris surface method, but is an integral type display, a parallax barrier type, or a 3D display using 3D glasses (optical shutter, Anything such as a polarizing plate) that can show different images depending on the viewpoint position is sufficient.
  • the integral method is a method in which a lens array is arranged in front of a display surface and an image to be displayed is switched according to a viewing angle.
  • the parallax barrier method is a method in which barrier layers are arranged in front of a display surface and the image to be displayed is switched according to the viewing angle.
  • the input unit including the sensors 30-1 to 30-4 is not limited to the above-mentioned hand gesture recognition, and each of the users 200A to 200C can operate the cursor individually, such as mouse, joystick, or laser pointer recognition. Anything is fine.
  • the information processing device 10 is a device that inputs cursor operation information from sensors 30-1 to 30-4 in the display device 1 and supplies images to projectors 20-1 to 20-4.
  • the information processing device 10 shown in FIG. 5 includes coordinate input units 101-1 to 101-4, image output units 102-1 to 102-4, a control unit 103, and a storage unit 104.
  • the coordinate input units 101-1 to 101-4 receive the sensing data from the sensors 30-1 to 30-4 and calculate the coordinates of the corresponding cursors. That is, the coordinate input units 101-1 to 101-4 receive the operation information of the cursor from each of the users 200A to 200C, and calculate the coordinates of the cursor based on the operation information.
  • the image output units 102-1 to 102-4 supply images to be shown to each of the users 200A to 200C to each of the projectors 20-1 to 20-4.
  • the control unit 103 identifies the correspondence between the projectors 20-1 to 20-4 and the sensors 30-1 to 30-4 with reference to the information stored in the storage unit 104, and the image output units 102-1 to 102-
  • the highlighting cursor is switched every 4.
  • the control unit 103 superimposes the highlighted cursor image on the coordinates of the cursor operated by the corresponding sensors 30-1 to 30-4 on each of the original images supplied to the projectors 20-1 to 20-4.
  • the normally displayed cursor image is superimposed on the coordinates of the other cursors.
  • the original video supplied to the projectors 20-1 to 20-4 may be input from the outside, or may be read out stored in the storage unit 104.
  • the storage unit 104 stores the correspondence between the sensors 30-1 to 30-4 and the projectors 20-1 to 20-4.
  • two types of cursor images A and B are associated with each other.
  • the cursor image A is a highlighting cursor image
  • the cursor image B is a normal display cursor image.
  • the sensor 30-3 (the sensor ID is 3) corresponds to the projector 20-1 (the projector ID is 1).
  • the image of the highlight 3 is used for the cursor operated based on the sensing data of the sensor 30-3, and the image using the image of the simple 3 is supplied to the projector 20-1 for the other cursors.
  • step S11 the information processing device 10 receives sensing data from each of the sensors 30-1 to 30-4, and obtains cursor coordinates corresponding to each of the sensors 30-1 to 30-4.
  • the cursor coordinates may not be displayed (the cursor may not be displayed).
  • the information processing device 10 executes the following steps S12 and S13 for each of the projectors 20-1 to 20-4.
  • step S12 the information processing device 10 displays the emphasized cursor at the cursor coordinates operated by the sensors 30-1 to 30-4 corresponding to the projectors 20-1 to 20-4 to be processed.
  • step S13 the information processing apparatus 10 normally displays a cursor group other than the cursor displayed in step S12, and supplies the image on which the cursor is superimposed to the projectors 20-1 to 20-4 to be processed.
  • the information processing device 10 displays the image supplied to the projector 20-1 with the highlight 3 at the cursor coordinates operated by the sensor 30-3. The image is superimposed, and the simple 3 image is superimposed on the other cursor coordinates.
  • FIG. 8 shows an example of the cursor image 501 for normal display and the cursor images 511 to 515 for highlighting.
  • the cursor image 511 is a enlarged version of the cursor.
  • the cursor image 512 has a different contour or shadow.
  • the cursor image 513 has different colors. You may change the shape of the cursor.
  • the cursor image 514 is obtained by changing the direction of the cursor.
  • the cursor image 515 is annotated near the cursor.
  • the cursor image When the cursor image has top and bottom or characters are displayed near the cursor, the cursor image is rotated according to the viewing angle of the users 200A to 200C, that is, the direction of the displayed image, and the cursor is visually recognized by the users 200A to 200C. It may be easier to do.
  • the cursor that is normally displayed that is, the cursor that is operated by another user, may be switched between display and non-display. For example, when the user 200A sets his / her own cursor to be hidden, the cursor operated by the user 200A is not displayed in the images of the other users 200B and 200C.
  • the display device 1 of the present embodiment includes a plurality of projectors 20-1 to 20-4 and a projector 20 that provide different images for each of the plurality of users on the screen 50 shared by the plurality of users. It is associated with -1 to 20-4, and has a plurality of sensors 30-1 to 30-4 that acquire the operations of each of the plurality of users and operate the corresponding cursors.
  • the information processing device 10 sets the cursor operated by the sensors 30-1 to 30-4 corresponding to the projectors 20-1 to 20-4 in the image projected by each of the projectors 20-1 to 20-4. It is displayed in a mode different from that of the cursor operated by the sensors 30-1 to 30-4. As a result, in the image projected on the shared screen 50, the cursor operated by each user is displayed in a different manner from the cursor group operated by other users, so that each user can easily move his / her cursor. It becomes possible to grasp.
  • the information processing device 10 described above includes, for example, a central processing unit (CPU) 901, a memory 902, a storage 903, a communication device 904, an input device 905, and an output device 906, as shown in FIG.
  • CPU central processing unit
  • a general-purpose computer system including the above can be used.
  • the information processing device 10 is realized by the CPU 901 executing a predetermined program loaded on the memory 902.
  • This program can be recorded on a computer-readable recording medium such as a magnetic disk, an optical disk, or a semiconductor memory, or can be distributed via a network.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention, selon le présent mode de réalisation, concerne un dispositif d'affichage (1) comprenant : une pluralité de projecteurs (20-1 à 20-4) qui fournissent différentes vidéos à destination d'une pluralité d'utilisateurs individuels sur un écran (50) partagé par les utilisateurs de la pluralité d'utilisateurs ; une pluralité de capteurs (30-1 à 30-4) qui sont associés aux projecteurs (20-1 à 20-4), qui obtiennent des opérations de la pluralité d'utilisateurs individuels, et qui actionnent des curseurs correspondants ; et une unité de commande (103) qui, par rapport à la pluralité de projecteurs individuels (20-1 à 20-4), provoque un affichage des curseurs actionnés par les capteurs (30-1 à 30-4) correspondant aux projecteurs (20-1 à 20-4) dans les vidéos fournies par les projecteurs (20-1 à 20-4) dans des modes différents de ceux de curseurs actionnés par les autres capteurs (30-1 à 30-4).
PCT/JP2020/016841 2020-04-17 2020-04-17 Dispositif et procédé d'affichage ainsi que programme Ceased WO2021210155A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/016841 WO2021210155A1 (fr) 2020-04-17 2020-04-17 Dispositif et procédé d'affichage ainsi que programme

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/016841 WO2021210155A1 (fr) 2020-04-17 2020-04-17 Dispositif et procédé d'affichage ainsi que programme

Publications (1)

Publication Number Publication Date
WO2021210155A1 true WO2021210155A1 (fr) 2021-10-21

Family

ID=78084300

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/016841 Ceased WO2021210155A1 (fr) 2020-04-17 2020-04-17 Dispositif et procédé d'affichage ainsi que programme

Country Status (1)

Country Link
WO (1) WO2021210155A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07120835A (ja) * 1993-10-27 1995-05-12 Nippon Telegr & Teleph Corp <Ntt> マルチリンガル表示装置
JP2011197380A (ja) * 2010-03-19 2011-10-06 Seiko Epson Corp 表示装置、表示システムおよび表示方法
JP2014089379A (ja) * 2012-10-31 2014-05-15 Seiko Epson Corp 画像表示システム、および画像表示システムの制御方法
JP2015135572A (ja) * 2014-01-16 2015-07-27 キヤノン株式会社 情報処理装置とその制御方法
JP2015146611A (ja) * 2015-03-17 2015-08-13 セイコーエプソン株式会社 インタラクティブシステム、およびインタラクティブシステムの制御方法
US20170212640A1 (en) * 2014-05-23 2017-07-27 Piqs Technology (Shenzhen) Limited Interactive display systems

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07120835A (ja) * 1993-10-27 1995-05-12 Nippon Telegr & Teleph Corp <Ntt> マルチリンガル表示装置
JP2011197380A (ja) * 2010-03-19 2011-10-06 Seiko Epson Corp 表示装置、表示システムおよび表示方法
JP2014089379A (ja) * 2012-10-31 2014-05-15 Seiko Epson Corp 画像表示システム、および画像表示システムの制御方法
JP2015135572A (ja) * 2014-01-16 2015-07-27 キヤノン株式会社 情報処理装置とその制御方法
US20170212640A1 (en) * 2014-05-23 2017-07-27 Piqs Technology (Shenzhen) Limited Interactive display systems
JP2015146611A (ja) * 2015-03-17 2015-08-13 セイコーエプソン株式会社 インタラクティブシステム、およびインタラクティブシステムの制御方法

Similar Documents

Publication Publication Date Title
US12353646B2 (en) Augmented reality eyewear 3D painting
US9823764B2 (en) Pointer projection for natural user input
TWI559174B (zh) 以手勢爲基礎之三維影像操控技術
US9746989B2 (en) Three-dimensional image processing apparatus
JP7182920B2 (ja) 画像処理装置、画像処理方法およびプログラム
US11284061B2 (en) User input device camera
CN103248810A (zh) 图像处理装置、图像处理方法和程序
JP2009278456A (ja) 映像表示装置
US11675198B2 (en) Eyewear including virtual scene with 3D frames
CN108900829B (zh) 动态显示系统
CN105210144A (zh) 显示控制装置、显示控制方法和记录介质
CN114647317A (zh) 由外围设备启用的远程触摸检测
JP2017187667A (ja) 頭部装着型表示装置およびコンピュータープログラム
WO2025024469A1 (fr) Dispositifs, procédés et interfaces utilisateur graphiques pour partager un contenu dans une session de communication
US12169968B2 (en) Augmented reality eyewear with mood sharing
JP2019146155A (ja) 画像処理装置、画像処理方法およびプログラム
JP2016045588A (ja) データ処理装置、データ処理システム、データ処理装置の制御方法、並びにプログラム
JP2018112894A (ja) システムおよび制御方法
US20250203061A1 (en) Augmented reality eyewear with x-ray effect
CN113196212B (zh) 作为用于交互的物理接口的移动平台
WO2021210155A1 (fr) Dispositif et procédé d&#39;affichage ainsi que programme
JP2019032713A (ja) 情報処理装置、情報処理方法及びプログラム
JP5337409B2 (ja) 情報提示装置
CN112535392A (zh) 基于光通信装置的物品陈列系统和信息提供方法、设备及介质
KR100845274B1 (ko) 관람자의 시선 방향을 고려한 전시 시스템의 인터페이스방법 및 장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20930919

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20930919

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP