[go: up one dir, main page]

US10613621B2 - Interactive display system and method for operating such a system - Google Patents

Interactive display system and method for operating such a system Download PDF

Info

Publication number
US10613621B2
US10613621B2 US15/947,327 US201815947327A US10613621B2 US 10613621 B2 US10613621 B2 US 10613621B2 US 201815947327 A US201815947327 A US 201815947327A US 10613621 B2 US10613621 B2 US 10613621B2
Authority
US
United States
Prior art keywords
display element
broadcasting
outer face
devices
graphic representation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US15/947,327
Other languages
English (en)
Other versions
US20180292886A1 (en
Inventor
Lionel Chataignier
Geoffrey Chataignier
Léo Giorgis
Hugo Loi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pixminds Distribution SAS
Original Assignee
ARK SAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ARK SAS filed Critical ARK SAS
Assigned to ARK reassignment ARK ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CHATAIGNIER, GEOFFREY, CHATAIGNIER, LIONEL, GIORGIS, LEO, LOI, HUGO
Publication of US20180292886A1 publication Critical patent/US20180292886A1/en
Application granted granted Critical
Publication of US10613621B2 publication Critical patent/US10613621B2/en
Assigned to PIXMINDS DISTRIBUTION SAS reassignment PIXMINDS DISTRIBUTION SAS ASSIGNMENT OF ASSIGNOR'S INTEREST Assignors: ARK SAS
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/002Specific input/output arrangements not covered by G06F3/01 - G06F3/16
    • G06F3/005Input arrangements through a video camera
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0421Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/001Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
    • G09G3/002Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to project the image of a two-dimensional display, such as an array of light emitting or modulating elements or a CRT
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/001Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
    • G09G3/003Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to produce spatial visual effects
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/56Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • A63F13/573Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using trajectories of game objects, e.g. of a golf ball according to the point of impact
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device

Definitions

  • the invention relates to an interactive display system and a method for operating such a system.
  • interactive display systems generally comprise a device for broadcasting an image or even a video provided with a display element such as a touch screen.
  • Such a touch screen makes it possible to combine the display functionalities with those of a pointing device such as a mouse or even a pad. In effect, a user of this system can, by touching this screen with his or her finger, perform the same operations as with the traditional pointing device.
  • the touch screen works more often than not according to a so-called “capacitive” technology.
  • capacitive touch screen when the finger of the user is in contact with an outer face of this screen, electrical charges are transferred to him or her. This transfer of electrical charges generates a loss of charge which is then detected by measurement systems placed at the four corners of the screen in order to identify the position of the point of contact between the finger and this outer face of the screen and thus perform the operation envisaged by the user.
  • the present invention aims to mitigate such drawbacks associated with the prior art.
  • the invention relates to an interactive display system configured to produce an immersion of a user in a virtual visual environment comprising several devices for broadcasting at least one graphic representation that are capable of partially or entirely covering a visual field of the user, each broadcasting device being provided with an element for displaying said at least one graphic representation and an interactive interface comprising a zone for receiving at least one impact which is defined over all or part of an outer face of the display element, said interactive interface comprising a device for emitting infrared radiation and a device for capturing at least one image of an outer face of said display element.
  • the invention relates also to a method for operating such an interactive display system configured to produce an immersion of a user in a virtual visual environment, comprising the following steps:
  • the detection substep comprises:
  • the invention relates also to a computer program comprising program code instructions for executing the steps of this method when said program is run by a processing unit of this interactive display system.
  • FIG. 1 is a plan schematic view of an interactive display system comprising broadcasting devices forming a closed space according to an embodiment of the invention
  • FIG. 2 is a plan schematic view of the interactive display system comprising broadcasting devices forming an open space according to the embodiment of the invention
  • FIG. 3 represents a schematic view of an internal region of the broadcasting device, according to the embodiment of the invention.
  • FIG. 4 schematically represents a method for operating the interactive display system according to the embodiment of the invention.
  • the interactive display system 1 is configured to produce an immersion of a user 3 in a virtual visual environment comprising a processing unit 11 and several devices 2 for broadcasting at least one graphic representation such as an image or even several images then forming a video.
  • a system 1 can be implemented in the field of cinema, video gaming, sports training or even education. It will be noted that this system 1 makes it possible to produce an immersion of its user 3 in a virtual visual environment which is non-intrusive.
  • these broadcasting devices 2 can be arranged so as to define a closed space 14 .
  • the system 1 can then be a system of “CAVE” type, an acronym for “Cave Automatic Virtual Environment”.
  • the user 3 is then situated in this closed space 14 in which space 14 he or she can move around and/or perform movements.
  • these broadcasting devices 2 can then form together, for example, a cube defining this closed space 14 .
  • this cube can then correspond to a room whose walls are formed by such broadcasting devices 2 .
  • this closed space 14 comprises walls 16 , here four walls, and a ceiling (not represented) and a floor 17 which each comprise one of these broadcasting devices 2 .
  • these broadcasting devices 2 entirely cover the visual field of the user 3 and do so regardless of his or her motions and/or his or her movements within this closed space 14 , so this configuration contributes to producing a total immersion of the user 3 in a virtual visual environment.
  • these broadcasting devices 2 can be arranged so as to define an open space 15 .
  • This open space 15 can be situated outside or in a room.
  • this open space 15 can correspond to a portion of a cube then defining a room of which only a few walls are formed by these broadcasting devices 2 .
  • only three of the four walls, the ceiling and the floor of this room each comprise one of these broadcasting devices 2 .
  • these broadcasting devices 2 partially cover the visual field of the user 3 when he or she moves around and/or performs motions in this open space 15 , so this configuration contributes to producing a partial immersion of the user 3 in a virtual visual environment.
  • the processing unit 11 such as a computer comprises hardware and/or software elements. These hardware and/or software elements more specifically comprise at least one microprocessor cooperating with memory elements.
  • This processing unit 11 is capable of executing program code instructions for implementing a computer program.
  • This program code can in particular comprise an algorithm for digitally processing images.
  • the memory elements of the processing unit 11 comprise in particular data relating to at least one graphic representation of image or even video type which can be transmitted to the broadcasting devices 2 .
  • This processing unit 11 is connected to the broadcasting devices 2 of the system 1 via a link element 12 .
  • each broadcasting device 2 that can be seen in FIGS. 1 and 2 comprises an element 4 for displaying said at least one graphic representation, an interactive interface 5 and a device 10 for projecting said at least one graphic representation.
  • the display element 4 is preferably planar but it can however, in another variant, be curved to have a hemispherical or cylindrical form. This display element 4 delimits an inner region 19 of the broadcasting device 2 from an outer region 18 thereof also called “user side” where this user 3 of the system 1 is preferably situated.
  • this display element 4 comprises inner and outer faces 7 a , 7 b with the inner face 7 b which is arranged facing the inner region 19 of the broadcasting device 2 and the outer face 7 a facing the outer region 18 thereof.
  • Such a display element 4 can be translucent or transparent. It is preferably composed of two layers 13 a , 13 b of material: a first layer and a second layer 13 a , 13 b.
  • the first layer 13 a comprises the outer face 7 a of the display element 4 .
  • This first layer 13 a is a transparent layer which is produced in a transparent material defined in particular to be passed through by infrared radiation of a wavelength substantially greater than 850 nm.
  • This first transparent layer 13 a is for example produced in a thermoplastic material such as, for example, PMMA (polymethyl methacrylate) better known by the trademark PlexiglasTM. It can have a thickness which is preferably at least 1500 ⁇ m or even at least 3000 ⁇ m.
  • this first layer 13 a exhibits characteristics of high resistance to shocks and/or to scratches/abrasions. These resistance characteristics can be configured in particular by varying the thickness of this first layer 13 a . Furthermore, this first layer 13 a allows the display element 4 to have an outer face 7 a which can be washed and is unbreakable. As an example, such a first layer 13 a can have a thickness of between 15 and 25 mm, preferably 20 mm and have a wear resistance at a pressure of between 600 and 1000 N/m 2 , and preferably 900 N/m 2 .
  • the second layer 13 b comprises the inner face 7 b of the display element 4 .
  • This second layer 13 b is a layer of material defined to refract light radiation, in particular near light radiation originating from the projection device 10 according to a defined angle that can lie between 0 and 70 degrees. It will be noted that the angle of 0 degrees corresponds to incident light radiation perpendicular to the inner face 7 b of the display element 4 .
  • Such a second layer 13 b forms a zone for displaying said at least one graphic representation by being in particular a back-projection screen or a broadcaster of such a graphic representation.
  • This second layer 13 b has a thickness which lies between 50 and 250 ⁇ m, and which is preferably 150 ⁇ m.
  • the first and second layers 13 a , 13 b are linked to one another preferably by bonding-type link elements.
  • the interactive interface 5 of each display device comprises the following components: a device 8 for emitting infrared radiation and a device 9 for capturing at least one image of an outer face 7 a of said display element 4 and a zone 6 for receiving at least one impact defined over all or part of the outer face 7 a of the display element 4 .
  • this interactive interface 5 also comprises the processing unit 11 of the system 1 to which the emission and capture devices of this interactive interface 5 are connected via a link element 12 .
  • the reception zone 6 is designed to receive at least one impact making it possible to perform an interaction between the user 3 and the system 1 .
  • This impact can result from a brushing or a contact or even a collision between an object and the outer face 7 a of the display element 4 .
  • This object can be liquid, solid or non-solid and/or with or without magnetic properties.
  • This object can, in a nonlimiting and nonexhaustive manner, be an element carried/worn by this user 3 , a bouncing body that can have elastic properties, a liquid such as water, a non-solid object such as snow or even sand. It will be noted that this impact can also be produced by a part of a body of the user 3 .
  • the emission device 8 of this interactive interface 5 comprises several infrared light sources arranged in this device 8 so as to generate a homogeneous and/or uniform light distribution on the display element 4 , in particular on the inner face 7 b of this element 4 .
  • These light sources are configured in this emission device 8 and chosen to ensure lighting that is uniform/homogeneous and/or constant spatially and in time.
  • These light sources are supplied constantly with electrical energy and are configured to emit a constant energy.
  • the emission device is positioned on the same side as the user 3 , inside the closed space 14 . It will be observed that the capture device 9 can operate with a greater accuracy when the emission device is positioned on the same side as the user 3 , inside the closed space 14 .
  • the emission device 8 can comprise, in a nonlimiting and nonexhaustive manner, between 4 and 200 infrared light sources. It will be understood here that the number of light sources depends on their power as well as on their projection angle. In the least costly optimal configuration, the emission device 8 can comprise four powerful light sources arranged respectively at each corner of the display element 4 . In this configuration, the excessive lighting can be compensated by the application of a shield over each of these light sources.
  • these light sources are high-intensity light-emitting diodes emitting radiation in the wavelengths of the infrared. These wavelengths are chosen to be different or even far from the visible spectrum in order for the light radiation from the projection device 10 not to interfere with the radiation from this emission device 8 . These wavelengths are preferably greater than 850 nm.
  • the capture device 9 of this interactive interface 5 comprises at least one camera and/or a photographic sensor.
  • This camera or this photographic sensor comprises a photosensitive electronic component that is used to convert visible, ultraviolet or even infrared light radiation into an analog electrical signal which is then amplified then digitized by an analog-digital converter and finally processed to obtain at least one digital image stored in the memory elements of the processing unit 11 .
  • the camera and the photographic sensor can each be associated with a physical or even software infrared filter. When this filter is software, it is then implemented by the digital image processing algorithm executed by the processing unit 11 .
  • the capture device 9 from each camera and/or photographic sensor is capable of capturing at least one image of an outer face 7 a of said display element 4 .
  • this device 9 is capable of capturing the light originating from the outer region 18 of the broadcasting device 2 in which the user 3 of the system 1 is situated. It will be noted that the capture device 9 can acquire images by operating at high frequencies, preferably greater than 100 Hz.
  • the projection device 10 comprises at least one video projector.
  • This projection device 10 is capable of emitting light radiation that makes it possible to project at least one graphic representation such as an image or several images forming a video onto the second layer 13 b of the display element 4 .
  • the use of a video projector makes it possible to project images of large size, which is particularly important for creating an immersion of the user in the virtual environment.
  • the projected image is not interrupted by a frame or a separator surrounding the juxtaposed screens.
  • the image can also be projected into the corners between two walls. The image can therefore be continuous between two walls forming a corner.
  • the image obtained is therefore of a single piece which improves the immersive experience of the user.
  • the invention relates also to a method for operating the system 1 .
  • This method is capable of implementing such an interactive display system 1 .
  • This method comprises a step of starting up 20 the system 1 during which the broadcasting devices 2 are electrically powered and in particular the device 8 for emitting infrared radiation, the capture device 9 and the projection device 10 .
  • the user 3 of the system 1 takes his or her place in the open 15 or closed 14 space defined by the mutual arrangement of the broadcasting devices 2 .
  • This start-up step 20 comprises a substep of broadcasting 21 at least one graphic representation by the broadcasting devices 2 of the system 1 . More specifically, during this substep 21 , the processing unit 11 selects data archived in its memory elements relating to at least one graphic representation such as an image or several images forming a video. These data are then transmitted by the processing unit 11 to the broadcasting devices 2 and in particular to the projection device 10 of each of these broadcasting devices 2 . On receipt of these data, the projection devices 10 emit light radiation then projecting said at least one graphic representation onto the display elements 4 , in particular onto the second layer 13 b of each of these elements 4 .
  • the start-up step 20 also comprises a substep of activation of a generation 22 of infrared radiation by the emission device 8 of each broadcasting device 2 of the system 1 as a function of a determination of a level Nd of infrared light in the outer region 18 of each broadcasting device 2 .
  • the processing unit 11 determines, from one or more capture devices 9 of the broadcasting devices 2 , the level Nd of infrared light present in the outer region 18 of these broadcasting devices 2 . This level Nd of infrared light that is thus determined is then compared to a reference level Nr of infrared light.
  • the processing unit 11 triggers, in an emission phase 23 , the generation by the emission devices 8 of the broadcasting devices 2 of infrared radiation.
  • the level of infrared light in the outer region 18 of the broadcasting devices 2 is then considered to be low.
  • Such a first case is generally identified when the broadcasting devices 2 define a closed space 14 .
  • the processing unit 11 does not trigger, in a non-emission phase 24 , the generation of infrared radiation by the emission devices.
  • the level of infrared light in the outer region 18 of the broadcasting devices 2 is then considered to be high.
  • Such a second case is generally identified when the broadcasting devices 2 define an open space 15 .
  • this substep 22 when the infrared radiation is emitted, it is then capable of passing through the display elements of the broadcasting devices 2 and in particular the first and second layers 13 a , 13 b of each display element 4 toward the outer region 18 of the broadcasting devices 2 .
  • the method then comprises a step of configuration 25 of the interactive interface 5 .
  • the processing unit 11 performs, by being linked to the capture device 9 of each broadcasting device 2 , an acquisition of at least one image of the inner face 7 b of the display element 4 of each of these broadcasting devices 2 .
  • it can alternatively acquire several images forming a video.
  • Such an acquisition of at least one image makes it possible to assess the level of infrared light present on the inner face 7 b of the display element 4 of each broadcasting device 2 during the starting up of the system 1 .
  • This said at least one image is then archived in the memory elements of the processing unit 11 and then forms a reference image.
  • the method then comprises a step of identification of an activation 26 of the interactive interface 5 .
  • Such an activation takes place as soon as an object collides with the outer face 7 a of the display element 4 of one of the broadcasting devices 2 of the system 1 .
  • This activation step comprises a substep of detection 27 of an impact in the reception zone 6 defined over all or part of the outer face 7 a of the display element 4 of one of the broadcasting devices 2 of the system 1 .
  • This detection substep 27 comprises a phase of acquisition 28 of images of the inner face 7 b of the display element 4 of each broadcasting device 2 .
  • This image acquisition phase 28 can be performed continuously.
  • these acquired images are each transmitted to the processing unit 11 as they are captured.
  • the detection substep 27 then provides a phase of comparison 29 of each of these acquired images with said at least one reference image from the execution by the processing unit 11 of the digital image processing algorithm.
  • this comparison phase 29 as soon as at least one difference is identified between one of these acquired images and said at least one reference image, then an impact is detected.
  • each difference corresponds to a variation of the level of infrared light between the acquired image of the inner face 7 b of the display element and said at least one reference image.
  • the level of infrared light in the outer region 18 of the broadcasting devices 2 when an object enters into collision with the outer face 7 a of the display element 4 , the latter will then block the infrared light from this outer region 18 then causing a lowering of the level of infrared light in the image acquired upon the impact compared to the level of infrared light present in said reference image.
  • the processing unit 11 is then capable of locating, in the reception zone 6 defined over all or part of the outer face 7 a of the display element 4 , the position of this impact as a function of the identification of the position of the variation of infrared light resulting from the comparison between the acquired image relating to the impact and said at least one reference image.
  • the invention relates also to a computer program comprising program code instructions for executing the steps of this method, when said program is run by the processing unit 11 of the interactive display system 1 .
  • the invention makes it possible to have an interactive display system 1 comprising several broadcasting devices 2 which are each capable of interacting with objects of any type or also with any part of the body of the user 3 of such a system 1 .
  • this interactive display system 1 allows for the production of interactive virtual environments at lower costs. In effect, the cost of production of this system 1 decreases when the surface of the display element 4 increases.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • Multimedia (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Processing Or Creating Images (AREA)
  • Devices For Indicating Variable Information By Combining Individual Elements (AREA)
US15/947,327 2017-04-07 2018-04-06 Interactive display system and method for operating such a system Active US10613621B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR1753048 2017-04-07
FR1753048A FR3065091B1 (fr) 2017-04-07 2017-04-07 Systeme d'affichage interactif et procede de fonctionnement d'un tel systeme

Publications (2)

Publication Number Publication Date
US20180292886A1 US20180292886A1 (en) 2018-10-11
US10613621B2 true US10613621B2 (en) 2020-04-07

Family

ID=59297019

Family Applications (1)

Application Number Title Priority Date Filing Date
US15/947,327 Active US10613621B2 (en) 2017-04-07 2018-04-06 Interactive display system and method for operating such a system

Country Status (2)

Country Link
US (1) US10613621B2 (fr)
FR (1) FR3065091B1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102784317B1 (ko) 2019-07-30 2025-03-24 삼성디스플레이 주식회사 표시장치
US11463789B2 (en) * 2020-04-23 2022-10-04 Harves Group, Llc Interactive multimedia structure

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1396781A2 (fr) 2002-09-05 2004-03-10 Sony Computer Entertainment Inc. Système d'affichage, dispositif de commande d'affichage, dispositif d'affichage, procédé d'affichage et dispositif d'interface utilisateur
KR20100116970A (ko) 2009-04-23 2010-11-02 장준호 공간탈출게임시스템
US20100328306A1 (en) 2008-02-19 2010-12-30 The Board Of Trustees Of The Univer Of Illinois Large format high resolution interactive display
US20120156652A1 (en) * 2010-12-16 2012-06-21 Lockheed Martin Corporation Virtual shoot wall with 3d space and avatars reactive to user fire, motion, and gaze direction
US20130181901A1 (en) * 2012-01-12 2013-07-18 Kanye Omari West Multiple Screens for Immersive Audio/Video Experience
US20140048681A1 (en) 2012-08-16 2014-02-20 Pixart Imaging Inc Object tracking device and operating method thereof
US20140192087A1 (en) * 2013-01-09 2014-07-10 Northrop Grumman Systems Corporation System and method for providing a virtual immersive environment

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1396781A2 (fr) 2002-09-05 2004-03-10 Sony Computer Entertainment Inc. Système d'affichage, dispositif de commande d'affichage, dispositif d'affichage, procédé d'affichage et dispositif d'interface utilisateur
US20040125044A1 (en) 2002-09-05 2004-07-01 Akira Suzuki Display system, display control apparatus, display apparatus, display method and user interface device
US20100328306A1 (en) 2008-02-19 2010-12-30 The Board Of Trustees Of The Univer Of Illinois Large format high resolution interactive display
KR20100116970A (ko) 2009-04-23 2010-11-02 장준호 공간탈출게임시스템
US20120156652A1 (en) * 2010-12-16 2012-06-21 Lockheed Martin Corporation Virtual shoot wall with 3d space and avatars reactive to user fire, motion, and gaze direction
US20130181901A1 (en) * 2012-01-12 2013-07-18 Kanye Omari West Multiple Screens for Immersive Audio/Video Experience
US20140048681A1 (en) 2012-08-16 2014-02-20 Pixart Imaging Inc Object tracking device and operating method thereof
US20140192087A1 (en) * 2013-01-09 2014-07-10 Northrop Grumman Systems Corporation System and method for providing a virtual immersive environment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
French Search Report and Written Opinion dated Dec. 6, 2017 issued in counterpart application No. FR1753048; w/ English machine translation (14 pages).

Also Published As

Publication number Publication date
FR3065091B1 (fr) 2021-09-03
FR3065091A1 (fr) 2018-10-12
US20180292886A1 (en) 2018-10-11

Similar Documents

Publication Publication Date Title
US8605046B2 (en) System and method for providing multi-dimensional touch input vector
US8971565B2 (en) Human interface electronic device
US8587520B2 (en) Generating position information using a video camera
JP6270898B2 (ja) 非接触入力方法
US10916025B2 (en) Systems and methods for forming models of three-dimensional objects
CN107526953A (zh) 支持指纹认证功能的电子装置及其操作方法
JP2008511069A (ja) 半透明表面を有するスクリーンとともに使用されるユーザ入力の装置、システム、方法、およびコンピュータ・プログラム
JP2014517361A (ja) カメラ式マルチタッチ相互作用装置、システム及び方法
KR20110100582A (ko) 정보 처리 장치, 정보 처리 방법 및 프로그램
US20110261013A1 (en) Touch screen system based on image recognition
JP2014179032A (ja) 仮想キー入力装置
US10613621B2 (en) Interactive display system and method for operating such a system
CN113557710A (zh) 生物特征输入装置
US10325377B2 (en) Image depth sensing method and image depth sensing apparatus
KR100968205B1 (ko) 적외선 카메라 방식의 공간 터치 감지 장치, 방법 및스크린 장치
CN104049747B (zh) 一种用手指直接控制光标的鼠标装置
TWI439906B (zh) 感測系統
TWI573043B (zh) The virtual two - dimensional positioning module of the input device
JP2013187804A (ja) 監視カメラ
JP3134814U (ja) 座標検出装置
KR20160121963A (ko) 제스처 인식이 가능한 적외선 터치스크린 시스템
US9430823B1 (en) Determining camera sensor isolation
CN218181501U (zh) 一种手部动作感知输入装置
CN116199056B (zh) 非接触式电梯操控系统
JP7371382B2 (ja) 位置指示装置、表示システム、位置指示方法、及びプログラム

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

AS Assignment

Owner name: ARK, FRANCE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CHATAIGNIER, LIONEL;CHATAIGNIER, GEOFFREY;GIORGIS, LEO;AND OTHERS;REEL/FRAME:045718/0544

Effective date: 20180412

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4

AS Assignment

Owner name: PIXMINDS DISTRIBUTION SAS, FRANCE

Free format text: ASSIGNMENT OF ASSIGNOR'S INTEREST;ASSIGNOR:ARK SAS;REEL/FRAME:072715/0058

Effective date: 20250924