WO2004098176A1 - Procede de transmission de donnees representant la position dans l'espace d'une camera video et systeme de mise en oeuvre du procede - Google Patents
Procede de transmission de donnees representant la position dans l'espace d'une camera video et systeme de mise en oeuvre du procede Download PDFInfo
- Publication number
- WO2004098176A1 WO2004098176A1 PCT/FR2004/000982 FR2004000982W WO2004098176A1 WO 2004098176 A1 WO2004098176 A1 WO 2004098176A1 FR 2004000982 W FR2004000982 W FR 2004000982W WO 2004098176 A1 WO2004098176 A1 WO 2004098176A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- camera
- subsystem
- data
- images
- coordinates
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S5/00—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
- G01S5/0009—Transmission of position information to remote stations
- G01S5/0018—Transmission from mobile station to base station
- G01S5/0027—Transmission from mobile station to base station of actual mobile position, i.e. position determined on mobile
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/2224—Studio circuitry; Studio devices; Studio equipment related to virtual studio applications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/272—Means for inserting a foreground image in a background image, i.e. inlay, outlay
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
Definitions
- the invention relates to a method for transmitting data representing the position in space of a moving video camera, more particularly to a method for transmitting these coordinates in real time.
- the invention also relates to a system for implementing the method.
- coordinates should be understood in a very general sense.
- the coordinates of the image plane are: The Inclination of this plane in space, given by the azimuth, elevation and rolling angles, as well as the position of the center of the image.
- This center can be given by three coordinate values, arbitrarily named x, y, and z, of an orthonormal trihedron of axes X, Y and Z
- - first field of application the characterization of the images of a video stream when the position of their plane in space
- - second field of application the visualization in real time of an overview of the framing of a video camera in a virtual setting.
- the sets of a film are also, in whole or in part, composed of computer generated images.
- the three-dimensional decoration is modeled beforehand.
- the virtual image of the framing is rendered by three-dimensional rendering software in real time.
- the invention aims to overcome the drawbacks of the systems and methods of the known art, some of which have just been mentioned.
- the object of the invention is to set a method for transmitting position coordinates in space of a moving video camera in real time.
- the object of the invention is to set a system for implementing this method.
- the whole system is in the form of a lightweight system designed to be attached to a video camera.
- the system according to the invention essentially comprises two main subsystems:
- the first subsystem is intended to be made integral with a video camera. It advantageously consists of a rigid shell in which an inertial unit and its electronics are housed.
- the second subsystem which can advantageously be carried in the wallet by an operator, is intended to process the information of the first subsystem and supplies the entire system with energy. He understands :
- a battery or any equivalent member, preferably for powering the entire system
- a light electronic console connected to the output of the inertial unit which records the position data of the video camera in space at a determined frequency, typically 100 Hz.
- the data recorded in the electronic console are indexed with respect to time.
- the data can be transmitted directly by cable or by wave to an additional data processing unit equipped with navigation software in a 3D model, so as to be processed there in real or near real time.
- this additional processing unit can be incorporated in a backpack, or a similar member, carried by the operator.
- additional organs can be added to one and / or the other of the two subsystems. Thanks to the method and system according to the invention, the precision of the measurements is independent of the image quality. In addition, the position coordinates are transmitted in real time.
- the main object of the invention is therefore a method of transmitting the position coordinates of a camera for taking a sequence of video images moving in space along a trajectory determined with respect to a frame of reference. determined, characterized in that it comprises at least the following steps:
- a preliminary step consisting in subjecting said camera to a first subsystem comprising an inertial unit delivering data signals representative of the coordinates and of the instantaneous inclination of said camera relative to said reference frame ;
- the invention also relates to the application of this method to the characterization of the images of a video stream when the position of their plane in space.
- the invention also relates to the application of this method to the real-time display of an overview of the framing of a video camera in a virtual setting.
- FIG. 1A schematically illustrates an example of a data transmission and processing system representing the position in space of a recording device views of a sequence of video images, according to a first embodiment of the invention
- Figures 1 B to 1 D schematically illustrate an example of a data transmission and processing system representing the position in space of a camera for capturing a sequence of video images, according to a second embodiment of the invention
- - Figure 2 schematically illustrates an operating mode allowing the acquisition of three-dimensional coordinates of the plane of an image, relative to an orthonormal reference trihedron linked to a scene
- FIG. 1A schematically illustrates an example of a data transmission and processing system representing the position in space of a recording device views of a sequence of video images, according to a first embodiment of the invention
- Figures 1 B to 1 D schematically illustrate an example of a data transmission and processing system representing the position in space of a camera for capturing a sequence of video images, according to a second embodiment of the invention
- - Figure 2 schematically illustrates an operating mode allowing the
- FIGS. 1A to 3 is a block diagram describing the main steps of the method according to two embodiments of the method of the invention. We will first describe an example of a system for transmitting position coordinates in space in real time from a moving video camera according to two embodiments of the invention with reference to FIGS. 1A to 3.
- FIG 1A there is shown a system 1, which can be described as "minimalist”.
- This system 1 includes a video camera.
- the method according to the invention makes it possible to decorrelate the quality of the images from the precision of the measurements supplied by the system. In other words, high measurement accuracy can be obtained even if the images are of poor quality.
- the system is therefore not sensitive to shooting conditions such as low light, blurring, etc.
- the system for real-time transmission of position coordinates in space of the moving video camera 10 comprises, as indicated two main subsystems:
- the first subsystem, 11, comprises an inertial unit and its electronics (not explicitly shown). It advantageously consists of a rigid shell 110 in which the inertial unit and its electronics are housed. This shell is intended to be subjected to the video camera 10 by any suitable fixing device, definitive or unlockable (links, screws, etc.), symbolized by the reference 1 11.
- the second subsystem is a storage device 12, can advantageously be carried in the bag by an operator (not shown in FIG. 1A). This storage member 12 is intended to process the information supplied by the first subsystem 11. preferably, it supplies energy to the entire system 1. More specifically, the second subsystem 12 comprises:
- a light electronic console connected to the output of the inertial unit which records the position data in the space of the video camera 10 (via the frame 11 which is subject to it), at a typical frequency of around 100 Hz.
- FIG. 1B illustrates an example of a system for transmitting in real time the position coordinates in space of a moving video camera according to a second embodiment, which can be described as "complete". This system is now referenced the.
- the video camera and the first subsystem can be of a type identical to those shown in FIG. 1A. However, the video camera and the first subsystem are now referenced 10 ′ and 11 ′, because they can be supplemented, one and / or the other, in certain variant embodiments, by various members which will be described below. after.
- the first subsystem 11 ' is, as previously subject to the video camera 10' by any appropriate means, symbolized by the reference link 111 '.
- a fundamental difference between the first and second embodiments consists in that the data generated by the first subsystem 11 ′ are transmitted to a program data processing system recorded under the general reference 20.
- the latter is advantageously arranged in a backpack,
- FIG. 1C schematically illustrates a possible configuration of the data processing system 20, in the form for example of a microcomputer 4 disposed in the backpack 200 and communicating with the first subsystem 1 'via links 112' of similar type if not identical to those of Figure 1A.
- the microcomputer 4 comprises a set of organs and circuits arbitrarily grouped in a chassis 40.
- the main organs and circuits of this microcomputer 4 are as follows: a battery for supplying electrical energy 400, for example 12 volts
- the power supply provides a set of voltages and / or currents of outputs of appropriate amplitudes, intended for under the general reference VS ;
- a digital data processing unit with recorded program 402 proper comprising various conventional organs necessary for its proper functioning (not shown) such as, microprocessor, main memory, etc., and in particular a video card, advantageously of the 3D type, a mass memory 403, for example consisting of two magnetic disks of 80 GB each; a display unit 404 controlled by the aforementioned 3D video card; - keyboard 405 and pointing 406 data entry devices: mouse, ball, etc., for entering comments, data, synchronization instructions or processing orders; electronic circuits 407 for processing the signals generated by the first subsystem 11 '(FIG. 1B), interfaces with the digital data processing unit 402 by an appropriate electronic circuit board (serial port for example); and
- connection 409 is also provided, making it possible to make a cable connection 112 '(FIG. 1B) with the first subsystem 11'.
- the cable 112 ' can carry signals of several categories: electronic by wired links 1120' (serial bus for example or other), power supply signals 1121 'and optical signals by fiber optic links 1122 ', for example from fiber optic gyros contained in the first subsystem 11'.
- a similar connection (not shown) is provided in the subsystem 11 '.
- radio links for example according to the aforementioned "bluethooth" protocol
- direct optical links signals modulating an infrared carrier for example
- an external "Pocket Electronic Assistant” (“ADP") can be provided (FIG. 1B), known as Anglo-Saxon abbreviation of "PDA” (for "Personal Digital Assistant”).
- PDA Personal Digital Assistant
- This "PDA” is connected to the microcomputer 4 by a cable 210 passing through, for example, by the connector 409.
- the circuits 421 supply via the links 1121 ', via the connectors 409, the entire system l', that is to say the first subsystem 11 ', and possibly the video camera 10'.
- the first and second subsystems, 11 ′ and 2 can be completed and / or modified, respectively.
- the following organs, arranged in the video camera 10 ′, the first subsystem 11 ′ and / or the second subsystem 2 can be used:
- a gyrometer made up of three orthogonal coils of single-mode optical fiber, for example of diameter 80 mm each comprising
- GPS Global Positioning System
- GALILLEO Global Positioning System
- the device 22 communicates with appropriate processing circuits, not shown in FIG. 1C, contained in the chassis 40 of the microcomputer 4;
- auxiliary camera 100 made integral with the video camera 10'.
- the second subsystem 2 can also be supplemented by a microphone 23 picking up comments from the operator OP and / or synchronization sound signals, this to facilitate and secure the editing of the various sequences of video images taken.
- an initialization terminal 3 can be implemented, as shown diagrammatically in FIG. 1 D.
- the video camera 10 ′ is originally placed on a support 30 which serves as the point of origin for the motion and location sensors.
- the video camera 10 ′ can in particular be provided : - push buttons;
- a display screen advantageously of the "LCD” type (for "Liquid Crystal Display” or “Liquid Crystal Screen”, for example with a resolution of 400 ⁇ 300 pixels).
- the operator OP can also be provided with virtual glasses which can replace the screen of the video camera 10 ′.
- the first subsystem, 1 1 or 11 ′ respectively is constituted a module comprising at least devices based on inertial position and instantaneous speed sensors (gyrometers and accelerometers).
- the first subsystem, 11 or 11 ' is attached to a video camera, 10 or 10'.
- the system according to the invention makes it possible in particular to carry out what has previously been called "motion control".
- the invention covers, as will be shown two main series of applications: the characterization of the images of a video stream when the position of their plane in space and the visualization in real time of an overview of the framing of a video camera in a virtual setting. These operations assume that a certain number of parameters linked to the taking of photographs can be determined at all times, in particular the coordinates of the video camera, 10 or 10 ′, relative to a reference frame, and the focal length used.
- FIG. 2 schematically illustrates the operating mode allowing the acquisition of the coordinates in three dimensions, x n , y ⁇ , and z n , of the plane Pû'une image /, with respect to a reference orthonormal trihedron, XYZ, linked to a SC scene.
- the first subsystem 11 ' being subject to the video camera 10', it follows the movements of the latter, at the same speed of movement and also follows its inclination (characterized by elevation angles, ⁇ réelleazimuth , ⁇ admirand of rolling y.
- the devices for measuring the devices based on position and instantaneous speed sensors of inertial nature present in the second subsystem 11 ′ can therefore generate, during the above-mentioned displacement of the position data ( coordinates), instantaneous speed and inclination.
- the transmission of this data is carried out via the link 112 ′, typically at a frequency of 100 Hz.
- These data are recorded in the data processing system 2 for deferred processing and / or processed in real or near real time, preferably followed by a recording of the results of the treatment for later use.
- Other parameters can also be acq uis, for example the focal length used for taking pictures.
- an initialization of the measuring devices present in the first subsystem 11 ′ can be carried out by placing the video camera 10 'on the support 30 of the initialization terminal, at a predetermined location on the scene SC, for example of coordinates x r , y, and
- Step 1 (block 51): This is a preliminary step and system configuration.
- the first subsystem 11 '(position module) is assembled with the video camera 10'.
- the first subsystem 11 ' is secured to the video camera 10' by a screw provided for this purpose or by any other member 111 '.
- the first subsystem 11 ' is fixed via the thread generally provided for the base of the camera 10'.
- a configuration description is integrated into the second subsystem 2 in the form of a text-type system configuration file.
- This file information relating to the hardware and software used is provided. Certain parameters can be reconfigured at this stage. Subsequently, as data are acquired, models for predicting the behavior of inertial sensors can be refined by algorithmic data processing methods.
- Step 2 (block 52): Initialization of the system.
- Step 3 (block 53): processing of inertial data
- This step makes it possible to process the location acquisition data in real time and to establish the kinematics of the video camera 10 '(including its tilt).
- the position of the video camera 10 ′ is determined by a location system of the inertial central type essentially based on the use of gyros and accelerometers arranged in the first subsystem 11 ′.
- a terminal 3 which serves as a reference point.
- the deviation of the measurements from the reference point is noted. This difference is distributed over the various previous location measurements in order to compensate for the measurement error.
- Other location and distance sensors can also be used, such as those listed above: GPS, magnetometers, inclinometers, odometer inputs. These sensors make it possible to obtain redundancies and to improve the measurements. The estimate of the device position sensors drift can also be transmitted to the user.
- the inclination of the video camera 10 ' (elevation angles, ⁇ foundedof azimuth, ⁇ concerned and of rolling ⁇ ,), and consequently the inclination in space of the plane P e image /, is given mainly by a trihedron of three gyros placed along three distinct non-coplanar axes arranged in the first subsystem 11 '.
- the electronics associated with the three gyrometric sensors provide increments of rotation ⁇ ⁇ y and ⁇ , following the three axes. From these increments, the angular positions 6, ⁇ y , ⁇ are calculated by a change of reference by means of the arithmetic known as of the quatemions.
- the instantaneous position (coordinates x beginnery, and z reinforce) of the image sensor of the video camera 10 ′ and consequently the position of the center C e the image / is provided mainly by the trihedron of the three accelerometers.
- the distances traveled along the three axes are calculated by successive integrations using kinematics calculations using the acceleration data emitted by the trihedron of the three accelerometers.
- the accelerometers and gyros cooperate to calculate location data for the video camera 10 ′, typically at the frequency of the above 100 Hertz.
- the kinematics of the video camera 10 ′ is calculated continuously by means of known digital integration algorithms (for example of the “Kutta range predictor-corrector” and “Adam predictor-corrector”) to interpolate intermediate points.
- the accelerometers are measured for a period greater than 84 minutes. This initialization time makes it possible to know the characteristics of the "Schuler” period for each accelerometer. From these characteristics, using synchronization by an electronic clock, the value of the period is decremented at any time from the results given by the accelerometers. To compensate for the errors due to the laws of "Schuler” and “Coriolis” one uses a predictive modeling based on filters of a known type known as of Kalman.
- Measurements on gyros and accelerometers are carried out under different conditions. These measurements are entered in a specific software whose function is to odumbleer the behavior of the gyrometers. This software is implemented in the second subsystem 2. The errors are compensated in real time by taking account of the model.
- the system is capable of interfacing with additional positioning systems:
- accelerometers are precise especially in a determined range of measurements (acceleration and frequency). It is therefore desirable to absorb variations in trajectories / which exceed this predilection range. This procedure optimizes the quality of the measurements provided by the accelerometers. Another advantage is to smooth the curve of the path followed by the video camera 10 ′ and thus eliminate noise which parasites the measurements.
- Step 4 data acquisition.
- This step consists in acquiring the necessary information. All of this information is time stamped.
- This information comes mainly from:
- Step 5 improvement of the measurements of localization of the shots by image processing.
- the location of the position of the video camera 10 ′ by image processing essentially serves to corroborate and compensate for drift in the measurement of gyrometers and accelerometers.
- Redundant data are managed by the aforementioned “gradient descent” or “quadratic programming” algorithms which give the almost certain value of the parameters sought despite the measurement errors. These algorithms also make it possible to eliminate the outliers and to reconstruct the missing values as indicated above.
- the "tracking" is a software image processing method, well known in itself that automatically follow one or more points m pictures ⁇ category me a subject in a sequence where the shooting succeed . It is known that this process is currently the subject of studies and industrial achievements, nevertheless its reliability is often called into question in difficult cases, in particular when the remarkable points are not sufficiently different from the neighboring points. This process is made much easier and more reliable, when the specific provisions of the process of the invention are implemented, by the fact that the characteristics of the image are already known with a certain precision (coordinates of the plane picture). By the physical means of locations already defined. The software performs its "search” only in a well-defined image region.
- This "tracking" method allows assistance in locating images. It allows the automatic processing of a large number of reference points. It is easily understood that the more reference points are available, the more it becomes possible to refine the position measurements of the location of the images.
- a laser rangefinder (not shown) is added to the video camera 10 ′ and / or to the first subsystem 11 ′ which glows and is secured. separating it from a particular point of an object to model. This point whose distance is now known is automatically taken as a remarkable point. It is then followed by "tracking".
- the analysis of the curves over time of the positions of the reference points makes it possible to know precisely the kinematics of the reference points relative to the camera 10 'video. If the modeling relates to an immobile space. It becomes very easy to know the kinematics of the video camera 10 itself. It is possible in this way to correct the measurements provided by the gyros and the accelerometers of the first subsystem 11 ′. It is also possible, and again, to use optical targets as reference points.
- a robust solution for obtaining reference points on images is to install optical targets around the subject to be modeled. These targets are equipped with emitting diodes generating signals. Their positions on the images are automatically recognized.
- Step 6 (block 56): Data synchronization.
- the inertial unit located in the first subsystem 11 ′ transmits data at the frequency of a few hundred Hertz.
- the difficulty is to synchronize the position data of the video camera 10 'transmitted by the system with the filmed images, the aim being that the images are characterized by the position of the video camera 10' at the time when they were filmed.
- some video cameras can transmit data digitally and in real time concerning the synchronization of sound, the position of the focal length (which constitutes a very important parameter, if not essential), the type of film, the measurements of lights, etc., data which is also useful to synchronize with respect to position data.
- Solution 1 The synchronization of image / position data is done according to the traditional method of image / sound synchronization used in the film industry. We film a "clap”. We then "hold” the image of the film where the flap is closed with the moment when in the soundtrack we hear the noise of the flap.
- the second subsystem 2 is provided with a microphone 23 (FIG. 1B).
- the position data stream is synchronized with the sound (simultaneous recording). It only remains to find in the soundtrack the sound emitted by the flap so that there is synchronization of images / position data flow.
- Solution 2 The video camera 10 ′ is provided with circuits generating an output signal which indicates the moment when the operator OP begins to record a new sequence of images. This signal is picked up by the processing unit 4 of the second subsystem 2 and the recording of the position data stream is executed immediately.
- Solution 3 We use a small additional video camera 100 ', light and of low definition, subject to the main video camera 10' or to the first subsystem 1 l ': in this case, recording images / data flow position is simultaneous.
- the direction of the lens of the video camera 100 ' is the same as that of the main video camera 10'. Therefore, the small video camera 100 'films the same scenes with the same positions. It is enough to superimpose the two image streams to obtain the synchronization of the images of the main video camera 10 'with the data stream
- Step 7 (block 7): Data storage
- the images taken by the video camera 10 ′ are characterized by the six position coordinates but also by the following main parameters:
- time stamp precise time when images were taken
- - kinematics acceleration, speed and trajectory
- focal length of the 10 'video camera type of focal length used or zoom position if necessary;
- Step 8 (block 58): further processing of the characterized images.
- the operator OP can leverage the database of the characterized images to modify them a posteriori.
- the process of the invention is susceptible of various applications and in particular the following:
- Application 2 Films shot on cameras carried on the shoulder can undergo sudden variations in trajectory as well as shocks or vibrations. Knowing the image positions can be used to smooth the image sequence, in particular using known image interpolation software. Again the method of the invention achieves this goal.
- Application 3 To give slow motion effects without having jerky effects, special effects use software to create "virtual" images between two actually filmed images. Knowing the trajectory of the focal plane, as the method according to the invention allows, allows rapid and reliable image interpolation.
- Application 4 A traditional film specialization consists in superimposing images.
- a classic example is to film a person, then film them again in another part of the scene, and finally to superimpose the two images. The person filmed then seems to have split. This effect, relatively easy to perform when the planes are fixed, is made much more difficult when the planes are in motion. It is then necessary to synchronize the two sequences of images whose image planes are located in the same places, which again allows the method according to the invention.
- the method according to the invention also allows navigation in a three-dimensional universe in real time.
- Step 9 the steps of this second mode of the process of the invention are similar, if not identical, to those of the first mode which has just been described, at least until the fourth step included. We then move on to what is arbitrarily referenced as "Step 9" on the block diagram of FIG. 3, since this step actually takes place after the fourth step, and which will be explained below.
- Step 9 (block 59): Obtaining a pointer in three-dimensional space for navigation in a virtual universe.
- the position data of the video camera 10 ′, the images taken, as well as possibly the focal length used are communicated in real time to the second subsystem 2, more precisely to the processing device 4, by an appropriate link 112 ′: wired cable , optical link or wave.
- the register program data processing device is provided with three-dimensional restitution software, operating in real time. It can be, for example, the commercial software "Director” distributed by the company MACROMEDIA or “3DS max” distributed by the company “DISCRET” (all these names being registered trademarks).
- this second embodiment of the method can find application in the following field:
- the film sets are increasingly, in whole or in part, composed of synthetic images.
- the virtual image of the framing can then be restored by three-dimensional rendering software in real time.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Studio Devices (AREA)
Abstract
Description
Claims
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US10/553,885 US8009198B2 (en) | 2003-04-24 | 2004-04-22 | System for the tracking and motion control of a professional video camera permitting the combination of both real and vitual images |
| EP04742559A EP1623567A1 (fr) | 2003-04-24 | 2004-04-22 | Procede de transmission de donnees representant la position dans l espace d une camera video et systeme de mise en oeuvre du procede |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| FR03/05068 | 2003-04-24 | ||
| FR0305068A FR2854301B1 (fr) | 2003-04-24 | 2003-04-24 | Procede de transmission de donnees representant la position dans l'espace d'une camera video et systeme de mise en oeuvre du procede |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2004098176A1 true WO2004098176A1 (fr) | 2004-11-11 |
Family
ID=33104382
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/FR2004/000982 Ceased WO2004098176A1 (fr) | 2003-04-24 | 2004-04-22 | Procede de transmission de donnees representant la position dans l'espace d'une camera video et systeme de mise en oeuvre du procede |
Country Status (4)
| Country | Link |
|---|---|
| US (1) | US8009198B2 (fr) |
| EP (1) | EP1623567A1 (fr) |
| FR (1) | FR2854301B1 (fr) |
| WO (1) | WO2004098176A1 (fr) |
Families Citing this family (26)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| ES2813401T3 (es) | 2003-01-16 | 2021-03-23 | Viva Healthcare Packaging Ltd | Métodos para formar artículos que tienen resistencia al agrietamiento por tensión ambiental |
| US8273826B2 (en) | 2006-03-15 | 2012-09-25 | Dow Global Technologies Llc | Impact modification of thermoplastics with ethylene/α-olefin interpolymers |
| CA2613998C (fr) * | 2005-06-08 | 2016-01-05 | Thomson Licensing | Procede, appareil et systeme d'insertion d'une image/video de remplacement |
| WO2007136745A2 (fr) | 2006-05-19 | 2007-11-29 | University Of Hawaii | Système de suivi de mouvement pour imagerie adaptative en temps réel et spectroscopie |
| US8595632B2 (en) * | 2008-02-21 | 2013-11-26 | International Business Machines Corporation | Method to monitor user trajectories within a virtual universe |
| US9267862B1 (en) * | 2009-02-18 | 2016-02-23 | Sensr Monitoring Technologies Llc | Sensor and monitoring system for structural monitoring |
| US8391563B2 (en) * | 2010-05-25 | 2013-03-05 | Sony Corporation | Using computer video camera to detect earthquake |
| US9041796B2 (en) * | 2010-08-01 | 2015-05-26 | Francis Ruben Malka | Method, tool, and device for determining the coordinates of points on a surface by means of an accelerometer and a camera |
| WO2013032933A2 (fr) | 2011-08-26 | 2013-03-07 | Kinecticor, Inc. | Procédés, systèmes et dispositifs pour correction de mouvements intra-balayage |
| KR20130084720A (ko) * | 2012-01-18 | 2013-07-26 | 삼성전기주식회사 | 영상 처리 장치 및 방법 |
| US8953024B2 (en) | 2012-02-21 | 2015-02-10 | Intellectual Ventures Fund 83 Llc | 3D scene model from collection of images |
| US20130215239A1 (en) * | 2012-02-21 | 2013-08-22 | Sen Wang | 3d scene model from video |
| US9305365B2 (en) | 2013-01-24 | 2016-04-05 | Kineticor, Inc. | Systems, devices, and methods for tracking moving targets |
| US10327708B2 (en) | 2013-01-24 | 2019-06-25 | Kineticor, Inc. | Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan |
| US9717461B2 (en) | 2013-01-24 | 2017-08-01 | Kineticor, Inc. | Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan |
| EP2950714A4 (fr) | 2013-02-01 | 2017-08-16 | Kineticor, Inc. | Système de poursuite de mouvement pour la compensation de mouvement adaptatif en temps réel en imagerie biomédicale |
| WO2015148391A1 (fr) | 2014-03-24 | 2015-10-01 | Thomas Michael Ernst | Systèmes, procédés et dispositifs pour supprimer une correction de mouvement prospective à partir de balayages d'imagerie médicale |
| CN106714681A (zh) | 2014-07-23 | 2017-05-24 | 凯内蒂科尔股份有限公司 | 用于在医学成像扫描期间追踪和补偿患者运动的系统、设备和方法 |
| US9943247B2 (en) | 2015-07-28 | 2018-04-17 | The University Of Hawai'i | Systems, devices, and methods for detecting false movements for motion correction during a medical imaging scan |
| US10360718B2 (en) * | 2015-08-14 | 2019-07-23 | Samsung Electronics Co., Ltd. | Method and apparatus for constructing three dimensional model of object |
| EP3380007A4 (fr) | 2015-11-23 | 2019-09-04 | Kineticor, Inc. | Systèmes, dispositifs, et procédés de surveillance et de compensation d'un mouvement d'un patient durant un balayage d'imagerie médicale |
| GB2550854B (en) | 2016-05-25 | 2019-06-26 | Ge Aviat Systems Ltd | Aircraft time synchronization system |
| JP6657475B2 (ja) * | 2016-08-25 | 2020-03-04 | エルジー エレクトロニクス インコーポレイティド | 全方位ビデオを伝送する方法、全方位ビデオを受信する方法、全方位ビデオの伝送装置及び全方位ビデオの受信装置 |
| WO2018131813A1 (fr) * | 2017-01-10 | 2018-07-19 | Samsung Electronics Co., Ltd. | Procédé et appareil de génération de métadonnées pour des images 3d |
| CN106959460B (zh) * | 2017-05-16 | 2024-05-28 | 广州市度量行电子设备有限公司 | 一种高精度定位的gis采集器的目标点计算方法 |
| FR3142588B1 (fr) * | 2022-11-24 | 2025-01-10 | Idemia Identity & Security France | Dispositif et procédé de traitement d’image |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP0608945A1 (fr) * | 1993-01-27 | 1994-08-03 | Societe Anonyme D'etudes Et Realisations Nucleaires S.O.D.E.R.N. | Viseur d'étoile à matrice de DTC, procédé de détecton, et application au recalage d'un engin spatial |
| FR2730837A1 (fr) * | 1995-02-22 | 1996-08-23 | Sciamma Dominique | Systeme d'insertion en temps reel ou differe de panneaux publicitaires ou informationnels virtuels dans des emissions televisees |
| US5850352A (en) * | 1995-03-31 | 1998-12-15 | The Regents Of The University Of California | Immersive video, including video hypermosaicing to generate from multiple video views of a scene a three-dimensional video mosaic from which diverse virtual video scene images are synthesized, including panoramic, scene interactive and stereoscopic images |
| US6100925A (en) * | 1996-11-27 | 2000-08-08 | Princeton Video Image, Inc. | Image insertion in video streams using a combination of physical sensors and pattern recognition |
| EP1168830A1 (fr) * | 2000-06-30 | 2002-01-02 | Wells & Verne Investments Ltd | Système de prise de vues supporté par ordinateur |
Family Cites Families (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP3021556B2 (ja) * | 1990-06-20 | 2000-03-15 | ソニー株式会社 | 映像情報処理装置とその方法 |
| US5913078A (en) * | 1994-11-01 | 1999-06-15 | Konica Corporation | Camera utilizing a satellite positioning system |
| US6282362B1 (en) * | 1995-11-07 | 2001-08-28 | Trimble Navigation Limited | Geographical position/image digital recording and display system |
| FR2836215B1 (fr) * | 2002-02-21 | 2004-11-05 | Yodea | Systeme et procede de modelisation et de restitution tridimensionnelle d'un objet |
| JP2004297478A (ja) * | 2003-03-27 | 2004-10-21 | Fuji Photo Film Co Ltd | デジタルカメラ |
-
2003
- 2003-04-24 FR FR0305068A patent/FR2854301B1/fr not_active Expired - Lifetime
-
2004
- 2004-04-22 WO PCT/FR2004/000982 patent/WO2004098176A1/fr not_active Ceased
- 2004-04-22 US US10/553,885 patent/US8009198B2/en not_active Expired - Fee Related
- 2004-04-22 EP EP04742559A patent/EP1623567A1/fr not_active Withdrawn
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP0608945A1 (fr) * | 1993-01-27 | 1994-08-03 | Societe Anonyme D'etudes Et Realisations Nucleaires S.O.D.E.R.N. | Viseur d'étoile à matrice de DTC, procédé de détecton, et application au recalage d'un engin spatial |
| FR2730837A1 (fr) * | 1995-02-22 | 1996-08-23 | Sciamma Dominique | Systeme d'insertion en temps reel ou differe de panneaux publicitaires ou informationnels virtuels dans des emissions televisees |
| US5850352A (en) * | 1995-03-31 | 1998-12-15 | The Regents Of The University Of California | Immersive video, including video hypermosaicing to generate from multiple video views of a scene a three-dimensional video mosaic from which diverse virtual video scene images are synthesized, including panoramic, scene interactive and stereoscopic images |
| US6100925A (en) * | 1996-11-27 | 2000-08-08 | Princeton Video Image, Inc. | Image insertion in video streams using a combination of physical sensors and pattern recognition |
| EP1168830A1 (fr) * | 2000-06-30 | 2002-01-02 | Wells & Verne Investments Ltd | Système de prise de vues supporté par ordinateur |
Also Published As
| Publication number | Publication date |
|---|---|
| FR2854301A1 (fr) | 2004-10-29 |
| FR2854301B1 (fr) | 2005-10-28 |
| US20060221187A1 (en) | 2006-10-05 |
| EP1623567A1 (fr) | 2006-02-08 |
| US8009198B2 (en) | 2011-08-30 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| EP1623567A1 (fr) | Procede de transmission de donnees representant la position dans l espace d une camera video et systeme de mise en oeuvre du procede | |
| FR2836215A1 (fr) | Systeme et procede de modelisation et de restitution tridimensionnelle d'un objet | |
| EP3658921B1 (fr) | Procédé de calibration d'un magnetometre | |
| EP1984696B1 (fr) | Dispositif de capture de mouvement et procede associe | |
| EP3278301B1 (fr) | Procede de determination d'une direction d'un objet a partir d'une image de l'objet | |
| EP1407214B1 (fr) | Dispositif, et procede associe, apte a determiner la direction d'une cible | |
| FR2628859A1 (fr) | Capteur stellaire | |
| FR2849223A1 (fr) | Systeme de mesure et de stabilisation pour vehicules et objets volants | |
| FR2980005A1 (fr) | Procede de controle d'un curseur par des mesures d'attitude d'un pointeur et pointeur mettant en oeuvre ledit procede | |
| FR3015072A1 (fr) | Procede de determination de l'orientation d'un repere capteur lie a un terminal mobile muni d'un ensemble capteur, porte par un utilisateur et comprenant au moins un capteur de mouvement lie en mouvement | |
| EP3060881A1 (fr) | Procede de localisation en interieur et exterieur et dispositif portatif mettant en oeuvre un tel procede. | |
| EP1168831B1 (fr) | Procédé de calibration de caméra | |
| EP0502771A1 (fr) | Procédé et système d'harmonisation autonome d'équipements à bord d'un véhicule, utilisant des moyens de mesure des champs de gravité et magnétique terrestres | |
| CA3100115C (fr) | Procede d'harmonisation de deux unites de mesure inertielle l'une avec l'autre et systeme de navigation mettant en oeuvre ce procede | |
| WO2004083767A2 (fr) | Dispositif de visee ou de pointage | |
| US11580713B2 (en) | Motion compensation for a SPAD array camera | |
| EP0502770B1 (fr) | Procédé et système d'harmonisation autonome d'équipements à bord d'un véhicule, utilisant des moyens de mesure du champ de gravité terrestre | |
| EP3211370A1 (fr) | Procede de filtrage des signaux issus d'un ensemble capteur comprenant au moins un capteur de mesure d'un champ physique vectoriel sensiblement constant dans le temps et l'espace dans un repere de reference | |
| EP3655725A1 (fr) | Procédé d'estimation du mouvement d'un objet évoluant dans un environnement et un champ magnétique | |
| EP4625022A1 (fr) | Dispositif et procédé d'assistance à la localisation d'objets célestes | |
| Kaarre | HARDWARE FOR MOBILE POSITIONING | |
| FR3064389A1 (fr) | Dispositif de modelisation et de representation, et procede correspondant | |
| WO2023083879A1 (fr) | Procede de navigation hybride inertielle/stellaire a indicateur de performance d'harmonisation | |
| FR3129232A1 (fr) | Interface de navigation en environnement virtuel | |
| FR3157574A1 (fr) | Procédé de commande d'un dispositif optronique embarqué et dispositif optronique embarqué correspondant |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
| AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
| WWE | Wipo information: entry into national phase |
Ref document number: 2004742559 Country of ref document: EP |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2006221187 Country of ref document: US Ref document number: 10553885 Country of ref document: US |
|
| WWP | Wipo information: published in national office |
Ref document number: 2004742559 Country of ref document: EP |
|
| WWP | Wipo information: published in national office |
Ref document number: 10553885 Country of ref document: US |