WO2017220601A1 - Dispositif de balayage permettant le balayage du fond d'étendues d'eau et procédé d'établissement de cartes sous-marines - Google Patents
Dispositif de balayage permettant le balayage du fond d'étendues d'eau et procédé d'établissement de cartes sous-marines Download PDFInfo
- Publication number
- WO2017220601A1 WO2017220601A1 PCT/EP2017/065121 EP2017065121W WO2017220601A1 WO 2017220601 A1 WO2017220601 A1 WO 2017220601A1 EP 2017065121 W EP2017065121 W EP 2017065121W WO 2017220601 A1 WO2017220601 A1 WO 2017220601A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- scanning device
- image data
- dimensional
- camera
- scanning
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C13/00—Surveying specially adapted to open water, e.g. sea, lake, river or canal
- G01C13/008—Surveying specially adapted to open water, e.g. sea, lake, river or canal measuring depth of open water
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/87—Combinations of sonar systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/89—Sonar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A90/00—Technologies having an indirect contribution to adaptation to climate change
- Y02A90/30—Assessment of water resources
Definitions
- the present invention relates to a scanning device for scanning the bottom of waters for the automatic generation of underwater maps and a method for creating underwater maps. From most of the waters there are no underwater maps that allow planning of dives. Such cards should have a resolution of at least one meter, and preferably a higher resolution. These maps would also be useful for the water industry. For small areas around California, such underwater maps are available, which have been produced by the Sea Floor Mapping Lab (SFML).
- SFML Sea Floor Mapping Lab
- An underwater ground is not optically scanned by a satellite. Furthermore, satellite navigation systems do not function underwater, as they do not allow the reception of satellite signals.
- DeepVision From DeepVision AB, Sweden, a depth logger is known with which sonar depth data from a boat's bottom can be recorded together with corresponding satellite positioning data (GPS data) by means of a sonar arranged on a boat.
- GPS data satellite positioning data
- DeepVision also offers AB Side Scan Sonars, where a sonar sensor is placed in a submersible body that is connected by a trailing cable to a computer above it to scan objects laterally from the submersible with sonar waves.
- US Pat. No. 5,432,712 discloses a stereo imaging method in which images which have been recorded with different camera modules of a stereo camera are compared with respect to predetermined features (here: edges) and assigned to one another.
- CAGD Computer-Aided Geometric Design
- geometric objects can be displayed using free-form curves and surfaces, such as Hermite curves, Bezier curves, spline curves or NURBS.
- free-form curves and surfaces such as Hermite curves, Bezier curves, spline curves or NURBS.
- methods in computer graphics which work primarily or exclusively with polygons.
- the surface of an object is represented by polygons, which are often triangles.
- polygonization techniques include marching cubes: A high resolution 3D surface construction algorithm WELorensen et al, Computer Graphics, Issue 21, No.
- the submersible has a multi-beam sonar to scan the subsurface of a body of water three-dimensionally.
- the submersible robot also has a single camera to optically scan the ground.
- the individual images captured with the camera are put together like a mosaic.
- the composite mosaic image can be used as a texture on a 3 D structure captured by the multibeam sonar.
- a dive computer (ROV: remotely operated underwater vehicle) emerges, which has a device with which it can determine its location coordinates in real time and transmit them to a base station.
- the base station can be provided with a GPS receiver, wherein the position of the dive computer with respect to the base station is determined and exchanged, so that based on this combined information, the position of the dive computer can be determined.
- the dive computer is equipped with a data acquisition device, such as an optical camera, designed to scan the background of a body of water.
- US 2006/0182314 A1 discloses a method for displaying three-dimensional objects, which are described by three-dimensional data, on a two-dimensional screen.
- 3D area data are generated, which are two-dimensional images, the individual pixels having a false color corresponding to the distance of the respective pixel from the camera, with which this image has been taken.
- Such a 3-D area image can be superimposed with a two-dimensional color image, wherein both images have preferably been taken with the same viewing direction.
- a camera arranged in the immersion body for generating image data describing the reason
- a reference device which generates referencing information which is assigned to the image data such that the position of the detail of the reason represented in the respective image data is defined
- a 3D scanning device which is arranged either on a floating on a water surface of a body of water float or on a submersible, which is connected via a towline with the float.
- the arrangement of the camera in a submersible body makes it possible to position the camera close to the bottom of the water, so that the camera can capture image data of the ground at close range.
- high quality image data is obtained which allows to create a high resolution underwater map.
- the image data can be inserted as a texture in a three-dimensional height map, so that thereby a visualizable Unterwasser badge is generated.
- Texture is used in the sense of computer graphics as a coating for three-dimensional models to increase their level of detail, but without increasing the Datailgrad of the geometry.
- a texture thus refers to an image that is displayed on the surface of the three-dimensional model
- a pixel of the texture is called a "texel”.
- reference information is assigned to the image data defining the position of the portion of the reason represented in the respective image data.
- this reference information is the position of the submersible or the camera at the time the respective image has been detected, and the direction of the camera. Based on this reference information, in combination with a three-dimensional height map, the section of the ground which is described by the image data can be determined.
- the Referenzier portions preferably also include the size of the field of view of the camera, which is indicated for example in angular ranges.
- the field of vision can be changed so that here the reference device also detects the respective field of view and associates it with the image data.
- the field of view is constant and does not have to be individually recorded by the reference device.
- a D-scanning engagement which is arranged either on a float floating on a water surface of a body of water on a submersible body which is connected to the floating body via a towline, is thus located at a water surface or a surface near the surface Diving depth of the submersible due to the towline is limited.
- a data and / or energy supply line can run on the towline, so that data from the immersion body can be continuously transmitted to the float or the body must be powered by the float with energy.
- Such a submersible body can be equipped with less computer power compared to a self-contained submarine, since the data collected with the submersible data only cached and possibly pre-processed in the immersion body and then transmitted via the data line to the float and / or the immersion body does not need to have its own energy supply.
- a scanning device in which a submersible body is coupled by means of a towline to a float is preferably provided both on the float and on the immersion body each with a 3 D-scanning device.
- the 3D scanning device on the float is arranged with the viewing direction substantially vertically downwards.
- the 3D Abtastei nrichtu ng on the body can be aligned with the direction of view in the horizontal direction. However, it may also be pivotally mounted on the body so that it can be arranged between different horizontal directions with their viewing direction and / or in the vertical direction. In this way, the bottom of a body of water can be scanned simultaneously with two 3D scanning devices, which look at the same locations of the ground with different viewing directions.
- the reference device is preferably designed to detect the position of the camera and / or the scanning device in three-dimensional space. This position can be done, for example, with Cartesian coordinates (X, Y, Z) or with coordinates from another coordinate system.
- the reference device can have a position log device, with which the position and the viewing direction of the camera are detected and stored in the respective image data.
- the position and the viewing direction of the camera can be stored in the immersion body together with the corresponding image data on a storage device arranged in the immersion body.
- the positioning and viewing direction of the camera comprehensive Referenziera be stored in a control device located outside of the immersion body.
- the position log device preferably has a pressure sensor arranged on the immersion body in order to determine the current depth of the immersion body below the water surface on the basis of the pressure measured with the pressure sensor and to consider this as a component of the reference information. With such a pressure sensor, the depth of the immersion body in the water can be determined very accurately. As a result, a coordinate in the vertical direction (Z direction) of the submersible is clearly defined.
- the position log device may include at least one float, such as a buoy or a boat, which may float on the water surface of the water.
- the floating body is provided with a radio navigation device, such as a satellite navigation device or a mobile radio navigation device, for detecting the position of the float, and has relative position means for detecting the relative position between the float and the body.
- the relative position device may comprise one or more of the following devices:
- a towline with which the immersion body is attached to the floating body wherein the length of the towline determines the distance of the immersion body from the float; a rod with which the immersion body is fastened to the floating body, the length of the rod determining the distance of the immersion body from the floating body, a sonar on the floating body for detecting the immersion body, a position determining device for exchanging sound signals between the position-determining device and the immersion body, wherein the transit time the sound signals is measured.
- a towline has the advantage that it is very simple and inexpensive and at the same time serves for pulling or positioning the Tauc body in the water. If the dive body by means of the towline pulled by a boat, then turns at a certain water speed of the boat depending on the hydrodynamic training of the towline and the diving body a certain relative position of the body dive with respect to the boat. These relative positions are once empirically recorded and stored for different water speeds of the boat and thus allow easy and cost-effective determination of the relative position of the immersion body with respect to the float in later operation. However, if there are subsea currents or if the boat changes direction and / or speed, then the actual relative position may differ significantly from the calculated one.
- the determination of the relative position on the towline is difficult if the body is steerable steerable so that it can be steered to different depths.
- the relative position of the submersible body is set very precisely to the float. This is particularly interesting for the sampling of waters in the range up to a depth of about 10 m. This area is especially important for shipping.
- the rod length is preferably so long that the immersion body is approximately 5-7 m below the water surface.
- the floating body is typically a boat.
- the measured values are corrected with the aid of the orientation of the float.
- the orientation of the float can be detected with appropriate sensors (sensors for detecting inclination, stamping and rolling).
- sensors sensors for detecting inclination, stamping and rolling.
- measuring devices in particular a 2D camera and a 3D scanning device, can themselves also be arranged or suspended in such a way that they are stable in inclination.
- the location of the submersible body can be detected.
- the direction in which the dipping body with respect of the floating body can be detected very precisely.
- the detection of the float by means of the sonar works well if the immersion body has a certain minimum size and is not so far away from the float.
- Preferably, only the relative direction of the immersion body with respect to the floating body is detected by means of the sonar and the distance determined by a towline.
- a working with the exchange of sound signals position determining device is described in the German patent application DE 10 2016 106 214.6.
- a sound signal is exchanged, in which at least the time of sound generation is coded.
- the sound signal is decoded by the receiver, wherein the receiver also has a clock for detecting the timing of the reception of the sound signal. From the time of generation and the time of receipt of the sound signal, the duration of the sound signal is determined.
- the duration of the sound signal corresponds to the distance of the submersible body from the float. This distance can be combined with a position determination of the float by means of a navigation system.
- the position log device can also be formed independently of a float floating on the water surface.
- the position log device can have one or more inertial sensors for detecting the position of the dipping body under water. With such inertial sensors, the acceleration of the float is detected. The position is determined from the integral of the acceleration. Prior to immersion of the immersion body in the water, this is preferably calibrated by means of a navigation device, that is, the current location of the immersion body determined so that by means of the inertial or the sensors position of the body can be recorded with respect to the calibrated position.
- the inertial sensor or sensors are provided in combination with a pressure sensor, wherein the pressure sensor is used to determine the depth, so that the depth detected by means of the inertial sensors can be corrected during the dipping process.
- a clock is provided in the immersion body.
- time stamps can be generated, for example, to provide the acquired image data with a time stamp indicating the time at which the image data has been generated.
- the immersion body can be designed as a manned or unmanned submarine with its own drive for locomotion under water. Is the diving body an unmanned U- Boat, then he is preferably designed remotely controllable.
- the remote control can be carried out by means of a connecting line, in which at least one data line is provided for transmitting the control data.
- the lanyard may also include a conduit for transmitting electrical power to the submarine.
- the remote control of an unmanned submarine can also be done by means of sound signals to which the corresponding control signals are encoded.
- the scanning device may include both a 2D camera as the camera for generating basic two-dimensional image data, and a 3D scanner for generating three-dimensional reason descriptive information.
- the two-dimensional image data generated by the 2D camera is preferably used as a texture for a height map, and the three-dimensional reason descriptive information for generating the height map may be used.
- the 2D camera and the 3 D scanning device can each be arranged in a submersible body. They can also be arranged in the same immersion body. Additionally or alternatively, a 2D camera and / or a 3D scanning device can also be arranged in a floating body, in particular a boat. Preferably, at least two 3D scanning devices are provided such that the ground can be scanned from different directions.
- a method of generating underwater maps comprising the steps of:
- two-dimensional image data of a ground of a water body are provided together with reference information, whereby these can be easily and reliably imaged as a texture on the three-dimensional height map. This creates a visualizable underwater map.
- polygonized height maps allows the acquisition of a three-dimensional contour with a very small amount of data, with highly contoured areas These can be represented precisely by small polygons, whereas in areas with little contour due to the use of correspondingly large polygons, the amount of data can be kept very low.
- the polygons form surfaces on which the texture can be easily imaged.
- such a polygonized height map can be supplemented in areas by detail geometry data.
- a method for generating underwater maps is provided, wherein the following steps are carried out:
- mapping the two-dimensional image data as a texture onto the three-dimensional height map by means of the reference information wherein the three-dimensional height map is provided by scanning the bottom of the water with a 3-D scanner disposed on a float or immersion body provided with a radio navigation device is coupled such that with the radio navigation direction determined position coordinates are assigned as reference information of the three-dimensional height map.
- the typical applications for underwater maps, such as shipping or diving require the most accurate information immediately below the water surface, for example, to avoid a collision of a ship with the ground.
- the need for detailed profiles at great depths of, for example, more than 100 meters is extremely low.
- the inventors of the present invention have recognized this and accordingly found a solution with which submarine maps can be created in a very simple way, which have the necessary precision in the areas which are important for the user.
- the evaluation of the 3D data generated with the 3D scanning interface is much simpler than with 3D data.
- 3D data which are generated with a provided on a submarine 3D Abtastei device.
- the position of the submarine and thus of the SD scanning device under water must be detected by means of one or more inertial sensors and updated from an initial position which is determined as the submarine emerges. This is much more complicated and deviations continue and increase increasingly. Therefore, data acquired in this way must be aligned differently with each other. This can be done for example by extracting characteristic points in individual images, so that the individual images are subsequently assembled into a mosaic. As a result, inaccuracies in the local mood of the submarine and thus the 3 D scanning device can be compensated. However, this is not necessary in a precise determination of the location of the floating body or of the diving body by means of radio navigation.
- the elevation maps or the 3D information is generated by means of a sonar, which is arranged on a floating body (boat or ship) or immersion body.
- the two-dimensional image data of the bottom of the watercourse can be prepared together with the reference information with a scanning device as explained above.
- the reference information is used to transform the two-dimensional image data into a texture space.
- Color values of points of the texture space are mapped to the associated points in the height map. It is also possible to associate a plurality of color values of different two-dimensional image data with a point of the height map, wherein the plurality of color values are interpolated or averaged.
- a bottom of a stream can be sampled from two different directions. The information thus obtained is assembled to produce the three-dimensional altitude map.
- the two different directions preferably enclose an angle of at least 30 ° or of at least 60 ° and in particular are approximately orthogonal to one another.
- the different angles can also be aligned in the horizontal and in the vertical direction to each other.
- the inventors have recognized that when scanning from one direction only, the problem is that with different slopes of the ground, the corresponding areas in a height map are displayed at different resolutions.
- a steep wall scanned only from above is only detected by a few sampling points. If the steep wall is scanned from the side, then it is detected with many points.
- sampling the ground from one direction only has the problem that if the resolution of the height map is too coarse, the surfaces appear to be smoothed out, with a texture displayed thereon showing fine structures appearing very unrealistic. This problem is eliminated by scanning from different directions, because this allows the resolution to be kept approximately independent of the slope of the ground.
- Sampling from two different directions may be accomplished by a method of generating three-dimensional information, such as. a sonar scan, a scan using a stereo camera or runtime camera, executed.
- the height map should have points with a maximum distance of 20 cm. Preferably, the distances are smaller, in particular 10 cm or 5 cm.
- the precision of the individual points should be at least 20 cm.
- a height map can be represented by a three-dimensional data cloud.
- the three-dimensional data cloud is a list of points in three-dimensional space, which are indicated by three coordinates (X, Y and Z coordinates), for example. These points each represent a point on the surface of the bottom of the area described by the elevation map. With such a data cloud, undercuts such as caves or the like can be displayed.
- the data points may also contain values, in particular vectors, which point to the surface of the ground. These values are preferably provided only at data points located adjacent to the surface of the ground.
- the two-dimensional image data can be inversely proportional to the deviation of the viewing direction from the normal of the area of the three-dimensional height map on which the two-dimensional height map Image data are mapped.
- the reason is detected multiple times from different directions for generating image data describing the reason.
- FIG. 1 shows a first embodiment of a scanning device for scanning from
- FIG. 2 shows a basic diagram of bodies with a submersible located on a towline
- FIG. 2 shows the submersible body of FIG. 1 schematically in a block diagram
- FIG. 3 shows a second embodiment with a scanning device, wherein the
- Submersible body is attached to a boat by means of a pole,
- FIG. 4 shows a third exemplary embodiment of a scanning device with a floating body and a submersible, wherein the position of the submersible is determined by means of sonar signals
- 5 shows schematically the arrangement of two floats and a submersible body in the water according to the third embodiment
- FIG. 6 shows a method for generating underwater maps in a flowchart
- FIG. 7 shows a method for producing a texture
- FIG. 8 shows the method according to FIG. 7 on the basis of some example images.
- the invention relates to a method for producing underwater maps.
- a first aspect of the invention comprises the generation of information of a ground of a water body with a corresponding scanning device 1.
- a second aspect of the invention relates to the production of underwater maps, for which purpose the information obtained with the scanning device 1 according to the invention can be used.
- the underwater maps can also be generated from corresponding information from other sources of information.
- a first embodiment of a scanning device 1 comprises a float 2 and a submersible body 3 ( Figure 1).
- the floating body 2 is in this embodiment, a boat or a ship that floats on a water surface 4 of a body of water.
- the immersion body 3 is connected to a towline 5 with the float or boat 2.
- the towline includes a two-wire cable 6 for supplying the submersible body 3 with electrical power in the form of direct current (Figure 2). In FIG. 2, only sections of the two wires of the cable 6 are shown schematically. This cable 6 is connected to all electrical devices of the immersion body 3 in order to supply them with electricity.
- This cable 6 is also used as a data line by feeding a frequency signal to the cable.
- the immersion body 3 has a bandpass filter 7, which is connected to the cable 6 and functions as a crossover, so that the frequency signal is decoupled from the cable 6 via the bandpass filter 7.
- a demodulator 8 is connected, which demodulates the frequency signal and generates an analog data signal.
- the analog data signal is converted with an A / D converter 9 into a digital data signal, which is fed to a central control device 10.
- the central controller 10 is connected to a D / A converter 11 which receives digital signals from the central controller 10 and converts them into analog signals.
- a modulator 12 is connected, which modulates the analog signals of the D / A converter to a predetermined frequency or a predetermined frequency range.
- the Modulator 12 is connected to cable 6 so that the modulated signal is fed to cable 6.
- the central control device 10 can thus send data via the D / A converter 11 and the modulator 12 via the cable 6.
- the immersion body 3 is provided with a 2D camera 13, which is a camera in the visible wavelength range in the present embodiment. Within the scope of the invention, the 2D camera can also be for receiving other wavelength ranges, in particular an infrared camera.
- the 2D camera can also be designed as a sonar. With the 2D camera, a two-dimensional image of the bottom 14 of the water is generated.
- the 2D camera 13 is arranged on the immersion body 3 with its viewing direction 15 directed downward.
- the 2D camera 13 can also be arranged pivotably on the immersion body 3, so that their viewing direction can be directed vertically downwards or horizontally to the side or in any position between them.
- the immersion body 3 can also have a plurality of 2D cameras 13, which are arranged with their viewing directions in different directions.
- the submersible body 3 at least two 2D cameras 13, the viewing directions 15 are aligned mutually orthogonal.
- the immersion body 3 may also have an illumination device 16, which emits light in a wavelength range adapted to the sensitivity of the 2D camera. Such a lighting device 16 is useful when the immersion body 3 is to be used in depths to which passes little daylight.
- the 2-D cameras 13 are connected to the central control device 10, so that they can receive the images captured by the 2-D camera or the 2-D cameras 13 and store them on a storage device 17 and / or optionally process them further.
- the floating body 2 has a first 3D scanning device 18 and a second SD scanning device 19.
- the first 3D scanning device 18 is directed downward with its viewing direction 20 and the second 3D scanning device 19 is directed horizontally to the side with its viewing direction (perpendicular to the plane of the drawing in FIG. 2).
- the 3D scanning devices 18, 19 are each designed as a sonar. They thus each have a transmitter and receiver for transmitting and receiving sonar signals. len on.
- the 3D scanners 18, 19 are used to generate information about a three-dimensional surface structure.
- a sonar other means for three-dimensional scanning of surfaces may be used, such as a stereo camera or a time-of-flight camera.
- the immersion body 3 With sonars, 3 D information can be generated very reliably under water regardless of the turbidity of the water, which is why sonars are the preferred 3 D scanners.
- the immersion body 3 also have only a single 3D scanning device, with different viewing directions then body by pivoting the 3 D-scanning on the immersion body by means of a corresponding pivoting device or by controlling the rotational position about a horizontal longitudinal axis of the submersible third can be adjusted.
- the immersion body 3 has rudder 21, which are designed to control the depth and / or the rotational position about a horizontal longitudinal axis of the immersion body 3.
- a Steuerruderstell Road 22 provided, which controls the individual rudder 21 with corresponding actuators 23.
- the rudder adjustment device 22 is connected to the central control device 10 and contains signals corresponding thereto for changing the position of the submerged body 3.
- a 3D scanning device 24 is provided to scan the bottom of the water and to generate three-dimensional information about the bottom of the water.
- the 3 D scanning device is aligned with the boat 2 with its viewing direction 25 down.
- the 3-D scanner 24 is a sonar. However, it can also be another suitable device for generating three-dimensional information of a surface.
- the boat 2 further comprises a Tauchgroiper scanning device 26, which is a sonar in the present embodiment.
- the plunger-scanning device 26 is viewed with its line of sight from the boat 2 rearwardly inclined downwards, so that the immersion body 3 when dragging by the boat 2 by means of the towline 5 in the field of view of the submersible body scanning device 26 is located.
- the submersible body scanning device 26 is a sonar in the present embodiment. With the sonar, the exact location of the submersible body 3 relative to the boat 2 can be determined.
- the float or boat 2 has a satellite navigation device 28, such as a GPS system, which can receive location signals from the satellite 29 so as to determine the position of the boat 2.
- a satellite navigation device 28 such as a GPS system
- the location of the boat 2 can thus be determined.
- the relative location of the immersion body 3 with respect to the boat 2 can be determined. Since the absolute location of the boat 2 is known, the absolute location of the submersible body 3 can be determined from the relative location.
- This location information is assigned to the respectively captured two-dimensional images or the respectively acquired 3D information as reference information. This assignment can be carried out, for example, by simultaneous acquisition of the two-dimensional images or 3D information and simultaneous determination of the location, the corresponding data then being linked to one another.
- the location of the immersion body 3 in the three-dimensional space can also be determined at predetermined time or spatial distances, it being possible to interpolate location coordinates in the area between two location determinations, if necessary.
- the sampling frequencies of the 2D camera, the 3D scanner and the radio navigation device may thus differ, with the respective location coordinates being individually interpolated for assignment to the two-dimensional images or for assignment to the three-dimensional data.
- the means for determining the location information of the immersion body 3 thus serve as a referencing device and the location of the immersion body 3 as referencing information.
- the use tion of location information as referencing information allows a very simple, fast and precise assignment of the two-dimensional images to the three-dimensional height maps.
- the immersion body 3 can also have a tilt sensor and / or inertial sensor with which the inclination relative to the vertical can be determined. Based on this angle of inclination, in conjunction with the direction of travel of the boat 2 and thus the pulling direction, with which the immersion body 3 is pulled in the water, the orientation of the immersion body 3 in three-dimensional space and thus the viewing directions of the 2D camera or the 3 D Scanning devices are determined. These viewing directions can equally be used as referencing information.
- the 2D camera can be provided with a zoom lens. This allows you to change the field of view of the camera.
- the setting of the zoom lens can be done by means of a magnification scale or the field of view can be described by means of an angular range. Values describing the field of view can equally be used as referencing information for the respective camera 13.
- the above-described scanning apparatus 1 it is thus possible to scan the bottom of a stream and to generate image data describing the bottom of the stream while simultaneously generating referencing information associated with the image data such that the position in the respective image data section shown, is defined. This is especially true for two-dimensional image data.
- the 3-D information can equally be referenced by the reference information.
- the thus generated two-dimensional image data can then be easily integrated into an existing height map.
- the 3D information can be used to create or refine an existing elevation map.
- the above-described referencing information is sufficient to image the two-dimensional images on the height map. It is not necessary to extract characteristic points of the two-dimensional images and to align the two-dimensional images with each other and / or to extract characteristic points of the individual three-dimensional images generated by the 3-D scanning interpolation in order to align them and / or around them align two-dimensional images with respect to the elevation map. Although such information may in principle be used in addition, it is not necessary.
- a Abtastvomchtung invention 1 of the immersion body 3 by means of a rod 30 on the floating body 2 is attached (Hg. 3).
- the float or the boat 2 corresponds in its embodiment substantially the float or the boat 2 according to the first embodiment.
- the immersion body 3 of the second embodiment substantially corresponds to the immersion body of the first embodiment.
- the cable 6 is guided along the rod 30 here.
- no immersion body scanning device is necessary, because the relative location of the immersion body 3 with respect to the float 2 is clearly determined by the rigid rod 30.
- the exact position of the immersion body 3 and its orientation can be determined on the basis of the inclination values obtained by means of the sensors for detecting the inclination, the pitching and the rolling. Accordingly, referencing information can be generated that is linked to the generated image data and 3D information.
- the second embodiment is particularly useful for scanning shallow waters with a maximum depth of 20 m, in particular not more than 10 m, or for sampling waters near the shore. Such water depths are of particular interest to shipping.
- the immersion body 3 is designed as a remotely controlled submarine or drone, which receives 2 sound signals from two floats, based on which the immersion body 3 can determine its exact position in three-dimensional space.
- the immersion body 3 is provided with a pressure sensor 40 and a temperature sensor 41, which are connected to a first sensor controller 42.
- a first sensor controller 42 To the first sensor controller 42 and a clock 43 is connected.
- the first sensor controller 42 can pick up corresponding sensor signals at the pressure sensor 40 and at the temperature sensor 41 and convert them into a corresponding digital pressure value or temperature value.
- the first sensor controller 42 can provide the individual pressure values and temperature values with a time stamp.
- the immersion body 3 has a hydrophone 44 with which sound signals are received and converted into electrical signals.
- the hydrophone is connected to a second sensor controller 45, which can detect the electrical signals generated by the hydrophone.
- the second sensor controller 45 is configured to extract time and location information from the received sound signals.
- a transmitting device 46 which is explained in more detail below, generates a sound signal on which a digital signal is modulated, the digital signal containing the location and time information. This digital signal is extracted by the second sensor controller 45 and the corresponding location and time information is provided.
- the second sensor controller 45 is also connected to the clock 43 and can provide the received location and time information with a time stamp indicating the time when the corresponding sound signal has been received by means of the hydrophone 44,
- the first sensor controller 42 and the second sensor controller 45 are each connected to a microcontroller 47.
- the ikrokontroller 47 has a connection to an external interface 48, to which a computer can be connected.
- the immersion body 3 is arranged in a watertight housing 49.
- the sensors 40, 41 and 44 each extend through an opening of the housing 49, being watertight sealed from the respective opening.
- the immersion body 3 has an inertial sensor 50 with which the direction of movement of the immersion body 3 can be detected.
- the inertial sensor 50 is connected to the microcontroller 47.
- the transmitting device 46 has a GPS receiver 51, which is connected to an antenna 52 for receiving satellite signals.
- the transmitting device 50 has a clock 53.
- the GPS receiver 51 and the clock 53 are each connected to a transmission circuit 54, which generates a transmission signal which can be output by means of a hydrophone source 55.
- the transmission circuit 54 is designed such that it modulates both the time of signal generation and the location determined by the GPS receiver 51 on the transmission signal.
- the emitted sound signal thus contains the location and time information when and where the signal has been generated.
- the clock 43 of the immersion body 3 and the clock 53 of the transmitting device 46 are synchronized with each other. These clocks 43, 53 are preferably radio clocks, so that they are regularly synchronized with a central radio clock.
- the body with a hydro-sound source and the transmitting device 46 with a hydrophone, so that the immersion body 3 and the transmitting device 46 can exchange bidirectional sound signals to synchronize the respective clocks 43, 53 with each other.
- a system with two transmitters 46/1 and 46/2 is used.
- the transmitting devices 46/1 and 46/2 receive with their respective GPS receivers 51 satellite signals from GPS satellites 56. Based on these satellite signals, the GPS receivers 51 respectively determine the location of the respective transmitting devices 46/1 and 46/2.
- the two transmission devices 46/1 and 46/2 each emit a sound signal in which the location of the respective transmission device 46/1 and 46/2 and the time of the sound generation are coded. As a result, each of these sound signals contains the information about the location and the time of its generation. With the respective Hyd rose hallettin 55 the sound signals are radiated into the water.
- the two transmitting devices 46/1 and 46/2 are each arranged on buoys, boats or ships, which float freely on a water surface.
- the transmitting devices 46/1 and 46/2 are preferably arranged at a distance of at least a few meters, preferably a few tens of meters from each other.
- the immersion body 3 receives by means of the hydrophone 44, the sound signals of the two transmitting devices 46/1 and 46/2.
- the sound signals are decoded by the second sensor controller 45 and provided with the time stamp, which indicates the time when the respective sound signal has been received by the immersion body 3.
- This information is forwarded by the second sensor controller 45 to the microcontroller 47.
- the microcontroller 47 determines the transit time of the sound signal from the sending time of the sound signal and the time of reception of the sound signal. Based on the speed of sound in the water, the running time is converted into a distance. This is the distance d1 or d2 from the location encoded in the sound signal.
- the distances dl and d2 to the respective transmitting devices 46/1 and 46/2 are known.
- the microcontroller 47 calculates the depth of the immersion body 3 with respect to the water surface. This depth defines a certain level 58 which intersects the circle 57 in two points. Since the immersion body 3 must be in this plane 58, the position of the submersible body is set to one of these two intersections. These two points of intersection are arranged mirror-symmetrically to a vertical plane of symmetry which runs through the two transmitting devices 46/1 and 46/2. If the immersion body 3 moves with respect to this plane of symmetry a bit towards the plane of symmetry or away from the plane of symmetry, then this can be detected with the inertial sensor 50. The inertial sensor 50 transmits the direction of movement to the microcontroller 47. The ikrokontroller 47 determines the component of motion perpendicular to the plane of symmetry.
- At least two positions of the immersion body 3 are determined with the aid of the sound signals and the pressure sensor. These positions are still ambiguous because they may be located on either side of the plane of symmetry. It also determines the timing of these two positions, so that the direction of movement of the submersible body 3 is determined by these two positions.
- the component of motion perpendicular to the plane of symmetry is in each case directed in opposite directions from the plane of symmetry on both sides in the present two possibilities.
- These directions of movement are compared with the direction of movement detected by the inertial sensor 50, the positions on the side of the plane of symmetry being judged to be correct, which have given the same direction of movement as the inertial sensor 50. As a result, the position of the immersion body 3 can be determined uniquely.
- This position is then determined by three coordinates (X, Y, Z) with respect to the two transmitting devices 46/1 and 46/2 or with respect to a coordinate system predetermined by the G PS satellite system.
- the coordinates and the corresponding time stamp of the respective position of the immersion body 3 are stored in a memory device 59 in a predetermined log data record 60.
- This immersion body 3, just like the immersion body 3 of the first exemplary embodiment according to FIG. 2, has a 2 D camera 13 and two 3D scanning devices 18, 19.
- the scanning devices 13, 18, 19 can be designed in exactly the same way as in the first embodiment, for which reason reference is made to this is taken.
- rudders are again provided on the immersion body 3, which rudders are designed to control the diving depth and / or the rotational position about a horizontal and / or vertical longitudinal axis of the immersion body 3.
- a Steuerruderstell issued 22 is provided in the immersion body 3, which controls the individual rudder with corresponding controls 23.
- the rudder control device 22 is connected to the central control device 47 and contains signals corresponding to it in order to change the position of the submerged body 3. If the immersion body 3 is an autonomously operated immersion body, then it still has a drive mechanism (not shown) with which the immersion body 3 can be moved in the water.
- the images generated by the scanning devices 13, 18 and 19 are preferably provided with a time stamp by the microcontroller 47 and stored in the memory device 59 in an image data record 61.
- the location stored in the log record 60 serves as referencing information for the image data contained in the image data set 61.
- the referencing information of the log data set 60 is linked to the image data of the image data set 61 via the time stamp.
- the orientation of the immersion body 3 is stored in the log data record 60 so that the viewing direction of the camera 13 or the scanning device 18 can be assigned to the image data contained in the image data set 61.
- the immersion body 3 is coupled in this embodiment to the radio navigation device (GPS receiver) of the transmitting devices 46/1 and 46/2.
- GPS receiver GPS receiver
- the three-dimensional coordinates of the immersion body 3 can be determined at any time.
- Such a coupling of the immersion body 3 to the radio navigation device is not possible to any depths, since on the one hand the sound signals can not be arbitrarily far transmitted and on the other hand limits the distance of the transmitting devices 46/1 and 46/2, the spatial resolution in deeper depths.
- Such a coupling of the submersible body 3 to the radio navigation is mainly in the upper part of the water near the pool to a depth of z. B. 100 m very efficient and reliable.
- 3 D altitude map is about 1 m.
- step S3 with the 3 D-scanners 18, 19, which are arranged on the immersion body 3, detected 3 D information composed.
- the 3-D scanners 18, 19 scan the ground from different, especially mutually orthogonal, directions, thereby providing approximately uniform resolution of the 3D information regardless of the slope of the surface of the ground.
- This 3D Information is associated with reference information describing the section of the reason shown. Based on this reference information, this SD information is added to the 3D height map from step S2 and the shore lines from step S1 (step S4), so that a 3D height map with a resolution less than 1 m is generated.
- step S5 the 3 D height map is polygonized. This can be done, for example, with the method explained in the introduction to D.T. Lee et al. (Two algorithm for constructing a delaunay triangulation).
- the data sources may include absolute point cloud data, relative point cloud data, or a polygonized object. Such data sources are in part for corals, wrecks or other objects of great interest. This data can also be recorded for such objects.
- One suitable method for obtaining high-resolution SD information is photogrammetry. Photogrametry involves different methods for generating 3D information.
- the objects are scanned from different directions by means of a camera. The images of the objects produced in this way are subjected to a feature analysis. Based on the features, an assignment of the individual features takes place in the different images, from which the three-dimensional body or the three-dimensional object can be modeled.
- the 3D information can also be generated by means of a stereo camera and a corresponding method for evaluating stereo images.
- This detail geometry data may be added to the polygonized height map in step S7.
- step S8 two-dimensional image data of the reason is read.
- the two-dimensional image data includes reference information describing the portion of the reason represented by the respective image data.
- the two-dimensional image data can be added as a texture of the three-dimensional height map obtained in step S5 or S7 in consideration of the height map-associated referencing information (step S8).
- the reference information of the two-dimensional image data preferably contains in each case the location and the viewing direction of the camera with which the image data was recorded.
- the two-dimensional image data is inversely proportional to the deviation of the viewing direction from the normal of the area of the three-dimensional height map weighted on which the two-dimensional image data are mapped. In other words, the more the viewing direction deviates from the normal of the surface of the height map, the less the corresponding two-dimensional image data is adopted as texture. Image data whose viewing direction corresponds to the normal are adopted all the more.
- step S9 the final underwater map is thus generated, which is output in step S10.
- the individual steps are fully automatic executable.
- step S3 used 3D information with high resolution and / or the two-dimensional image data used in step S8 allow the automatic integration of this information to the existing existing 3D elevation map.
- step S3 and 54 a rough and easy-to-produce 3D height map is refined so far that their resolution is so high that the integration of two-dimensional image data as a texture does not lead to unnatural impressions.
- a natural impression the appearance of the ground is judged, which sees a diver with the human eye from a distance of about 2 - 5 m.
- the combination of 3D information acquired from two different directions, which are preferably orthogonal to each other, allows the generation of a height map of substantially uniform resolution, regardless of the slope of the ground being displayed. 3.
- two-dimensional image data as a texture for a three-dimensional underwater map results in a 3D map that on the one hand reflects the contour of the ground with a sufficiently accurate for diving or for shipping resolution and on the other hand faithfully reproduces the appearance of the reason. 4. If the two-dimensional image data is taken from different directions and weighted according to the deviation from the normal of the reason, as explained above, then a very lifelike representation is achieved because distortions due to the line of sight are significantly reduced. A method is explained below with reference to FIGS. 7 and 8, as the two-dimensional image data read in in step S8 are converted into a texture which can be imaged onto the three-dimensional height map.
- step S12 a two-dimensional image of the ground is read.
- the two-dimensional image is a color image 31 (FIG. 8).
- This two-dimensional image contains reference information from which, in step S13, the camera position with which the two-dimensional image was taken and the size of the field of view of the real camera 32 are read out.
- the virtual object which may be a grounded object, is known from the detail geometry data from step S6, or the virtual object is a portion of the ground described by the polygonized 3D elevation map according to step S5.
- the geometry of the three-dimensional object is thus present with high precision. Only the surface of this three-dimensional object does not emerge either from the height map or from the detail geometry data. Due to the precise representation of the three-dimensional object, it is now possible to generate a virtual two-dimensional image 33 of the object (step S14).
- a virtual camera 34 is positioned with respect to the virtual three-dimensional object in the same place as the real camera 32 in capturing the real two-dimensional image 31, and the same field of view as when taking the real two-dimensional image 31 with the real camera 32 is set ,
- the virtual two-dimensional image 33 thus generated is essentially identical to the real two-dimensional image 31 with regard to the perspective representation of the three-dimensional object in the two-dimensional image.
- texture space 36 also referred to as UV space.
- This texture space contains surface elements that correspond to surface sections of the object. At the hut, these are e.g. the roof surfaces and side surfaces of the hut.
- the virtual image generated in step S14 is assigned to the texture space 36 in step S15. Since the texture space has been generated on the basis of the geometric description of the virtual object, there is an unambiguous assignment of the points of the surface of the virtual object. rush objects to the points in texture space called texels. Since the virtual two-dimensional image 31 has also been generated on the basis of the geometric description of the virtual three-dimensional object, there is also a clear relationship between the pixels of the virtual two-dimensional image and the points of the surface of the virtual three-dimensional object, and thus also a clear relationship between the two Thus, a pixel 37 of the real two-dimensional image 31 is assigned to the texture space 36 on the texel 38 corresponding to the pixel 39 in the two-dimensional virtual image 33.
- the corresponding pixel 39 in the virtual two-dimensional image 33 is determined.
- the corresponding pixel 39 is located in the virtual two-dimensional image 33 in the same place as the pixel 37 in the real two-dimensional image 31.
- the assignment to the corresponding texel 38 is fixed in the texture space 36, so that the Pixel 37 of the real image 31 can be uniquely assigned to the texture space.
- the color values of the pixel 37 are assigned to the texel 38 or entered there. This assignment takes place very quickly because, as with a look-up table, the image points of the real image can be assigned to the corresponding texels.
- step S16 it is checked whether there are more real two-dimensional images. If this is the case, then the procedure goes to step S12 and the further real two-dimensional image is read. Then, steps S12 to S15 are executed in the same manner as explained above. It may be that several real two-dimensional images contain pixels 37 which are to be assigned to the same texel in the texture space 36. In this case, the color values of the different image points are preferably averaged. It may also be expedient, in the averaging, to apply the above-explained weighting as a function of the deviation of the viewing direction of the camera from a normal to the surface of the object at which the respective pixel 37 or 39 is located.
- step S17 the texture produced with steps S12 to S15 is mapped onto the object or the 3 D height map. The process is ended with step S18.
- the referencing information position of the camera, viewing direction and field of view are thus used to transform the image points of the real two-dimensional images into the texture space.
- a submarine map can be generated which very precisely reproduces the contour of the ground and, moreover, looks true to life through the use of the texture.
- the invention is not limited to the generation of underwater maps.
- the invention can be used advantageously wherever there is 3 D data of an object with high precision.
- the above-explained methods can, for. B. be used in different medical applications.
- these z. B. be measured with a laser.
- Two-dimensional color images of the teeth can be taken with a special camera, the z. B. has an optical angle element with which the back of the teeth can be detected.
- From the two-dimensional image data a texture is generated and mapped onto the three-dimensional model.
- 3 D data are obtained, which correctly reproduce both the contour of the teeth and their color appearance.
- Submersible body scanner 60 56 GPS satellite
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Hydrology & Water Resources (AREA)
- Computer Networks & Wireless Communication (AREA)
- Acoustics & Sound (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)
Abstract
L'invention concerne un dispositif de balayage permettant le balayage du fond d'étendues d'eau pour l'établissement automatique de cartes sous-marines. Le dispositif de balayage comprend un corps immergé, une caméra agencée dans le corps immergé et produisant des données d'image décrivant le fond, et un dispositif de référencement qui produit des informations de référencement qui sont associées aux données d'image, de manière à définir la position des secteurs du fond représentés dans les données d'image correspondantes.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| DE112017003046.3T DE112017003046A5 (de) | 2016-06-20 | 2017-06-20 | Abtastvorrichtung zum Abtasten vom Grund von Gewässern und Verfahren zum Erzeugen von Unterwasserkarten |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| DE102016111238.0 | 2016-06-20 | ||
| DE102016111238.0A DE102016111238A1 (de) | 2016-06-20 | 2016-06-20 | Abtastvorrichtung zum Abtasten vom Grund von Gewässern und Verfahren zum Erzeugen von Unterwasserkarten |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2017220601A1 true WO2017220601A1 (fr) | 2017-12-28 |
Family
ID=59152873
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/EP2017/065121 Ceased WO2017220601A1 (fr) | 2016-06-20 | 2017-06-20 | Dispositif de balayage permettant le balayage du fond d'étendues d'eau et procédé d'établissement de cartes sous-marines |
Country Status (2)
| Country | Link |
|---|---|
| DE (2) | DE102016111238A1 (fr) |
| WO (1) | WO2017220601A1 (fr) |
Cited By (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN108151715A (zh) * | 2018-02-09 | 2018-06-12 | 首都师范大学 | 一种浅水区水下地形测量装置及方法 |
| CN110764094A (zh) * | 2019-10-25 | 2020-02-07 | 南京奥达升智能科技有限公司 | 一种水下三维可视化探测系统及其探测方法 |
| CN114049414A (zh) * | 2021-11-04 | 2022-02-15 | 海南诺亦腾海洋科技研究院有限公司 | 一种图像生成方法、装置、电子设备及存储介质 |
| CN116625329A (zh) * | 2023-07-24 | 2023-08-22 | 新兴际华(北京)智能装备技术研究院有限公司 | 堰塞湖信息确定方法、系统、电子设备及存储介质 |
| CN118089676A (zh) * | 2024-04-24 | 2024-05-28 | 中交华南勘察测绘科技有限公司 | 一种水下地形勘测无人船数据采集及传输方法 |
Families Citing this family (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN109099854B (zh) * | 2018-08-29 | 2020-05-05 | 江苏省工程勘测研究院有限责任公司 | 一种水域深度的测量装置及测量方法 |
| DE102019118103A1 (de) * | 2019-07-04 | 2021-01-07 | Rwe Renewables Gmbh | Maritimer Schwebekörper |
| CN112285682B (zh) * | 2020-10-20 | 2024-08-13 | 水利部交通运输部国家能源局南京水利科学研究院 | 水工工程涵洞环境的360°多波束声呐扫描装置及方法 |
| DE102021101796A1 (de) | 2021-01-27 | 2022-07-28 | Jens Dirksen | Verfahren zur Bestimmung eines Gewässerprofils |
| CN116976679B (zh) * | 2023-09-20 | 2023-12-29 | 航天宏图信息技术股份有限公司 | 堰塞湖溃坝预警方法、装置、电子设备及可读存储介质 |
Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5432712A (en) | 1990-05-29 | 1995-07-11 | Axiom Innovation Limited | Machine vision stereo matching |
| US6590640B1 (en) | 2000-07-20 | 2003-07-08 | Boards Of Regents, The University Of Texas System | Method and apparatus for mapping three-dimensional features |
| US20060182314A1 (en) | 2005-02-11 | 2006-08-17 | England James N | Method and apparatus for displaying a calculated geometric entity within one or more 3D rangefinder data sets |
| JP2010190726A (ja) * | 2009-02-18 | 2010-09-02 | Toa Harbor Works Co Ltd | 水底地形測量方法およびシステム |
| WO2012129612A1 (fr) | 2011-03-31 | 2012-10-04 | Ogburn Damian | Procédé et système pour surveiller ou réguler des formations sous-marines |
| DE102012103373A1 (de) | 2012-04-18 | 2013-10-24 | Jena-Optronik Gmbh | Verfahren zur Erstellung eines 3D-Modells urbaner Umgebungen |
| US20150301180A1 (en) | 2011-09-08 | 2015-10-22 | Advanced Scientific Concepts Inc. | Terrain mapping ladar system |
Family Cites Families (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US8437979B2 (en) * | 2007-01-20 | 2013-05-07 | Kcf Technologies, Inc. | Smart tether system for underwater navigation and cable shape measurement |
| DE102011116613A1 (de) * | 2011-10-20 | 2013-04-25 | Atlas Elektronik Gmbh | Unbemanntes Unterwasserfahrzeug und Verfahren zum Lokalisieren und Untersuchen eines am Gewässergrund eines Gewässers angeordenten Objekts sowie System mit dem unbemannten Unterwasserfahrzeug |
| WO2013192353A1 (fr) * | 2012-06-21 | 2013-12-27 | California Institute Of Technology | Systèmes de capteurs autonomes et pouvant être commandés et procédés d'utilisation de ces systèmes |
| DE102016106214A1 (de) | 2016-04-05 | 2017-10-05 | Ocean Maps GmbH | Tauchvorrichtung, Sendeeinrichtung und System und Verfahren zur Bestimmung der Position unter Wasser |
-
2016
- 2016-06-20 DE DE102016111238.0A patent/DE102016111238A1/de not_active Withdrawn
-
2017
- 2017-06-20 DE DE112017003046.3T patent/DE112017003046A5/de active Pending
- 2017-06-20 WO PCT/EP2017/065121 patent/WO2017220601A1/fr not_active Ceased
Patent Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5432712A (en) | 1990-05-29 | 1995-07-11 | Axiom Innovation Limited | Machine vision stereo matching |
| US6590640B1 (en) | 2000-07-20 | 2003-07-08 | Boards Of Regents, The University Of Texas System | Method and apparatus for mapping three-dimensional features |
| US20060182314A1 (en) | 2005-02-11 | 2006-08-17 | England James N | Method and apparatus for displaying a calculated geometric entity within one or more 3D rangefinder data sets |
| JP2010190726A (ja) * | 2009-02-18 | 2010-09-02 | Toa Harbor Works Co Ltd | 水底地形測量方法およびシステム |
| WO2012129612A1 (fr) | 2011-03-31 | 2012-10-04 | Ogburn Damian | Procédé et système pour surveiller ou réguler des formations sous-marines |
| US20150301180A1 (en) | 2011-09-08 | 2015-10-22 | Advanced Scientific Concepts Inc. | Terrain mapping ladar system |
| DE102012103373A1 (de) | 2012-04-18 | 2013-10-24 | Jena-Optronik Gmbh | Verfahren zur Erstellung eines 3D-Modells urbaner Umgebungen |
Non-Patent Citations (9)
| Title |
|---|
| "3D Reconstruction Based on Underwater Video from ROV Kiel 6000 Considering Underwater Imaging Conditions", OCEANS 2009-EUROPE, 2009, pages 1 - 10 |
| D.T. LEE ET AL.: "International Journal of Computer and Information Science, Ausgabe 9, Nr. 3,", 1980, pages: 219 - 242 |
| H. HOPPE, SURFACE RECONSTRUCTION FROM UNORGANIZED POINTS, 1994 |
| HURTOS N ET AL: "Calibration of optical camera coupled to acoustic multibeam for underwater 3D scene reconstruction", OCEANS 2010 IEEE - SYDNEY, IEEE, PISCATAWAY, NJ, USA, 24 May 2010 (2010-05-24), pages 1 - 7, XP031777058, ISBN: 978-1-4244-5221-7 * |
| KUNZ: "Map Building Fusing Acoustic and Visual Information using Autonomous Underwater Vehicles", JOURNAL OF FIELD ROBOTICS, vol. 30, no. 5, 2013, pages 763 - 783 |
| MASSOT-CAMPOS: "Optical Sensors and Methods for Underwater 3D Reconstruction", SENSORS, vol. 15, 2015, pages 31525 - 31557 |
| PELAGOTTI: "Automated Multispectral Texture Mapping of 3D Models", 17TH EUROPEAN SIGNAL PROCESSING CONFERENCE, 2009, pages 1215 - 1219, XP032758904 |
| T. SCHENK, INTRODUCTION TO PHOTOGRAMMETRY, 2005 |
| W.E.LORENSEN ET AL.: "Computer Graphics, Ausgabe 21, Nr.4.", July 1987, pages: 163 - 169 |
Cited By (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN108151715A (zh) * | 2018-02-09 | 2018-06-12 | 首都师范大学 | 一种浅水区水下地形测量装置及方法 |
| CN108151715B (zh) * | 2018-02-09 | 2023-09-26 | 首都师范大学 | 一种浅水区水下地形测量装置及方法 |
| CN110764094A (zh) * | 2019-10-25 | 2020-02-07 | 南京奥达升智能科技有限公司 | 一种水下三维可视化探测系统及其探测方法 |
| CN114049414A (zh) * | 2021-11-04 | 2022-02-15 | 海南诺亦腾海洋科技研究院有限公司 | 一种图像生成方法、装置、电子设备及存储介质 |
| CN116625329A (zh) * | 2023-07-24 | 2023-08-22 | 新兴际华(北京)智能装备技术研究院有限公司 | 堰塞湖信息确定方法、系统、电子设备及存储介质 |
| CN116625329B (zh) * | 2023-07-24 | 2023-10-20 | 新兴际华(北京)智能装备技术研究院有限公司 | 堰塞湖信息确定方法、系统、电子设备及存储介质 |
| CN118089676A (zh) * | 2024-04-24 | 2024-05-28 | 中交华南勘察测绘科技有限公司 | 一种水下地形勘测无人船数据采集及传输方法 |
Also Published As
| Publication number | Publication date |
|---|---|
| DE102016111238A1 (de) | 2017-12-21 |
| DE112017003046A5 (de) | 2019-02-28 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2017220599A2 (fr) | Procédé de production de données 3d d'un objet | |
| WO2017220601A1 (fr) | Dispositif de balayage permettant le balayage du fond d'étendues d'eau et procédé d'établissement de cartes sous-marines | |
| Johnson‐Roberson et al. | High‐resolution underwater robotic vision‐based mapping and three‐dimensional reconstruction for archaeology | |
| CN114488164B (zh) | 水下航行器同步定位与建图方法及水下航行器 | |
| Whitcomb et al. | Advances in underwater robot vehicles for deep ocean exploration: Navigation, control, and survey operations | |
| Roman et al. | Application of structured light imaging for high resolution mapping of underwater archaeological sites | |
| Negahdaripour et al. | Direct estimation of motion from sea floor images for automatic station-keeping of submersible platforms | |
| WO2017131838A2 (fr) | Systèmes et procédés de fusion de capteur sonar et de réalité virtuelle et augmentée basée sur un modèle | |
| DE112010002843T5 (de) | Oberflächenverfolgung auf Bildbasis | |
| Wang et al. | Acoustic camera-based pose graph slam for dense 3-d mapping in underwater environments | |
| Bruno et al. | Project VISAS: virtual and augmented exploitation of submerged archaeological sites‐overview and first results | |
| CN110053743A (zh) | 一种用于水下精准测量的遥控机器人 | |
| Williams et al. | Simultaneous localisation and mapping and dense stereoscopic seafloor reconstruction using an AUV | |
| CN116642468A (zh) | 基于无人机航空摄影和无人船水上水下一体化扫描方法 | |
| Yoerger et al. | Fine-scale three-dimensional mapping of a deep-sea hydrothermal vent site using the Jason ROV system | |
| Calantropio et al. | Photogrammetric underwater and UAS surveys of archaeological sites: The case study of the roman shipwreck of Torre Santa Sabina | |
| CN105488852B (zh) | 一种基于地理编码和多维校准的三维图像拼接方法 | |
| KR102186733B1 (ko) | 3차원 해저 지형 생성 방법 | |
| Mahon et al. | Reconstructing pavlopetri: Mapping the world's oldest submerged town using stereo-vision | |
| Yin et al. | Study on underwater simultaneous localization and mapping based on different sensors | |
| Wajs et al. | Development of low-cost Unmanned Surface Vehicle system for bathymetric measurements | |
| Prado et al. | 3D modeling of Rio Miera wreck ship merging optical and multibeam high resolution points cloud | |
| Hernández et al. | Autonomous seabed inspection for environmental monitoring | |
| EP3384480A1 (fr) | Procédé de simulation préparatoire d'une intervention militaire dans une zone d'intervention | |
| Allotta et al. | Thesaurus project: Design of new autonomous underwater vehicles for documentation and protection of underwater archaeological sites |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17732394 Country of ref document: EP Kind code of ref document: A1 |
|
| REG | Reference to national code |
Ref country code: DE Ref legal event code: R225 Ref document number: 112017003046 Country of ref document: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 17732394 Country of ref document: EP Kind code of ref document: A1 |