WO2013020158A1 - Inspecting geographically spaced features - Google Patents
Inspecting geographically spaced features Download PDFInfo
- Publication number
- WO2013020158A1 WO2013020158A1 PCT/AU2011/001506 AU2011001506W WO2013020158A1 WO 2013020158 A1 WO2013020158 A1 WO 2013020158A1 AU 2011001506 W AU2011001506 W AU 2011001506W WO 2013020158 A1 WO2013020158 A1 WO 2013020158A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- sensor
- location
- feature
- location data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/53—Determining attitude
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
- G01S19/48—Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
- G01S19/485—Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system whereby the further system is an optical system or imaging system
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
- G01C21/1656—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/01—Satellite radio beacon positioning systems transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/13—Receivers
- G01S19/14—Receivers specially adapted for specific applications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
- G01S19/48—Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
- G01S19/49—Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system whereby the further system is an inertial position system, e.g. loosely-coupled
Definitions
- the invention relates to inspecting geographically spaced features.
- the tops of power poles should be inspected regularly for faults. Faults at the tops of power poles can cause the wires to fall which may well start a fire. It is generally accepted that the tops of power poles should be inspected every 3 years.
- the present inventors have previously inspected poles by taking digital photographs from a helicopter. Typically 3 or 4 photos of each pole are taken. Up to 800 poles can be photographed per day in this way.
- the photos After a flight the photos must be paired with an asset number identifying the pole.
- the photos are GPS tagged using an inbuilt feature of the camera which pairs each digital image with location data in a common EXIF file.
- the location data identifies the location at which the photo was taken.
- the GIS asset data includes paired data identifying the asset number of each pole and its actual location. By comparing the recorded location data to the actual location of the poles the digital images can be paired with the appropriate asset number. Thereafter the images are inspected by a qualified linesman to determine if maintenance is required.
- the location data recorded in the EXIF file is the location of the camera rather than the location of the pole. This discrepancy leads to ambiguities when comparing the location data and the actual location data. This is particularly so in the case of complex networks and/or parallel powerlines etc. The degree of ambiguity increases based on the distance between the camera and the power pole, this distance is referred to as 'stand off distance'.
- the invention provides a method of inspecting geographically spaced features including obtaining sensor data, from a sensor, describing a feature at a distance and a direction from the sensor; sensor location data describing the location of the sensor; and direction data describing the direction; in a form suitable for determining, based on the sensor location data and the direction data,
- determined feature location data describing the location of the feature; and producing an output including paired or pairable sensor data and determined feature location data.
- the method further includes determining, based on the sensor location data and the direction data,
- determined feature location data describing the location of the feature; and producing an output including paired or pairable sensor data and determined feature location data.
- the determining and producing could be performed at a later date and/or conditionally on the sensor data.
- the location of a power pole might only be determined if a photograph of the pole shows a fault.
- Obtaining the sensor location data and/or the direction data preferably includes obtaining data from a satellite navigation system, which system is most preferably the Global Positioning System.
- Direction data may be obtained using an inertial instrument, e.g. an accelerometer, or more preferably a gyroscope, may provide direction data as a backup should data from the satellite navigation system be unavailable or the relevant data receiving systems fail.
- 'Gyroscope' as used herein takes in conventional spinning wheel gyroscopes and equivalents such as laser gyros and chip mounted MEMs gyroscopes.
- each of the sensor data and the determined feature location data includes a time stamp by which they are pairable.
- the obtaining includes manipulating the sensor by hand to direct the sensor toward the direction.
- the senor is a camera and the sensor data includes an image.
- the sensor data is assigned a name and the determined feature location is paired with the name.
- the determined feature location is paired with the name in a text file or a shape file.
- the method preferably includes comparing the determined feature location data to actual feature location data, of a collection of paired actual feature location data and feature identifiers, to pair the sensor data and a feature identifier; producing an output including paired sensor data and feature identifiers.
- the comparing and producing could be performed at a later date and/or conditionally on the sensor data.
- the location of a power pole might only be determined if a photograph of the pole shows a fault.
- Another aspect of the invention provides an apparatus for inspecting geographically spaced features including a sensor for obtaining sensor data describing a feature at a distance and a direction from the sensor; a location measuring device for obtaining location data describing the location of the sensor; a direction measuring device configured to obtain direction data describing the direction; wherein the sensor data, location data and direction data are each in a form suitable for determining, based on the sensor location data and the direction data, determined feature location data describing the location of the feature; and producing an output including paired or pairable sensor data and determined feature location data.
- the apparatus may further include a processing device configured to receive sensor data, location data and direction data; determine, based on the sensor location data and the direction data, determined feature location data describing the location of the feature; and produce an output including paired or pairable sensor data and determined feature location data.
- a processing device configured to receive sensor data, location data and direction data; determine, based on the sensor location data and the direction data, determined feature location data describing the location of the feature; and produce an output including paired or pairable sensor data and determined feature location data.
- the devices may be integrated is a single unit and/or one or more of the devices might be integrated with the sensor.
- the sensor is configured to be directed toward the direction by hand.
- the location measuring device and/or the direction measuring device is configured to obtain data from a satellite navigation system, which system is most preferably the Global Positioning System.
- the direction measuring device preferably includes an inertial instrument, most preferably in the form of a gyroscope.
- Another aspect of the invention provides a vehicle carrying the apparatus, which vehicle is preferably a helicopter.
- Another aspect of the invention provides a computer readable medium carrying instructions executable by a processing device to receive sensor data describing a feature at a distance and a direction from a sensor, sensor location data describing the location of the sensor and direction data describing the direction; determine, based on the sensor location data and the direction data, determined feature location data describing the location of the feature; and produce an output including paired or pairable sensor data and determined feature location data.
- Figure 1 is a perspective view of a sensor and a direction measuring device in accordance with an embodiment of the invention
- FIG. 2 diagrammatically illustrates an embodiment of the invention.
- FIG. 3 diagrammatically illustrates processing steps.
- Figure 1 illustrates a sensor in the form of camera 10 and a direction measuring device in the form of a GPS based azimuth 20.
- the camera 10 may be used to obtain sensor data 1 10 in the form of a digital picture of the feature (e.g. the top of a power pole) at a distance and direction from the camera 10.
- the camera 10 includes a camera body 1 1 and a telephoto lens 12.
- the camera 10 is a NikonTM D3X with a 600mm telephoto lens.
- the camera 10 is mounted within helicopter 30 via a gyro stabilised mount (not shown).
- the camera 10 includes an inbuilt location measuring device 13 in the form of a 5Hz GarminTM GPS unit.
- a 1 Hz GarminTM GPS18 is also suitable.
- the mounting is such that the camera 10 is freely manipulate by hand to move relative to the helicopter 30 to direct the camera toward a feature of interest.
- the mounting may be omitted - the camera could simply be handheld.
- the camera 10, GPS based Azimuth 20 and location measuring device 13 produce data in an electronic form.
- the GPS based azimuth 20 produces direction data 120 and includes a pair of GPS antennae 21 carried by a mount 22.
- the mount 22 connects the antennae 21 to the mount of the camera whereby the antennae are fixed relative to the camera. As such the antennae move in unison with the camera and by tracking their position the azimuth component of the direction at which the camera is pointed can be tracked.
- the azimuth 20 may be mounted directly to the camera 10 to move with the camera 10.
- the mount 22 includes a bar running parallel to the axis of the camera lens. This bar terminates at the centre of a 500mm long horizontal bar transverse to the axis of the camera lens. Each antennae 21 is carried at a respective end of the horizontal bar.
- the GPS based azimuth further includes a tilt sensor (not shown) to measure the inclination of the direction at which the camera is pointed and a processing arrangement to bring together the azimuth and tilt components of the direction data 120. It is contemplated that workable embodiments may not include the tilt sensor, instead tilt could be a predetermined value.
- the GPS based azimuth 20 further includes an inertial device in the form of a small laser gyroscope (not shown) to provide an indication of the direction of the camera if and when the antennae 21 lose signal.
- an inertial device in the form of a small laser gyroscope (not shown) to provide an indication of the direction of the camera if and when the antennae 21 lose signal.
- the processing arrangement of the GPS azimuth takes information from the antennae, from the tilt sensor and the laser gyroscope to prepare direction data 120 describing the direction of the camera and time stamps the direction data by the inclusion of GPS time.
- the data is conveyed in the form of a serial NMEA string including the following comma delineated fields:
- G Gyro solution * checksum
- time stamped direction data for four features could be:
- the inventors have found that the GPS based azimuth is a significant advance over other possible direction measuring systems. Magnetic compasses were trialled, but worked poorly in the magnetically noisy environment about the helicopter.
- direction measuring device 20 may produce a continuous stream of direction data but preferred that the direction data 120 is produced substantially simultaneously with the sensor data.
- the direction measuring device 20 is responsive to the camera 10. This may be achieved via electrical connection to the flash mechanism of the camera 10. According to the described embodiment the measuring device 20 is connected directly to the hot shoe of the camera 10.
- the sensor data and the direction data may be stored in a storage device(s) on board the helicopter and then processed elsewhere after the flight, but preferably the data is processed in flight.
- the described apparatus includes a processor 40 carried aboard the helicopter 30.
- the processor 40 may be a Linux processor (and/or an off the shelf laptop computer) and cooperate with an external / removable hard drive 41 to store data.
- the processor 40 includes or cooperates with a computer readable medium (e.g. a hard disk) carrying computer executable instructions executable by processor 40 to perform the processing steps 101 , 102 and 103 illustrated in Figure 3.
- a computer readable medium e.g. a hard disk
- the file 110 includes an image, a time stamp and location data 1 13 from the camera's location measuring device 13.
- the processor 40 receives the file 1 10, and names the file 1 10 with an adjusted photo name made up of a date code, a letter identifying the camera (e.g. in case there is more than one camera aboard the helicopter) and a 4 digit number - "yyyymmdd-camera letter-1234". The number is indexed by one each time a photo is taken and returns to 0000 after 9999.
- the processor 40 receives direction data 120 and location data 1 13. Based on this data and a preset estimate of stand off distance the location of the feature (e.g. the power pole) can be determined. In other variants of the invention, accuracy might be improved by providing a distance measuring device to measure the stand off distance.
- Step 102 is preferably performed in real time so that the determined feature locations can be passed to a photographer's navigation system to produce a real time plot showing which features have been photographed.
- the processor 40 matches the time stamps of the determined feature location data to the time stamps of the EXIF files from step 101 to produce a text file (or a shape file) including a listing of paired determined feature location data & adjusted photo names.
- the text file may include comma delineated text including: Time of image capture, determined feature location, camera direction, local time, local date, adjusted photo name.
- the EXIF file retains the original camera location data 1 13. This provides a degree of redundancy. If the data in the text file is corrupted (potentially due to any number of upstream failures) the location data from the EXIF files can be used in line with inventors' earlier approach. Of course it is also possible to modify the EXIF files to include the determined feature location data.
- the text file may later be compared to the actual locations of the power poles listed in a database including paired actual power locations and asset numbers (or other feature identifier). In this way the pictures can be paired with the asset numbers.
- This comparison may be performed by a spatial search using GIS software to locate the closest feature to each determined feature location and then populate a DBF file with the results. The distance between the determined feature location and the actual feature location can also be reported. This distance provides an indication of the likelihood of a feature being mis-identified.
- the DBF file may also include executable links to the images.
- Location data (including sensor location data, determined feature location data and actual feature location data) may be expressed in terms of latitude and longitude.
- the final output data may be combined into a single GIS based package including: actual feature locations
- GIS features such as streets, towns), determined feature locations, helicopter (or other vehicle) travel path, and images (or other sensor data).
- the camera 10 (including its location measuring device 13), direction measuring device 20 and the processor 40 together constitute an apparatus for inspecting geographically spaced features.
- the apparatus further includes a user interface 50 including status light 51 and operator display 52.
- the user interface 50 is driven by the processor 40.
- the status lights 51 include three separate lights. A red light is displayed to confirm that the direction measuring device is operating. It is also contemplated that an audible alarm might be generated if the measuring device 20 is not operating. A green light is displayed to confirm that an adequate GPS signal is being received. An orange light is displayed to indicated that GPS signal has been lost and that the measuring device is now utilising the laser gyro.
- the operator display 52 is a small daylight viewable screen showing system status including:
- an atmospheric sensor may be used to inspect a volume of atmosphere, e.g. a cloud.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Automation & Control Theory (AREA)
- Gyroscopes (AREA)
- Navigation (AREA)
- Studio Devices (AREA)
Abstract
A method of inspecting geographically spaced features. The method includes obtaining sensor data 110, from a sensor 10, sensor location data 1 13 and direction data 120. The sensor data describes a feature at a distance and a direction from the sensor. The sensor location data describes the location of the sensor. The direction data describes the direction. The data is in a form suitable for determining, based on the sensor location data and the direction data, determined feature location data 102 describing the location of the feature; and producing an output 101, 103 including paired or pairable sensor data and determined feature location data.
Description
INSPECTING GEOGRAPHICALLY SPACED FEATURES
FIELD OF THE INVENTION
The invention relates to inspecting geographically spaced features.
The invention will be described in relation to the inspection of power poles although various embodiments of the invention may suit other applications.
BACKGROUND
The tops of power poles should be inspected regularly for faults. Faults at the tops of power poles can cause the wires to fall which may well start a fire. It is generally accepted that the tops of power poles should be inspected every 3 years.
Traditionally the tops of power poles have been inspected from the ground with the aid of a camera and/or a mirror atop a hand held pole. This is slow and misses faults.
The present inventors have previously inspected poles by taking digital photographs from a helicopter. Typically 3 or 4 photos of each pole are taken. Up to 800 poles can be photographed per day in this way.
After a flight the photos must be paired with an asset number identifying the pole. For this purpose the photos are GPS tagged using an inbuilt feature of the camera which pairs each digital image with location data in a common EXIF file. The location data identifies the location at which the photo was taken.
After the flight the EXIF files are compared to GIS asset data. The GIS asset data includes paired data identifying the asset number of each pole and its actual location. By comparing the recorded location data to the actual location of the poles the digital
images can be paired with the appropriate asset number. Thereafter the images are inspected by a qualified linesman to determine if maintenance is required.
This approach has drawbacks. The location data recorded in the EXIF file is the location of the camera rather than the location of the pole. This discrepancy leads to ambiguities when comparing the location data and the actual location data. This is particularly so in the case of complex networks and/or parallel powerlines etc. The degree of ambiguity increases based on the distance between the camera and the power pole, this distance is referred to as 'stand off distance'.
In some cases it is not possible to reduce the stand off distance, e.g. due to surrounding structures impeding the safe manoeuvring of the helicopter, and in any case
manoeuvring the helicopter to reduce stand off distance adds additional time.
It is an object of the invention to provide for improved inspection of geographically spaced features, or at least to provide an alternative for those concerned with inspecting geographically spaced features.
It is not admitted that any of the information in this patent specification is common general knowledge, or that the person skilled in the art could be reasonably expected to ascertain or understand it, regard it as relevant or combine it in any way at the priority date.
SUMMARY
In one aspect the invention provides a method of inspecting geographically spaced features including obtaining sensor data, from a sensor, describing a feature at a distance and a direction from the sensor;
sensor location data describing the location of the sensor; and direction data describing the direction; in a form suitable for determining, based on the sensor location data and the direction data,
determined feature location data describing the location of the feature; and producing an output including paired or pairable sensor data and determined feature location data.
Preferably the method further includes determining, based on the sensor location data and the direction data,
determined feature location data describing the location of the feature; and producing an output including paired or pairable sensor data and determined feature location data.
Alternatively the determining and producing could be performed at a later date and/or conditionally on the sensor data. By way of example the location of a power pole might only be determined if a photograph of the pole shows a fault.
Obtaining the sensor location data and/or the direction data preferably includes obtaining data from a satellite navigation system, which system is most preferably the Global Positioning System.
Direction data may be obtained using an inertial instrument, e.g. an accelerometer, or more preferably a gyroscope, may provide direction data as a backup should data from the satellite navigation system be unavailable or the relevant data receiving systems fail. 'Gyroscope' as used herein takes in conventional spinning wheel gyroscopes and equivalents such as laser gyros and chip mounted MEMs gyroscopes.
According to preferred forms of the invention, each of the sensor data and the determined feature location data includes a time stamp by which they are pairable.
Preferably the obtaining includes manipulating the sensor by hand to direct the sensor toward the direction.
Optionally the sensor is a camera and the sensor data includes an image.
Preferably the sensor data is assigned a name and the determined feature location is paired with the name. Most preferably, the determined feature location is paired with the name in a text file or a shape file.
The method preferably includes comparing the determined feature location data to actual feature location data, of a collection of paired actual feature location data and feature identifiers, to pair the sensor data and a feature identifier; producing an output including paired sensor data and feature identifiers.
Alternatively the comparing and producing could be performed at a later date and/or conditionally on the sensor data. By way of example the location of a power pole might only be determined if a photograph of the pole shows a fault.
Another aspect of the invention provides an apparatus for inspecting geographically spaced features including a sensor for obtaining sensor data describing a feature at a distance and a direction from the sensor; a location measuring device for obtaining location data describing the location of the sensor;
a direction measuring device configured to obtain direction data describing the direction; wherein the sensor data, location data and direction data are each in a form suitable for determining, based on the sensor location data and the direction data, determined feature location data describing the location of the feature; and producing an output including paired or pairable sensor data and determined feature location data.
The apparatus may further include a processing device configured to receive sensor data, location data and direction data; determine, based on the sensor location data and the direction data, determined feature location data describing the location of the feature; and produce an output including paired or pairable sensor data and determined feature location data.
The devices may be integrated is a single unit and/or one or more of the devices might be integrated with the sensor. Preferably the sensor is configured to be directed toward the direction by hand.
Preferably the location measuring device and/or the direction measuring device is configured to obtain data from a satellite navigation system, which system is most preferably the Global Positioning System.
The direction measuring device preferably includes an inertial instrument, most preferably in the form of a gyroscope.
Another aspect of the invention provides a vehicle carrying the apparatus, which vehicle is preferably a helicopter.
Another aspect of the invention provides a computer readable medium carrying instructions executable by a processing device to receive sensor data describing a feature at a distance and a direction from a sensor, sensor location data describing the location of the sensor and direction data describing the direction; determine, based on the sensor location data and the direction data, determined feature location data describing the location of the feature; and produce an output including paired or pairable sensor data and determined feature location data.
BRIEF DESCRIPTION OF THE DRAWINGS
Figure 1 is a perspective view of a sensor and a direction measuring device in accordance with an embodiment of the invention;
Figure 2 diagrammatically illustrates an embodiment of the invention; and
Figure 3 diagrammatically illustrates processing steps.
DETAILED DESCRIPTION
Figure 1 illustrates a sensor in the form of camera 10 and a direction measuring device in the form of a GPS based azimuth 20. The camera 10 may be used to obtain sensor data 1 10 in the form of a digital picture of the feature (e.g. the top of a power pole) at a distance and direction from the camera 10.
The camera 10 includes a camera body 1 1 and a telephoto lens 12. In this case the camera 10 is a Nikon™ D3X with a 600mm telephoto lens. The camera 10 is mounted within helicopter 30 via a gyro stabilised mount (not shown). The camera 10 includes an inbuilt location measuring device 13 in the form of a 5Hz Garmin™ GPS unit. A 1 Hz
Garmin™ GPS18 is also suitable. The mounting is such that the camera 10 is freely manipulate by hand to move relative to the helicopter 30 to direct the camera toward a feature of interest. The mounting may be omitted - the camera could simply be handheld.
The camera 10, GPS based Azimuth 20 and location measuring device 13 produce data in an electronic form.
The GPS based azimuth 20 produces direction data 120 and includes a pair of GPS antennae 21 carried by a mount 22. The mount 22 connects the antennae 21 to the mount of the camera whereby the antennae are fixed relative to the camera. As such the antennae move in unison with the camera and by tracking their position the azimuth component of the direction at which the camera is pointed can be tracked. In
embodiments without the mount 22, the azimuth 20 may be mounted directly to the camera 10 to move with the camera 10.
In this embodiment the mount 22 includes a bar running parallel to the axis of the camera lens. This bar terminates at the centre of a 500mm long horizontal bar transverse to the axis of the camera lens. Each antennae 21 is carried at a respective end of the horizontal bar.
The GPS based azimuth further includes a tilt sensor (not shown) to measure the inclination of the direction at which the camera is pointed and a processing arrangement to bring together the azimuth and tilt components of the direction data 120. It is contemplated that workable embodiments may not include the tilt sensor, instead tilt could be a predetermined value.
The GPS based azimuth 20 further includes an inertial device in the form of a small laser gyroscope (not shown) to provide an indication of the direction of the camera if and when the antennae 21 lose signal. As such the accuracy of the system is substantially unaffected by short periods of signal loss. The inventors have found that sufficiently accurate direction data is available for up to 3 minutes without GPS signal.
In this embodiment the processing arrangement of the GPS azimuth takes information from the antennae, from the tilt sensor and the laser gyroscope to prepare direction data 120 describing the direction of the camera and time stamps the direction data by the inclusion of GPS time. The data is conveyed in the form of a serial NMEA string including the following comma delineated fields:
NMEA identifier, model, GPS time, bearing, tilt, pan, N= normal GPS solution G=Gyro solution *checksum
By way of example, the time stamped direction data for four features could be:
$PSAT,HPR,19171 1.20,226.27,-1.70,-0.9,N*12
$PSAT,HPR,19171 1.40,226.43,0.06,-0.9,N*3B
$PSAT,HPR,19171 1.60,226.52,1.2,-0.9,G*05
$PSAT,HPR,19171 1.80,226.52,1.2,-0.9,G*0B
The inventors have found that the GPS based azimuth is a significant advance over other possible direction measuring systems. Magnetic compasses were trialled, but worked poorly in the magnetically noisy environment about the helicopter.
It is possible that direction measuring device 20 may produce a continuous stream of direction data but preferred that the direction data 120 is produced substantially simultaneously with the sensor data. In this embodiment the direction measuring device 20 is responsive to the camera 10. This may be achieved via electrical connection to the flash mechanism of the camera 10. According to the described embodiment the measuring device 20 is connected directly to the hot shoe of the camera 10.
The sensor data and the direction data may be stored in a storage device(s) on board the helicopter and then processed elsewhere after the flight, but preferably the data is processed in flight. Accordingly the described apparatus includes a processor 40 carried
aboard the helicopter 30. The processor 40 may be a Linux processor (and/or an off the shelf laptop computer) and cooperate with an external / removable hard drive 41 to store data.
The processor 40 includes or cooperates with a computer readable medium (e.g. a hard disk) carrying computer executable instructions executable by processor 40 to perform the processing steps 101 , 102 and 103 illustrated in Figure 3. When a photo is taken a .jpg image EXIF file 110 is generated and the direction measuring device 20 generates direction data 120. The file 110 includes an image, a time stamp and location data 1 13 from the camera's location measuring device 13.
At processing step 101 the processor 40 receives the file 1 10, and names the file 1 10 with an adjusted photo name made up of a date code, a letter identifying the camera (e.g. in case there is more than one camera aboard the helicopter) and a 4 digit number - "yyyymmdd-camera letter-1234". The number is indexed by one each time a photo is taken and returns to 0000 after 9999.
At step 102 the processor 40 receives direction data 120 and location data 1 13. Based on this data and a preset estimate of stand off distance the location of the feature (e.g. the power pole) can be determined. In other variants of the invention, accuracy might be improved by providing a distance measuring device to measure the stand off distance.
Step 102 is preferably performed in real time so that the determined feature locations can be passed to a photographer's navigation system to produce a real time plot showing which features have been photographed.
At step 103 the processor 40 matches the time stamps of the determined feature location data to the time stamps of the EXIF files from step 101 to produce a text file (or a shape file) including a listing of paired determined feature location data & adjusted photo names.
The text file may include comma delineated text including:
Time of image capture, determined feature location, camera direction, local time, local date, adjusted photo name.
According to the present embodiment the EXIF file retains the original camera location data 1 13. This provides a degree of redundancy. If the data in the text file is corrupted (potentially due to any number of upstream failures) the location data from the EXIF files can be used in line with inventors' earlier approach. Of course it is also possible to modify the EXIF files to include the determined feature location data.
After the flight the text file may later be compared to the actual locations of the power poles listed in a database including paired actual power locations and asset numbers (or other feature identifier). In this way the pictures can be paired with the asset numbers. This comparison may be performed by a spatial search using GIS software to locate the closest feature to each determined feature location and then populate a DBF file with the results. The distance between the determined feature location and the actual feature location can also be reported. This distance provides an indication of the likelihood of a feature being mis-identified. The DBF file may also include executable links to the images.
Location data (including sensor location data, determined feature location data and actual feature location data) may be expressed in terms of latitude and longitude.
It is contemplated that the final output data may be combined into a single GIS based package including: actual feature locations
GIS features (such as streets, towns), determined feature locations, helicopter (or other vehicle) travel path, and
images (or other sensor data).
It will be appreciated that the camera 10 (including its location measuring device 13), direction measuring device 20 and the processor 40 together constitute an apparatus for inspecting geographically spaced features. In this embodiment the apparatus further includes a user interface 50 including status light 51 and operator display 52. The user interface 50 is driven by the processor 40.
The status lights 51 include three separate lights. A red light is displayed to confirm that the direction measuring device is operating. It is also contemplated that an audible alarm might be generated if the measuring device 20 is not operating. A green light is displayed to confirm that an adequate GPS signal is being received. An orange light is displayed to indicated that GPS signal has been lost and that the measuring device is now utilising the laser gyro.
The operator display 52 is a small daylight viewable screen showing system status including:
On and ready;
Hard drive space;
No. of photos taken; and
System health.
An embodiment of the invention has been described. This description should not be interpreted as limiting the scope of the invention as defined in the claims. Whilst the use of a camera to photograph power poles has been described, various embodiments of the invention may suit other applications. By way of example, an atmospheric sensor may be used to inspect a volume of atmosphere, e.g. a cloud.
Claims
1. A method of inspecting geographically spaced features including obtaining sensor data, from a sensor, describing a feature at a distance and a direction from the sensor; sensor location data describing the location of the sensor; and from a satellite navigation system direction data describing the direction; in a form suitable for determining, based on the sensor location data and the direction data, determined feature location data describing the location of the feature; and producing an output including paired or pairable sensor data and determined feature location data.
2. The method of claim 1 further including determining, based on the sensor location data and the direction data, determined feature location data describing the location of the feature; and producing an output including paired or pairable sensor data and determined feature location data.
3. The method of claim 1 or 2 wherein the obtaining sensor location data includes obtaining data from the satellite navigation system.
4. The method of claim 1 , 2 or 3 wherein the obtaining includes manipulating the sensor by hand to direct the sensor toward the direction.
5. The method of any one of claims 1 to 4 wherein the satellite navigation system is the Global Positioning System.
6. The method of any one of claims 1 to 5 wherein the obtaining direction data includes obtaining data from an inertial instrument.
7. The method of claim 7 wherein the inertial instrument is a gyroscope.
8. The method of any one of claims 1 to 7 wherein each of the sensor data and the determined feature location data includes a time stamp by which they are pairable.
9. The method of any one of claims 1 to 8 wherein the sensor is a camera and the sensor data includes an image.
10. The method of any one of claims 1 to 9 wherein sensor data is assigned a name and the determined feature location is paired with the name.
1 1. The method claim 10 wherein the determined feature location is paired with the name in a text file or a shape file.
12. The method of any one of claims 1 to 1 1 wherein the feature is the top of a power pole.
13. The method of any one of claims 1 to 12 further including comparing the determined feature location data to actual feature location data, of a collection of paired actual feature location data and feature identifiers, to pair the sensor data and a feature identifier; producing an output including paired sensor data and feature identifiers.
14. An apparatus for inspecting geographically spaced features including a sensor for obtaining sensor data describing a feature at a distance and a direction from the sensor; a location measuring device for obtaining location data describing the location of the sensor; a direction measuring device configured to obtain from a satellite navigation system direction data describing the direction; wherein the sensor data, location data and direction data are each in a form suitable for determining, based on the sensor location data and the direction data, determined feature location data describing the location of the feature; and producing an output including paired or pairable sensor data and determined feature location data.
15. The apparatus of claim 14 further including a processing device configured to receive the sensor data, location data and direction data; determine, based on the sensor location data and the direction data, determined feature location data describing the location of the feature; and produce an output including paired or pairable sensor data and determined feature location data.
16. The apparatus of claim 14 or 15 wherein the location measuring device is configured to obtain sensor location data from the satellite navigation system.
17. The apparatus of claim 14, 15 or 16 wherein the sensor is configured to be directed toward the direction by hand.
18. The apparatus of any one of claims 14 to 17 wherein the satellite navigation system is the Global Positioning System.
19. The apparatus of any one of claims 14 to 18 wherein a direction measuring device includes an inertial instrument.
20. The apparatus of claim 19 wherein the inertial instrument is a gyroscope.
21. The apparatus of any one of claims 14 to 20 wherein each of the sensor data and the determined feature location data includes a time stamp by which they are pairable.
22. The apparatus of any one of claims 14 to 21 wherein the sensor is a camera and the sensor data includes an image.
23. The apparatus of any one of claims 14 to 22 wherein the sensor data is assigned a name and the determined feature location is paired with the name.
24. The apparatus claim 23 wherein the determined feature location is paired with the name in a text file or a shape file.
25. The apparatus of any one of claims 14 to 24 wherein the feature is the top of a power pole.
26. A vehicle carrying the apparatus of any one of claims 14 to 25.
27. A helicopter carrying the apparatus of any one of claims 14 to 26.
28. A computer readable medium carrying instructions executable by a processing device to receive sensor data describing a feature at a distance and a direction from a sensor, sensor location data describing the location of the sensor, and from a satellite navigation system direction data describing the direction; determine, based on the sensor location data and the direction data, determined feature location data describing the location of the feature; and produce an output including paired or pairable sensor data and determined feature location data.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| AU2013100417A AU2013100417A4 (en) | 2011-08-10 | 2013-04-04 | Inspecting Geographically Spaced Features |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| AU2011903184 | 2011-08-10 | ||
| AU2011903184A AU2011903184A0 (en) | 2011-08-10 | Inspecting Geographically Spaced Features |
Related Child Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| AU2013100417A Division AU2013100417A4 (en) | 2011-08-10 | 2013-04-04 | Inspecting Geographically Spaced Features |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2013020158A1 true WO2013020158A1 (en) | 2013-02-14 |
Family
ID=47667760
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/AU2011/001506 Ceased WO2013020158A1 (en) | 2011-08-10 | 2011-11-21 | Inspecting geographically spaced features |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2013020158A1 (en) |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5894323A (en) * | 1996-03-22 | 1999-04-13 | Tasc, Inc, | Airborne imaging system using global positioning system (GPS) and inertial measurement unit (IMU) data |
| US20050007450A1 (en) * | 2002-12-13 | 2005-01-13 | Duane Hill | Vehicle mounted system and method for capturing and processing physical data |
-
2011
- 2011-11-21 WO PCT/AU2011/001506 patent/WO2013020158A1/en not_active Ceased
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5894323A (en) * | 1996-03-22 | 1999-04-13 | Tasc, Inc, | Airborne imaging system using global positioning system (GPS) and inertial measurement unit (IMU) data |
| US20050007450A1 (en) * | 2002-12-13 | 2005-01-13 | Duane Hill | Vehicle mounted system and method for capturing and processing physical data |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP6675770B2 (en) | Map update method and in-vehicle terminal | |
| US8169505B2 (en) | Image management apparatus for displaying images based on geographical environment | |
| JP6393912B2 (en) | Surveying system, setting device, setting program and recording medium | |
| JP6950832B2 (en) | Position coordinate estimation device, position coordinate estimation method and program | |
| US20090202102A1 (en) | Method and system for acquisition and display of images | |
| US20130107038A1 (en) | Terminal location specifying system, mobile terminal and terminal location specifying method | |
| KR101444685B1 (en) | Method and Apparatus for Determining Position and Attitude of Vehicle by Image based Multi-sensor Data | |
| CN110603463A (en) | Non line of sight (NLoS) satellite detection at a vehicle using a camera | |
| CN103822631B (en) | Localization method and the device of a kind of satellite towards rotor and the combination of optical flow field vision | |
| JP2008118643A (en) | Image file management apparatus and method | |
| EP1189021A1 (en) | Improvements in or relating to camera systems | |
| WO2010052558A2 (en) | System and method for the precise integration of virtual objects to interactive panoramic walk-through applications | |
| KR101011813B1 (en) | Overlapping Area Display between Adjacent Aerial Photos | |
| FR2999303A1 (en) | METHOD OF GEO PRECISE LOCATION OF AN ON-BOARD IMAGE SENSOR ABOVE AN AIRCRAFT | |
| KR100558367B1 (en) | Digital Mapping System and Method Using JPS and INNS | |
| CN112489032A (en) | Unmanned aerial vehicle-mounted small target detection and positioning method and system under complex background | |
| ES2576457T3 (en) | Procedure for supporting inertial navigation of a flying apparatus | |
| CN104748754A (en) | Vehicle positioning method and system thereof | |
| Al-Hamad et al. | Smartphones based mobile mapping systems | |
| US20120026324A1 (en) | Image capturing terminal, data processing terminal, image capturing method, and data processing method | |
| CN108492334A (en) | A method of realizing commercial camera photo geographical calibration based on positioning and directing data | |
| CN101689310A (en) | Spatial information database generating device and spatial information database generating program | |
| JP4969053B2 (en) | Portable terminal device and display method | |
| KR20070055533A (en) | Method and system for identifying object in photograph, program for implementing the system, recording medium, terminal and server | |
| AU2013100417A4 (en) | Inspecting Geographically Spaced Features |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| DPE2 | Request for preliminary examination filed before expiration of 19th month from priority date (pct application filed from 20040101) | ||
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11870593 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 11870593 Country of ref document: EP Kind code of ref document: A1 |