[go: up one dir, main page]

WO2016025585A1 - Système et procédé d'utilisation d'un dispositif d'affichage à réalité augmentée dans des procédures de traitement de surface - Google Patents

Système et procédé d'utilisation d'un dispositif d'affichage à réalité augmentée dans des procédures de traitement de surface Download PDF

Info

Publication number
WO2016025585A1
WO2016025585A1 PCT/US2015/044838 US2015044838W WO2016025585A1 WO 2016025585 A1 WO2016025585 A1 WO 2016025585A1 US 2015044838 W US2015044838 W US 2015044838W WO 2016025585 A1 WO2016025585 A1 WO 2016025585A1
Authority
WO
WIPO (PCT)
Prior art keywords
target object
information
interface device
mobile interface
environment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/US2015/044838
Other languages
English (en)
Inventor
Brian Bare
Jonathan Martin
Partick RYAN
Paul Sells
Mark Lawrence
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huntington Ingalls Inc
Original Assignee
Huntington Ingalls Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US14/695,636 external-priority patent/US9734403B2/en
Application filed by Huntington Ingalls Inc filed Critical Huntington Ingalls Inc
Publication of WO2016025585A1 publication Critical patent/WO2016025585A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Definitions

  • This application relates generally to the use of augmented reality to display changes in dynamic environments and, more particularly, to the use of augmented reality to provide information and direction to users performing operations on or applying coatings to surfaces in such dynamic environments.
  • Augmented reality provides a view of a physical, real-world environment whose elements are augmented (or supplemented) by computer-generated sensory input such as sound, text, graphics, or video.
  • a data processor reviews a camera-captured image for cues that trigger the display of additional information and images along with a display of the captured image.
  • AR is useful for conveying information via a static display
  • AR. information changes.
  • the ability to provide constant update to the AR information in response to changes in the environment and location and relative positioning of the user's display provides great utility in various applications including c"nstruction, repair, maintenance, and safety.
  • a particularly significant example of a dynamic spatial environment is the space on board a large ship. Not only does the ship itself move, lis structure is flexible so that the position of a particular compartment, deck portion, supporting structure, or exterior surface in one part of the ship may change relative to other parts of the ship. Similar dynamic behavior can occur in tail buildings, construction sites, outdoor processing plants, roadways, bridges, etc.
  • An illustrative aspect of the invention provides a method for providing information to a mobile interface device user for use in conducting an operation on a surface of a target object in a dynamic structural environment.
  • the method comprises receiving on a central data processor from the mobile interface device over a communication network a request for target object surface information, determining a pose of the mobile interface device relative to the target object surface, and obtaining target object surface information for one or more measurable target object characteristics.
  • the method further comprises assembling AR target object surface information; for transmission to and display on the mobile interface device.
  • the AR target object surface information is assembled using the target object surface information and is configured for viewing in conjunction with a real-time view of the target object captured by the mobile interface device.
  • the method also comprises transmitting the AR target object information to the mobile interface device.
  • Another illustrative aspect of the invention provides an automated system for providing information to a mobile interface device for use in conducting an operation on a surface of a target object in a dynamic structural environment.
  • the system comprises at least one mobile interface device configured for variable disposition within the dynamic structural environment, capturing information about the target object within the structural environment, and transmitting the target object information over a network.
  • the system also comprises a local positioning system in communication with the at least one mobile interface device via the network and configured to provide information usable to determine a mobile interface device pose relative to the target object.
  • the system further comprises a central processor comprising at least one data processing machine in communication with the at least one mobile interface device and the local positioning system via the network.
  • the central processor is configured for receiving from a requesting one of the at least one mobile interface device a request for target object surface information, the request including information usable to determine the mobile interface device pose relative to the target object, detenmning the pose of the requesting mobile interface device relative to the target object, and obtaining target object surface information for one or more measurable target object characteristics.
  • the central processor is further configured for assembling augmented reality (AR) target object surface information for transmission to and display on the mobile interface device.
  • the AR target object surface information is assembled using the target object surface information and is configured for viewing in conjunction with a real-time view of the target object captured by the mobile interface device.
  • the central processor is still further configured for transmitting the AR target object information to the mobile interface device.
  • Figure 1 is a schematic representation of a system for providing AR information to a mobile interface device according to an embodiment of the invention
  • Figure 2 is a flow diagram of a method of providing target object information to a mobile interface device in a dynamic structural environment according to an embodiment of the invention
  • Figure 3 is a schematic representation of a system for providing target object surface information to a mobile interface device according to an embodiment of the invention
  • Figure 4 is an illustration of a mobile interface device user having a target object and exemplary AR surface deviation information displayed thereon in accordance with an embodiment of the invention
  • Figure 5 is an illustration of a three dimensional target object having discrete surface regions for coating application
  • Figure 6 is an illustration of a mobile interface device user having a camera-captured view of a target object and overlaid AR surface information displayed thereon in accordance with an embodiment of the invention.
  • Figure 7 is a flow diagram of a method of providing target object surface information to a mobile interface device according to an embodiment of the invention.
  • the present invention provides methods and systems for real-time display of AR information on a mobile device immersed in and movable within a dynamic environment.
  • the challenges presented by this scenario include determination of the location of and orientation of the mobile device within the environment, recognition of variations in the spatial geometry of the environment, and detection/identification of changes in other measurable parameters associated with the environment or objects within the environment.
  • the systems of the invention use AR as the primary medium for presenting environment-related information to a user.
  • AR allows presentation of such information on the mobile interface device in graphical or textual form overlaid or adjacent an environmental area or object as it appears in the camera-generated view on the device screen.
  • a generalized system 100 for generating and displaying real-time AR information according to an embodiment of the invention is illus trated in Figure 1.
  • the system 100 is configured for obtaining and storing information on a dynamic structural environment such as a ship or building and objects disposed within that environment.
  • the system 100 comprises a central processor 1 10 in communication with one or more mobile interface devices 101 via a communication network 102.
  • the central processor may include or be in communication with a relational database structure (not shown) as is described in U.S. Pat. App. No.
  • the central processor 1 10 is configitred to receive captured object information from the mobile interface devices 101 and to extract information relating to the environment or an object in the environment, generate AR information for display on a requesting mobile interface device, and transmit the AR information to the requesting mobile interface device 101.
  • the central processor 1 10 may include or be configured to receive information from a local positioning system 109 via the communications network 102 or a different network.
  • the central processor may be configured to use the information from the local positioning system 109 in conjunction with information from a requesting mobile interface device 101 and known/stored structural information (e.g., a three dimensional model) to determine the pose of the mobile interface device 101 within the environment.
  • known/stored structural information e.g., a three dimensional model
  • pose means the position (x,y,z) and orientation ( ⁇ , ⁇ , ⁇ ) of an object in a particular physical space.
  • the system is configured to resolve spatial differences between the coordinate system established based on the known structural information and the data received from the local positioning system 109 that result from changes in the dynamic structure,
  • the central processor 110 is also configured to receive information from an environment data system 103 via the network 102 or another network.
  • the environment data system 103 is configured for measurement or determination of parameters associated with the structural environment or an object or objects within the structural environment.
  • parameters may include, but are not limited to spatially mapped or mappabJc data obtained from sensors (e.g., radiation or temperature sensors) with known locations in the structural environment, spatially mapped or mappable data (e.g., weight distribution or surface topography) associated with a particular object in the environment, and system or device status information (e.g., electrical circuit energization status), in some embodiments, the environmental data systems 103 may include a metrology system adapted to provide measurements of specific parameters for particular object types.
  • the central processor 1 10 is configured to process information from the environmental data systems 103 and use it with the pose information for the requesting mobile interface device 101 to generate AR information that can be transmitted to the mobile interface device 101 for display.
  • information processed by the central processor 1 10 may include asset location information from a global or local positioning system, visual or graphical information received from the mobile interface devices, observational information from users, and operational or other data from instrumentation systems associated with the environment or particular assets. Any or all of such information can be used by the central processor 1 10 to update object-related information and/or generate information for display via AR images that can be superimposed on the mobile device user's view of the environment or an object in the environment.
  • the mobile interface devices used in the systems of the invention can make use of AR in a variety of ways that allow the user to conduct inspection, maintenance, repair, and replacement tasks in relation to particular assets.
  • AR can also be used to assist a user in identifying safety hazards, locating objects, or simply navigating within the dynamic environment.
  • the central processor 1 10 may be configured for processing information it receives for a particular object or structure in the environment and comparing that information to desired specifications or previously obtained information for that object or structure. Such information can be used to determine if a maintenance or other corrective operation should be performed.
  • the central processor 1 10 may also be configured to generate AR representations of a deviation from the desired condition that can be displayed to mobile device user charged with correcting the deviation. As will be discussed in more detail hereafter, this could include any of various maintenance operations or corrective machining operations.
  • the environment data systems 103 may be configured to provide periodic or continuous updates to the central processor i 10 regarding the status or condition of the object or structure undergoing such operations. This allows the central processor 1 10 to provide condition updates to the mobile device operator in real-time.
  • the mobile interface device 101 may be any mobile computing solution that is used by a user to facilitate communication with and display information from the central processor 1 10.
  • the mobile interface device 101 may be, for example, a tablet computer, a smartphone, or a wearable heads-up display.
  • the mobile interface device 101 may have features including, but not limited to a processor, a display (such as a screen), a vision sensor (such as a camera), a microphone, one or more speakers, and wireless communications capabilities.
  • the mobile interface device 101 may be, in a particular embodiment, a wearable head- mounted device (HMD) such as that described in U.S. App. No. 14/2.10,730, filed March 14, 2014, the complete disclosure of which is incorporated herein by reference in its entirety.
  • HMD wearable head- mounted device
  • the mobile interface device 101 is equipped or configured to display AR images/information to a user.
  • the mobile interface device 101 may include one or more accelerometers or other motion detection sensors.
  • Each mobile interface device 101 may include one or more unique identifiers.
  • some or all of the mobile interface devices 101 may include one or more local positioning receivers, image and object recognition, audio queues, or electromagnetic field (EMF) receivers or detectors (for GPS, WiFi, or RFID reception or light detection).
  • EMF electromagnetic field
  • the mobile interface device 101 may be configured to receive information from the local positioning system 109 and to determine its own pose relative to the environment. This capability may be provided to the mobile device 101 in addition to or instead of configuring the central processor! 10 to establish the pose of the mobile device 101.
  • the communication network 102 may be a wireless network, a wired network or any combination of wireless network and wired network.
  • the communications network 102 is a wireless communications network, allowing the mobile mterface devices 101 to communicate wirelessly with the central processor 110.
  • the communication network 102 may, in particular, be or include a wireless LAN, a Global System for Mobile Communication ("GSM”), a Personal Communication Sendee ("PCS”), a Personal Area Network (“PAN”), D-AMPS, Wi-Fi, Fixed Wireless Data, IEEE 802.11 a, 802.1 lb, 802.15.1 , 802.1 In and 802.1 Ig or any other wired or wireless network for transmitting and/or receiving a data signal .
  • GSM Global System for Mobile Communication
  • PCS Personal Communication Sendee
  • PAN Personal Area Network
  • D-AMPS Digital Cellular System
  • Wi-Fi Fixed Wireless Data
  • IEEE 802.11 a, 802.1 lb, 802.15.1 , 802.1 In and 802.1 Ig or any other wired or
  • the central processor 1 10 may be or comprise one or more servers, data processing machines, or network-enabled computers and may host an AR operating system 104,
  • the AR operating system 104 may be configured to control the in teraction of the hardware and software components of a relational database structure (not shown).
  • the relational database structure is configured to provide a logical framework that allows digital information to be associated with physical objects. This framework includes addresses for both tangible objects as well as individual point addresses within a coordinate system for the structural environment. In an exemplary embodiment, this coordinate system is based on a three dimensional (3D) structural model of the environment (e.g.. the ship or building). Preferably, the 3D model provides a complete detail of the environment including every space, room or compartment where objects may be disposed.
  • 3D three dimensional
  • the AR operating system 104 is configured to assemble AR information for transmission to and display by the mobile device 101.
  • the AR information is constructed itsing the processed environment data from the environment data systems 103 and the pose of the mobile device 101 using any of various techniques known in the art.
  • the AR information may be presented for display as text or as graphical images that can be superimposed over real-time images captured by the mobile device 101.
  • the AR information may be associated with specific parameters relating to the portion of the environment where the mobile device 101 is located or relating to an object or system near the mobile device 101 and/or with which the user of the mobile device 101 is interacting.
  • the AR information may include information on a target object that is usable by a mobile device user to conduct maintenance, construction, machining or other operations on the target object.
  • target object means an object or structure in a dynamic environment that can be identified by the system and associated with location, status, condition or other object-related information.
  • the AR information may include information on deviations from a desired status or condition.
  • the AR information may be presented on a mobile device as an AR image superimposed over a camera image of the target structure or object to show physical deviations to the user in a clear visual manner.
  • the central processor 1 10 may be configured or may comprise a processor or processing module and computer executable software (e.g., on a tangible computer-readable medium) configured to perform various processing functions relating to object recognition, including feature extraction to extract lines, edges, ridges, or other localized interest points from an image; detection or segmentation to select a specific set of interest points within an image or segment multiple image regions that contain a specific object of interest; image recognition to categorize a detected object into a particular category ; noise reduction; contrast enhancement; and/or space scaling, for example.
  • object recognition including feature extraction to extract lines, edges, ridges, or other localized interest points from an image; detection or segmentation to select a specific set of interest points within an image or segment multiple image regions that contain a specific object of interest; image recognition to categorize a detected object into a particular category ; noise reduction; contrast enhancement; and/or space scaling, for example.
  • the relational database structure may include a domain coordinate management system that maintains spatial addresses for all spaces within the domain of the structural environment.
  • the domain coordinate management system may be configured to receive spatial address information from both the local positioning system 109 and from the three dimensional structural model.
  • the domain coordinate management system is configured to resolve spatial differences between the coordinate system established by the 3D model of the structure and any available telemetry data received from the local positioning system 109 as a result of changes in the dynamic structure. Such differences may be particularly significant in, for example, a large vessel underway at sea. Ships (particularly large ships) are not rigid bodies.
  • the local positioning system 109 is a system (complete or composite) that facilitates the establishment or estimation of the pose of a mobile interface device 101 within the coordinate system described or approximated by the three dimensional model of the structural environment.
  • pose may be accurately established using vision science-based algorithms. Such algorithms may recognize one or more unique pre-identified visual tracking cues within a physical space.
  • the local positioning system 109 may be or include any system capable of establishing the position and/or orientation of a mobile interface device relative to a structural environment coordinate system. This coordinate system may be, or may be based on, for example, a predetermined reference system for a ship or other structure.
  • the local positioning system 109 may comprise a light positioning system that operates by using light points positioned throughout the physical spaces of the vessel. An example of a light positioning system is described in U.S. Patent No. 8,248,467, the complete disclosure of which is incorporated herein by reference in its entirety.
  • the local positioning system 109 may use electromagnetic or sound waves emanating from various points within the physical spaces in the structural environment.
  • electromagnetic or sound waves examples include radio frequency identification (RFID) signals, radio signals, WiFi signals, audio tones, and/or sound waves.
  • RFID radio frequency identification
  • the local positioning system 109 may use unique images or recognizable visual features, quick response (QR) codes, bar codes, or reference points placed throughout the physical space of the structure.
  • QR quick response
  • the system 100 may use information from more than one local positioning system. Alternatively or in addition, the system 100 may receive and use information from a global positioning system (GPS) (not shown).
  • GPS global positioning system
  • system 100 may be combined into a single processor or further subdivided into multiple processors or servers, it will be appreciated that in some cases, multiple instances of a particular component of the system 100 may be used. Moreover, the system 100 may include other devices not depicted in Figure 1.
  • the system 100 can be used to provide a mobile device user with real-time AR information on the characteristics or condition of target objects disposed in a dynamic environment.
  • a generalized method M100 for providing such AR information to a mobile device user begins at S 105.
  • the user uses the mobile interface device 101 to capture a digital image of the target object and/or a portion of the environment in which the target object is disposed.
  • a request for target object information is sent from the mobile interface device to a central processor 1 10 over a network 102.
  • the request may include information usable by the central processor to determine the exact location and pose of the mobile interface device 101 relative to the target object. This information may include some or all of the digital image.
  • the central processor 1 10 uses the request information along with information from the local positioning system 109 to establish the exact location and pose of the mobile device 101 relative to the target environment.
  • the pose of the mobile device 101 may be determined by the mobile device 101 itself.
  • pose information may be transmitted by the mobile device 101 to the central processor 110.
  • the central processor 1 10 obtains information on one or more target object- related parameters. Some or all of this information may be received from the environment data systems 103 or determined from information received from the mobile device 101 and/or the mobile device user. The information could, for example, be determined from a digital image received from the mobile device 101 as part of the information request. Alternatively, the information may be data from sensors located near or on or associated with the target object. The information may also be or include information on the status of a system of which the target object is a part. It will be understood that the information from the environment data systems 103 may be provided to the central processor 1 10 on a constant basis or may instead be provided in response to a request or query submitted by the central processor 1 10. The central processor 1 10 may also obtain previously stored information associated with the target object from a data storage module.
  • the central processor may be required to process the information received from the mobile interface device 101 to specifically identify the target object and differentiate it from other objects in the environment. In some embodiments, this may be accomplished using object recognition techniques in conjunction with environment location information. In other embodiments, this may be accomplished through the use of unique identifiers or other indicia applied to or attached to the target object. Such indicia can be provided to the central processor 1 10 by the user or can be extracted from a captured digital image. Among other things, identification of the target object allows the central processor I I 0 to determine if previously stored information is available for that target object.
  • the target object parameter information may include information on a particular characteristic or status that can be evaluated to determine if an operation should be performed on the target object by the mobile device user. In some cases, if it is determined that a maintenance or corrective operation is required or if it is already known that such an operation is required, the target object information may be information required by the mobile device user to perform the required operation.
  • the central processor 1 10 uses the target object information to assemble AR information for transmission to the mobile interface device 101.
  • This action may include processing of raw data from the environment data systems into a form usable by the AR operating system 104 to construct the AR information.
  • the AR information includes a graphical representation of target object parameters as a visual display that can be superimposed over a real-time view of the target environment captured by the mobile interface device 101.
  • the central processor 1 10 may be configured to determine the deviation of a current target object status or characteristic from a desired target object status or characteristic. This could, for example, be a deviation of measured performance or geometric parameters from what is specified for the target object.
  • the central processor 1 10 may use this information to construct the AR information.
  • the AR information could include an AR image that visually shows the deviation in conjunction with a real-time view of the target object.
  • the AR information could be or include a visual representation of the desired target object condition to assist the mobile device user in conducting an operation intended to bring the target object in conformance with that desired condition.
  • the central processor 1 10 transmits the AR information to the mobile interface device 101 where it is displayed to the mobile device user at S I 70.
  • the AR information may be presented as text displayable in conjunction with the visual display of the target object or as graphical imagery that can be superimposed over an appropriate portion of the visual display.
  • the graphical imagery could, for example, be or include one or more graphical representations of the parameters measured by the environmental data systems, a representation of desired characteristics, or the above- described deviation from desired characteristics.
  • some or all of the actions of the method Ml 00 may be repeated to periodically or continuously provide real-time target object information to the mobile interface device 101. This assures that the user is aware of variations due to changes in the location and pose of the mobile device relative to the dynamic environment and changes target object parameters, hi some embodiments, the actions of obtaining target parameter data, assembling AR information and transmitting the AR information may be repeated after some or all of an operation on the target object has been accomplished. This allow s the operator to monitor the effect of the operation being conducted on the object.
  • the methods of the invention are usable by individuals conducting virtually any operation associated with an object, including without limitation any form of machining, welding, construction, assembly, or maintenance operation. It may also include instances where a status of the object is changed. An example of this is an instance where the object is a component in an electrical circuit and the operator is required to effect a change in the connectivity or energization status of that component.
  • the present invention provides systems and methods for providing detailed AR information to mobile device users conducting, assisting or supervising surface protection operations including without limitation application or removal of paint and other coatings, chemical treatment, insulation application or removal, and cleaning tasks.
  • the AR information provided to a user may include visual maps, surface area and other surface characteristics, covering material information, coating thickness data, and flaw locations. It may also include task-specific information such as location and area of surfaces to be coated, surface preparation requirements, material requirements, thickness requirements, inspection requirements, task instructions, and troubleshooting tools.
  • the methods of the invention may be applied to any portion of a dynamic structure.
  • different compartments and exterior portions require different kinds of coatings including, but not limited to paint, insulation, and deck covering. Testing of these coatings is necessary during construction to assure that specifications are met and to identify where corrective measures are needed.
  • the present invention can provides workers with easily viewed and understood representations of the design configuration for protective coatings, the as-built or measured configuration of such coatings, and the discrepancies between the two configurations.
  • a surface protection operation on a ship's rudder may require the appl ication of a fairing compound (i.e. a thick epoxy putty) to smooth out contours to minimize any cavitations of water that could accelerate erosion of the rudder.
  • a fairing compound i.e. a thick epoxy putty
  • the system and methods of the present invention can be used to aid in determining the amount of putty to apply, where to apply it, and in what shape.
  • Figure 3 illustrates a surface protection operation display system 200 according to an embodiment of the invention that can be used to a ssist in various surface protection operations on a surface or portion of a surface of an object or structure in a dynamic environment.
  • the system 200 is essentially a specific variation of the generalized AR display system i 00 of Figure 1 , It is noted that system 200 illustrates only a single instance of each component. It will be appreciated that multiple instances of these components may be used and that the system 200 may include other devices not shown in Figure 3. It will be understood that in many environments, the system 200 will interface with other systems such as, for example, the operating system of a shipboard operations and maintenance platform as described in the '650 Application.
  • the system 200 comprises a central processor 210 in communication with one or more mobile interface devices 201 via a communication network 202.
  • the mobile interface device 201 may be any mobile computing solution that is integratabie into the system 200.
  • the mobile interface device 201 may be, for example, a tablet computer, a smartphone, or a wearable heads-up display.
  • the central processor 2.10 may include or be in communication with a relational database structure (not shown) as described in the "'650 Application.
  • the central processor 210 may be configured to receive information from a local positioning system 209 via the communications network 202 or via a different network and use such information along with information from a mobile device 201 to determine a pose of the device 201 relative to a target object.
  • the central processor 210 is configured to receive captured target object information from the mobile interface devices 201 and to extract information relating to the target object, generate AR information for display on a requesting mobile interface device, and transmit the AR information to the requesting mobile interface device 201.
  • the target object may be a discrete object in the environment or a structure or a portion of a structure in the environment.
  • the surface may be an exterior surface of the object or structure or, in some cases, an interior surface such as an interior wall of a ship compartment In typical shipboard applications, all spaces within (he ship will be included in the environment coordinate system and specific locations for surface protection operations (e.g. stud weld positions, putty on rudders, installations of insulation or zinc anodes, location of dry film thickness (DVT) readings or other qualitative or quantitative measurement) will be readily identifiable.
  • surface protection operations e.g. stud weld positions, putty on rudders, installations of insulation or zinc anodes, location of dry film thickness (DVT) readings or other qualitative or
  • the central processor 210 is also configured to receive surface measurement information via the network 202 (or another network) from a surface metrology system 203.
  • the surface metrology system 203 may be or include any one or more measurement systems for locating and measuring surface or surface coating parameters discretely or continuously.
  • the surface metrology system 203 may, for example, be or include a paint, insulation, and deck covering metrology system comprising sensors (e.g., electronic and magnetic pull-off gages) for measuring substrate/surface roughness and/or DFT of a dry coating. It may also comprise instrumentation such as digital photogrammetry, computerized theodolites systems, total stations, laser trackers, and coordinate measuring machines to make three-dimensional surface measurements.
  • the surface metrology system may provide information relating to substrate or surface structure (e.g., stud position) as well. Sensors and instrumentation of the surface metrology system may be automated or may be operated manually. Data provided by the surface metrology system 203 may be processed in real-time or later processed after all target points are measured. A translator (not shown) may be used to convert sensor readings into digital signals compatible for standard transmission over the communication network 202,
  • the surface metrology system 203 may use a combination of laser, theodolite, photographic, optical telescope and other data to map surface (opography in three dimensions. This information can then be provided to the central processor 210 along with any other surface or coating measurements.
  • the central processor 210 may be configured to receive the surface and coating measurement information and compare it to specified surface parameters for the target object. The central processor is further configured to determine the deviation between the measured surface parameters and the desired surface parameters. These deviations may. for example, relate to coating thickness or location discrepancies.
  • the surface metrology system 203 may be used to determine surface measurement and/or discrepancy mformation and transmit such information directly to the one or more mobile interface de vices 201 in addition to or instead of the central processor 210 via the communication network 202.
  • the location of particular measurements or other operations may be documented by manual input into the surface coating metrology system 203.
  • a graphic indicator may be used to document measurement locations by using a cursor to mark the location on an augmented reality map (e.g., on a mobile interface device 201), or if the screen is a smart screen, the location can be identified by marking the position with one's finger or a pointer.
  • the central processor may be configured to use the surface measurement and/or the surface discrepancy information along with pose information for the requesting mobile interface device 201 to generate AR surface information that can be transmitted to the mobile interface device 201 for display.
  • AR surface information may include, for example, an AR image of the desired surface or surface coating condition or a graphical representation of the differences between the desired and actual conditions.
  • the central processor may be configured to provide the AR information in a wide variety of forms.
  • Visual contour style displays may be provided that show out- of- tolerance areas by the intensity of color and/or the closeness of contour lines.
  • an AR image 20 of surface coating thickness deviations from the desired surface coating thickness for a target object in this case, a planar floor panel 10) could take the form of a topographical map in which variations in predefined ranges are represented by color density variations.
  • the tone density of the illustrated regions are darker for larger deviations from the desired surface coating thickness.
  • AR surface information may also be or include specific text or other mformation pertaining to the desired surface coating condition or to a sequence of operations needed to correct a discrepant condition.
  • the AR surface information may include a 3-dimensionai augmented image of the proposed design overlaid on the as-built design in the correct pose, thus allowing the worker to visualize in real-time or near real time where a surface coating is to be applied, removed, or modified.
  • an AR image may contain both graphical and textual surface information.
  • Figure 5 illustrates a ship structure 30 having a complex three dimensional surface divided into three areas 32, 34, 36 each having its own surface characteristics and coating requirements.
  • Figure 6 illustrates a camera- captured view 30' of the ship structure 30 displayed on a mobile interface de vice 201.
  • a three dimensional AR image 40 comprising a variety of AR surface information including graphical representation of hull waterlines 42 and coating thickness measurement locations 44, as well as textual information 46 relating to coating identification and requirements all superimposed over the imaged surface areas 32', 34', 36'.
  • the illustrated example also includes an inset 48 showing a key to the displayed information.
  • different surface area portions such as areas 32', 34', 36'
  • these colors could be similar to or the same as those of the actual paint coatings to be applied.
  • the methods of the invention are usable in any type of surface protection operation to allow a worker to visualize the desired configuration and/or the difference between what was expected or required (e.g., by drawing or 3D model) and a measured condition.
  • the measured condition may be compared to a 3D model, or other representation of the intended design, for example, and the resulting mformation may be visually provided to a worker through augmented reality in real time so that corrections may be made during the process.
  • the present invention provides the ability for a worker to use a mobile interface device to visualize areas to which coatings are to be applied and/or out-of- tolerance or mislocated areas on the surface of an object without having to physically compare measurements to those set forth in drawings or other specifications.
  • the system 200 provides the worker with a visual representation in two or three dimensions that can be superimposed over a real-time camera view of the surface using augmented reality display techniques. Workers may use this visualization to determine where coatings are to be applied, corrections are to be made or removal/cleaning operations are to be conducted and what methods to follow to bring the surface to the desired condition.
  • an exemplary method M200 uses surface protection operation display system 200 to provide surface information to a mobile device user.
  • the method begins at S205,
  • the user uses the mobile interface device 201 to capture a digital image of a target object or a portion of a target object having a particular surface of interest.
  • a request for information on the target object surface is sent from the mobile interface device 201 to the central processor 210 over the network 202.
  • the request may include information usable by the central processor 210 to determine the target object and the surface of interest and to determine the exact location and pose of the mobile interface device 201 with respect to the target object.
  • the information sent from the mobile interface device 201 may include some or all of the captured digital image of the target object.
  • the central processor 210 uses the request information along with information from the local positioning system 209 to establish the exact location and pose of the mobile interface device 201 relative to the target object.
  • the pose of the mobile device 201 may be determined by the mobile device 201 itself.
  • pose information may be transmitted by the mobile device 201 to the central processor 1 10.
  • the central processor obtains information on the target object surface characteristics. This may include recalling previously stored surface topography information and/or obtaining updated information derived from surface or substrate measurements (i.e., the "as-built" or current surface configuration).
  • the information may include, in particular, existing coating material characteristics and/or thicknesses. It may also include information on a base/design configuration or a specified/desired surface configuration or topography.
  • the current surface characteristics may be established using contemporaneous surface measurements or based on measurements previously provided to the system. Such measurements may include for example thickness information at various locations on the surface.
  • the measured surface information can be provided in terms of the spatial location within the established environment coordinate system. Measurements may be taken dynamically using any suitabl e measuring system including, but not limited to any of the surface metrology measurement systems previously disclosed.
  • the target surface information may be used to establish surface
  • Such deviations may be, for example variations in thickness within a particular surface region or variations in location and boundaries of a particular surface area region. This can be done for specific points or areas on the target object surface. It will be understood that there are some instances where it may be desirable to continuously or periodically re-measure current surface characteristics and re-determine the current deviations from a desired topography. This allows, for example, an operator to see the effect of his corrective operations on such deviations and/or the occurrence of new deviations caused by such corrective efforts,
  • the central processor may also obtain information on environmental conditions for the environment in which the target object is located.
  • information is received from an environment data system configured to measm'e and provide environmental parameter data in real time or near real time.
  • environment parameters that could affect a particular surface operation.
  • Such parameters could include, for example, target object surface temperature, ambient air temperature, atmospheric pressure, wind direction and speed, relative humidity, and dew point.
  • Environment parameter information can be used by the central processor to construct graphical or textual AR environment information can be displayed on the mobile interface device.
  • a rendering engine on the central processor 210 uses the target object surface information (measured characteristics, desired characteristics, and/or deviations from desired characteristics) along with mobile device pose information to construct an AR representation of the target object surface information of interest.
  • this includes graphically rendering a visual display that can be superimposed over the view of the target object surface captured by the mobile interface device 201 as illustratively shown in Figure 6.
  • the display may be in the form of a surface density diagram, topographical maps, colored areas varying in size and color to indicate location and extent of discrepancies or target regions for particular operations, or other notional user view- of surface area characteristics.
  • the AR representation may also include text as previously described.
  • AR surface information which is or includes the AR representation constructed in S260, is transmitted to the mobile interface device 20.1.
  • AR environment information can be transmitted to the mobile interface device as well.
  • some or all of the AR surface information and/or AR environment information is displayed on the mobile interface device 201 as shown in Figure 6.
  • the graphical AR rendering of the surface information is superimposed over the displayed image of the target object surface so that the user is able to easily view the desired surface characteristics, the surface deviations, or other desired information.
  • the method ends at S295.
  • the present invention may utilize any of the aforementioned or other methods available for providing surface protection operation information to a worker in a maintenance or construction environment.
  • the paragraphs that follow describe particular operations in which these methods can be used and the form in which surface information may be acquired and used.
  • Paint information may be preprogrammed into the mobile interface device 201 or may be retrie ved from storage by the central processor 210. Paint information may include various procedures and the chronological order of events for a particular preservation or surface protection process to be conducted.
  • the paint information may include, for example, information on the intended surface area, areas requiring protection during an operation, cleanliness requirements, required surface testing, environmental condition limitations, surface preparation procedures and materials, profile requirements, characteristics and requirement for the paint/coating to be applied, coating thickness, and flaw identification criteria, the following paragraphs describe these forms of information in more detail. It will be understood that these sub- ope rations are not necessarily in chronological order and that other operations may also be involved (e.g., a cleanliness inspection may be required at various stages).
  • Information Boundaries, location and surface area in squared units (English or metric) of area to be painted.
  • the surface information may include a visual representation of the area to receive a coating including the edges or boundaries to be coated and the specific coating to be used and may be supplied to the user through mobile interface device 201.
  • Validation The dimensions and location of the area to be painted will be calculated and mapped in three dimensions.
  • the system 200 will calculate an accurate surface area to be painted, including bulkheads, stiffeners, piping, etc.
  • the calculated surface area will be displayed visually on the mobile interface device 201 and may be compared to the preprogrammed surface area to validate the accuracy of the area. If the area does not match within a specified allowance of error, the user may be alerted so coiTections can be made.
  • a visual map of specific items that need to be protected from blasting and coating will be identified by color coding or other means through mobile interface device 201, and any requirements for foreign material exclusion devices (FMEDs) may be supplied. Other information may include cleanliness requirements for specified areas.
  • Validation The location and surface characteristics of the target area are established as previously described. Once protective materials are in place, a visual representation of the target object or space may be overlaid on a camera-captured image of the target object or space using mobile interface device 201.
  • the blast and coating protection and FMEDs may be color coded or have identifying shapes affixed, such as diamonds or squares that would be recognized in a scan as the protection required.
  • the protected area may be deemed satisfactory or unsatisfactory based on the placement of the protection in comparison to where the protection was planned to be placed. If the area is unsatisfactory, corrective action would be required prior to proceeding in the coating process.
  • Information The requirements for cleanliness, e.g. through SSPC-SP 1 or potentially the use of a Fourier Transform Infrared Spectroscopy (FT-IR) analysis may be visually represented on the surface using mobile interface device 201.
  • Validation A visual inspection may be manually or wirelessly conducted and documented for a space or compartment using camera input. If FT-TR. spectra are required to verify cleanliness of the surface, the surface coating metrology system 203, may be used to map the location of the FT-IR measurements taken manually or wirelessly in the space and verify the measurements are satisfactory. If the reading is unsatisfactory, the user may be warned visually through mobile interface device 201 and may be required to perform additional cleaning prior to additional measurements being taken.
  • FT-IR Fourier Transform Infrared Spectroscopy
  • the conductivity and chloride limits on bare metal based on design specifications may be determined and provided, along with dust test requirements, at the mobile interface device 201. Requirements for the location of readings may also be visually displayed (e.g. an AR image may include text indicating that one conductivity reading must be taken for every 200 square feet for the first 1,000 square feet, etc.).
  • Validation The results of conductivity or chloride readings along with dust tests may be wirelessly or manually mapped via the surface coating metrology system 203, so thai the location in the compartment where the readings were taken is documented, along with the number of readings. The locations of the measurements must meet the requirements set forth by design requirements as in for example SSPC-SP standards. If the measurements are unsatisfactory, additional cleaning would be necessary or approval of the departure from specification by the customer.
  • the validation information would be communicated from the surface coating metrology system 203, to the mobile interface device 201.
  • the required environmental conditions for a particular surface protection procedure may be provided to the mobile interface device 201.
  • the information may include, for example, surface temperature, ambient temperature, relative humidity, and dew point.
  • the environmental readings may be measured and recorded manually or automatically and transmitted wirelessly. Measurement locations may be mapped via the surface coating metrology system 203. The recorded temperatures, relative humidity, and dew points can be documented as satisfactory or unsatisfactory immediately. The user may be alerted through mobile interface device 201 of unsatisfactory conditions so corrective action can be taken.
  • the validation information could be communicated from the surface coating metrology system 203 or the central processor 210, to the mobile interface device 201.
  • Required surface preparation instructions or other information may be provided. This could include for example, the requirements of specified standards such as SSPC-SP 2 on piping, SSPC-SP 1 1 for hard to access surfaces, SSPC-SP 10 on bulkheads and overheads, etc. Locations where specific surface preparation operations are required or have already been conducted may be mapped.
  • the surface protection operation display system 200 may provide a visual overlay of the required preparations on the physical area prepared to verify that the specified compartment requirements match with the physical compartment.
  • the validation information may be visualized using mobile interface device 201.
  • This information may include materials and characteristics of qualified products for blast media that can be presented on the mobile interface device 201.
  • a user may be provided information such as appropriate particle size of blast media, pressure, nozzle distance from surface, etc. that would result in the ideal surface profile to meet specifications.
  • the amount of blast media required would be estimated based on the square feet of the area to be blasted.
  • Validation Prior to blast operations, information on the blast media may be obtained and stored.
  • the label of the blast material can be scanned and v erified that it is included in the qualified products list via the surface coating metrology system 203.
  • a sample of the blast media may be observed under a microscope which is capable of calculating the particle size.
  • the results of the microscopic analysis will be entered into the surface coating metrology system 203, the central processor or an associated data storage medium, and the media may be deemed satisfactory or unsatisfactory, if the results are unsatisfactory, new blast material will be required or an allowance to use the out-of-specification media must be provided to the worker.
  • the validation information may be communicated from the surface coating metrology system 203 or the central processor 210 to the mobile interface device 201.
  • a specified coating profile (e.g. 2-4 mils on SSPC-SP 10 surface or greater than 1 mil on SSPC-SP 1 1 surfaces) and a number of required measuremen ts based on surface area may be provided to the mobile interface device 201, along with the contractual rules pertaining to the measurements. For example, the rales may require that one profile reading be taken for every 200 square feet for the first 1 ,000 square feet, etc. If profile measurements are not required on particular surfaces in a target area, that information may be provided. For example, no profile readings are required on the gasket seating surfaces for manholes.
  • Profile measurements are obtained and mapped using the surface coating metrology system. Locations where profile measurements should not be taken are avoided. Severely pitted areas, for example, are generally not used for taking profile measurements. Such areas may be identified photographically and their locations mapped and stored. Pitted surfaces can also be identified and their location logged by manual input in the three dimensional map of the space. Any of various measurement tools may be used to obtain profile data.
  • profile tape is utilized to take the profile measurement
  • the area where the tape is applied can be mapped using the surface coating metrology system 203, and the profile measurement manually or wirelessly input
  • the gage can wirelessly interact with the surface coating metrology system 203 to map the specific location where the profile measurement was taken and record the profile measurement.
  • the number and location of profile readings will be satisfactory or unsatisfactory based on design requirements. If the readings or locations are unsatisfactory, corrective action must be taken, or permission to proceed with out-of-specification profile readings must be obtained.
  • the information collected in the surface coating metrology system 203 may be communicated to the user on the mobile interface device 201 in real time or stored for later communication.
  • the coating system to be used for a particular operation may be specified in accordance with military or commercial specifications and the a vailability of qualified products.
  • An estimate of the quantity of paint that is required based on the surface area to be painted may be provided in information supplied on the mobile interface device 201.
  • the information provided may include color requirements for the cured paint such as L*a*b* values or other color specification.
  • the L* value is for how dark or light the color is.
  • the a* value measures the amount of green and red and the b* value measures the amount of blue and yellow.
  • the L*a*b* values can be calculated with the ⁇ , ⁇ , ⁇ values for colors.) Such information may be provided at the mobile interface device 201 , along with expected resulting Fourier Transform- Infrared (FT-1R) spectra for the coatings.
  • F-1R Fourier Transform- Infrared
  • Some target areas require different paint in different areas.
  • the different paint system requirements may be identified on a map of the areas such as that shown in Figure 6.
  • different types of piping on a ship may be color coded. For example, piping for jet fuel and potable water would be different colors, so that there is a visual indication of what is inside the pipes.
  • Other variations include bilges that have different paint systems and color in the bilge wells, and bulkhead colors that vary depending on how close the bulkhead is to the walkway grating.
  • ihe funding source for a paint job is different, even though ihe same paint is being applied adjacently.
  • the mobile interface device 201 can clearly identify the funding source for a paint job to allow a worker to know what charge to use for the job. For example, the sea chests and the underwater hull on an aircraft carrier receive similar coating instructions, but the funding sources for the projects are different.
  • a portable FT-IR could be utilized to obtain a spectrum of the coating that could be compared to the FT-IR spectrum specified in the preloaded design information.
  • An evaluation of the two spectra via the surface coating metrology system 203, would result in a satisfactory or unsatisfactory result, if the color or FT-IR spectrum is unsatisfactory, the user can be notified on the mobile interface device 201 , and corrective action taken, or acceptance of an out-of-specification conditions would be required.
  • DFT Dry- Film Thickness
  • the DFT requirements for each coat of paint along with the expected total DFT may be provided at the mobile interface device 201.
  • the number of necessary measurements based on the surface area will be available per the design specification.
  • the requirements pertaining to the locations of the measurements may be preprogrammed into the system. Such requirements could include, e.g.. indications that a batch reading is the average of five spot measurements taken in a 100 square feet area or that for areas between 300 to 1000 square feet, three 100 square feet sections shall be randomly selected and measured, etc.
  • the DFT measurements will be taken and their locations automatically or manually input into the surface coating metrology system 203 to allow three dimensional mapping.
  • the DFT measurements and the location of the measuremen ts may be iden tified as satisfactory or unsatisfactory for the user on the mobile interface device 201. If the DFT measurements are unsatisfactory, corrective action can take place, or the out-of-specification measurements can be accepted as-is with approval. In some cases, a map of the discrepancies may be provided to the mobile interface device 201.
  • the surface coating metrology system 203 may calculate the DFT of the individual subsequent coats by subtracting the average DFT of the previous coat from the total DFT measured. The results of the calculations can then be supplied to the user on the mobile interface device 201 ,
  • Paint items considered flaws during an inspection may be included in the information provided to the user on the mobile interface device 201. Flaws may be identified photographically or by direct observation.
  • Example flaws may include rust, bleeding rust, pinhole rust, checking, cracking, flaking, scaling, peeling, blistering, bleeding contaminants, puddling, holidays, scrapes, cuts, abrasions, mechanical damage, paint applied over dirt, paint applied over grit, paint applied over debris, pinholing, fish eyes, cratering, pitting, wrinkling, overspray, orange peel, cob webbing, visual presence of amine bloom, visual presence of chlorides, runs, drips, sags, or other contractually invoked identified flaw. Flaw location and type information may be provided io the operator on the mobile interface device 201.
  • the worker When a worker enters a space where an insulation operation (e.., installation or removal) is to be performed the worker may be provided with a visual representation of the insulation along with other process information using the mobile interface device 201.
  • the insulation information supplied and subsequent validation information may include the types of information described in the following paragraphs. It will be understood that the information types are not limited to the following and may be provided in any chronological order.
  • Some operations require preparation of surfaces for the welding or attachment of studs.
  • a visual representation of the location where the studs are supposed to be attached may be provided using the mobile interface device 201, as well as the required surface preparation, e.g. SSPC-SP 11.
  • a worker may work from a visual representation of the insulation to install and validate the surface and the surface preparation for stud installation, eliminating the need to take measurements along the bulkhead, overhead, or deck.
  • the area can be scanned or have surface characteristics measured by the surface coating metrology system 203. Alternatively, a picture can be taken and overlaid on a map to verify that the surface preparation for stud attachment is completed and in the correct location. The stud locations will be deemed satisfactory or unsatisfactory to the user through the mobile interface device. If
  • the locations of unsatisfactory studs may be provided in an AR map that can be visually overlaid on a camera view of the physical environment.
  • Validation The material used can be scanned and verified (o be correct for the insulation installation via the surface coating metrology system 203,
  • Inspection criteria for the studs may also be provided on the mobile interface device 201 .
  • Val idation Validation information obtained by the surface coating metrology system 2 (Bean include a picture overlaid on or compared to a three dimensional map, or a comparison of a new scan of the target area with the desired configuration. In either case, the location of the studs within a predetermined allowance of error can be verifi ed.
  • the inspection results of the studs may also be documented and stored by the surface coating metrology system 203, and the results of satisfactory or unsatisfactory conveyed to the user on the mobile interface device 201 , If the inspection is unsatisfactory, a representation of the discrepancy may be provided to the mobile device so that corrective action can be taken.
  • Information The type and amount of insulation required may be provided (e.g. acoustic, thermal, etc) on the mobile interface device 201 , Additionally, if glue and (ape are required, the type and amount can be estimated and provided. Information on the qualified products for insulation, glue, and tape will be transmitted to the mobile interface device 201 for the target area.
  • Validation The actual materials can be scanned using the mobile interface device and/or the surface coating metrology system 203 and compared with specified materials or material characteristics to ensure thai the material meets the specifications and that it is not expired. The results of the scan will be provided to the user on the mobile interface device 201.
  • the type of insulation installed in a target area may depend on a variety of factors. On a ship, the insulation for a particular area can depend on the compartment and the adjacent compartment that shares the bulkhead. A single compartment could require different kinds of insulation, e.g. one kind on the wall, and a different one on piping, or no insulation at all.
  • the system of the presen t invention can provide a three dimensional map viewable on the mobile interface device 201. This map provides a visual display of the specific insulation required on different portions of a compartment as well as where the caps to the studs should appear on the insulation types that require studs.
  • a visual representation e.g., a photo captured by the mobile interface device 201
  • a visual representation e.g., a photo captured by the mobile interface device 201
  • the results would be satisfactory or unsatisfactory.
  • the user would be notified of the results on the mobile interface device 201, and if the results are unsatisfactory, corrective action could be taken or approval of the out-of-specification insulation would be required.
  • Deck covering information may be preprogrammed into the mobile interface device 201 or stored in or by the central processor 210, along with the chronological order of events for an installation process.
  • the deck covering information supplied and validation used can include the following, not necessarily in chronological order.
  • the shape, size, material type, and quantity of deck material required may be provided for display to the mobile interface device 201. This can include, for example, tile, diamond plate, non-skid tape, or any other flooring material.
  • Validation The material used may be scanned and verified to be the correct product type, color, and unexpired using the surface coating metrology system 203. The results of the validation would be communicated to the user through the mobile interface device 201.
  • FT-IR FT-IR
  • NIR Near- Infrared Spectroscopy
  • XRF x-ray fluorescent
  • gas analyzers could be used for identification of gases in a confined space for safety, e.g. to ensure oxygen levels are adequate, or to identify and quantify the presence of gases sometimes found on a ship such as argon, hydrogen sulfide, or acety lene, etc
  • a visual representation of the gas and any pertinent properties such as extent, intensity, and movement may be provided through mobile interface device 201 as described in U.S. App. No. 14/686,427. Future portable surface or gas analysis equipment could also be linked.
  • the methods of the invention may be extended to any type of surface protection or preservation operations to allow a worker to visualize desired information and, in some cases, the difference between what was expected or required and the as-built condition.
  • the as-built tolerances may be compared to a 3D model, or other representation of the intended design, and the resulting information may be visually provided to a worker through the mobile interface device 201 in real time so that corrections may be made during the fabrication process.
  • the systems and methods of the present invention have a particular value when applied to vessels and other mobile structures, they may also be applied to stationary buildings and other structures as well Tn stationary environments, GPS and GIS mformation are typically available and may be used by the operating system.
  • the in v ention is, ho wever, of particular utility in shipboard or other mobile/dynamic environments in which GPS information may be unreliable or unavailable.
  • the present invention also addresses the problem of dynamic changes in structure and relative positioning such as are often encountered in large vessels. It will be understood that in some embodiments, the invention may be usable in a setting in which the user is not inside a structure and that the term "structural environment" may encompass exterior structural settings.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention concerne un procédé de fourniture d'informations de surface d'un objet cible à un utilisateur de dispositif mobile. Le procédé consiste à recevoir une demande d'informations de surface d'objet cible provenant d'un dispositif mobile, à déterminer la position du dispositif d'interface mobile par rapport à l'objet cible, et à obtenir des informations de surface d'objet cible pour un ou plusieurs paramètres de surface d'objet cible mesurables. Les données d'objet cible sont utilisées pour assembler des informations de réalité augmentée configurées pour être regardées en même temps qu'une vue en temps réel de l'objet cible capturée par le dispositif d'interface mobile. Les informations de surface à réalité augmentée de l'objet cible sont ensuite transmises au dispositif mobile à des fins d'affichage pour l'utilisateur.
PCT/US2015/044838 2014-08-13 2015-08-12 Système et procédé d'utilisation d'un dispositif d'affichage à réalité augmentée dans des procédures de traitement de surface Ceased WO2016025585A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201462036762P 2014-08-13 2014-08-13
US62/036,762 2014-08-13
US14/695,636 US9734403B2 (en) 2014-04-25 2015-04-24 Augmented reality display of dynamic target object information
US14/695,636 2015-04-24

Publications (1)

Publication Number Publication Date
WO2016025585A1 true WO2016025585A1 (fr) 2016-02-18

Family

ID=55304581

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2015/044838 Ceased WO2016025585A1 (fr) 2014-08-13 2015-08-12 Système et procédé d'utilisation d'un dispositif d'affichage à réalité augmentée dans des procédures de traitement de surface

Country Status (1)

Country Link
WO (1) WO2016025585A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
ES2643863A1 (es) * 2016-05-24 2017-11-24 Sonovisión Ingenieros España, S.A.U. Método para proporcionar mediante realidad aumentada guiado, inspección y soporte en instalación o mantenimiento de procesos para ensamblajes complejos compatible con s1000d y dispositivo que hace uso del mismo
WO2018102107A1 (fr) * 2016-11-29 2018-06-07 Caterpillar Inc. Caractéristique de zoom d'écran pour des applications de réalité augmentée

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110306412A1 (en) * 2010-06-11 2011-12-15 Nintendo Co., Ltd. Hand-held game apparatus and housing part of the same
US20130113827A1 (en) * 2011-11-08 2013-05-09 Qualcomm Incorporated Hands-free augmented reality for wireless communication devices

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110306412A1 (en) * 2010-06-11 2011-12-15 Nintendo Co., Ltd. Hand-held game apparatus and housing part of the same
US20130113827A1 (en) * 2011-11-08 2013-05-09 Qualcomm Incorporated Hands-free augmented reality for wireless communication devices

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
ES2643863A1 (es) * 2016-05-24 2017-11-24 Sonovisión Ingenieros España, S.A.U. Método para proporcionar mediante realidad aumentada guiado, inspección y soporte en instalación o mantenimiento de procesos para ensamblajes complejos compatible con s1000d y dispositivo que hace uso del mismo
WO2018102107A1 (fr) * 2016-11-29 2018-06-07 Caterpillar Inc. Caractéristique de zoom d'écran pour des applications de réalité augmentée

Similar Documents

Publication Publication Date Title
US9864909B2 (en) System and method for using augmented reality display in surface treatment procedures
US9898867B2 (en) System and method for augmented reality display of hoisting and rigging information
US9734403B2 (en) Augmented reality display of dynamic target object information
JP6251519B2 (ja) 加工対象物検査のための方法及びシステム
EP3792588B1 (fr) Procédé et appareil d'inspection de l'épaisseur de revêtement d'une surface et des défauts de revêtement de la surface
CA2386135C (fr) Procede et systeme destines a l'inspection et a la refection des vehicules a revetement utilises pour le transport de marchandises et/ou de materiaux dangereux
EP2647951B1 (fr) Système d'analyse de produit d'étanchéité
US20150294506A1 (en) System and Method for Augmented Reality Display of Dynamic Environment Information
US10564127B2 (en) Augmented reality visualization for pipe inspection
US20170122909A1 (en) Non-destructive system and method for detecting structural defects
Korotaev et al. Deflection measuring system for floating dry docks
US8249832B2 (en) Correlation of inspection information and computer-aided design data for structural assessment
US10504294B2 (en) System and method for augmented reality discrepancy determination and reporting
WO2016025585A1 (fr) Système et procédé d'utilisation d'un dispositif d'affichage à réalité augmentée dans des procédures de traitement de surface
US10915754B2 (en) System and method for use of augmented reality in outfitting a dynamic structural space
KR20170119018A (ko) 기름유출 탐지 및 확산 예측 방법
US20240256725A1 (en) Infrastructure asset prioritization software technologies for steel and concrete assets
Harris et al. Lamp ray: ship hull assessment for value, safety and readiness
Cole et al. Development of a sensor-based learning approach to prognostics in intelligent vehicle health monitoring
CN111971546B (zh) 移动体的特性劣化评价方法、特性劣化评价装置、特性劣化速度图生成方法以及特性劣化速度图生成装置
JP7671210B2 (ja) 設備のモデル化システム
US20220290964A1 (en) Method and system for tactile measurement, in particular layer thickness measurement
Alharbi et al. AI-Driven Aerial Corrosion Detection: Capabilities, Limitations, and Future Directions
Yajko et al. Visual Determination of Film Thickness via Real-Time Enhanced Digital Imaging
CN121114147A (zh) 一种船舶内舱涂层状态监检测方法及系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15832594

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15832594

Country of ref document: EP

Kind code of ref document: A1