[go: up one dir, main page]

WO2025041144A1 - Pilot training system utilizing integrated real/virtual aircraft and object data - Google Patents

Pilot training system utilizing integrated real/virtual aircraft and object data Download PDF

Info

Publication number
WO2025041144A1
WO2025041144A1 PCT/IL2024/050848 IL2024050848W WO2025041144A1 WO 2025041144 A1 WO2025041144 A1 WO 2025041144A1 IL 2024050848 W IL2024050848 W IL 2024050848W WO 2025041144 A1 WO2025041144 A1 WO 2025041144A1
Authority
WO
WIPO (PCT)
Prior art keywords
aircraft
virtual
training
positions
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/IL2024/050848
Other languages
French (fr)
Inventor
Ninet HAIMOVITZ
Yaacov TAVGER
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Israel Aerospace Industries Ltd
Original Assignee
Israel Aerospace Industries Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Israel Aerospace Industries Ltd filed Critical Israel Aerospace Industries Ltd
Publication of WO2025041144A1 publication Critical patent/WO2025041144A1/en
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • G09B9/02Simulators for teaching or training purposes for teaching control of vehicles or other craft
    • G09B9/08Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of aircraft, e.g. Link trainer
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • G09B9/02Simulators for teaching or training purposes for teaching control of vehicles or other craft
    • G09B9/08Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of aircraft, e.g. Link trainer
    • G09B9/30Simulation of view from aircraft
    • G09B9/301Simulation of view from aircraft by computer-processed or -generated image
    • G09B9/302Simulation of view from aircraft by computer-processed or -generated image the image being transformed by computer processing, e.g. updating the image to correspond to the changing point of view
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • G09B9/02Simulators for teaching or training purposes for teaching control of vehicles or other craft
    • G09B9/08Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of aircraft, e.g. Link trainer
    • G09B9/44Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of aircraft, e.g. Link trainer providing simulation in a real aircraft flying through the atmosphere without restriction of its path

Definitions

  • the presently disclosed subject matter relates to pilot tactical training systems, and in particular to implementation of such systems operating in coordination with ground- based aircraft simulators.
  • a system of monitoring an airborne training aircraft comprising a processing circuitry configured to: a) receive, from one or more ground-based aircraft simulators, data indicative of, at least, respective three-dimensional (3D) positions and bearings of one or more virtual aircraft associated with the respective simulators; b) receive, from the training aircraft, data indicative of, at least, the training aircraft's 3D position and bearing; c) construct, utilizing, at least, the training aircraft's 3D position and bearing, and the respective three-dimensional (3D) positions and bearings of respective virtual aircraft, at least one of: i) imaging depicting a view, from an external point of reference, of the training aircraft and relative positions of the virtual aircraft, and ii) imaging depicting at least one instrument of the training aircraft, wherein the depicted at least one instrument depicts positions of the one or more virtual aircraft relative to the training aircraft.
  • system can comprise feature (i) listed below:
  • processing circuitry is additionally configured to: d) display the constructed imaging on a display device.
  • a computer-implemented method of monitoring an airborne training aircraft comprising: a) receiving, from one or more ground-based aircraft simulators, data indicative of, at least, respective three-dimensional (3D) positions and bearings of one or more virtual aircraft associated with the respective simulators; b) receiving, from the training aircraft, data indicative of, at least, the training aircraft's 3D position and bearing; c) constructing, utilizing, at least, the training aircraft's 3D position and bearing, and the respective three-dimensional (3D) positions and bearings of respective virtual aircraft, at least one of: i) imaging depicting a view, from an external point of reference, of the training aircraft and relative positions of the virtual aircraft, and ii) imaging depicting at least one instrument of the training aircraft, wherein the depicted at least one instrument depicts positions of the one or more virtual aircraft relative to the training aircraft.
  • This aspect of the disclosed subject matter can further optionally comprise feature (i) listed above with respect to the system, mutatis
  • a computer program product comprising a non-transitory computer readable storage medium retaining program instructions, which, when read by a processing circuitry, cause the processing circuitry to perform a computerized method monitoring an airborne training aircraft, the method comprising: a) receiving, from one or more ground-based aircraft simulators, data indicative of, at least, respective three-dimensional (3D) positions and bearings of one or more virtual aircraft associated with the respective simulators; b) receiving, from the training aircraft, data indicative of, at least, the training aircraft's 3D position and bearing; c) constructing, utilizing, at least, the training aircraft's 3D position and bearing, and the respective three-dimensional (3D) positions and bearings of respective virtual aircraft, at least one of: i) imaging depicting a view, from an external point of reference, of the training aircraft and relative positions of the virtual aircraft, and ii) imaging depicting at least one instrument of the training aircraft, wherein the depicted at least one instrument depicts positions of the
  • Fig. 1 illustrates an example deployment scenario method of processing incoming sensor data, in accordance with some embodiments of the presently disclosed subject matter
  • Fig. 2A illustrates a logical block diagram of an example on-board system configured for integrated support of virtual objects/virtual aircraft, optionally including one or more manned ground-based aircraft simulators, in accordance with some embodiments of the presently disclosed subject matter;
  • Fig. 2B illustrates a logical block diagram of an example real-time training ground station (RTGS) configured for integrated support of virtual aircraft, including one or more manned ground-based aircraft simulators,, in accordance with some embodiments of the presently disclosed subject matter;
  • RGS real-time training ground station
  • Fig. 3A illustrates a flow diagram of an example method of processing incoming sensor data, in accordance with some embodiments of the presently disclosed subject matter
  • Fig. 3B illustrates a flow diagram of an example method of processing incoming virtual object data, in accordance with some embodiments of the presently disclosed subject matter
  • Fig. 4 illustrates a flow diagram of an example method of emulating and/or tracking the progress of an emulated projectile, in accordance with some embodiments of the presently disclosed subject matter.
  • Fig. 5 illustrates a flow diagram of an example method of monitoring an airborne training aircraft in the realtime ground station (RTGS), in accordance with some embodiments of the presently disclosed subject matter.
  • RGS realtime ground station
  • non-transitory memory and “non-transitory storage medium” used herein should be expansively construed to cover any volatile or non-volatile computer memory suitable to the presently disclosed subject matter.
  • FIG. 1 illustrates an example deployment scenario of an airplane including a pilot tactical training systems operating in coordination with one or more ground-based aircraft simulators, in accordance with some embodiments of the presently disclosed subject matter.
  • Aircraft equipped with training system 100 can be a aircraft customized for tactical training of a pilot.
  • Aircraft equipped with training system 100 can be a comparatively low-cost airplane, and can lack actual high-cost equipment such as radar that can be present in an actual tactical aircraft, and can instead include a “virtual radar” display.
  • Aircraft equipped with training system 100 can include weapons systems adapted for use in such pilot training - in some such systems, the target tracking and projectile firing subsystems have been modified so that they do not fire projectiles, but rather perform simulations of projectile firings.
  • Aircraft equipped with training system 100 can receive data from remote systems implementing virtual aircraft and/or other virtual objects, as described below.
  • Peer aircraft 110 can be a tactical aircraft which is in communication with aircraft equipped with training system 100.
  • Aircraft equipped with training system 100 can - for example - participate in training exercises in coordination with peer aircraft 110.
  • Ground-based aircraft simulator 120 can be a suitable ground-based system which simulates operational behavior of an aircraft for pilot training purposes.
  • ground-based aircraft simulator 120 includes a cockpit and pilot-operated or pilot-utilized instruments station.
  • Ground-based aircraft simulator 120 can communicate with e.g. real-time training ground station 130
  • Real-time training ground station 130 can be a suitable ground-based system which, in some embodiments, at least partially manages and/or monitors (e.g. in real time) an aircraft equipped with training system 100 and/or ground-based aircraft simulator 120.
  • Fig. 2A illustrates an example logical block diagram of an on-board system configured for integrated support of virtual objects/virtual aircraft, optionally including one or more manned ground-based aircraft simulators, in accordance with some embodiments of the presently disclosed subject matter.
  • Processing circuitry 210A can include e.g. processor 220A, memory 230A, and network controller 240 A.
  • Processor 220A can be a suitable hardware -based electronic device with data processing capabilities, such as, for example, a general purpose processor, digital signal processor (DSP), a specialized Application Specific Integrated Circuit (ASIC), one or more cores in a multicore processor, etc.
  • DSP digital signal processor
  • ASIC Application Specific Integrated Circuit
  • Processor 220 can also consist, for example, of multiple processors, multiple ASICs, virtual processors, combinations thereof etc.
  • Memory 230A can be, for example, a suitable kind of volatile and/or non-volatile storage, and can include, for example, a single physical memory component or a plurality of physical memory components. Memory 230A can also include virtual memory. Memory 230A can be configured to, for example, store various data used in computation.
  • Processing circuitry 210A can be configured to execute several functional modules in accordance with computer-readable instructions implemented on a non- transitory computer-readable storage medium. Such functional modules are referred to hereinafter as comprised in the processing circuitry. These modules can include, for example, navigation data integration unit 280A, projectile launching data integration unit 290A, integrated real/virtual space management unit 250A, and integrated real/virtual data repository 260A.
  • Virtual aircraft 270 can be a hardware or software device (located e.g. on the ground) that generates data indicative of behavior and properties of an aircraft.
  • virtual aircraft 270 can be a manned ground-based aircraft simulator, which generates data indicative of virtual objects (for example: data describing its own emulated location in 3D space, velocity, direction etc.).
  • Virtual aircraft 270 can generate the data indicative of virtual objects e.g. in response to actions by the simulator pilot.
  • Virtual aircraft 270 can generate the data indicative of virtual objects e.g., periodically.
  • Virtual aircraft 270 can transmit the generated data to the aircraft.
  • virtual aircraft 270 can be an entirely software-based virtual aircraft executing inside a server computer, or be a different variation of a hardware or software device that generates data indicative of behavior and properties of an aircraft.
  • Display system 205A can be suitable type of display device for providing the pilot with situational awareness - for example a head-mounted device (HMD) or a head-up display (HUD).
  • Display system 205A can display information pertaining to objects on the ground and/or in the air.
  • display system 205A displays information that is derivative of data provided by aircraft navigation sensors 215A.
  • display system 205A displays information that is derivative of data provided by virtual aircraft 270.
  • display system 205A displays information that is derivative of data provided by ground control station 275, as described in detail below.
  • display system 205A displays information that is derivative of combinations of the aforementioned and/or other data sources.
  • Display system 205A can receive its data from - for example - navigation data integration unit 280A.
  • Navigation data integration unit 280A can, for example, receive situational awareness data from different sources, and can them integrate them and/or otherwise prepare the data for display on display system 205A.
  • navigation data integration unit 280A can receive data (e.g. current radar data) from aircraft navigation sensors 215A.
  • navigation data integration unit 280A can receive data (e.g. data descriptive of current location and properties of a virtual aircraft 270) from network controller 240A.
  • Integrated real/virtual space management unit 250A can, for example, receive situational awareness data from different sources, and can them integrate them and/or otherwise prepare the data for storage, and store the data in integrated real/virtual data repository 260A.
  • Integrated real/virtual space management unit 250A can - for example - store the integrated data in a manner that is indicative of a virtual three-dimensional (3D) space (herein termed a “dynamically-updating 3D map”), in which real and virtual objects are present and associated with 3D coordinates as well as properties such as size, velocity etc.
  • 3D virtual three-dimensional
  • integrated real/virtual space management unit 250A can receive data (e.g. current radar data) from aircraft navigation sensors 215A.
  • integrated real/virtual space management unit 250A can receive data (e.g. a current data descriptive of location and properties of a virtual aircraft 270) from virtual aircraft 270 via network controller 240A.
  • Integrated real/virtual data repository 260A can be a suitable type of data storage.
  • Network controller 240A can be, for example, a type of communications controller suitable for bidirectional aircraft-to-ground communication. In some embodiments, network controller 240A can utilize a data link with suitable latency and/or jitter requirements to ensure that the dynamically-updating 3D map accurately represents a synchronized map of real and virtual objects. In some embodiments, network controller 240A is software that controls the network management, distributed slot allocation and RF data correlation and predictions.
  • Network controller 240A can be in communication with ground control station 275 and can send/receive data to/from ground control station 275.
  • Network controller 240A can be in communication with virtual entities 270 and can send/receive data to/from virtual entities 270.
  • Projectile launching data integration unit 290A can receive instructions from the aircraft pilot - such as, for example: a signal to launch a virtual projectile (e.g. a virtual heat-seeking missile) at a real or virtual target (such as a specific virtual aircraft 270). Projectile launching data integration unit 290A can emulate the behavior of the launched projectile. In some embodiments display system 205A displays the virtual projectile hitting missing of the real or virtual target.
  • a virtual projectile e.g. a virtual heat-seeking missile
  • display system 205A displays the virtual projectile hitting missing of the real or virtual target.
  • Projectile launching data integration unit 290A can, in emulating the behavior of the launched projectile, utilize data from the dynamically updated 3D map stored in integrated real/virtual data repository 260A. In this manner, the projectile emulation can be performed according to the most up-to-date construction of the integrated/real 3D space (e.g. including recent actions or maneuvers performed by a ground-based emulator pilot).
  • Fig. 2B illustrates an example logical block diagram of a real-time training ground station (RTGS) configured for integrated support of virtual aircraft, including one or more manned ground-based aircraft simulators, in accordance with some embodiments of the presently disclosed subject matter.
  • RGS real-time training ground station
  • Processing circuitry 210B can include e.g. processor 220B, memory 230B, and network controller 240B.
  • Processor 220B can be a suitable hardware -based electronic device with data processing capabilities, such as, for example, a general purpose processor, digital signal processor (DSP), a specialized Application Specific Integrated Circuit (ASIC), one or more cores in a multicore processor, etc.
  • DSP digital signal processor
  • ASIC Application Specific Integrated Circuit
  • Processor 220B can also consist, for example, of multiple processors, multiple ASICs, virtual processors, combinations thereof etc.
  • Memory 230B can be, for example, a suitable kind of volatile and/or non-volatile storage, and can include, for example, a single physical memory component or a plurality of physical memory components. Memory 230B can also include virtual memory. Memory 230 can be configured to, for example, store various data used in computation.
  • Processing circuitry 210B can be configured to execute several functional modules in accordance with computer-readable instructions implemented on a non- transitory computer-readable storage medium. Such functional modules are referred to hereinafter as comprised in the processing circuitry. These modules can include, for example, network controller 240B, integrated virtual space management unit 250B, integrated virtual data repository 260B, imaging construction unit 290B.
  • Display device 280B can be a monitor or other suitable type of display device for providing imaging informative of relative positions of real and virtual aircraft.
  • Integrated real/virtual space management unit 250B can, for example, receive situational awareness data from different sources, and can them integrate them and/or otherwise prepare the data for storage, and store the data in integrated real/virtual data repository 260B.
  • Integrated real/virtual space management unit 250 can - for example - store the integrated data in a manner that is indicative of a virtual three-dimensional (3D) space (herein termed a “dynamically-updating 3D map”), in which real and virtual objects are present and associated with 3D coordinates as well as properties such as size, velocity etc.
  • 3D virtual three-dimensional
  • integrated real/virtual space management unit 250B can receive data (e.g. current 3D location and bearing (i.e. velocity /direction) data) from airborne training aircraft 260.
  • integrated real/virtual space management unit 250B can receive data (e.g. a current data descriptive of location and properties such as velocity /direction of a virtual aircraft 270) from virtual aircraft 270 via network controller 240B.
  • Integrated real/virtual data repository 260B can be a suitable type of data storage.
  • Network controller 240B can be, for example, a type of communications controller suitable for bidirectional aircraft-to-ground communication. In some embodiments, network controller 240B can utilize a data link with suitable latency and/or jitter requirements to ensure that the dynamically-updating 3D map accurately represents a synchronized map of real and virtual objects. In some embodiments, network controller 240A is software that controls the network management, distributed slot allocation and RF data correlation and predictions.
  • Network controller 240B can be in communication with airborne training aircraft 260 and can send/receive data to/from airborne training aircraft 260.
  • Network controller 240B can be in communication with virtual aircraft 270 and can send/receive data to/from virtual aircraft 270.
  • Imaging construction unit 290B can utilize up-to-date 3D map data from integrated virtual data repository 260B. Imaging construction unit 290B can construct synthetic images using the 3D map data to illustrate one or more of: a) a view, from an external point of reference, of the training aircraft and relative positions of the virtual aircraft, and b) a depiction of at least one onboard instrument of the training aircraft, which shows positions of the virtual aircraft relative to the training aircraft.
  • Imaging construction unit 290B can display the constructed images on display device 280B. Imaging construction unit 290B can continually construct new images based on e.g. updated 3D map information and display them, thereby resulting in a periodically updating display. In some embodiments, the updating is sufficient frequent so as to provide video illustrating the real and virtual airplane movements.
  • Fig. 3A illustrates a method of processing incoming sensor data, in accordance with some embodiments of the presently disclosed subject matter.
  • Processing circuitry 210 can receive 310A sensor data (e.g. from aircraft navigation sensors 215).
  • Processing circuitry 210 can then update 320A display system 205 to display the received sensor data.
  • Processing circuitry 210 e.g. integrated real/virtual space management unit 250
  • navigation data integration unit 280 reads aircraft navigation sensors 215 every e.g. 20 milliseconds (ms) to obtain new sensor data.
  • Fig. 3B illustrates a method of processing incoming virtual object data, in accordance with some embodiments of the presently disclosed subject matter.
  • Processing circuitry 210 can receive 310B virtual aircraft or other virtual object data (e.g. from network controller 240, which can receive the data from e.g. virtual aircraft 270). Processing circuitry 210 (e.g. navigation data integration unit 280) can then update 320B display system 205 to display the received virtual aircraft or object data. For example: display system 205 can display the virtual aircraft (in its current virtual location) on an HMD together with velocity and/or other information.
  • Processing circuitry 210 can then update 330B integrated real/virtual data repository with the received sensor data, thereby updating the dynamically-updating 3D map with the current virtual object data.
  • navigation data integration unit 280 reads network controller 240 every e.g. 20 milliseconds (ms) to obtain new virtual object data.
  • Fig. 4 illustrates an example method of emulating and/or tracking the progress of an emulated projectile, in accordance with some embodiments of the presently disclosed subject matter.
  • Processing circuitry 210 can receive 410 an event indicating that a projectile launch should take place.
  • the pilot can perform an in cockpit operation to launch a missile (e.g. a heat-seeking missile, or a programmed path missile).
  • Processing circuitry 210 e.g. projectile launching data integration unit 290
  • Processing circuitry 210 e.g. projectile launching data integration unit 290
  • Processing circuitry 210 can then track 430 status and properties (e.g. 3D location, velocity etc.) of the emulated projectile in its emulated flight towards its real or emulated target.
  • Processing circuitry 210 e.g. projectile launching data integration unit 290
  • the target is a moving virtual aircraft (i.e. an “enemy” aircraft)
  • the course of an emulated heat-seeking missile can change accordingly.
  • Processing circuitry 210 can then evaluate 440 whether projectile course has completed, and when projectile course has completed, processing circuitry 210 (e.g. projectile launching data integration unit 290) can determine 450 status of the projectile e.g. whether it hit/missed its target.
  • Fig. 5 illustrates an example method of monitoring an airborne training aircraft in the realtime ground station (RTGS), in accordance with some embodiments of the presently disclosed subject matter.
  • RGS realtime ground station
  • Processing circuitry 210 can receive 510 position/bearing data of virtual aircraft from the ground-based simulators.
  • Processing circuitry 210 can receive 520 training aircraft position/bearing data.
  • Processing circuitry 210 can then update 530 the status/properties of data (e.g. 3d map data) stored in integrated real/virtual data repository 260B.
  • data e.g. 3d map data
  • Processing circuitry 210 can next construct 540, for a given point of reference, a view of the relative locations of the training aircraft and the one or more virtual aircraft.
  • processing circuitry 210 can construct one or more of: c) a view, from an external point of reference, of the training aircraft and relative positions of the virtual aircraft, and d) a depiction of at least one onboard instrument of the training aircraft, wherein the depiction shows positions of the virtual aircraft relative to the training aircraft.
  • Processing circuitry 210 can then display 550 the constructed view and/or depiction on e.g. display device 280B.
  • system according to the invention may be, at least partly, implemented on a suitably programmed computer.
  • the invention contemplates a computer program being readable by a computer for executing the method of the invention.
  • the invention further contemplates a non-transitory computer-readable memory tangibly embodying a program of instructions executable by the computer for executing the method of the invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

There is provided a system of monitoring an airborne training aircraft, the system comprising a processing circuitry configured to: a) receive, from one or more ground- based aircraft simulators, data indicative of, at least, respective three-dimensional (3D) positions and bearings of one or more virtual aircraft associated with the respective simulators; b) receive, from the training aircraft, data indicative of, at least, the training aircraft's 3D position and bearing; c) construct, utilizing, at least, the training aircraft's 3D position and bearing, and the respective three-dimensional (3D) positions and bearings of respective virtual aircraft, at least one of: i) imaging depicting a view, from an external point of reference, of the training aircraft and relative positions of virtual aircraft, and ii) imaging depicting at least one instrument of the training aircraft, wherein the depicted at least one instrument depicts positions of the one or more virtual aircraft relative to training aircraft.

Description

PILOT TRAINING SYSTEM UTILIZING INTEGRATED REAL/VIRTUAL AIRCRAFT AND OBJECT DATA
TECHNICAL FIELD
The presently disclosed subject matter relates to pilot tactical training systems, and in particular to implementation of such systems operating in coordination with ground- based aircraft simulators.
BACKGROUND
Problems of implementation in pilot tactical training systems have been recognized in the conventional art and various techniques have been developed to provide solutions.
GENERAL DESCRIPTION
According to one aspect of the presently disclosed subject matter there is provided a system of monitoring an airborne training aircraft, the system comprising a processing circuitry configured to: a) receive, from one or more ground-based aircraft simulators, data indicative of, at least, respective three-dimensional (3D) positions and bearings of one or more virtual aircraft associated with the respective simulators; b) receive, from the training aircraft, data indicative of, at least, the training aircraft's 3D position and bearing; c) construct, utilizing, at least, the training aircraft's 3D position and bearing, and the respective three-dimensional (3D) positions and bearings of respective virtual aircraft, at least one of: i) imaging depicting a view, from an external point of reference, of the training aircraft and relative positions of the virtual aircraft, and ii) imaging depicting at least one instrument of the training aircraft, wherein the depicted at least one instrument depicts positions of the one or more virtual aircraft relative to the training aircraft.
In addition to the above features, the system according to this aspect of the presently disclosed subject matter can comprise feature (i) listed below:
(i) the processing circuitry is additionally configured to: d) display the constructed imaging on a display device.
According to another aspect of the presently disclosed subject matter there is provided a computer-implemented method of monitoring an airborne training aircraft, the method comprising: a) receiving, from one or more ground-based aircraft simulators, data indicative of, at least, respective three-dimensional (3D) positions and bearings of one or more virtual aircraft associated with the respective simulators; b) receiving, from the training aircraft, data indicative of, at least, the training aircraft's 3D position and bearing; c) constructing, utilizing, at least, the training aircraft's 3D position and bearing, and the respective three-dimensional (3D) positions and bearings of respective virtual aircraft, at least one of: i) imaging depicting a view, from an external point of reference, of the training aircraft and relative positions of the virtual aircraft, and ii) imaging depicting at least one instrument of the training aircraft, wherein the depicted at least one instrument depicts positions of the one or more virtual aircraft relative to the training aircraft. This aspect of the disclosed subject matter can further optionally comprise feature (i) listed above with respect to the system, mutatis mutandis.
According to another aspect of the presently disclosed subject matter there is provided a computer program product comprising a non-transitory computer readable storage medium retaining program instructions, which, when read by a processing circuitry, cause the processing circuitry to perform a computerized method monitoring an airborne training aircraft, the method comprising: a) receiving, from one or more ground-based aircraft simulators, data indicative of, at least, respective three-dimensional (3D) positions and bearings of one or more virtual aircraft associated with the respective simulators; b) receiving, from the training aircraft, data indicative of, at least, the training aircraft's 3D position and bearing; c) constructing, utilizing, at least, the training aircraft's 3D position and bearing, and the respective three-dimensional (3D) positions and bearings of respective virtual aircraft, at least one of: i) imaging depicting a view, from an external point of reference, of the training aircraft and relative positions of the virtual aircraft, and ii) imaging depicting at least one instrument of the training aircraft, wherein the depicted at least one instrument depicts positions of the one or more virtual aircraft relative to the training aircraft.
This aspect of the disclosed subject matter can further optionally comprise feature (i) listed above with respect to the system, mutatis mutandis. BRIEF DESCRIPTION OF THE DRAWINGS
In order to understand the invention and to see how it can be carried out in practice, embodiments will be described, by way of non-limiting examples, with reference to the accompanying drawings, in which:
Fig. 1 illustrates an example deployment scenario method of processing incoming sensor data, in accordance with some embodiments of the presently disclosed subject matter;
Fig. 2A illustrates a logical block diagram of an example on-board system configured for integrated support of virtual objects/virtual aircraft, optionally including one or more manned ground-based aircraft simulators, in accordance with some embodiments of the presently disclosed subject matter;
Fig. 2B illustrates a logical block diagram of an example real-time training ground station (RTGS) configured for integrated support of virtual aircraft, including one or more manned ground-based aircraft simulators,, in accordance with some embodiments of the presently disclosed subject matter;
Fig. 3A illustrates a flow diagram of an example method of processing incoming sensor data, in accordance with some embodiments of the presently disclosed subject matter;
Fig. 3B illustrates a flow diagram of an example method of processing incoming virtual object data, in accordance with some embodiments of the presently disclosed subject matter;
Fig. 4 illustrates a flow diagram of an example method of emulating and/or tracking the progress of an emulated projectile, in accordance with some embodiments of the presently disclosed subject matter; and
Fig. 5 illustrates a flow diagram of an example method of monitoring an airborne training aircraft in the realtime ground station (RTGS), in accordance with some embodiments of the presently disclosed subject matter. DETAILED DESCRIPTION
In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the invention. However, it will be understood by those skilled in the art that the presently disclosed subject matter may be practiced without these specific details. In other instances, well-known methods, procedures, components and circuits have not been described in detail so as not to obscure the presently disclosed subject matter.
Unless specifically stated otherwise, as apparent from the following discussions, it is appreciated that throughout the specification discussions utilizing terms such as "processing", "computing", "comparing", "encrypting", “decrypting”, "determining", "calculating", “receiving”, “providing”, “obtaining”, “emulating” or the like, refer to the action(s) and/or process(es) of a computer that manipulate and/or transform data into other data, said data represented as physical, such as electronic, quantities and/or said data representing the physical objects. The term “computer” should be expansively construed to cover any kind of hardware -based electronic device with data processing capabilities including, by way of non-limiting example, the processor, mitigation unit, and inspection unit therein disclosed in the present application.
The terms "non-transitory memory" and “non-transitory storage medium” used herein should be expansively construed to cover any volatile or non-volatile computer memory suitable to the presently disclosed subject matter.
The operations in accordance with the teachings herein may be performed by a computer specially constructed for the desired purposes or by a general-purpose computer specially configured for the desired purpose by a computer program stored in a non- transitory computer-readable storage medium.
Embodiments of the presently disclosed subject matter are not described with reference to any particular programming language. It will be appreciated that a variety of programming languages may be used to implement the teachings of the presently disclosed subject matter as described herein. Fig. 1 illustrates an example deployment scenario of an airplane including a pilot tactical training systems operating in coordination with one or more ground-based aircraft simulators, in accordance with some embodiments of the presently disclosed subject matter.
Aircraft equipped with training system 100 can be a aircraft customized for tactical training of a pilot. Aircraft equipped with training system 100 can be a comparatively low-cost airplane, and can lack actual high-cost equipment such as radar that can be present in an actual tactical aircraft, and can instead include a “virtual radar” display.. Aircraft equipped with training system 100 can include weapons systems adapted for use in such pilot training - in some such systems, the target tracking and projectile firing subsystems have been modified so that they do not fire projectiles, but rather perform simulations of projectile firings. Aircraft equipped with training system 100 can receive data from remote systems implementing virtual aircraft and/or other virtual objects, as described below.
Peer aircraft 110 can be a tactical aircraft which is in communication with aircraft equipped with training system 100. Aircraft equipped with training system 100 can - for example - participate in training exercises in coordination with peer aircraft 110.
Ground-based aircraft simulator 120 can be a suitable ground-based system which simulates operational behavior of an aircraft for pilot training purposes. In some embodiments, ground-based aircraft simulator 120 includes a cockpit and pilot-operated or pilot-utilized instruments station. Ground-based aircraft simulator 120 can communicate with e.g. real-time training ground station 130
Real-time training ground station 130 can be a suitable ground-based system which, in some embodiments, at least partially manages and/or monitors (e.g. in real time) an aircraft equipped with training system 100 and/or ground-based aircraft simulator 120.
Fig. 2A illustrates an example logical block diagram of an on-board system configured for integrated support of virtual objects/virtual aircraft, optionally including one or more manned ground-based aircraft simulators, in accordance with some embodiments of the presently disclosed subject matter.
Processing circuitry 210A can include e.g. processor 220A, memory 230A, and network controller 240 A.
Processor 220A can be a suitable hardware -based electronic device with data processing capabilities, such as, for example, a general purpose processor, digital signal processor (DSP), a specialized Application Specific Integrated Circuit (ASIC), one or more cores in a multicore processor, etc. Processor 220 can also consist, for example, of multiple processors, multiple ASICs, virtual processors, combinations thereof etc.
Memory 230A can be, for example, a suitable kind of volatile and/or non-volatile storage, and can include, for example, a single physical memory component or a plurality of physical memory components. Memory 230A can also include virtual memory. Memory 230A can be configured to, for example, store various data used in computation.
Processing circuitry 210A can be configured to execute several functional modules in accordance with computer-readable instructions implemented on a non- transitory computer-readable storage medium. Such functional modules are referred to hereinafter as comprised in the processing circuitry. These modules can include, for example, navigation data integration unit 280A, projectile launching data integration unit 290A, integrated real/virtual space management unit 250A, and integrated real/virtual data repository 260A.
Virtual aircraft 270 can be a hardware or software device (located e.g. on the ground) that generates data indicative of behavior and properties of an aircraft. For example, virtual aircraft 270 can be a manned ground-based aircraft simulator, which generates data indicative of virtual objects (for example: data describing its own emulated location in 3D space, velocity, direction etc.). Virtual aircraft 270 can generate the data indicative of virtual objects e.g. in response to actions by the simulator pilot. Virtual aircraft 270 can generate the data indicative of virtual objects e.g., periodically. Virtual aircraft 270 can transmit the generated data to the aircraft. There can be one or more instances of virtual aircraft 270. In some other embodiments, virtual aircraft 270 can be an entirely software-based virtual aircraft executing inside a server computer, or be a different variation of a hardware or software device that generates data indicative of behavior and properties of an aircraft.
Display system 205A can be suitable type of display device for providing the pilot with situational awareness - for example a head-mounted device (HMD) or a head-up display (HUD). Display system 205A can display information pertaining to objects on the ground and/or in the air. In some embodiments, display system 205A displays information that is derivative of data provided by aircraft navigation sensors 215A. In some embodiments, display system 205A displays information that is derivative of data provided by virtual aircraft 270. In some embodiments, display system 205A displays information that is derivative of data provided by ground control station 275, as described in detail below. In some embodiments, display system 205A displays information that is derivative of combinations of the aforementioned and/or other data sources. Display system 205A can receive its data from - for example - navigation data integration unit 280A.
Navigation data integration unit 280A can, for example, receive situational awareness data from different sources, and can them integrate them and/or otherwise prepare the data for display on display system 205A. By way of non-limiting example, navigation data integration unit 280A can receive data (e.g. current radar data) from aircraft navigation sensors 215A. By way of further non-limiting example, navigation data integration unit 280A can receive data (e.g. data descriptive of current location and properties of a virtual aircraft 270) from network controller 240A.
Integrated real/virtual space management unit 250A can, for example, receive situational awareness data from different sources, and can them integrate them and/or otherwise prepare the data for storage, and store the data in integrated real/virtual data repository 260A. Integrated real/virtual space management unit 250A can - for example - store the integrated data in a manner that is indicative of a virtual three-dimensional (3D) space (herein termed a “dynamically-updating 3D map”), in which real and virtual objects are present and associated with 3D coordinates as well as properties such as size, velocity etc.
By way of non-limiting example, integrated real/virtual space management unit 250A can receive data (e.g. current radar data) from aircraft navigation sensors 215A. By way of further non-limiting example, integrated real/virtual space management unit 250A can receive data (e.g. a current data descriptive of location and properties of a virtual aircraft 270) from virtual aircraft 270 via network controller 240A.
Integrated real/virtual data repository 260A can be a suitable type of data storage.
Network controller 240A can be, for example, a type of communications controller suitable for bidirectional aircraft-to-ground communication. In some embodiments, network controller 240A can utilize a data link with suitable latency and/or jitter requirements to ensure that the dynamically-updating 3D map accurately represents a synchronized map of real and virtual objects. In some embodiments, network controller 240A is software that controls the network management, distributed slot allocation and RF data correlation and predictions.
Network controller 240A can be in communication with ground control station 275 and can send/receive data to/from ground control station 275.
Network controller 240A can be in communication with virtual entities 270 and can send/receive data to/from virtual entities 270.
Projectile launching data integration unit 290A can receive instructions from the aircraft pilot - such as, for example: a signal to launch a virtual projectile (e.g. a virtual heat-seeking missile) at a real or virtual target (such as a specific virtual aircraft 270). Projectile launching data integration unit 290A can emulate the behavior of the launched projectile. In some embodiments display system 205A displays the virtual projectile hitting missing of the real or virtual target.
Projectile launching data integration unit 290A can, in emulating the behavior of the launched projectile, utilize data from the dynamically updated 3D map stored in integrated real/virtual data repository 260A. In this manner, the projectile emulation can be performed according to the most up-to-date construction of the integrated/real 3D space (e.g. including recent actions or maneuvers performed by a ground-based emulator pilot).
Fig. 2B illustrates an example logical block diagram of a real-time training ground station (RTGS) configured for integrated support of virtual aircraft, including one or more manned ground-based aircraft simulators, in accordance with some embodiments of the presently disclosed subject matter.
Processing circuitry 210B can include e.g. processor 220B, memory 230B, and network controller 240B.
Processor 220B can be a suitable hardware -based electronic device with data processing capabilities, such as, for example, a general purpose processor, digital signal processor (DSP), a specialized Application Specific Integrated Circuit (ASIC), one or more cores in a multicore processor, etc. Processor 220B can also consist, for example, of multiple processors, multiple ASICs, virtual processors, combinations thereof etc.
Memory 230B can be, for example, a suitable kind of volatile and/or non-volatile storage, and can include, for example, a single physical memory component or a plurality of physical memory components. Memory 230B can also include virtual memory. Memory 230 can be configured to, for example, store various data used in computation.
Processing circuitry 210B can be configured to execute several functional modules in accordance with computer-readable instructions implemented on a non- transitory computer-readable storage medium. Such functional modules are referred to hereinafter as comprised in the processing circuitry. These modules can include, for example, network controller 240B, integrated virtual space management unit 250B, integrated virtual data repository 260B, imaging construction unit 290B.
Display device 280B can be a monitor or other suitable type of display device for providing imaging informative of relative positions of real and virtual aircraft.
Integrated real/virtual space management unit 250B can, for example, receive situational awareness data from different sources, and can them integrate them and/or otherwise prepare the data for storage, and store the data in integrated real/virtual data repository 260B. Integrated real/virtual space management unit 250 can - for example - store the integrated data in a manner that is indicative of a virtual three-dimensional (3D) space (herein termed a “dynamically-updating 3D map”), in which real and virtual objects are present and associated with 3D coordinates as well as properties such as size, velocity etc.
By way of non-limiting example, integrated real/virtual space management unit 250B can receive data (e.g. current 3D location and bearing (i.e. velocity /direction) data) from airborne training aircraft 260. By way of further non-limiting example, integrated real/virtual space management unit 250B can receive data (e.g. a current data descriptive of location and properties such as velocity /direction of a virtual aircraft 270) from virtual aircraft 270 via network controller 240B.
Integrated real/virtual data repository 260B can be a suitable type of data storage.
Network controller 240B can be, for example, a type of communications controller suitable for bidirectional aircraft-to-ground communication. In some embodiments, network controller 240B can utilize a data link with suitable latency and/or jitter requirements to ensure that the dynamically-updating 3D map accurately represents a synchronized map of real and virtual objects. In some embodiments, network controller 240A is software that controls the network management, distributed slot allocation and RF data correlation and predictions.
Network controller 240B can be in communication with airborne training aircraft 260 and can send/receive data to/from airborne training aircraft 260.
Network controller 240B can be in communication with virtual aircraft 270 and can send/receive data to/from virtual aircraft 270.
Imaging construction unit 290B can utilize up-to-date 3D map data from integrated virtual data repository 260B. Imaging construction unit 290B can construct synthetic images using the 3D map data to illustrate one or more of: a) a view, from an external point of reference, of the training aircraft and relative positions of the virtual aircraft, and b) a depiction of at least one onboard instrument of the training aircraft, which shows positions of the virtual aircraft relative to the training aircraft.
Imaging construction unit 290B can display the constructed images on display device 280B. Imaging construction unit 290B can continually construct new images based on e.g. updated 3D map information and display them, thereby resulting in a periodically updating display. In some embodiments, the updating is sufficient frequent so as to provide video illustrating the real and virtual airplane movements.
Fig. 3A illustrates a method of processing incoming sensor data, in accordance with some embodiments of the presently disclosed subject matter.
Processing circuitry 210 (e.g. navigation data integration unit 280) can receive 310A sensor data (e.g. from aircraft navigation sensors 215).
Processing circuitry 210 (e.g. navigation data integration unit 280) can then update 320A display system 205 to display the received sensor data.
Processing circuitry 210 (e.g. integrated real/virtual space management unit 250) can then update 330A integrated real/virtual data repository with the received sensor data.
In some embodiments, the method illustrated in Fig. 3A is performed periodically. For example: in some embodiments, navigation data integration unit 280 reads aircraft navigation sensors 215 every e.g. 20 milliseconds (ms) to obtain new sensor data.
Fig. 3B illustrates a method of processing incoming virtual object data, in accordance with some embodiments of the presently disclosed subject matter.
Processing circuitry 210 (e.g. navigation data integration unit 280) can receive 310B virtual aircraft or other virtual object data (e.g. from network controller 240, which can receive the data from e.g. virtual aircraft 270). Processing circuitry 210 (e.g. navigation data integration unit 280) can then update 320B display system 205 to display the received virtual aircraft or object data. For example: display system 205 can display the virtual aircraft (in its current virtual location) on an HMD together with velocity and/or other information.
Processing circuitry 210 (e.g. integrated real/virtual space management unit 250) can then update 330B integrated real/virtual data repository with the received sensor data, thereby updating the dynamically-updating 3D map with the current virtual object data.
In some embodiments, the method illustrated in Fig. 3A is performed periodically. For example: in some embodiments, navigation data integration unit 280 reads network controller 240 every e.g. 20 milliseconds (ms) to obtain new virtual object data.
Fig. 4 illustrates an example method of emulating and/or tracking the progress of an emulated projectile, in accordance with some embodiments of the presently disclosed subject matter.
Processing circuitry 210 (e.g. projectile launching data integration unit 290) can receive 410 an event indicating that a projectile launch should take place. By way of nonlimiting example: the pilot can perform an in cockpit operation to launch a missile (e.g. a heat-seeking missile, or a programmed path missile). Processing circuitry 210 (e.g. projectile launching data integration unit 290) can then receive a signal indicating the incockpit event.
Processing circuitry 210 (e.g. projectile launching data integration unit 290) can then access 420 dynamically -updated real/virtual space that is e.g. stored in integrated real/virtual data repository 260.
Processing circuitry 210 (e.g. projectile launching data integration unit 290) can then track 430 status and properties (e.g. 3D location, velocity etc.) of the emulated projectile in its emulated flight towards its real or emulated target. Processing circuitry 210 (e.g. projectile launching data integration unit 290) can utilize the dynamically- updated real/virtual space in tracking the status of the projectile. Thus - for example - if the target is a moving virtual aircraft (i.e. an “enemy” aircraft), the course of an emulated heat-seeking missile can change accordingly.
Processing circuitry 210 (e.g. projectile launching data integration unit 290) can then evaluate 440 whether projectile course has completed, and when projectile course has completed, processing circuitry 210 (e.g. projectile launching data integration unit 290) can determine 450 status of the projectile e.g. whether it hit/missed its target.
Fig. 5 illustrates an example method of monitoring an airborne training aircraft in the realtime ground station (RTGS), in accordance with some embodiments of the presently disclosed subject matter.
Processing circuitry 210 (e.g. imaging construction unit 290B) can receive 510 position/bearing data of virtual aircraft from the ground-based simulators.
Processing circuitry 210 (e.g. imaging construction unit 290B) can receive 520 training aircraft position/bearing data.
Processing circuitry 210 (e.g. imaging construction unit 290B) can then update 530 the status/properties of data (e.g. 3d map data) stored in integrated real/virtual data repository 260B.
Processing circuitry 210 (e.g. imaging construction unit 290B) can next construct 540, for a given point of reference, a view of the relative locations of the training aircraft and the one or more virtual aircraft.
By way of non-limiting example, processing circuitry 210 (e.g. imaging construction unit 290B) can construct one or more of: c) a view, from an external point of reference, of the training aircraft and relative positions of the virtual aircraft, and d) a depiction of at least one onboard instrument of the training aircraft, wherein the depiction shows positions of the virtual aircraft relative to the training aircraft.
Processing circuitry 210 (e.g. imaging construction unit 290B) can then display 550 the constructed view and/or depiction on e.g. display device 280B.
It is to be understood that the invention is not limited in its application to the details set forth in the description contained herein or illustrated in the drawings. The invention is capable of other embodiments and of being practiced and carried out in various ways. Hence, it is to be understood that the phraseology and terminology employed herein are for the purpose of description and should not be regarded as limiting. As such, those skilled in the art will appreciate that the conception upon which this disclosure is based may readily be utilized as a basis for designing other structures, methods, and systems for carrying out the several purposes of the presently disclosed subject matter.
It will also be understood that the system according to the invention may be, at least partly, implemented on a suitably programmed computer. Likewise, the invention contemplates a computer program being readable by a computer for executing the method of the invention. The invention further contemplates a non-transitory computer-readable memory tangibly embodying a program of instructions executable by the computer for executing the method of the invention.
Those skilled in the art will readily appreciate that various modifications and changes can be applied to the embodiments of the invention as hereinbefore described without departing from its scope, defined in and by the appended claims.

Claims

1. A system of monitoring an airborne training aircraft, the system comprising a processing circuitry configured to: a) receive, from one or more ground-based aircraft simulators, data indicative of, at least, respective three-dimensional (3D) positions and bearings of one or more virtual aircraft associated with the respective simulators; b) receive, from the training aircraft, data indicative of, at least, the training aircraft's 3D position and bearing; c) construct, utilizing, at least, the training aircraft's 3D position and bearing, and the respective three-dimensional (3D) positions and bearings of respective virtual aircraft, at least one of: i) imaging depicting a view, from an external point of reference, of the training aircraft and relative positions of the virtual aircraft, and ii) imaging depicting at least one instrument of the training aircraft, wherein the depicted at least one instrument depicts positions of the one or more virtual aircraft relative to the training aircraft.
2. The system of claim 1 , wherein the processing circuitry is additionally configured to: d) display the constructed imaging on a display device.
3. A processing-circuitry -based method of monitoring an airborne training aircraft, the method comprising: a) receiving, from one or more ground-based aircraft simulators, data indicative of, at least, respective three-dimensional (3D) positions and bearings of one or more virtual aircraft associated with the respective simulators; b) receiving, from the training aircraft, data indicative of, at least, the training aircraft's 3D position and bearing; c) constructing, utilizing, at least, the training aircraft's 3D position and bearing, and the respective three-dimensional (3D) positions and bearings of respective virtual aircraft, at least one of: i) imaging depicting a view, from an external point of reference, of the training aircraft and relative positions of the virtual aircraft, and ii) imaging depicting at least one instrument of the training aircraft, wherein the depicted at least one instrument depicts positions of the one or more virtual aircraft relative to the training aircraft.
4. A computer program product comprising a non-transitory computer readable storage medium retaining program instructions, which, when read by a processing circuitry, cause the processing circuitry to perform a computerized method monitoring an airborne training aircraft, the method comprising: a) receiving, from one or more ground-based aircraft simulators, data indicative of, at least, respective three-dimensional (3D) positions and bearings of one or more virtual aircraft associated with the respective simulators; b) receiving, from the training aircraft, data indicative of, at least, the training aircraft's 3D position and bearing; c) constructing, utilizing, at least, the training aircraft's 3D position and bearing, and the respective three-dimensional (3D) positions and bearings of respective virtual aircraft, at least one of: i) imaging depicting a view, from an external point of reference, of the training aircraft and relative positions of the virtual aircraft, and ii) imaging depicting at least one instrument of the training aircraft, wherein the depicted at least one instrument depicts positions of the one or more virtual aircraft relative to the training aircraft.
PCT/IL2024/050848 2023-08-23 2024-08-22 Pilot training system utilizing integrated real/virtual aircraft and object data Pending WO2025041144A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IL305422 2023-08-23
IL305422A IL305422A (en) 2023-08-23 2023-08-23 Pilot training system utilizing integrated real/virtual aircraft and object data

Publications (1)

Publication Number Publication Date
WO2025041144A1 true WO2025041144A1 (en) 2025-02-27

Family

ID=94731460

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2024/050848 Pending WO2025041144A1 (en) 2023-08-23 2024-08-22 Pilot training system utilizing integrated real/virtual aircraft and object data

Country Status (2)

Country Link
IL (1) IL305422A (en)
WO (1) WO2025041144A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9262939B2 (en) * 2009-12-01 2016-02-16 The Boeing Company Integrated live and simulation environment system for an aircraft
US20200258414A1 (en) * 2019-02-11 2020-08-13 Sierra Nevada Corporation Live virtual constructive gateway systems and methods
US20210271792A1 (en) * 2017-02-27 2021-09-02 Rockwell Collins, Inc. Systems and methods for autonomous vehicle systems testing
US20210295732A1 (en) * 2018-08-02 2021-09-23 Elbit Systems Ltd. In-flight training simulation displaying a virtual environment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9262939B2 (en) * 2009-12-01 2016-02-16 The Boeing Company Integrated live and simulation environment system for an aircraft
US20210271792A1 (en) * 2017-02-27 2021-09-02 Rockwell Collins, Inc. Systems and methods for autonomous vehicle systems testing
US20210295732A1 (en) * 2018-08-02 2021-09-23 Elbit Systems Ltd. In-flight training simulation displaying a virtual environment
US20200258414A1 (en) * 2019-02-11 2020-08-13 Sierra Nevada Corporation Live virtual constructive gateway systems and methods

Also Published As

Publication number Publication date
IL305422A (en) 2025-03-01

Similar Documents

Publication Publication Date Title
US8616884B1 (en) Integrated live and simulation environment system for an aircraft
US9721478B2 (en) Integrated live and simulation environment system for an aircraft
US7599765B2 (en) Dynamic guidance for close-in maneuvering air combat
US11069254B2 (en) Method for simulating live aircraft infrared seeker obscuration during live, virtual, constructive (LVC) exercises
US8616883B2 (en) Simulation control system for an integrated live and simulation environment for an aircraft
US9099009B2 (en) Performance-based simulation system for an aircraft
US8336775B2 (en) Aerospace-ground integration testbed
CN104471627B (en) Embedded analogy method and related system
US11268790B2 (en) Firing-simulation scope
US8986011B1 (en) Occlusion server for an integrated live and simulation environment for an aircraft
CN114333466A (en) Vehicle-mounted weapon verification-oriented vehicle simulator
CN111857177A (en) Method, device, equipment and medium for generating remote control target instruction
KR20110019615A (en) Onboard simulation training system of aircraft
KR101396292B1 (en) Flight simulator apparatus for implementing the same flight environment with battlefield-situation
WO2025041144A1 (en) Pilot training system utilizing integrated real/virtual aircraft and object data
KR101996409B1 (en) Bomb release simulator for analyzing effectiveness of weapon, simulation method thereof and a computer-readable storage medium for executing the method
US20240005812A1 (en) Bi-directional communications for vehicle and virtual game situations
Morrison et al. The Utility of Embedded Training.
Farlik et al. New Ways for Ground-Based Air Defence Personnel Training Using Simulation Technologies
Hoke et al. Embedded LVC training: A distributed training architecture for live platforms
Gerhard Jr Weapon System Integration for the AFIT Virtual Cockpit
Brady et al. The basics of on-board simulation and embedded training
Zhu et al. An Integrated Simulation System for Air-to-Ground Guided Munitions
Howell et al. Avionics wind tunnel laboratory interface development
Perry et al. Man-ln-the-loop/hardware-ln-the-loop synthetic battlespace simulation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24856018

Country of ref document: EP

Kind code of ref document: A1