WO2020248000A1 - A contained area network and a processor - Google Patents
A contained area network and a processor Download PDFInfo
- Publication number
- WO2020248000A1 WO2020248000A1 PCT/AU2019/050607 AU2019050607W WO2020248000A1 WO 2020248000 A1 WO2020248000 A1 WO 2020248000A1 AU 2019050607 W AU2019050607 W AU 2019050607W WO 2020248000 A1 WO2020248000 A1 WO 2020248000A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- processor
- area network
- geospatial
- information
- contained area
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
- G06Q50/26—Government or public services
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
- G01C11/04—Interpretation of pictures
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/29—Geographical information databases
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/587—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using geographical or spatial information, e.g. location
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/02—Agriculture; Fishing; Forestry; Mining
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/05—Geographic models
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/04—Indexing scheme for image data processing or generation, in general involving 3D image data
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
- G06T2207/10012—Stereo images
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
Definitions
- the software and data may be accessible by only one computer at a time.
- a separate software installation may be required for each additional computer.
- Each additional installation may be associated with the cost of an additional software licence and the cost of another desktop computer. Smaller computing devices, for example tablets, may not have sufficient computing resources, however, for processing the information.
- cloud-hosted software can only be accessed where there is an Internet connection. People working at remote locations (such as prospective mining site) may not have access to the internet. Also, an area that has been hit with a natural disaster (e.g.
- the contained area network comprises a processor comprising a contained area network interface and processor readable tangible media including program instructions which when executed by the processor causes the processor to generate processed geospatial information by processing geospatial information and send the processed geospatial information via the contained area network interface.
- the contained area network comprises a plurality of personal computing devices comprising a plurality of contained area network interfaces and configured to receive from the processor via the plurality of contained area interfaces the processed geospatial information.
- the program instructions comprise visualisation program instructions.
- each of the plurality of personal computing devices are configured to generate zoom level information indicative of a selected magnification of a geospatial image, and send the zoom level information to the processor, and the processed geospatial information comprises geospatial image information generated using the zoom level information.
- the geospatial image information comprises overlaid graphic information.
- each of the plurality of personal computing devices comprise personal computing device tangible media including program instructions which when executed by any one of the plurality of personal computing devices causes the personal computing device to generate a geospatial image using the geospatial image information.
- the program instructions comprise photogrammetry program instructions.
- processor readable tangible media comprises the geospatial information.
- the geospatial information may comprise geospatial image information.
- An embodiment is configured to receive zoom level information indicative of a selected magnification of a geospatial image, wherein the processed geospatial information comprises geospatial image information generated using the zoom level information.
- the geospatial information may comprise overlaid graphic information.
- the geospatial image information may comprise three geospatial dimensions.
- Figure 1 shows a schematic diagram of an embodiment of a contained area network.
- Figure 2 shows a schematic diagram of a processor of figure 1.
- FIG. 1 shows a schematic diagram of an embodiment of a contained area network (“network”) in the form of a local area network, the network being generally indicated by the numeral 10.
- a contained area network is a network or internetwork that does not comprise a network having a spatial scope that is greater than a local area network (LAN).
- Examples of contained area networks include a LAN, a Near-me network (NAN), a personal area network (PAN), a body area network (BAN), and a network comprising any two or more of a LAN, NAN, PAN and BAN.
- a network or internetwork comprising a wides area network (WAN) or network of greater spatial scope is not a contained area network.
- the network comprises a processor 12, a schematic diagram of which is shown in figure 2.
- the processor 12 comprises a contained area network interface 18 in the form of an Ethernet interface and processor readable tangible media 16 including program instructions which when executed by the processor 12 causes the processor to generate processed geospatial information by processing geospatial information and send the processed geospatial information via the contained area network interface 18.
- the geospatial information may be in the form of images and data from satellites, airborne drones, underwater drones, and terrestrial scanners, in relation to geography including landscapes, buildings and structures.
- the geospatial information may generally have two spatial dimensions (“two dimensional”) or have three spatial dimensions (“three dimensional”).
- the geospatial information may have a time dimension, but it may not.
- the network 10 comprises a plurality of personal computing devices 20, 22, 24.
- the plurality of personal computing devices comprises a plurality of contained area network interfaces 28, 30, 32 configured to receive from the processor 12 the processed geospatial information.
- the network 10 is in the form of a local area network comprising at least one of a wireless network component (e.g. IEEE 802.11“Wi-Fi”, as defined by documents available on 1 June 2019 from the Institute of Electrical and Electronics Engineers) and a wired network component (e.g. IEEE 802.3 (“Ethernet”), as defined by document available on 1 June 2019, FibreChannel, InfiniBand and PCIe networks).
- the network comprises a wired IEEE 802.3 link 46 between a wireless access point (“WAP”) 34 in the form of a Wi-Fi WAP and the processor 12.
- WAP wireless access point
- the plurality of personal computing devices are each in wireless communication with the WAP 34 and through a wireless link 44, the WAP 34, and the wired link 46 the processor 12.
- Each of the plurality of personal computing devices 22,24,26 are configured to generate zoom level information indicative of a selected magnification of a geospatial image, and send the zoom level information to the processor 12.
- the zoom level is selected by a user manipulating a user interface or on a computing device 20,22,24.
- the processed geospatial information comprises geospatial image information generated using the zoom level information.
- the geospatial image information can comprise in the present by not all embodiments overlaid graphic information.
- Each of the personal computing devices 22,24,26 comprise browser software (e.g. INTERNET EXPLORER, SAFARI, or CITROME).
- the image information is displayed by the browser when running.
- the browser uses the hypertext transfer protocol (HTTP) to communicate with the processor 12, in addition to the transmission control protocol (TCP) and internet protocol (IP). Any suitable communication protocols may be used.
- HTTP hypertext transfer protocol
- TCP transmission control protocol
- IP internet protocol
- Each of the plurality of personal computing devices comprise personal computing device tangible media including program instructions which when executed by any one of the plurality of personal computing devices causes the personal computing device to generate a geospatial image using the geospatial image information.
- the geospatial image has three spatial dimensions, however it may have less or more dimensions.
- the IEEE 802.3 standards define the transmission of protocol data units (PDUs) including Ethernet frames and Ethernet packets over a network physical medium in the form of, for example, a network cable, backplane lane, or another suitable network medium that connects two nodes of the network.
- a network cable may be, for example:
- optical fibre network cable in the form of single mode or multimode optical fibre, for example.
- a transceiver may comprise a transceiver module in the form of, for example, a pluggable 10 GE Small Form Factor Pluggable transceiver (10 Gb/s SFP+), a XENPAK transceiver, a XFP transceiver, an embedded PHY receiver, or generally any suitable 10 GE or other type of transceiver.
- the transceiver may be received in a transceiver socket, the received transceiver being selected for the selected network physical medium.
- Embodiments may have a 10 GE receive PHY system and a 10 GE transmit PHY system.
- the contained area network 10 may generally enable multiple users to interact with geospatial information (which may be stored on the processor 12) as a virtual reality experience, however other embodiments do not necessarily have this feature.
- the processor 12 hosts a virtual reality application which uses the geospatial data to generate a virtual reality environment.
- the geospatial data may be data that has been measured from the real world, or it may be data from the real world that has been modified, or it may be data which has been generated entirely artificially.
- the network 10 enables geospatial data to be viewed or processed, within a corporation or with controlled external access, by multiple users without using an internet service, and without sharing the data with a third-party cloud host.
- Scenario 1 natural disaster area. Drone footage of an area can be processed by the processor 12 into 3D visualisations and viewed by multiple local search and rescue workers using personal computing devices 20, 22, 24 without an internet service.
- the confined area network may help to find and victims or repair damage sooner.
- To quickly map a disaster area may require a large volume of aerial photos captured by drones, for example, to be processed by the processor 12.
- the contained area network 10 enables multiple drone operators to upload data to the processor 12, which may be at the natural disaster area for rapid access to the required data services.
- Three dimensional inspection models generated by the processor 12 are generally but not necessarily made available to multiple search and rescue personnel via the plurality of personal computing devices 20, 22, 24.
- Scenario 2 virtual training.
- police or military can use the confined area network 10 to visualise an area or a building before arriving at a site with a live situation or for training, for example.
- Three dimensional models of a site derived previously from drone or aerial data, could be stored on the server or copied to it prior to deployment to an incident or training event.
- Personnel with authority to access the data can navigate around relatively high resolution models of the site using low powered field computers and tablets. The models may be true to scale and location, with high levels of detail over large areas.
- Data can be captured from at least one drone, for example, and processed by processor 12 into models to add details to a visualisation.
- Scenario 3 secret location.
- the contained area network 10 can be used for analysising a secret area or structure by multiple trusted users of the contained areas network 10, without sharing the data with a third-party cloud host.
- the data may be kept off the cloud, which may otherwise be a security or confidentiality concern.
- Data may be kept on the contained area network 10, which may be a private network of, for example, a company.
- the data may be available to authorised users using the plurality of personal computing devices 20, 22, 24.
- Scenario 4 farmers.
- the contained area network I bean assist farmers to visualise crop data and better manage crops without an internet connection.
- farmers can acquire geospatial information such as satellite imagery and analytics (via couriered storage media or an internet connection when available), for example to assist with advanced crop management such as targeted weed and pest control, waste management and general crop health determination.
- the contained area network 10 at a field or farm may assist the farmer to access data to relatively high levels of details, and add data to the virtual site using just a tablet or low powered computer.
- Scenario 5 Surveyors.
- the contained area network 10 can assist surveyors to acquire, process and access data without an internet connection.
- a surveyor can capture photographs with at least one drone on a remote site, load the images onto a processor, which may process the data into three dimensional models using a light field computer.
- the quality of the field data can be validated before returning to an office.
- Data can continue to be processed while in transit, for example back to an office. Once in the office, for example, the data can be send to the contained area network 10.
- Scenario 6 customised gaming.
- the system enables 3D virtual models of specific areas to be created for a gaming environment. Photographs from a drone can be processed into three dimension models, or pre-existing models can be loaded onto the processor 12 to be used in a personalised gaming environment. Processor 12 can create the local scene for gaming use either at the scene location or elsewhere.
- each mobile device must have a camera and must transmit the GPS location and orientation of the device to the server so that graphics can be generated for the area being viewed.
- geospatial data can be processed by the contained area network.
- the geospatial data is generally but not necessarily tiled, or cut up into detailed views and different zoom levels, enabling the relatively high detail to be seen of massive files covering large areas, using a browser application. Multiple data types can be viewed and merged together on a browser application.
- the contained area network 10 enables account administrators to control access of users to data, both internally by creating groups of people in projects, and externally by creating hyperlinks to the project.
- API integration with external database products can add external contextual data to a project or add a project to data in an external database product.
- API connectivity can push data to external processing software and services, and to have the resultant data returned to the proj ect.
- API connectivity can be used to access external software running on another processor within the contained area netowrk.
- network 10 to simultaneously mark and measure points, lines, polygons (areas) and volumes and extract the markups (digitize) with real world coordinates.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Business, Economics & Management (AREA)
- Remote Sensing (AREA)
- Geometry (AREA)
- Software Systems (AREA)
- Databases & Information Systems (AREA)
- Tourism & Hospitality (AREA)
- General Health & Medical Sciences (AREA)
- General Business, Economics & Management (AREA)
- General Engineering & Computer Science (AREA)
- Computer Graphics (AREA)
- Health & Medical Sciences (AREA)
- Economics (AREA)
- Data Mining & Analysis (AREA)
- Human Resources & Organizations (AREA)
- Marketing (AREA)
- Primary Health Care (AREA)
- Strategic Management (AREA)
- Radar, Positioning & Navigation (AREA)
- Life Sciences & Earth Sciences (AREA)
- Agronomy & Crop Science (AREA)
- Animal Husbandry (AREA)
- Marine Sciences & Fisheries (AREA)
- Mining & Mineral Resources (AREA)
- Multimedia (AREA)
- Development Economics (AREA)
- Educational Administration (AREA)
- Library & Information Science (AREA)
- Information Transfer Between Computers (AREA)
Abstract
Description
Claims
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| AU2019450140A AU2019450140A1 (en) | 2019-06-13 | 2019-06-13 | A contained area network and a processor |
| PCT/AU2019/050607 WO2020248000A1 (en) | 2019-06-13 | 2019-06-13 | A contained area network and a processor |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/AU2019/050607 WO2020248000A1 (en) | 2019-06-13 | 2019-06-13 | A contained area network and a processor |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2020248000A1 true WO2020248000A1 (en) | 2020-12-17 |
Family
ID=73780666
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/AU2019/050607 Ceased WO2020248000A1 (en) | 2019-06-13 | 2019-06-13 | A contained area network and a processor |
Country Status (2)
| Country | Link |
|---|---|
| AU (1) | AU2019450140A1 (en) |
| WO (1) | WO2020248000A1 (en) |
Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20080250485A1 (en) * | 2004-01-29 | 2008-10-09 | Koninklijke Philips Electronic, N.V. | Guest Dongle and Method of Connecting Guest Apparatuses to Wireless Home Networks |
| US7933929B1 (en) * | 2005-06-27 | 2011-04-26 | Google Inc. | Network link for providing dynamic data layer in a geographic information system |
| US20130328862A1 (en) * | 2012-06-06 | 2013-12-12 | Apple Inc. | Geospatial representation of data-less map areas |
| US9070216B2 (en) * | 2011-12-14 | 2015-06-30 | The Board Of Trustees Of The University Of Illinois | Four-dimensional augmented reality models for interactive visualization and automated construction progress monitoring |
| US9609061B2 (en) * | 2013-03-13 | 2017-03-28 | Trolex Aport ApS | Rugged and mobile media server and method for providing media to passengers on a public transport vehicle |
| US9659406B2 (en) * | 2008-05-07 | 2017-05-23 | Microsoft Technology Licensing, Llc | Procedural authoring |
| US10163255B2 (en) * | 2015-01-07 | 2018-12-25 | Geopogo, Inc. | Three-dimensional geospatial visualization |
-
2019
- 2019-06-13 WO PCT/AU2019/050607 patent/WO2020248000A1/en not_active Ceased
- 2019-06-13 AU AU2019450140A patent/AU2019450140A1/en active Pending
Patent Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20080250485A1 (en) * | 2004-01-29 | 2008-10-09 | Koninklijke Philips Electronic, N.V. | Guest Dongle and Method of Connecting Guest Apparatuses to Wireless Home Networks |
| US7933929B1 (en) * | 2005-06-27 | 2011-04-26 | Google Inc. | Network link for providing dynamic data layer in a geographic information system |
| US9659406B2 (en) * | 2008-05-07 | 2017-05-23 | Microsoft Technology Licensing, Llc | Procedural authoring |
| US9070216B2 (en) * | 2011-12-14 | 2015-06-30 | The Board Of Trustees Of The University Of Illinois | Four-dimensional augmented reality models for interactive visualization and automated construction progress monitoring |
| US20130328862A1 (en) * | 2012-06-06 | 2013-12-12 | Apple Inc. | Geospatial representation of data-less map areas |
| US9609061B2 (en) * | 2013-03-13 | 2017-03-28 | Trolex Aport ApS | Rugged and mobile media server and method for providing media to passengers on a public transport vehicle |
| US10163255B2 (en) * | 2015-01-07 | 2018-12-25 | Geopogo, Inc. | Three-dimensional geospatial visualization |
Also Published As
| Publication number | Publication date |
|---|---|
| AU2019450140A1 (en) | 2021-06-17 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| Kikuchi et al. | Future landscape visualization using a city digital twin: Integration of augmented reality and drones with implementation of 3D model-based occlusion handling | |
| CN109064545B (en) | Method and device for data acquisition and model generation of house | |
| Verhoeven | Taking computer vision aloft–archaeological three‐dimensional reconstructions from aerial photographs with photoscan | |
| US20190306424A1 (en) | Capture, Analysis And Use Of Building Data From Mobile Devices | |
| Wen et al. | Augmented reality and unmanned aerial vehicle assist in construction management | |
| CN112074797A (en) | System and method for anchoring virtual objects to physical locations | |
| AU2020409015B2 (en) | Data hierarchy protocol for data transmission pathway selection | |
| CN106454209A (en) | Unmanned aerial vehicle emergency quick action data link system and unmanned aerial vehicle emergency quick action monitoring method based on spatial-temporal information fusion technology | |
| US20140295891A1 (en) | Method, server and terminal for information interaction | |
| KR101839111B1 (en) | System for providing building information based on BIM | |
| CN109102566A (en) | A kind of indoor outdoor scene method for reconstructing and its device of substation | |
| CN110895833A (en) | Method and device for three-dimensional modeling of indoor scene | |
| CN116129064A (en) | Electronic map generation method, device, equipment and storage medium | |
| KR102022912B1 (en) | System for sharing information using mixed reality | |
| US12315193B2 (en) | Localization and mapping by a group of mobile communications devices | |
| CN113919737A (en) | Data processing method and system for comprehensive protection of multi-functional emergency fire rescue | |
| AU2019101803A4 (en) | A contained area network and a processor | |
| WO2020248000A1 (en) | A contained area network and a processor | |
| CN120107488A (en) | Method, system and equipment for constructing integrated indoor and outdoor three-dimensional model of a building | |
| Yang et al. | A Low‐Cost and Ultralight Unmanned Aerial Vehicle‐Borne Multicamera Imaging System Based on Smartphones | |
| JP5471919B2 (en) | Image processing apparatus, image processing program, image processing method, and moving body | |
| EP3845858B1 (en) | Using three dimensional data for privacy masking of image data | |
| KR20230089362A (en) | System for updating virtual space based on location | |
| CN114241126A (en) | Method for extracting object position information in monocular video based on live-action model | |
| JP7722751B2 (en) | Real-time communication support system and method, mobile terminal, server, and program |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19932541 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2019450140 Country of ref document: AU Date of ref document: 20190613 Kind code of ref document: A |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 19932541 Country of ref document: EP Kind code of ref document: A1 |