AU2019450140A1 - A contained area network and a processor - Google Patents

A contained area network and a processor Download PDF

Info

Publication number
AU2019450140A1
AU2019450140A1 AU2019450140A AU2019450140A AU2019450140A1 AU 2019450140 A1 AU2019450140 A1 AU 2019450140A1 AU 2019450140 A AU2019450140 A AU 2019450140A AU 2019450140 A AU2019450140 A AU 2019450140A AU 2019450140 A1 AU2019450140 A1 AU 2019450140A1
Authority
AU
Australia
Prior art keywords
processor
area network
geospatial
information
contained area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
AU2019450140A
Inventor
Adam CHABOK
Robert KLAU
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
4d Mapper Pty Ltd
Original Assignee
4d Mapper Pty Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 4d Mapper Pty Ltd filed Critical 4d Mapper Pty Ltd
Publication of AU2019450140A1 publication Critical patent/AU2019450140A1/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/26Government or public services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/587Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using geographical or spatial information, e.g. location
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/02Agriculture; Fishing; Forestry; Mining
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/04Indexing scheme for image data processing or generation, in general involving 3D image data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Tourism & Hospitality (AREA)
  • Remote Sensing (AREA)
  • Databases & Information Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • General Business, Economics & Management (AREA)
  • Health & Medical Sciences (AREA)
  • Economics (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • Computer Graphics (AREA)
  • Data Mining & Analysis (AREA)
  • Agronomy & Crop Science (AREA)
  • Animal Husbandry (AREA)
  • Marine Sciences & Fisheries (AREA)
  • Mining & Mineral Resources (AREA)
  • Library & Information Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Educational Administration (AREA)
  • Development Economics (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

Disclosed herein is a contained area network (10). The network comprises a processor (12). The processor (12) comprises a contained area network interface (18) and processor readable tangible media (16) including program instructions which when executed by the processor (12) causes the processor (12) to generate processed geospatial information by processing geospatial information and send the processed geospatial information via the contained area network interface (18). The network (10) comprises a plurality of personal computing devices (20, 22, 24). The plurality of personal computing devices (20, 22, 24) comprises a plurality of contained area network interfaces (28, 30, 32) configured to receive from the processor (12) the processed geospatial information.

Description

A CONTAINED AREA NETWORK AND A PROCESSOR
Technical field
The disclosure herein generally relates to a contained area network and a processor.
Background
The relatively recent availability of massive volumes of information from satellites, aircraft, airborne drones, underwater drones and terrestrial scanners has given rise to widespread need for handling visualising, analysing and other processing. For example, large structures such as buildings, bridges and towers are commonly imaged with airborne drones to perform routine maintenance inspections or to assess damage in areas that are otherwise difficult to reach.
Landscapes are imaged for many reasons, for example, to survey a landform, measure volumes, digitize features and to inspect crops on a farm.
Photogrammetry is the process of taking measurements from photographs. Modern
photogrammetry involves processing photographs with photogrammetry software and may include 3D reconstruction from multiple 2D images.
Visualisation, analysis and other information processing software may be installed on a personal computing device, for example a desktop computer, or accessed via a cloud-based architecture.
The software and data may be accessible by only one computer at a time. A separate software installation may be required for each additional computer. Each additional installation may be associated with the cost of an additional software licence and the cost of another desktop computer. Smaller computing devices, for example tablets, may not have sufficient computing resources, however, for processing the information.
When using a cloud-based architecture, photogrammetry and visualisation software may be hosted on a remote server and accessed over the internet via a web browser. Browser-enabled computing devices can access the software, for example tablets, smart phones, and laptops.
However, there are some serious drawbacks with a cloud-based architecture. Firstly, users may be required to upload their images and data to the cloud, which often requires very high bandwidth. Some users may not wish to upload their images and data to the cloud. Example reasons for not wishing to upload images and data to the cloud include confidentiality and security. For example, the data may reveal confidential information such as:
• the location of a prospective mining site (which could be used by competitors)
• details of a military asset (which could be used by enemies)
• details of public asset such as a power station or bridge (which could be used by
terrorists).
Users may also have concerns that the software host may claim ownership rights to the data.
Another drawback with cloud-hosted software is that it can only be accessed where there is an Internet connection. People working at remote locations (such as prospective mining site) may not have access to the internet. Also, an area that has been hit with a natural disaster (e.g.
earthquake, flood, cyclone) may be disconnected from the internet. It may be crucial to quickly analyse drone images of the local area. In the absence of an Internet connection, the only options are to travel to somewhere that has internet access, or to bring a desktop computer with a local copy of the software.
It may be desirable to overcome at least some disadvantages, or provide consumers with a choice.
Summary
Disclosed hererin is a contained area network. The contained area network comprises a processor comprising a contained area network interface and processor readable tangible media including program instructions which when executed by the processor causes the processor to generate processed geospatial information by processing geospatial information and send the processed geospatial information via the contained area network interface. The contained area network comprises a plurality of personal computing devices comprising a plurality of contained area network interfaces and configured to receive from the processor via the plurality of contained area interfaces the processed geospatial information.
In an embodiment, the program instructions comprise photogrammetry program instructions.
In an embodiment, the program instructions comprise visualisation program instructions.
In an embodiment, the processor readable tangible media includes the geospatial information. In an embodiment, the geospatial information comprises geospatial image information.
In an embodiment, each of the plurality of personal computing devices are configured to generate zoom level information indicative of a selected magnification of a geospatial image, and send the zoom level information to the processor, and the processed geospatial information comprises geospatial image information generated using the zoom level information.
In an embodiment, the geospatial image information comprises overlaid graphic information.
In an embodiment, each of the plurality of personal computing devices comprise personal computing device tangible media including program instructions which when executed by any one of the plurality of personal computing devices causes the personal computing device to generate a geospatial image using the geospatial image information.
In an embodiment, the geospatial image is a three dimensional geospatial image.
An embodiment comprises at least one contained area network interface dongle.
An embodiment comprises at least one of a local area network (LAN) and a personal area network (PAN).
In an embodiment, the processor comprises a ruggedized server.
Disclosed herein is a processor comprising a contained area network interface and processor readable tangible media including program instructions which when executed by the processor causes the processor to generate processed geospatial information by processing geospatial information and send the processed geospatial information via the contained area network interface.
In an embodiment, the program instructions comprise photogrammetry program instructions.
In an embodiment, the program instructions comprise visualisation program instructions.
In an embodiment, processor readable tangible media comprises the geospatial information. The geospatial information may comprise geospatial image information.
An embodiment is configured to receive zoom level information indicative of a selected magnification of a geospatial image, wherein the processed geospatial information comprises geospatial image information generated using the zoom level information. The geospatial information may comprise overlaid graphic information. The geospatial image information may comprise three geospatial dimensions.
Disclosed herein is non-transitory processor readable tangible media including program instructions which when executed by a processor causes the processor to perform a method disclosed herein.
Disclosed herein is a computer program for instructing a processor, which when executed by the processor causes the processor to perform a method disclosed herein.
Any of the various features of each of the above disclosures, and of the various features of the embodiments described below, can be combined as suitable and desired.
Brief description of the figures
Embodiments will now be described by way of example only with reference to the
accompanying figures in which:
Figure 1 shows a schematic diagram of an embodiment of a contained area network. Figure 2 shows a schematic diagram of a processor of figure 1.
Description of embodiments
Figure 1 shows a schematic diagram of an embodiment of a contained area network (“network”) in the form of a local area network, the network being generally indicated by the numeral 10. In the context of the present document, a contained area network is a network or internetwork that does not comprise a network having a spatial scope that is greater than a local area network (LAN). Examples of contained area networks include a LAN, a Near-me network (NAN), a personal area network (PAN), a body area network (BAN), and a network comprising any two or more of a LAN, NAN, PAN and BAN. A network or internetwork comprising a wides area network (WAN) or network of greater spatial scope is not a contained area network. The network comprises a processor 12, a schematic diagram of which is shown in figure 2. The processor 12 comprises a contained area network interface 18 in the form of an Ethernet interface and processor readable tangible media 16 including program instructions which when executed by the processor 12 causes the processor to generate processed geospatial information by processing geospatial information and send the processed geospatial information via the contained area network interface 18. The geospatial information may be in the form of images and data from satellites, airborne drones, underwater drones, and terrestrial scanners, in relation to geography including landscapes, buildings and structures. The geospatial information may generally have two spatial dimensions (“two dimensional”) or have three spatial dimensions (“three dimensional”). The geospatial information may have a time dimension, but it may not. The network 10 comprises a plurality of personal computing devices 20, 22, 24. The plurality of personal computing devices comprises a plurality of contained area network interfaces 28, 30, 32 configured to receive from the processor 12 the processed geospatial information.
The processor 12 is in the form of a computer, examples of which include but are not limited to a HP Z2 mini workstation, a HP VR backpack - a wearable computer - or generally any suitable computer or processor. In the present embodiment, the computer 12 is configured as a local area network server. The plurality of personal computing devices 20,22,24 comprise computing devices in the form of a tablet computer 22,24 and a smartphone 32, however they may comprise laptop computers, desk top computers or generally any suitable form of personal computing device. The processor readable tangible media 16 includes the geospatial information, however it may be in another device. For example the network 10 may comprise network attached storage including the geospatial information, or the geospatial information may be in peripheral storage (for example an external solid state drive). Generally, but not necessarily, the geospatial information comprises geospatial image information, for example digital aerial photographs.
In the present embodiment, the network 10 is in the form of a local area network comprising at least one of a wireless network component (e.g. IEEE 802.11“Wi-Fi”, as defined by documents available on 1 June 2019 from the Institute of Electrical and Electronics Engineers) and a wired network component (e.g. IEEE 802.3 (“Ethernet”), as defined by document available on 1 June 2019, FibreChannel, InfiniBand and PCIe networks). The network comprises a wired IEEE 802.3 link 46 between a wireless access point (“WAP”) 34 in the form of a Wi-Fi WAP and the processor 12. The plurality of personal computing devices are each in wireless communication with the WAP 34 and through a wireless link 44, the WAP 34, and the wired link 46 the processor 12. Alternatively, the processor may be wireless connected to the plurality of personal computing devices 22,24,26 via the wireless access point, or the processor and the plurality of physical layers may be connected via Ethernet cables and an Ethernet switch or hub, for example. Generally, any suitable network configuration may be used. The network may additionally comprise a personal area network (PAN) in the form of, for example, a Bluetooth or generally any suitable PAN. Alternatively, the network 10 may be in the form of a Bluetooth network. The network 10 comprises a router 42 comprising the wireless access point 34. The network 10 may be connected to the internet be plugging the router 42 into an internet wall socket, for example, however the network 10 can still operate when not connected to the internet. When connected to the internet, remote personal computing devices 48 can communicate with the processor 12. Embodiments may not be configured to be in communication with the internet, for example may not comprise a suitable router. Generally, any suitable form of contained area network may be used. Generally, each of the processor 12 and the plurality of personal computing devices 20,22,24 each have an integral contained area network interface, however any one of the processor 12 and the plurality of personal computing devices 20,22,24 may have received a contained area network interface dongle, for example a Wi-Fi dongle. The processor 12 may be a server on a private cloud-based architecture in a secure corporate environment.
The program instructions comprise at least one of photogrammetry program instructions and visualisation program instructions. The photogrammetry program instructions can at least one of determine dimensions and measurements from a plurality of images having only two spatial dimensions (e g a photograph) stored on the tangible media 16, and reconstruct objects
(including geographical objects and landscapes) from the plurality of images stored on the tangible media.
Each of the plurality of personal computing devices 22,24,26 are configured to generate zoom level information indicative of a selected magnification of a geospatial image, and send the zoom level information to the processor 12. The zoom level is selected by a user manipulating a user interface or on a computing device 20,22,24. The processed geospatial information comprises geospatial image information generated using the zoom level information. The geospatial image information can comprise in the present by not all embodiments overlaid graphic information. Each of the personal computing devices 22,24,26 comprise browser software (e.g. INTERNET EXPLORER, SAFARI, or CITROME). The image information is displayed by the browser when running. The browser uses the hypertext transfer protocol (HTTP) to communicate with the processor 12, in addition to the transmission control protocol (TCP) and internet protocol (IP). Any suitable communication protocols may be used.
Programme instructions in the form of visualisation software stored in the processor’s memory creates visualisation information for a plurality of visualisations of the parent object (“tiles”) for more than one zoom level and more than one viewing angle. A browser may assess the location and view angle selected by the user and find a visualisation (“tile”) relevant to display the user selected view of the model on the browser. This may enable massive highly detailed maps and models to be viewed on the plurality of personal computing devices (which may be low computational power computer) with a web browser. The browser, front end, makes calls on the back end for the relevant data. This can also include merging views with multiple data types. Each of the plurality of personal computing devices comprise personal computing device tangible media including program instructions which when executed by any one of the plurality of personal computing devices causes the personal computing device to generate a geospatial image using the geospatial image information. The geospatial image has three spatial dimensions, however it may have less or more dimensions.
The IEEE 802.3 standards, for example, define the transmission of protocol data units (PDUs) including Ethernet frames and Ethernet packets over a network physical medium in the form of, for example, a network cable, backplane lane, or another suitable network medium that connects two nodes of the network. A network cable may be, for example:
• An electrical network cable in the form of a twinaxial network cable, copper network cable, or twisted pair, for example, or
• an optical fibre network cable in the form of single mode or multimode optical fibre, for example.
A standard, for example as defined by IEEE 802.3, FibreChannel, InfiniBand and PCIe standards, may define network electrical and/or optical connections and mechanical connections at the physical level. An Ethernet network node or device generally comprises a system (PHY) compliant with the IEEE 802.3 standard, and is in communication with a media access controller (MAC) of the data link layer, that defines a MAC address for the node or device, and which is responsible for the sending a frame of data from one node of the network to another via the physical layer. The frame is a payload of an Ethernet packet defined at the physical layer. Each end of the network physical medium is connected to an interface of a respective node. The interface comprises a medium attachment unit (MAU) in the form of at least one
communications port of a respective node, which may comprise a transceiver, a receiver or a transmitter, and which may provide a mechanical connection and a communication connection between the node and the network medium. A transceiver may comprise a transceiver module in the form of, for example, a pluggable 10 GE Small Form Factor Pluggable transceiver (10 Gb/s SFP+), a XENPAK transceiver, a XFP transceiver, an embedded PHY receiver, or generally any suitable 10 GE or other type of transceiver. The transceiver may be received in a transceiver socket, the received transceiver being selected for the selected network physical medium.
Embodiments may have a 10 GE receive PHY system and a 10 GE transmit PHY system.
The payload of an Ethernet frame may comprise information, for example within a payload of a TCP protocol data unit (“segment”) that is the payload of the Ethernet frame. Example: private virtual reality
The contained area network 10 may generally enable multiple users to interact with geospatial information (which may be stored on the processor 12) as a virtual reality experience, however other embodiments do not necessarily have this feature. The processor 12 hosts a virtual reality application which uses the geospatial data to generate a virtual reality environment. The geospatial data may be data that has been measured from the real world, or it may be data from the real world that has been modified, or it may be data which has been generated entirely artificially. The network 10 enables geospatial data to be viewed or processed, within a corporation or with controlled external access, by multiple users without using an internet service, and without sharing the data with a third-party cloud host. Several scenarios are now described, in which example used and configurations of the contained area network :
Scenario 1: natural disaster area. Drone footage of an area can be processed by the processor 12 into 3D visualisations and viewed by multiple local search and rescue workers using personal computing devices 20, 22, 24 without an internet service. The confined area network may help to find and victims or repair damage sooner.
To quickly map a disaster area may require a large volume of aerial photos captured by drones, for example, to be processed by the processor 12.
The contained area network 10 enables multiple drone operators to upload data to the processor 12, which may be at the natural disaster area for rapid access to the required data services. Three dimensional inspection models generated by the processor 12 are generally but not necessarily made available to multiple search and rescue personnel via the plurality of personal computing devices 20, 22, 24.
Scenario 2: virtual training. Police or military can use the confined area network 10 to visualise an area or a building before arriving at a site with a live situation or for training, for example. Three dimensional models of a site, derived previously from drone or aerial data, could be stored on the server or copied to it prior to deployment to an incident or training event. Personnel with authority to access the data can navigate around relatively high resolution models of the site using low powered field computers and tablets. The models may be true to scale and location, with high levels of detail over large areas.
Data can be captured from at least one drone, for example, and processed by processor 12 into models to add details to a visualisation. Scenario 3: secret location. The contained area network 10 can be used for analysising a secret area or structure by multiple trusted users of the contained areas network 10, without sharing the data with a third-party cloud host. The data may be kept off the cloud, which may otherwise be a security or confidentiality concern.
Data may be kept on the contained area network 10, which may be a private network of, for example, a company. The data may be available to authorised users using the plurality of personal computing devices 20, 22, 24.
Scenario 4: farmers. The contained area network I bean assist farmers to visualise crop data and better manage crops without an internet connection. Farmers can acquire geospatial information such as satellite imagery and analytics (via couriered storage media or an internet connection when available), for example to assist with advanced crop management such as targeted weed and pest control, waste management and general crop health determination. The contained area network 10 at a field or farm, for example, may assist the farmer to access data to relatively high levels of details, and add data to the virtual site using just a tablet or low powered computer.
Scenario 5: Surveyors. The contained area network 10 can assist surveyors to acquire, process and access data without an internet connection. A surveyor can capture photographs with at least one drone on a remote site, load the images onto a processor, which may process the data into three dimensional models using a light field computer. The quality of the field data can be validated before returning to an office. Data can continue to be processed while in transit, for example back to an office. Once in the office, for example, the data can be send to the contained area network 10.
Scenario 6: customised gaming. The system enables 3D virtual models of specific areas to be created for a gaming environment. Photographs from a drone can be processed into three dimension models, or pre-existing models can be loaded onto the processor 12 to be used in a personalised gaming environment. Processor 12 can create the local scene for gaming use either at the scene location or elsewhere.
Application: private augmented reality system
The same system enables multiple users to interact with geospatial data as an augmented reality experience i.e. graphics overlaid on camera vision of the real world. To enable augmented reality, each mobile device must have a camera and must transmit the GPS location and orientation of the device to the server so that graphics can be generated for the area being viewed.
Examples of software functions
Functions that may be performed by example program instructions that can be stored in the processor’s 12 memory software include:
• 3D and 4D (time based) multi-source data fusion and visualization. Any suitable type of geospatial data can be processed by the contained area network. The geospatial data is generally but not necessarily tiled, or cut up into detailed views and different zoom levels, enabling the relatively high detail to be seen of massive files covering large areas, using a browser application. Multiple data types can be viewed and merged together on a browser application.
• User management, authentication, access control and collaboration. The contained area network 10 enables account administrators to control access of users to data, both internally by creating groups of people in projects, and externally by creating hyperlinks to the project.
• Linking to an existing database, or creating a new one. API integration with external database products can add external contextual data to a project or add a project to data in an external database product.
• Connecting to user-specific processing and analytic software. API connectivity can push data to external processing software and services, and to have the resultant data returned to the proj ect.
• Connecting to user specific processing hardware or server. API connectivity can be used to access external software running on another processor within the contained area netowrk.
• File and project management. Data can be added to an account, folders created, data moved between folders, and data and folders deleted. Projects can be created with a subset of data and users with permission to act on the data.
• Links to user trusted local or global data storage service. Data can be imported from local or cloud storage providers with API credentials to the external accounts.
• Near real-time field based photogrammetry processing (photo to 3D production and visualization). Resource intensive photogrammetry processing on a high powered field server is possible, which can enable data processing at the point of acquisition. • 3D and 4D asset condition inspection and tagging. Inspection of assets such as cell towers, bridges and buildings, is possible, with a digital twin virtual reality model linked to high resolution inspection photos.
• 3D measurements and feature extraction. Multiple users can use the contained area
network 10 to simultaneously mark and measure points, lines, polygons (areas) and volumes and extract the markups (digitize) with real world coordinates.
• SDK and API for mobile, web and desktop application development. External developers can customize the product for their company’s specific applications.
• Automated software update using internet or self-deployable package. Software can manage and update products on cloud SaaS or deployed local installations.
Now that embodiments have been described, it will be appreciated that some embodiments have at least some of the following advantages:
• Users may use a range of computing devices such as tablets, even though the data
typically requires high computing power to access, visualise and process.
• Users may keep their geospatial data confidential by storing and processing data on their own server, while still enabling sharing and collaboration within the organisation.
• Users may have access to software in places where there is no internet connection, for example natural disaster sites, and consequently data may not need to be uploaded to a cloud service.
• Multiple users may access the data, which may be massive geospatial data, and software, and may use small devices without an internet connection to do so.
• Data access is not restricted to a single machine.
Variations and/or modifications may be made to the embodiments described without departing from the spirit or ambit of the invention. The present embodiments are, therefore, to be considered in all respects as illustrative and not restrictive. Reference to a feature disclosed herein does not mean that all embodiments must include the feature.
Prior art, if any, described herein is not to be taken as an admission that the prior art forms part of the common general knowledge in any jurisdiction.
In the claims which follow and in the preceding description of the invention, except where the context requires otherwise due to express language or necessary implication, the word “comprise” or variations such as“comprises" or“comprising" is used in an inclusive sense, that is to specify the presence of the stated features but not to preclude the presence or addition of further features in various embodiments of the invention.

Claims (20)

1. A contained area network comprising:
a processor comprising a contained area network interface and processor readable tangible media including program instructions which when executed by the processor causes the processor to generate processed geospatial information by processing geospatial information and send the processed geospatial information via the contained area network interface; and
a plurality of personal computing devices comprising a plurality of contained area network interfaces and configured to receive from the processor via the plurality of contained area interfaces the processed geospatial information.
2. A contained area network defined by claim 1 wherein the program instructions comprise photogrammetry program instructions.
3. A contained area network defined by either one of claim 1 and claim 2 wherein the
program instructions comprise visualisation program instructions.
4. A contained area network defined by any one of the preceding claims wherein the
processor readable tangible media includes the geospatial information.
5. A contained area network defined by any one of the proceeding claims wherein the
geospatial information comprises geospatial image information.
6. A contained area network defined by any one of the proceeding claims wherein each of the plurality of personal computing devices are configured to generate zoom level information indicative of a selected magnification of a geospatial image, and send the zoom level information to the processor, and the processed geospatial information comprises geospatial image information generated using the zoom level information.
7. A contained area network defined by claim 6 wherein the geospatial image information comprises overlaid graphic information.
8. A contained area network defined by either one of claim 6 and claim 7 wherein each of the plurality of personal computing devices comprise personal computing device tangible media including program instructions which when executed by any one of the plurality of personal computing devices causes the personal computing device to generate a geospatial image using the geospatial image information.
9. A contained area network defined by claim 8 wherein the geospatial image is a three dimensional geospatial image.
10. A contained area network defined by any one of the preceding claims comprising at least one contained area network interface dongle.
11. A contained area network defined by any one of the preceding claims comprising at least one of a local area network (LAN) and a personal area network (PAN).
12. A contained area network wherein the processor comprises a ruggedized server.
13. A processor comprising a contained area network interface and processor readable
tangible media including program instructions which when executed by the processor causes the processor to generate processed geospatial information by processing geospatial information and send the processed geospatial information via the contained area network interface.
14. A processor defined by claim 13 wherein the program instructions comprise
photogrammetry program instructions.
15. A processor defined by either one of claim 13 and claim 14 wherein the program
instructions comprise visualisation program instructions.
16. A processor defined by either one of the claims 13 to 15 wherein processor readable tangible media comprises the geospatial information.
17. A processor defined by any one of the claim 13 to 16 wherein the geospatial information comprises geospatial image information.
18. A processor defined by any one of the claims 13 to 17 configured to receive zoom level information indicative of a selected magnification of a geospatial image, wherein the processed geospatial information comprises geospatial image information generated using the zoom level information
19. A processor defined by claim 18 wherein the geospatial information comprises overlaid graphic information.
20. A processor defined by either one of claim 18 and claim 19 wherein the geospatial image information comprises three geospatial dimensions.
AU2019450140A 2019-06-13 2019-06-13 A contained area network and a processor Pending AU2019450140A1 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/AU2019/050607 WO2020248000A1 (en) 2019-06-13 2019-06-13 A contained area network and a processor

Publications (1)

Publication Number Publication Date
AU2019450140A1 true AU2019450140A1 (en) 2021-06-17

Family

ID=73780666

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2019450140A Pending AU2019450140A1 (en) 2019-06-13 2019-06-13 A contained area network and a processor

Country Status (2)

Country Link
AU (1) AU2019450140A1 (en)
WO (1) WO2020248000A1 (en)

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1914880A (en) * 2004-01-29 2007-02-14 皇家飞利浦电子股份有限公司 Guest dongle and method of connecting guest apparatuses to wireless home networks
US7933929B1 (en) * 2005-06-27 2011-04-26 Google Inc. Network link for providing dynamic data layer in a geographic information system
US8737721B2 (en) * 2008-05-07 2014-05-27 Microsoft Corporation Procedural authoring
US9070216B2 (en) * 2011-12-14 2015-06-30 The Board Of Trustees Of The University Of Illinois Four-dimensional augmented reality models for interactive visualization and automated construction progress monitoring
US9418466B2 (en) * 2012-06-06 2016-08-16 Apple Inc. Geospatial representation of data-less map areas
DK177618B1 (en) * 2013-03-13 2013-12-09 Aporta Digital Aps Robust mobile media server and method for providing media to passengers in a public transport vehicle
US10163255B2 (en) * 2015-01-07 2018-12-25 Geopogo, Inc. Three-dimensional geospatial visualization

Also Published As

Publication number Publication date
WO2020248000A1 (en) 2020-12-17

Similar Documents

Publication Publication Date Title
Kikuchi et al. Future landscape visualization using a city digital twin: Integration of augmented reality and drones with implementation of 3D model-based occlusion handling
US20210377442A1 (en) Capture, Analysis And Use Of Building Data From Mobile Devices
CN109064545B (en) Method and device for data acquisition and model generation of house
Verhoeven Taking computer vision aloft–archaeological three‐dimensional reconstructions from aerial photographs with photoscan
Wen et al. Augmented reality and unmanned aerial vehicle assist in construction management
Gargees et al. Incident-supporting visual cloud computing utilizing software-defined networking
WO2015174729A1 (en) Augmented reality providing method and system for providing spatial information, and recording medium and file distribution system
US20140295891A1 (en) Method, server and terminal for information interaction
KR101839111B1 (en) System for providing building information based on BIM
SG185110A1 (en) Multiple-site drawn-image sharing apparatus, multiple-site drawn-image sharing system, method executed by multiple-site drawn-image sharing apparatus, program, and recording medium
Athanasis et al. Big data analysis in uav surveillance for wildfire prevention and management
US11489763B2 (en) Data hierarchy protocol for data transmission pathway selection
KR20210036212A (en) Server, device and method for providing augmented reality
TW200929066A (en) Geospatial data system for selectively retrieving and displaying geospatial texture data in successive additive layers of resolution and related methods
JP2007243509A (en) Image processing device
Capece et al. A client-server framework for the design of geo-location based augmented reality applications
CN110895833A (en) Method and device for three-dimensional modeling of indoor scene
Dabove et al. Close range photogrammetry with tablet technology in post-earthquake scenario: Sant’Agostino church in Amatrice
CN114363161B (en) Abnormal equipment positioning method, device, equipment and medium
AU2019101803A4 (en) A contained area network and a processor
US10735707B2 (en) Generating three-dimensional imagery
WO2020248000A1 (en) A contained area network and a processor
EP3845858B1 (en) Using three dimensional data for privacy masking of image data
JP2012023573A (en) Image processor, image processing program, image processing method, and mobile
EP4075789A1 (en) Imaging device, imaging method, and program

Legal Events

Date Code Title Description
DA3 Amendments made section 104

Free format text: THE NATURE OF THE AMENDMENT IS: APPLICATION IS TO PROCEED UNDER THE NUMBER 2019101803