CN111210515A - Airborne synthetic vision system based on terrain real-time rendering - Google Patents

Airborne synthetic vision system based on terrain real-time rendering Download PDF

Info

Publication number
CN111210515A
CN111210515A CN201911397507.4A CN201911397507A CN111210515A CN 111210515 A CN111210515 A CN 111210515A CN 201911397507 A CN201911397507 A CN 201911397507A CN 111210515 A CN111210515 A CN 111210515A
Authority
CN
China
Prior art keywords
terrain
rendering
parcel
airborne
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911397507.4A
Other languages
Chinese (zh)
Inventor
江彦
汪坤
肖永红
张松
唐太虎
伍振华
缪国凯
杨阳
唐佳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Hermes Technology Co ltd
Original Assignee
Chengdu Hermes Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Hermes Technology Co ltd filed Critical Chengdu Hermes Technology Co ltd
Priority to CN201911397507.4A priority Critical patent/CN111210515A/en
Publication of CN111210515A publication Critical patent/CN111210515A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D43/00Arrangements or adaptations of instruments
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D43/00Arrangements or adaptations of instruments
    • B64D43/02Arrangements or adaptations of instruments for indicating aircraft speed or stalling conditions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Software Systems (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses an airborne synthetic vision system based on terrain real-time rendering, which can ensure the real-time performance of pictures and realize high-precision display of terrain and is suitable for airborne vision synthesis in a comprehensive display unit in various comprehensive avionics systems. The system comprises: the system comprises a data interface module, an airborne data module and a terrain rendering module; the data interface module is used for receiving navigation indication information, speed indication information and posture indication information from airborne equipment through the data interface unit; the airborne data module is used for storing airborne data comprising terrain, obstacles and airport information; the terrain rendering module is used for determining the current position of the airplane according to the navigation indication information, reading airborne data in a preset range with the current position as the center from the airborne data module, calculating to generate 3D graphs and symbols, rendering according to the posture indication information, and outputting and displaying a 3D picture frame generated by rendering calculation.

Description

Airborne synthetic vision system based on terrain real-time rendering
Technical Field
The invention relates to the technical field of synthetic vision of avionic equipment, in particular to an airborne synthetic vision system based on terrain real-time rendering.
Background
Modern aircraft usually use a unified processor to process information of various avionics devices on the aircraft uniformly, combine the devices with the same or similar functions in one assembly, display related parameters comprehensively on a display, and transmit related information among the avionics devices through an onboard data bus, so that the performance of all the avionics devices on the whole aircraft reaches a higher level, and such a system is called a comprehensive avionics system.
The integrated avionics system includes a plurality of functional onboard devices, such as a data interface unit DIU, an atmospheric data system ADC, a combined navigation system INS/GNSS, an integrated radio system CNS, a head-up display unit HUD, and a cabin surveillance system. The integrated display control unit in the integrated avionics system is used as a main human-computer interaction device in the avionics system, and integrates an atmospheric attitude display subsystem, an engine fuel oil display subsystem, a radio navigation subsystem, a radio communication management subsystem, a flight plan and navigation subsystem, a warning and fault display subsystem, an electronic flight inspection subsystem, a synthetic vision subsystem and the like.
Among other things, the Synthetic Vision System (SVS) is used to represent the external field of view by computationally generating 3D images, providing an intuitive way for pilots to understand their flight environment. However, the picture generated by the existing synthetic vision system has a large time delay and low precision of the near terrain in front of the airplane, so that the safety of high-speed flight of the airplane under the complex terrain is low, and the incidence rate of air crash accidents is high.
Disclosure of Invention
At least one of the objectives of the present invention is to overcome the above problems in the prior art, and to provide an airborne composite visual system based on terrain real-time rendering, which can ensure real-time image and realize high-precision display of terrain, and is suitable for airborne visual composite in a composite display unit in various composite avionics systems.
In order to achieve the above object, the present invention adopts the following aspects.
An airborne synthetic vision system based on terrain real-time rendering, the airborne synthetic vision system comprising: the system comprises a data interface module, an airborne data module and a terrain rendering module;
the data interface module is used for receiving navigation indication information, speed indication information and posture indication information from airborne equipment through the data interface unit;
the airborne data module is used for storing airborne data comprising terrain, obstacles and airport information;
and the terrain rendering module is used for determining the current position of the airplane according to the navigation indication information, reading airborne data in a preset range taking the current position as the center from the airborne data module, calculating and generating 3D graphs and symbols of the terrain, obstacles and the airport, rendering according to the attitude indication information, and outputting the 3D picture frame generated by rendering calculation to one or more of the primary flight display PFD, the multifunctional flight display MFD and the head-up display HUD for display.
Preferably, when generating the 3D terrain graph, the terrain rendering module divides the terrain data into blocks, and uses the land blocks as the basic data structure for rendering.
Preferably, the terrain rendering module comprises a terrain grading unit, which is used for setting the terrain to different levels according to the size of the terrain and the number of vertexes, and calculating the distance threshold value whether the terrain is loaded according to the levels of the terrain; wherein the number of vertices of a plot farther from the aircraft is less than the number of vertices of a plot closer to the aircraft; and the center point of the land parcel is set as the longitude and latitude center position of the land parcel on the height with the altitude of 0, and the center point is used as the position of the whole land parcel.
Preferably, when two adjacent plots are different in level, the plot grading unit adds sidebands to four edges of each plot, and the sidebands extend to the center of the plot for a certain distance along the edges of the plots.
Preferably, the terrain rendering module comprises a plot management unit, wherein the plot management unit is used for gradually replacing the loaded plots of a place from the low-level plots to the higher-level plots when the airplane approaches the place, and unloading the low-level plots before the high-level plots are loaded; when the airplane flies away from the place, the place loaded by the place is gradually replaced by the place with lower level from the place with high level, and only the place with the highest level in all the places with the distance between the place and the airplane smaller than the distance threshold value is always loaded.
Preferably, the parcel management unit is further configured to maintain nine 0-level parcels forming a nine-grid pattern in real time, which are located near the current position of the aircraft, and dynamically load and/or unload a corresponding number of 0-level parcels as the position of the aircraft moves, so that 9 0-level parcels are maintained around the aircraft and serve as a data source of the rendering queue.
Preferably, the terrain rendering module further comprises a terrain block loading unit, which is used for reading terrain elevation information from a terrain data file, converting geographic coordinates into earth coordinates, generating a terrain block vertex mesh expressed by the earth coordinates, generating a terrain block vertex normal vector, calculating illumination according to the vertex normal vector to generate vertex colors, and adding the generated good blocks into the rendering queue.
As a preferred scheme, the terrain rendering module calls a CPU to calculate and generate a plot vertex, and performs conversion calculation, normal vector calculation and color calculation from the geographic coordinate of each vertex of the plot to the earth coordinate; calling a GPU to perform batch rendering on data sent by the CPU; when the top point of the land parcel is generated by calculation, performing land parcel visibility judgment, and when the visibility of the land parcel is smaller than a visible threshold value, not generating the land parcel and not performing subsequent calculation of the land parcel; during batch rendering, in a primitive assembly link of an OpenGL rendering pipeline, back rejection and view cone cutting are performed, and primitives except the back of a triangle and a view cone are rejected.
As a preferred scheme, the terrain rendering module carries out terrain visibility judgment, including selecting a terrain bounding volume and carrying out detection based on a viewing cone; the selected enclosure is: the part of the spherical shell is in a spherical shell shape, the longitude and latitude of 8 vertexes are respectively the longitude and latitude of four points of the west, the east and the north of the parcel, the height of the selected enclosure has two values, the low value is the altitude 0m, and the high value is 8000m which is the slightly larger ascending limit value of the airplane.
Preferably, upon detection based on the viewing cone, the six cutting planes are tested one by one with eight vertices of the parcel enclosure, and if all vertices are outside of one or more cutting planes, the parcel enclosure is not visible and the parcel is discarded.
In summary, due to the adoption of the technical scheme, the invention at least has the following beneficial effects:
determining the current position of the airplane through the navigation indication information, reading airborne data in a preset range with the current position as the center from an airborne data module, calculating and generating 3D (three-dimensional) graphs and symbols of terrain, obstacles and airports, and rendering according to the attitude indication information; the 3D image of the terrain environment around the airplane can be synthesized and displayed only according to the longitude and latitude heights of the geographical coordinates of the airplane, the real-time performance of the image can be guaranteed by greatly reducing the rendering calculation amount, the high-precision display of the terrain is realized, and the adverse factors brought to the flight under the condition of low visibility are overcome.
Drawings
Fig. 1 is a schematic structural diagram of an airborne synthetic vision system based on terrain real-time rendering according to an embodiment of the present invention.
Fig. 2 is a schematic view of a plot distribution according to an embodiment of the present invention.
Fig. 3 is a schematic diagram of a terrain height warning according to an embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and embodiments, so that the objects, technical solutions and advantages of the present invention will be more clearly understood. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
FIG. 1 illustrates an airborne synthetic vision system based on terrain real-time rendering, in accordance with an embodiment of the present invention. The onboard synthetic vision system of this embodiment includes: the system comprises a data interface module, an airborne data module and a terrain rendering module.
The data interface module is used for receiving navigation indication information (including data such as longitude, latitude, barometric altitude, radio altitude, selected altitude and the like), speed indication information (including data such as airspeed, vacuum speed, mach number, ground speed, selected speed, lifting speed and the like) and attitude indication information (including data such as heading, pitch angle, roll angle, sideslip angle, track, attack angle and the like) from airborne equipment such as an air engine, ING/GNSS combined navigation and radio through the data interface unit.
An onboard data module for storing onboard data including terrain, obstacle, airport information, e.g. in a built-in memory of the system or in an external storage medium such as SD via a high speed data interface, and arranged to be periodically updated from a data source.
And the terrain rendering module is used for determining the current position of the airplane according to the navigation indication information, reading airborne data in a preset range taking the current position as the center from the airborne data module, calculating and generating 3D graphs and symbols of the terrain, obstacles and the airport, rendering according to the attitude indication information, and outputting 3D picture frames generated by rendering calculation to the main flight display PFD, the multifunctional flight display MFD and the head-up display HUD for display.
When a 3D Terrain graph is generated, the Terrain data is divided according to blocks, namely a Terrain Tertain Tile, one Terrain is used as a rendering basic data structure, and according to the longitude, the latitude and the height of an airplane and the distance between the airplane and the Terrain, the Terrain around the airplane is dynamically loaded, so that the draw calling frequency of OpenGL is reduced, and the rendering efficiency is improved.
In the airborne synthetic vision system according to the embodiment of the invention, the terrain rendering module comprises a terrain classification unit which is used for setting the terrain to different levels according to the size of the terrain and the number of vertexes, and calculating the distance threshold value whether the terrain is loaded or not according to the levels of the terrain. The number of vertices of a plot farther from the aircraft is less than the number of vertices of a plot closer to the aircraft. The plot starts at level 0. Level 0 is the largest plot, spanning 1 degree longitude horizontally and 1 degree latitude vertically. The plot of level 1 is twice as small as the span of the plot of level 0, with a lateral span of 0.5 degrees longitude and a longitudinal span of 0.5 degrees, with 1 plot of level 0 covering exactly 4 plots of level 1. 1 parcel at level 1 covers exactly 4 parcels at level 2. The higher the land parcel level, the smaller the land parcel, the higher the precision, and so on. As shown in fig. 2, the more the land mass from the viewing plane includes the higher-level land mass, the higher the accuracy of the terrain. The land parcels adjacent in the level are in parent-child relationship. All levels of land parcel constitute a quadtree. The root of this quadtree is a 0-level plot. The center point of the land parcel is set as the longitude and latitude center position of the land parcel on the height with the altitude of 0, and when the land parcel needs to be regarded as a whole, the center point is used as the position of the whole land parcel, so that the calculation amount is reduced.
When two adjacent plots are different in level, gaps can occur at the boundary due to the difference of the number of the vertexes of the two plots. Thus, the plot grading unit further adds sidebands to all four edges of each plot, the sidebands extending a distance along the edges of the plot in the direction of the geocenter.
The terrain rendering module also comprises a plot management unit, wherein the plot management unit is used for gradually replacing the plot loaded in a place from a low-level plot to a higher-level plot when the airplane approaches the place, and unloading the low-level plot before loading the high-level plot; as the aircraft flies away from the site, the plot loaded by the site is gradually replaced from a high-level plot to a lower-level plot. Since there are multiple levels of plots at any one location, only the highest level plot of all plots for which the location is less than the distance threshold from the aircraft is always loaded.
The plot management unit is used for maintaining nine 0-level plots forming a nine-grid shape near the current position of the airplane in real time, and dynamically loading and/or unloading a corresponding number of 0-level plots along with the movement of the position of the airplane, so that 9 0-level plots are kept around the airplane and serve as data sources of the rendering queue. Thus, 9 quadtrees are maintained at each time, the nine quadtrees represent information of all plots to be rendered at the current airplane position, the nine palaces (9 0-level maximum plots) are dynamically expanded and recycled along with the change of the airplane position, child nodes (high-level plots) of the tree are created along with the approaching of the airplane position and are destroyed along with the departing of the airplane position, and the 9 quadtrees data structure is used as a data source of a rendering queue.
The terrain rendering module further comprises a terrain loading unit, wherein the terrain loading unit is used for reading terrain elevation information from a terrain data file, converting geographic coordinates (longitude and latitude heights) into earth coordinates (XYZ), generating a terrain vertex grid expressed by the XYZ, generating a terrain vertex normal vector, calculating illumination according to the vertex normal vector to generate vertex colors (which can also be calculated by a GPU), and adding the generated good blocks into a rendering queue. Specifically, the rendering units with the same primitive and the same OpenGL state may be placed in a rendering queue, each frame sequentially renders objects (e.g., parcel) in the rendering queue, and after completion, the queue is emptied, thereby avoiding reduction in rendering efficiency due to frequent OpenGL state machine switching.
The onboard synthetic vision system may further include a terrain height early warning module, configured to obtain an altitude of a location a preset distance ahead of the aircraft from the terrain data, compare the altitude with a current altitude of the aircraft, and display the location with a warning color when the altitude of the location is greater than the current altitude of the aircraft, for example, a dark color area at a highest position in fig. 3.
The terrain rendering module calls a CPU to calculate and generate a plot vertex, and performs conversion calculation, normal vector calculation and color calculation from each vertex of the plot to earth coordinates; calling a GPU to perform batch rendering on data sent by the CPU; when the top point of the land parcel is generated by calculation, the visibility of the land parcel is judged, and when the visibility of the land parcel is smaller than a visible threshold value, the land parcel is not generated and the subsequent calculation of the land parcel is not carried out, so that the unnecessary calculation amount is greatly reduced; and when batch rendering is carried out, in a primitive assembling link of an OpenGL rendering pipeline, back rejection and view cone cutting are carried out, and primitives except the back of the triangle and the view cone are rejected so as to avoid subsequent pipeline flow, thereby reducing the calculation amount and improving the efficiency.
Specifically, the land visibility judgment comprises the steps of selecting a land surrounding body and detecting based on a viewing cone. The decision whether a 3D mesh is visible in the scene is usually computed by bounding volumes of the 3D mesh. Bounding volumes are a great simplification of 3D mesh models, replacing the true geometry of the 3D mesh with bounding volumes to compute its visibility. The enclosure body can be in a cube shape, a sphere shape, and the like according to actual requirements. For the land parcel, considering the simplicity of conversion calculation of the geographic coordinates and the spherical coordinates, selecting a suitable bounding volume as follows: the part of the spherical shell is in a spherical shell shape, the longitude and latitude of 8 vertexes are respectively the longitude and latitude of four points of the west, the east and the north of the parcel, the height of the selected enclosure has two values, the low value is the altitude 0m, and the high value is 8000m which is the slightly larger ascending limit value of the airplane.
When the detection based on the viewing cone is carried out, the eight vertexes of the land surrounding body are used for testing six cutting surfaces (far, near, left, right, upper and lower sections) one by one, if all the vertexes are positioned outside one or more cutting surfaces, the land surrounding body is invisible, and the land is discarded. Specifically, a byte is applied to each vertex, each bit of the byte corresponds to a cutting surface, if the vertex is out of the cutting surface, the bit corresponding to the corresponding vertex is assigned with 1, and if not, the bit is assigned with 0; if the logical AND result of the 8 vertex output codes is nonzero, the bounding volume is off-screen, namely the plot is invisible; if the logical and result of the 8 vertex output codes is zero, it indicates that at least one vertex is within the view frustum, which is considered to be a parcel. If further optimization is needed, the bounding volume grid can be considered to be clipped, and when the logical or result of the 8 vertex output codes is zero (i.e. all 8 points are 0), all the points are in the view frustum, and the volume is determined to be visible.
In various embodiments, the terrain data file stored by the on-board data module may employ the spacecraft radar terrain mapping mission SRTM spacecraft project data of the national aerospace agency NASA. The format is as follows: a single file covers one degree of precision and one degree of latitude with a resolution of 1201x1201 points. The landmass of different levels has different numbers of vertexes with different sizes, and does not have point-to-point one-to-one correspondence with the SRTM topographic data file. Therefore, when reading terrain data into the altitudes of the vertices that generate plots of varying sizes, it is necessary to use an interpolation algorithm between several peripheral altitude data points. In addition, in order to reduce the frequency of opening, reading and closing the terrain data file, a cache unit is further arranged, and when a terrain block needs to sample the terrain data file, whether the cached terrain data exists in the cache unit is checked firstly.
In the above embodiments provided in the present application, it should be understood that the disclosed software platform may be implemented in other ways. The above-described device embodiments are merely illustrative, for example, the division of the unit may be only one logical function division, and there may be other division ways in actual implementation, such as: multiple units or components may be combined, or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the coupling, direct coupling or communication connection between the components shown or discussed may be through some interfaces, and the indirect coupling or communication connection between the devices or units may be electrical, mechanical or other forms.
The units and modules described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units; can be located in one place or distributed on a plurality of network units; some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, all units (for example, each functional unit, processor, memory, and the like) in each embodiment of the present invention may be integrated into one unit, each unit may be separately regarded as one unit, or two or more units may be integrated into one unit; the integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
Those skilled in the art will understand that: all or part of the steps for realizing the method embodiments can be completed by hardware related to program instructions, the program can be stored in a computer readable storage medium, and the program executes the steps comprising the method embodiments when executed; and the aforementioned storage medium includes: various media that can store program codes, such as a removable Memory device, a Read Only Memory (ROM), a magnetic disk, or an optical disk.
When the integrated unit of the present invention is implemented in the form of a software functional unit and sold or used as a separate product, it may also be stored in a computer-readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present invention may be essentially implemented or a part contributing to the prior art may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the methods described in the embodiments of the present invention. And the aforementioned storage medium includes: a removable storage device, a ROM, a magnetic or optical disk, or other various media that can store program code.
The foregoing is merely a detailed description of specific embodiments of the invention and is not intended to limit the invention. Various alterations, modifications and improvements will occur to those skilled in the art without departing from the spirit and scope of the invention.

Claims (10)

1. An airborne synthetic vision system based on terrain real-time rendering, the airborne synthetic vision system comprising: the system comprises a data interface module, an airborne data module and a terrain rendering module;
the data interface module is used for receiving navigation indication information, speed indication information and posture indication information from airborne equipment through the data interface unit;
the airborne data module is used for storing airborne data comprising terrain, obstacles and airport information;
the terrain rendering module is used for determining the current position of the airplane according to the navigation indication information, reading airborne data in a preset range with the current position as the center from the airborne data module, calculating and generating 3D graphs and symbols of the terrain, obstacles and the airport, rendering according to the attitude indication information, and outputting 3D picture frames generated by rendering calculation to one or more of the primary flight display PFD, the multifunctional flight display MFD and the head-up display HUD for display.
2. The system of claim 1, wherein the terrain rendering module divides the terrain data into blocks when generating the 3D terrain graphic, with the blocks being the underlying data structure of the rendering.
3. The system of claim 1, wherein the terrain rendering module comprises a terrain grading unit configured to set the terrain to different levels based on the terrain size and the number of vertices, and to calculate a distance threshold for whether the terrain is loaded based on the level of the terrain; wherein the number of vertices of a plot farther from the aircraft is less than the number of vertices of a plot closer to the aircraft; and the center point of the land parcel is set as the longitude and latitude center position of the land parcel on the height with the altitude of 0, and the center point is used as the position of the whole land parcel.
4. The synthetic vision system of claim 3, wherein the parcel classification unit adds sidebands to four edges of each parcel when two adjacent parcels are at different levels, the sidebands extending a distance along the edges of the parcel in the direction of the center of the earth.
5. The on-board synthetic vision system of claim 1, wherein the terrain rendering module comprises a plot management unit for gradually replacing a plot loaded at a location from a low-level plot to a higher-level plot as the aircraft approaches the location, unloading the low-level plot prior to loading the high-level plot; when the airplane flies away from the place, the place loaded by the place is gradually replaced by the place with lower level from the place with high level, and only the place with the highest level in all the places with the distance between the place and the airplane smaller than the distance threshold value is always loaded.
6. The system of claim 5, wherein the parcel management unit is further configured to maintain nine grade 0 parcels in a nine-grid pattern in real time in proximity to the current location of the aircraft, and dynamically load and/or unload a corresponding number of grade 0 parcels as the aircraft location moves, such that 9 grade 0 parcels are maintained around the aircraft and serve as a data source for the rendering queue.
7. The airborne synthetic vision system of claim 1, wherein the terrain rendering module further comprises a parcel loading unit configured to read terrain elevation information from a terrain data file, convert geographic coordinates to earth coordinates, generate a parcel vertex grid represented in earth coordinates, generate a parcel vertex normal vector, calculate illumination from the vertex normal vector to generate vertex colors, and add generated good parcels to a rendering queue.
8. The airborne synthetic vision system of any one of claims 1 to 7, wherein said terrain rendering module invokes a CPU to compute and generate a plot vertex, and performs a conversion computation from a geographic coordinate to an earth coordinate, a normal vector computation, and a color computation for each vertex of the plot; calling a GPU to perform batch rendering on data sent by the CPU; when the top point of the land parcel is generated by calculation, performing land parcel visibility judgment, and when the visibility of the land parcel is smaller than a visible threshold value, not generating the land parcel and not performing subsequent calculation of the land parcel; during batch rendering, in a primitive assembly link of an OpenGL rendering pipeline, back rejection and view cone cutting are performed, and primitives except the back of a triangle and a view cone are rejected.
9. The system according to claim 8, wherein said terrain rendering module performing terrain visibility determination comprises selecting a terrain bounding volume and performing inspection based on viewing cones; the selected enclosure is: the part of the spherical shell is in a spherical shell shape, the longitude and latitude of 8 vertexes are respectively the longitude and latitude of four points of the west, the east and the north of the parcel, the height of the selected enclosure has two values, the low value is the altitude 0m, and the high value is 8000m which is the slightly larger ascending limit value of the airplane.
10. The system of claim 9, wherein upon inspection of the viewing cones, the six cropping planes are individually tested with eight vertices of the parcel enclosure, and if all vertices are outside of one or more of the cropping planes, the parcel enclosure is not visible and the parcel is discarded.
CN201911397507.4A 2019-12-30 2019-12-30 Airborne synthetic vision system based on terrain real-time rendering Pending CN111210515A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911397507.4A CN111210515A (en) 2019-12-30 2019-12-30 Airborne synthetic vision system based on terrain real-time rendering

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911397507.4A CN111210515A (en) 2019-12-30 2019-12-30 Airborne synthetic vision system based on terrain real-time rendering

Publications (1)

Publication Number Publication Date
CN111210515A true CN111210515A (en) 2020-05-29

Family

ID=70784178

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911397507.4A Pending CN111210515A (en) 2019-12-30 2019-12-30 Airborne synthetic vision system based on terrain real-time rendering

Country Status (1)

Country Link
CN (1) CN111210515A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112381935A (en) * 2020-09-29 2021-02-19 西安应用光学研究所 Synthetic vision generation and multi-element fusion device
CN115221263A (en) * 2022-09-15 2022-10-21 西安羚控电子科技有限公司 Terrain preloading method and system based on route

Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101727681A (en) * 2008-10-30 2010-06-09 如临其境创意(上海)有限公司 Pyramid model based grid crack elimination algorithm for drawing massive terrains
CN103093484A (en) * 2013-01-05 2013-05-08 武汉中地数码科技有限公司 Integrated batch drawing method of remote-sensing image and vector data
CN104044745A (en) * 2013-03-12 2014-09-17 霍尼韦尔国际公司 Aircraft flight deck displays and systems and methods for enhanced display of obstacles in a combined vision display
CN104050689A (en) * 2014-06-10 2014-09-17 黄淮学院 Visual object two-dimensional cutting generation method in virtual scene
CN104376590A (en) * 2014-11-18 2015-02-25 武汉海达数云技术有限公司 Mass data circle-based indexing and space displaying method
CN105139451A (en) * 2015-08-10 2015-12-09 中国商用飞机有限责任公司北京民用飞机技术研究中心 HUD (head-up display) based synthetic vision guiding display system
CN105336003A (en) * 2015-09-28 2016-02-17 中国人民解放军空军航空大学 Three-dimensional terrain model real-time smooth drawing method with combination of GPU technology
CN105718481A (en) * 2014-12-05 2016-06-29 星际空间(天津)科技发展有限公司 Massive topographic data organization and release method
CN106373175A (en) * 2016-08-29 2017-02-01 北京像素软件科技股份有限公司 Terrain height graph data loading method
CN106856008A (en) * 2016-12-13 2017-06-16 中国航空工业集团公司洛阳电光设备研究所 A kind of dimensional topography rendering intent for airborne Synthetic vision
CN107888897A (en) * 2017-11-01 2018-04-06 南京师范大学 A kind of optimization method of video source modeling scene
CN107909538A (en) * 2017-12-07 2018-04-13 北京像素软件科技股份有限公司 Topographic data processing method and device
CN108108246A (en) * 2017-12-25 2018-06-01 中国航空工业集团公司洛阳电光设备研究所 A kind of terrain scheduling method for airborne Synthetic vision
CN108364330A (en) * 2018-02-06 2018-08-03 重庆强阳科技有限公司 A kind of virtual block coordinates mapping method and management method based on plane coordinate system
CN108445822A (en) * 2018-06-05 2018-08-24 成都赫尔墨斯科技股份有限公司 A kind of more member's aircraft avionics systems
CN109242967A (en) * 2018-08-07 2019-01-18 云南电网有限责任公司曲靖供电局 A kind of dimensional topography rendering method and device
CN109492070A (en) * 2018-11-06 2019-03-19 深圳航天智慧城市***技术研究院有限公司 A kind of city macroscopic view scene three-dimensional visualization plateform system
CN109523552A (en) * 2018-10-24 2019-03-26 青岛智能产业技术研究院 Three-dimension object detection method based on cone point cloud
CN109636889A (en) * 2018-11-19 2019-04-16 南京大学 A kind of Large Scale Terrain model rendering method based on dynamic suture zone

Patent Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101727681A (en) * 2008-10-30 2010-06-09 如临其境创意(上海)有限公司 Pyramid model based grid crack elimination algorithm for drawing massive terrains
CN103093484A (en) * 2013-01-05 2013-05-08 武汉中地数码科技有限公司 Integrated batch drawing method of remote-sensing image and vector data
CN104044745A (en) * 2013-03-12 2014-09-17 霍尼韦尔国际公司 Aircraft flight deck displays and systems and methods for enhanced display of obstacles in a combined vision display
CN104050689A (en) * 2014-06-10 2014-09-17 黄淮学院 Visual object two-dimensional cutting generation method in virtual scene
CN104376590A (en) * 2014-11-18 2015-02-25 武汉海达数云技术有限公司 Mass data circle-based indexing and space displaying method
CN105718481A (en) * 2014-12-05 2016-06-29 星际空间(天津)科技发展有限公司 Massive topographic data organization and release method
CN105139451A (en) * 2015-08-10 2015-12-09 中国商用飞机有限责任公司北京民用飞机技术研究中心 HUD (head-up display) based synthetic vision guiding display system
CN105336003A (en) * 2015-09-28 2016-02-17 中国人民解放军空军航空大学 Three-dimensional terrain model real-time smooth drawing method with combination of GPU technology
CN106373175A (en) * 2016-08-29 2017-02-01 北京像素软件科技股份有限公司 Terrain height graph data loading method
CN106856008A (en) * 2016-12-13 2017-06-16 中国航空工业集团公司洛阳电光设备研究所 A kind of dimensional topography rendering intent for airborne Synthetic vision
CN107888897A (en) * 2017-11-01 2018-04-06 南京师范大学 A kind of optimization method of video source modeling scene
CN107909538A (en) * 2017-12-07 2018-04-13 北京像素软件科技股份有限公司 Topographic data processing method and device
CN108108246A (en) * 2017-12-25 2018-06-01 中国航空工业集团公司洛阳电光设备研究所 A kind of terrain scheduling method for airborne Synthetic vision
CN108364330A (en) * 2018-02-06 2018-08-03 重庆强阳科技有限公司 A kind of virtual block coordinates mapping method and management method based on plane coordinate system
CN108445822A (en) * 2018-06-05 2018-08-24 成都赫尔墨斯科技股份有限公司 A kind of more member's aircraft avionics systems
CN109242967A (en) * 2018-08-07 2019-01-18 云南电网有限责任公司曲靖供电局 A kind of dimensional topography rendering method and device
CN109523552A (en) * 2018-10-24 2019-03-26 青岛智能产业技术研究院 Three-dimension object detection method based on cone point cloud
CN109492070A (en) * 2018-11-06 2019-03-19 深圳航天智慧城市***技术研究院有限公司 A kind of city macroscopic view scene three-dimensional visualization plateform system
CN109636889A (en) * 2018-11-19 2019-04-16 南京大学 A kind of Large Scale Terrain model rendering method based on dynamic suture zone

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112381935A (en) * 2020-09-29 2021-02-19 西安应用光学研究所 Synthetic vision generation and multi-element fusion device
CN115221263A (en) * 2022-09-15 2022-10-21 西安羚控电子科技有限公司 Terrain preloading method and system based on route
CN115221263B (en) * 2022-09-15 2022-12-30 西安羚控电子科技有限公司 Terrain preloading method and system based on route

Similar Documents

Publication Publication Date Title
US6862501B2 (en) Method for producing 3D perspective view avionics terrain displays
US6782312B2 (en) Situation dependent lateral terrain maps for avionics displays
US9347793B2 (en) Synthetic vision systems and methods for displaying detached objects
US8362925B2 (en) Avionics display system and method for generating flight information pertaining to neighboring aircraft
CN111210516B (en) Software platform for integrated display control of avionics equipment
EP1748392B1 (en) Real-time conformal terrain rendering
KR101405891B1 (en) Reality display system of air inteligence and method thereof
CN110852952B (en) Large-scale terrain real-time drawing method based on GPU
US8059121B2 (en) Method and aircraft display system for generating three dimensional image
US10304242B1 (en) Transparent display terrain representation systems and methods
CN111210515A (en) Airborne synthetic vision system based on terrain real-time rendering
CN106856008B (en) Three-dimensional terrain rendering method for airborne synthetic view
CN110866964A (en) GPU accelerated ellipsoid clipping map terrain rendering method
CN112330806B (en) Visual synthesis method and system based on low-power-consumption hardware platform
KR102012361B1 (en) Method and apparatus for providing digital moving map service for safe navigation of unmanned aerial vehicle
WO2024087764A1 (en) Evtol navigation synthetic visual method and system
US11112249B1 (en) Systems and methods for four-dimensional routing around concave polygon avoidances
Schafhitzel et al. Increasing situational awareness in DVE with advanced synthetic vision
CN111435358B (en) Design method for reducing terrain display jamming in forward-looking predictive warning
Shen et al. Simulation System of Aircraft Surveillance in Airport Terminal Area
Moller et al. Synthetic vision for improving flight control in night, poor visibility and adverse weather conditions
CN117135334A (en) Combined vision system based on airborne three-dimensional image engine and vision display method
CN114519946B (en) Air guide display method, device, equipment and storage medium
JP3431891B2 (en) Pseudo view generator for aircraft
CN111091617A (en) Aircraft accident prediction and three-dimensional visualization system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200529

RJ01 Rejection of invention patent application after publication