CN108845802A - Unmanned plane cluster formation interactive simulation verifies system and implementation method - Google Patents
Unmanned plane cluster formation interactive simulation verifies system and implementation method Download PDFInfo
- Publication number
- CN108845802A CN108845802A CN201810463947.4A CN201810463947A CN108845802A CN 108845802 A CN108845802 A CN 108845802A CN 201810463947 A CN201810463947 A CN 201810463947A CN 108845802 A CN108845802 A CN 108845802A
- Authority
- CN
- China
- Prior art keywords
- unity
- computer
- simulation
- interface
- driver
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
- G06F8/30—Creation or generation of source code
- G06F8/38—Creation or generation of source code for implementing user interfaces
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/10—Geometric effects
- G06T15/20—Perspective computation
- G06T15/205—Image-based rendering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- Computer Graphics (AREA)
- Geometry (AREA)
- Computing Systems (AREA)
- Human Computer Interaction (AREA)
- Computer Hardware Design (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
Abstract
The present invention relates to numerical simulations, virtual reality, brain-computer interface, human-computer interaction, network communication, the fields such as software development, to construct virtual reality emulation scene true to nature, with the demonstration of unmanned plane cluster formation real-time simulation data-driven, intervention of the people to simulation demo is realized by a variety of human-computer interaction devices, reach people in the simulated effect of ring (man-in-loop), the present invention, unmanned plane cluster formation interactive simulation verifies system and implementation method, including distributed real-time simulink device, host, what comes into a driver's computer, that is, client, show equipment, human-computer interaction device, wherein, emulator is formed by object-computer to be responsible for running the simulation unmanned plane cluster formation of real-time simulation program;Host runtime server software;What comes into a driver's computer runs the what comes into a driver's software developed based on Unity;Human-computer interaction device realizes human-computer interaction by what comes into a driver's Integrated Simulation.Present invention is mainly applied to unmanned planes to emulate occasion.
Description
Technical field
The present invention relates to the necks such as numerical simulation, virtual reality, brain-computer interface, human-computer interaction, network communication, software development
Domain more particularly to a kind of unmanned plane cluster formation control method verifying field based on virtual emulation are specifically a kind of bases
In the simulation and verification platform of virtual reality (VR) and the unmanned plane cluster formation control algorithm of human-computer interaction technology.
Background technique
In the research field that unmanned plane cluster is formed into columns, emulation technology has critical role.The research and development that unmanned plane cluster is formed into columns
Often have the characteristics that experimental cost is high, research cycle is long, the conclusion of early-stage study can be verified and be commented using emulation
Estimate, according to the design of the result gradual perfection system schema of emulation, shortens the development cycle, saves development cost.According to simulation layer
Secondary difference, traditional emulation technology can be divided into full digital trigger technique and HWIL simulation.Full digital trigger technique is by research object
Mathematical model and control algolithm are written as the executable program of computer, are calculated on replicating machine, output reflection emulation knot
Data, curve of fruit etc.;Related component in real system is added to imitative by HWIL simulation on the basis of full digital trigger technique
In true circuit, obtained result is more nearly actual conditions.However, two kinds of traditional simulation technologies respectively have its limitation:It is digital
Emulation only relies on data visualization, lacks the intuitive of result;Semi-matter simulating system is built and its complexity, R&D cycle length,
It is at high cost.In aerospace studies field now, multiple aircraft formation, man-machine coordination, intelligence become trend, to traditional simulation technology
Bring new growth requirement.
Last century Mo, along with the breakthrough of Compute Graphics Theory, Virtual Simulation has been obtained increasingly
It is widely applied.Virtual Simulation on the basis of traditional full digital trigger technique, using 3D graph technology simulation true environment with
Simulation object shows simulation result in a manner of patterned, and user is freed from cumbersome data.Compared to complete
Digital Simulation, virtual emulation show simulation process in the form of three-dimensional visualization, as a result more intuitive;With HWIL simulation phase
Than virtual emulation makes full use of the computer performance of current surplus, using virtual mode analog simulation process, greatly save
Cost and development cycle.In addition, realizing that people emulates at circuit (Man-In-Loop) based on human-computer interaction technology, to carry out people-
The simulation of machine cooperative system enhances the effect of emulation or training by improving feeling of immersion, is another big feature of virtual emulation.
U.S. national defense Advanced Research Projects Agency (DARPA) is always using virtual emulation as key project, the land since the nineties in last century
It is continuous to have developed the military distributed information retrieval service platform such as SIMNET, JMASS.Each research institution also have research quite abundant at
Fruit, such as the aircraft simulation control loop of Gordon Dryden flight research center realize man-machine coordination emulation, NASA jet-propulsion
Virtual mars exploration system in laboratory etc..Currently, network technology reaches its maturity, the technologies such as figure rendering, perception, interaction, VR/AR
Using the popular domain for being increasingly becoming market and scientific research, the intersection of subject is that Virtual Simulation brings new breakthrough.
Virtual reality (VR) technology is widely used in virtual emulation in recent years.Virtual reality is based on display
(Display), the key technologies such as (Tracking), three-dimensional registration (Registration), interaction (Interaction) are tracked,
Enter user in the virtual environment of one " immersion ".The realization of VR usually requires special display equipment, it is most common just
It is the display equipment (HMD-Head Mounted Device) of wear-type.In April, 2016, three big HMD manufacturer HTC, Sony and
For Oculus almost simultaneously by the open sale of the end the PC HMD device of oneself, this indicates that VR technology has formally come into the life of people
Living, 2016 also referred to as " the VR first year ".The rise of VR technology has also driven the development of novel interaction technique.Based on abundant
Cognition technology and equipment, novel interaction technique allow people to use the natural interactive language such as voice, gesture, limb action and meter
Calculation machine interacts.In VR immersive environment, the interactive mode of conventional keys and mouse becomes unfriendly for user, uses
Gesture and voice and virtual scene interaction become the mode of people's expectation, and the products such as Leap Motion, Fingo have been at present
It realizes well and the multivariant posture of hand is tracked, and can be compared with as the product of representative using Baidu's voice, Iflytek
Adequately realize speech recognition.VR and interactive device both provided the virtual simulation environment of immersion for user, also achieved
People's assemblage on-orbit, nowadays the reduction of the maturation and cost of hardware device technology becomes its application in virtual emulation platform more
It is easy to add.
Brain-computer interface (BCI) is new technique field in recent years, it connects human brain nerve signal and computer program, is
Realize the ultimate interactive mode of " idea control ".The research of brain-computer interface at present concentrates on EEG (electroencephalogram describer) realization, adopts
Collect user's brain wave, computer successfully identifies the information that human brain is transmitted by advanced algorithms such as pattern-recognition, machine learning.
BCI is very great in the meaning of aerospace field, and common pilot or unmanned plane fly hand and need to first pass through brain " thinking ", then
It passes to limbs and goes " doing ";And brain control flight only needs brain " thinking " that can realize the control to unmanned plane, this has in certain occasions
There is the advantage of remote ultra-traditional control.Can meet the needs of brain control flight simulation, institute well in conjunction with the virtual emulation platform of VR
The virtual scene of building can be used to be trained winged hand, therefore virtual emulation platform intergration BCI is of great significance.
In conclusion virtual emulation platform building is of great significance for the research of the methods of unmanned plane formation, control.
Visualization of Simulation Results based on VR has intuitive, can be in the feasibility of the abundant proof scheme of conceptual phase;Platform is man-machine
Interactive function allows researcher preferably to observe simulation result, and realizes people's assemblage on-orbit;Brain control is verified by integrated BCI to fly
The feasibility of row method keeps platform function in terms of man-machine coordination method simulating, verifying more perfect.
Summary of the invention
In order to overcome the deficiencies of the prior art, the present invention is directed to construct virtual reality emulation scene true to nature, with unmanned plane collection
Group's formation real-time simulation data-driven demonstration realizes intervention of the people to simulation demo by a variety of human-computer interaction devices, reaches people
In the simulated effect of ring (man-in-loop).For this purpose, the technical solution adopted by the present invention is that, unmanned plane cluster is formed into columns interactive
Simulation checking system, divides quinquepartite, distributed real-time simulink device, host, what comes into a driver's computer, that is, client, display equipment,
Human-computer interaction device, wherein emulator is formed by object-computer and is responsible for running real-time simulation program simulation unmanned plane cluster volume
Team;Host runtime server software is responsible for the downloading and operation, online tune ginseng, status data transfers of management simulated program;Depending on
Scape computer runs what comes into a driver's software develop based on Unity, and receives the data-driven of server, Virtual Reality head it is aobvious with
Real-time display virtual emulation scene in large screen;Human-computer interaction device includes voice-input device, gesture identification equipment and brain machine
Interface equipment, they are connected with what comes into a driver's computer, and realize human-computer interaction by what comes into a driver's Integrated Simulation;Further, depending on
Three-dimensional scenic resource module is provided in scape computer, for developing unmanned plane cluster formation task scene using Unity engine,
Specifically include following submodule:
(1) 3 d model library submodule:The building in three dimensional resource library is based on external software in conjunction with Unity, unmanned airport
Threedimensional model needed for scape is established using 3D Studio Max modeling software, and generates the mould of different LOD (level of detail) ranks
Type, in Unity add LOD Group component implementation model LOD, LOD technology be it is a kind of according to viewpoint distance come Controlling model
The technology of level of detail;
(2) real terrain environment submodule:Dimensional topography resource uses WM (World Machine) software development, is based on
Unity editing machine extends function, designs and develops landform resource construction submodule, and Block-terrain is realized in the form of graphical interfaces
It automatically generates;
(3) dynamic management submodule:Three dimensional resource including threedimensional model, terrain environment uses prefab (default body) shape
Formula is packaged in Unity editing machine, is realized in global procedures process based on this module and is called, loads, unloading operation;Mould
The core that block is realized is multithreading, in Unity exploitation environment, the realization of multithreading show as Coroutine (association journey) with it is different
The use of one step process.
Human-computer interaction module
The user that is embodied as of this module provides human-computer interaction function, constructs UI (user interface) interface of 3-D graphic,
VR aobvious equipment and gesture identification equipment are integrated under the hardware environment of client, realized in Untiy to underlying device
The encapsulation of interface and interactive instruction specifically includes following submodule
(1) three-dimensional UI submodule:Hovercast is the menu mode graphical interfaces library under a reality environment, it can
To show with VR, gesture interaction equipment is combined well, and supports to develop under Unity environment, Unity included UGUI circle
Face library also provides many infrastructure elements, based on the combination design of both Unity included UGUI interface database, Hovercast
VR interactive interface shows that user by VR and participates in virtual emulation process with gesture interaction equipment;
(2) VR show integrated:Equipment is shown using HTC Vive virtual reality head to realize and what comes into a driver's computer by USB mode
Data connection, software aspects provide the exploitation plug-in unit SteamVR Unity Plugin under Unity environment, are based on Unity
Good cross-platform characteristic;
(3) gesture interaction submodule:Leap Motion is used to interactively enter equipment, Leap Motion as user gesture
It is a portable gesture identification equipment, the Orion SDK that Leap Motion is provided supports Unity environment exploitation, Neng Goushi
The identification of existing various hand motions, this module design gesture interaction according to UI interface function and instruct, and are connect based on the identification of Orion bottom
Mouth is packaged, and is formed reusable second development interface, is also facilitated the calling of three-dimensional UI submodule;
(4) voice control submodule:The interactive voice of user and application program are realized based on Baidu voice REST API, are led to
It crosses the module and integrates online speech identifying function, platform user can simulate voice control and commander to unmanned plane;
(5) brain-computer interface submodule:The BCI of use belongs to EEG type, and equipment acquisition EEG signals simultaneously deliver matlab
Data Analysis Platform is handled, and treated result by UDP (User Datagram Protocol) communication protocol is transferred to what comes into a driver's
Using realization brain control flight simulation, brain-computer interface module is for realizing SSVEP (Steady State Visual Evoked Potential) and Mental imagery two
Kind normal form, SSVEP principle is when the visual stimulus by a fixed frequency, and the brain visual cortex of people can generate a company
Continuous response related with frequency of stimulation identifies human brain institute by acquiring the response and matching with response frequency stimulus signal
The information of transmitting;This module building one SSVEP stimulation interface comprising different stroboscopic numbers in what comes into a driver's application, and by each number
Word corresponds to different control instructions, to realize brain control flight simulation.
Network communication module is based on transport layer ICP/IP protocol, based on .NET Development Framework Socket socket and asynchronous mould
Formula development and application layer communication middleware, designs above-mentioned two parts application layer protocol format, realizes beating for two end data of customer service respectively
It wraps and unpacks, wherein server end open interface, in order to realize based on xPC host-target machine simulation framework platform two
Secondary exploitation, network transmission need to carry out application layer protocol data packet serializing and unserializing.
Unmanned plane cluster formation interactive simulation verification method is calculated using distributed real-time simulink device, host, what comes into a driver's
Machine, that is, client, display equipment, human-computer interaction device realize, wherein form emulator by object-computer and be responsible for operation in real time
Simulated program simulates unmanned plane cluster and forms into columns;Host runtime server software, the downloading and operation of responsible management simulated program,
It is online to adjust ginseng, status data transfers;What comes into a driver's computer runs the what comes into a driver's software developed based on Unity, and receives the number of server
According to driving, shown and real-time display virtual emulation scene in large screen in Virtual Reality head;Human-computer interaction device includes that voice is defeated
Enter equipment, gesture identification equipment and brain-computer interface equipment, they are connected with what comes into a driver's computer, and pass through what comes into a driver's Integrated Simulation
Realize human-computer interaction;Further, it is provided with three-dimensional scenic resource module in what comes into a driver's computer, for opening using Unity engine
Unmanned plane cluster formation task scene is sent out, it is specific as follows:
(1) 3 d model library constructs:The building in three dimensional resource library is based on external software in conjunction with Unity, unmanned plane scene
Required threedimensional model is established using 3D Studio Max modeling software, and generates the model of different LOD ranks, in Unity
Addition LOD Group component implementation model LOD, LOD technology be it is a kind of according to viewpoint distance come the skill of Controlling model level of detail
Art;
(2) real terrain environment construction:Dimensional topography resource uses WM (World Machine) software development, is based on
Unity editing machine extends function, designs and develops landform resource construction submodule, piecemeal is realized in the form of graphical interfaces
Shape automatically generates;
(3) dynamic manages:Three dimensional resource including threedimensional model, terrain environment is edited using prefab form in Unity
It is packaged in device, is realized in global procedures process based on this module and call, load, unloading operation;The core that module is realized
It is multithreading, in Unity exploitation environment, the realization of multithreading shows as the use of Coroutine and asynchronous method.
Human-computer interaction need to construct the interface UI of 3-D graphic, and VR aobvious equipment and gesture identification equipment are integrated into client
Under the hardware environment at end, the encapsulation to underlying device interface and interactive instruction is realized in Untiy, it is specific as follows:
(1) three-dimensional UI submodule:Hovercast is the menu mode graphical interfaces library under a reality environment, it can
To show with VR, gesture interaction equipment is combined well, and supports to develop under Unity environment, Unity included UGUI circle
Face library also provides many infrastructure elements, based on the combination design of both Unity included UGUI interface database, Hovercast
VR interactive interface shows that user by VR and participates in virtual emulation process with gesture interaction equipment;
(2) VR show integrated:Equipment is shown using HTC Vive virtual reality head to realize and what comes into a driver's computer by USB mode
Data connection, software aspects provide the exploitation plug-in unit SteamVR Unity Plugin under Unity environment, are based on Unity
Good cross-platform characteristic;
(3) gesture interaction submodule:Leap Motion is used to interactively enter equipment, Leap Motion as user gesture
It is a portable gesture identification equipment, the Orion SDK that Leap Motion is provided supports Unity environment exploitation, Neng Goushi
The identification of existing various hand motions, this module design gesture interaction according to UI interface function and instruct, and are connect based on the identification of Orion bottom
Mouth is packaged, and is formed reusable second development interface, is also facilitated the calling of three-dimensional UI submodule;
(4) voice control submodule:The interactive voice of user and application program are realized based on Baidu voice REST API, are led to
It crosses the module and integrates online speech identifying function, platform user can simulate voice control and commander to unmanned plane;
(5) brain-computer interface submodule:The BCI of use belongs to EEG type, and equipment acquisition EEG signals simultaneously deliver matlab
Data Analysis Platform is handled, and treated result by UDP communication protocol is transferred to what comes into a driver's application, realizes that brain control is flown
Simulation, for brain-computer interface module for realizing two kinds of normal forms of SSVEP and Mental imagery, SSVEP principle is when by a fixed frequency
When the visual stimulus of rate, the brain visual cortex of people can generate a continuously response related with frequency of stimulation, pass through acquisition
The response is simultaneously matched with response frequency stimulus signal to identify information that human brain is transmitted;This module constructs in what comes into a driver's application
One SSVEP comprising different stroboscopic numbers stimulates interface, and each number is corresponded to different control instructions, to realize brain
Control flight simulation.
The features of the present invention and beneficial effect are:
The present invention has built an interactive unmanned plane cluster formation virtual emulation platform, is providing immersion true to nature
It is real based on voice, gesture, the interactive controlling means of brain-computer interface while unmanned plane cluster formation virtual emulation scene is shown
Existing participation of the people in simulation process, has catered to the characteristics such as current unmanned plane cluster formation man-machine coordination, swarm intelligence, has had very
High application value.Traditional unmanned plane formation vision simulation is compared, core of the invention advantage is as follows:
The present invention can bring corresponding benefit in multiple application fields such as civil and military.In military field, the present invention
It can be used for unmanned plane cluster formation people-machine integrated Emulating in Combat System Simulation, tested for unmanned plane cluster cooperative combat task simulation
Card provides an inexpensive rapid prototyping system, provides good verifying to the design and development of such system and supports, thus
Accelerate the research and development speed of prototype system.At civilian aspect, can according to civil department needs, undertake detection to danger zone,
The tasks such as monitoring, search, such as large area maritime search and rescue task.In addition, the present invention is for unmanned plane cluster formation control method
Research and development also has a very important significance.The platform is the distribution of unmanned plane cluster formation task, track optimizing, control calculation
Method provides a real-time simulation environment, can real-time to algorithm, validity, feasibility carry out test and verification.Emulation knot
Fruit can provide virtual verifying not only for the Earlier designs of unmanned plane cluster fleet system and demonstration is supported, while be unmanned plane collection
The development of group's fleet system theory provides a good experiment porch.
Detailed description of the invention:
Fig. 1 platform architecture figure.
Fig. 2 human-computer interaction flow chart.
Fig. 3 client-side program logical flow chart.
Fig. 4 scenario objects dynamic management logical flow chart.
Network communication module flow chart of the Fig. 5 based on Socket socket.
Fig. 6 unmanned plane formation control method Simulink simulation model block diagram.
Fig. 7 VR interaction effect figure.
Fig. 8 voice control effect picture.
Fig. 9 brain-computer interface interaction effect figure.
Figure 10 unmanned plane formation vision simulation effect picture.
Specific embodiment
The purpose of the present invention is constructing virtual reality emulation scene true to nature, with unmanned plane cluster formation real-time simulation data
Driving demonstration realizes intervention of the people to simulation demo by a variety of human-computer interaction devices, reaches people in ring (man-in-loop)
Simulated effect.The scene demonstration verifying that unmanned plane cluster formation Rapid Prototype Design may be implemented by the platform, can have
Under body task scene, the test to unmanned aerial vehicle (UAV) control algorithm validity is completed;It is built simultaneously for unmanned plane manipulation and commanding
Real-time simulated environment, so that accelerating unmanned aerial vehicle control system on the whole researches and develops speed, be finally reached save economy and time at
This purpose.
Platform network system uses C/S framework, realizes data sharing by local area network.Software configuration be based on modularization, can
The principle of reusability is designed:Communication protocol is realized in application layer encapsulation, and provides protocol format and programming interface, is taken
Business device end can be deployed under any real-time simulation environment;Client provides virtual scene library abundant, and user can be according to imitative
True demand configuration demonstration scene.The present invention combines virtual reality, brain-computer interface, hand using vision simulation application demand as core
It is flat to realize interactive unmanned plane cluster formation simulating, verifying for a variety of advanced technologies such as gesture identification, speech recognition, network communication
Platform, have the characteristics that following function and:
1. distributed real-time simulation validation environment
The present invention is based on xPC host-target machine distributed frames to build real-time simulation environment, and operation unmanned plane cluster is compiled
Team's real-time simulation program.It is shared that constructing local network realizes platform data, and the net of the design C/S structure of the thought based on object-oriented
Network communication middleware.Based on Socket socket calling system network I/O resource, language peace is realized based on Protobuf frame
The independent network data agreement customization of platform, meets the exploitations demands such as real-time status data transmission, the realization of interactive simulation function.
2. large-scale virtual scenario building
The present invention is based on the large-scale virtual scene construction methods of Unity engine implementation, make in conjunction with the experience of VR immersion
It is more life-like to obtain virtual environment.Being established based on 3ds max software includes the threedimensional models such as a variety of unmanned planes, building, natural scene,
Importing Unity carries out editor and forms model library.Based on World Machine software building mountain range, river, Plain, highland, lake
The landform such as pool, construct a variety of terrain environment resources banks.To realize close to true wide terrain environment, design and Implement extensive
Scene management scheme completes the dynamic call to resource needed for scene, improves the performance of Simulation Application.Establish abundant in content field
Jing Ku can carry out simulating, verifying under several scenes.
3. multiple views VR visual display
The present invention is based on the virtual reality of HTC company (VR) helmet product Vive to realize visual display.The maximum that VR is shown
Feature is to bring the participation experience of user's immersion, can allow observer to the state of flight of unmanned plane and virtual in the present invention
Environment have it is on the spot in person as impression.With the development of computer graphical software and hardware technology, with HMD (wearing display equipment) for generation
The VR equipment of table graduallys mature, and the HTC Vive that the present invention uses is exactly representative therein.In addition, in order to comprehensively demonstrate
Simulation process, the present invention is based on the observation of VR helmet designs multiple views and handoff functionalities.
4. the human-computer interaction based on brain control, gesture and voice
The present invention devises many UI elements, including dynamic 3 D graphics user circle in virtual scene display picture
The human-computer interaction functions such as face and sand table, view angle switch, and realized based on interface databases such as UGUI, Hovercast.Interactive signal
Input equipment uses BCI (brain-computer interface) and Leap Motion, and passes through brain control in conjunction with gesture based on Baidu voice user
Mode complete the interaction with Simulation Application, realize people's assemblage on-orbit.
BCI equipment can be acquired human brain electric wave, by the analysis and processing to signal, interpret human brain nerve signal
The course of transmitting is the ultimate interactive device for realizing " idea " control.BCI equipment in the present invention is neuracle (rich
Farsighted health) company 64 leads wireless digital eeg collection system, using based on Matlab data acquisition and real-time analysis platform, it is right
The Matlab communication module of the platform is expanded, realize based on SSVEP, Mental imagery principle brain-computer interface in linear system
System, to realize brain control human-computer interaction function.
The present invention is based on Leap Motion hard recognition equipment development gesture interaction functions.Compared with brain-computer interface technology,
Gesture Recognition is more mature.Leap Motion can to both hands carry out 26 freedom degrees track and identify, thus make user with
The interaction gesture diversification of what comes into a driver's application.The equipment fully achieves naked hand interaction, with VR aobvious perfect combination.This platform uses
Person does not need using any peripheral hardware (such as handle), and user only need to be aobvious with top, reaches out one's hands, it can smooth in virtual world
Trip.In addition, being based on Baidu's voice REST API online recognition, interactive voice mode can be provided for user, can be realized voice control
Function processed.
Interactive unmanned plane cluster formation simulation and verification platform mainly by three-dimensional scenic resource module, human-computer interaction module,
Network communication module three parts composition, the technical solution of each module are as follows:
1. three-dimensional scenic resource module
The present invention is based on Unity engines to develop unmanned plane cluster formation task scene.Unity engine is by Unity
Technologies company exploitation a 3D game Integrated Development Environment, its outstanding feature and for bring of the present invention exploitation
Advantage includes:Powerful expansible editing machine, top figure rendering performance, leading multi-platform support and efficient exploitation
Mode etc..The content of this module includes threedimensional model, landform resource needed for building scene, and corresponding assembly is edited in Unity
Secondary operation is carried out, while developing the dynamic management submodule of scenario objects.
(1) 3 d model library constructs:The building in three dimensional resource library is based on external software in conjunction with Unity.The fields such as unmanned plane
Threedimensional model needed for scape is established using 3D Studio Max modeling software, and generates the model of different LOD ranks, in Unity
Middle addition LOD Group component implementation model LOD.LOD technology be it is a kind of according to viewpoint distance come Controlling model level of detail
Technology can effectively reduce vertex and the polygon number of GPU rendering, to effectively promote program feature.
(2) real terrain environmental structure:Dimensional topography resource uses World Machine (abbreviation WM) software development, WM
The topography and geomorphology abundant such as mountain range needed for can making scene, river, plateau, Plain, and corrosion true to nature is provided
(Erosion), the natural effects such as ladder (Terrace), while WM supports importing of the landform resource to Unity.In order to make scene
It is more true, based on static scenes such as Unity Terrain Engine component addition trees, grass, realized using shader technology
Dynamic Water effect needed for river, lake, while the simulation weather effect such as wind, rain, snow being added in the scene.Due to extensive true
Shape is huge for memory consumption on the spot, and landform resource constructs often by the way of piecemeal, and carries out dynamic pipe at runtime
Reason.The textures production of Block-terrain is based on Splatmap technology, mixes a variety of texture colors, realizes the rings such as snow mountain, meadow, rock
Border color.For above-mentioned Block-terrain resource construction process, function is extended based on Unity editing machine, designs and develops landform resource
Submodule is constructed, the function that Block-terrain automatically generates is realized in the form of graphical interfaces.
(3) dynamic management submodule:As it was noted above, scene is made of object, and object dependency is in resource, the submodule
Realize the dynamic management of object and its resource in scene.The benefit of dynamic management is can to call when resource needs, and is not required to
It is unloaded from memory when wanting, to save memory source, promote program feature, enhances the adaptability of platform.Above-mentioned three
Dimension resource is packaged in Unity editing machine using prefab form, is realized and is adjusted based on this module in global procedures process
With, load, unloading etc. operation.The core that module is realized is multithreading, it refers to opens up new thread simultaneously outside main thread
Row executes certain tasks, makes full use of the performance of multi-core CPU, avoid because resource management and occlusion program main logic into
Row.In Unity exploitation environment, the realization of multithreading shows as the use of Coroutine and asynchronous method.
2. human-computer interaction module
The relied on hardware of human-computer interaction module, i.e. voice-input device, gesture identification equipment, brain-computer interface equipment and VR are set
It is standby, it operates on what comes into a driver's computer.Rely on respective software tool, i.e. Baidu's voice REST API, Orion SDK, brain electricity
Data Analysis Platform and SteamVR are integrated into different forms in what comes into a driver's software, and develop function needed for corresponding submodule is realized
Energy.The mode that wherein Baidu's voice REST API, Orion SDK and SteamVR are all made of plug-in unit is integrated into the development phase
In Unity, and issued jointly with software;Eeg data analysis platform software provides for neuracle company, and isolated operation is regarding
On scape computer, and data communication is carried out with what comes into a driver's software at runtime, to realize the function of brain-computer interface module.
The user that is embodied as of this module provides human-computer interaction function, constructs the interface UI of 3-D graphic, VR are shown and is set
Under the standby hardware environment for being integrated into client with gesture identification equipment, realize in Untiy to underlying device interface and interaction
The encapsulation of instruction.
(1) three-dimensional UI submodule:Hovercast is the menu mode graphical interfaces library under a reality environment, it can
To show with VR, gesture interaction equipment is combined well, and supports to develop under Unity environment.The Unity included interface UGUI
Library also provides many infrastructure elements, designs VR interactive interface based on the two combination, allows user aobvious by VR
It is participated in virtual emulation process with gesture interaction equipment.
(2) VR show integrated:The HTC Vive virtual reality head that the present invention uses shows equipment and realizes and regard by USB mode
The data connection of scape computer, software aspects provide the exploitation plug-in unit SteamVR Unity Plugin under Unity environment, base
In the good cross-platform characteristic of Unity, project can easily be output to Vive it is aobvious in.The equipment can track head
Movement realizes view transformation of the user in the world VR based on virtual camera.
(3) gesture interaction submodule:The present invention uses Leap Motion as user gesture and interactively enters equipment.Leap
Motion is a portable gesture identification equipment, it may be implemented to set this into the tracking of both hands totally 26 freedom degree postures
Large-scale gesture trace regions can be formed in helmet front by purchasing.The Orion SDK that Leap Motion is provided is supported
The identification of various hand motions may be implemented in Unity environment exploitation.This module designs gesture interaction according to UI interface function and refers to
It enables, is packaged based on Orion bottom identification interface, forms reusable second development interface, also facilitate three-dimensional UI submodule
Calling.
(4) voice control submodule:The present invention is based on the voices that Baidu voice REST API realizes user and application program
Interaction.REST API is the online speech recognition Development Framework that Baidu's voice provides, and provides a general HTTP for developer
Interface, to realize the various functions of Baidu's voice offer.Compared with other speech control systems REST API have exploitation at
This low, high-efficient advantage does not need to be loaded into any SDK development kit without installing other files yet.Pass through the module collection
At online speech identifying function, platform user can be simulated to functions such as the voice control of unmanned plane and commanders.
(5) brain-computer interface submodule:The BCI that the present invention uses belongs to EEG type, and equipment acquisition EEG signals are simultaneously delivered
Matlab Data Analysis Platform is handled, and treated result by UDP communication protocol is transferred to what comes into a driver's application, realization
Brain control flight simulation.The need of work of EEG type B CI depends on certain normal form, i.e. human brain is produced according to the mode of regulation
The input EEG signals of raw BCI, brain-computer interface module realize two kinds of normal forms of SSVEP and Mental imagery.SSVEP principle be when by
To a fixed frequency visual stimulus when, the brain visual cortex of people can generate a continuously sound related with frequency of stimulation
It answers, identifies information that human brain is transmitted by acquiring the response and matching with response frequency stimulus signal.This module is regarding
One SSVEP comprising different stroboscopic numbers of building stimulates interface in scape application, and each number is corresponded to different control and is referred to
It enables, to realize brain control flight simulation.Mental imagery does not need stimuli responsive interface as SSVEP, therefore module is only
Need to realize different control instructions.The realization of Mental imagery needs a large amount of brain electricity numbers of many experiments acquisition user
According to, and deep learning is carried out based on BCI Data Analysis Platform, finally obtain good control instruction recognition effect.
3. network communication module
Network communication module is based on transport layer ICP/IP protocol, based on .NET Development Framework Socket socket and asynchronous mould
Formula development and application layer communication middleware.Above-mentioned two parts application layer protocol format is designed, realizes beating for two end data of customer service respectively
Packet and unpacking.Wherein server end open interface, in order to realize based on xPC host-target machine simulation framework platform two
Secondary exploitation.Network transmission needs to carry out application layer protocol data packet serializing and unserializing, and it is public that this platform is based on Google
The Protobuf frame of department's exploitation is realized.It is fast that its benefit first consists in speed, with common formatted language json, xml etc.
Scheme is compared, and has more superior real-time;Secondly numerous types of data is supported, user can emulate according to the difference of platform
Demand defines different types of data packet;In addition Protobuf with specifically using which kind of programming language and which kind of exploitation environment
It is unrelated, it can greatly enhance the applicability of platform software.
In conjunction with attached drawing, the invention will be further described.
Shown in FIG. 1 is platform architecture schematic diagram of the invention.Virtual emulation platform is divided into five on hardware configuration
Part, xPC distributed real-time simulink device, xPC host, high-performance what comes into a driver's computer (client), display equipment, man-machine friendship
Mutual equipment.Wherein, emulator is formed by xPC target machine to be responsible for running the simulation unmanned plane cluster formation of real-time simulation program;The place xPC
Host runtime server software is responsible for the functions such as downloading and operation, online tune ginseng, the status data transfers of management simulated program;
What comes into a driver's computer runs the what comes into a driver's software developed based on Unity, and receives the data-driven of server, in VR aobvious and large screens
Middle real-time display virtual emulation scene;Human-computer interaction device includes that voice-input device, gesture identification equipment and brain-computer interface are set
Standby, they are connected with what comes into a driver's computer, and realize human-computer interaction function by what comes into a driver's Integrated Simulation.
Fig. 2 show platform human-computer interaction work flow diagram, this, which is based partially on brain-computer interface, gesture, voice, realizes people
Assemblage on-orbit.Client what comes into a driver's software shows artificial intelligence by virtual scene and graphical interfaces, and user receives information feedback
The interactive function that interface offer can be used simultaneously is interacted with emulation.It realizes based on SSVEP and movement brain-computer interface part
Imagine the brain-machine interaction interface of normal form, the visual stimulus that user receives the interface generates reaction signal, acquired and handed over by BCI equipment
Matlab Data Analysis Platform is paid to be handled;Gesture identification part obtains real-time hand motion by Leap Motion, and
It analyzes to obtain gesture control instruction based on Orion identification module;Voice control part passes through the microphone etc. with VR hardware compatibility
Input equipment, acquisition voice signal are delivered Baidu's voice REST platform and are analyzed;The control instruction that above three process obtains
Server end is sent to after being packaged by network module.Server-side network module first solves data packet according to agreement
Analysis, the online tune ginseng of bottom real-time simulation environment progress is delivered in obtained instruction, and (online tune ginseng is the base of xPC real-time simulation environment
One of this function), change the relevant parameter of simulation model according to the instruction of user, to realize interactive flight simulation.
By the ingehious design of control instruction, the function based on the interactive platform can sufficiently simulate operator or commander to nothing
The control of man-machine cluster formation simulation object.
Fig. 3 show client (what comes into a driver's) software logic flow chart.Task is divided into three parallel lines by client software
Journey:Main thread, UI thread and network thread.Main thread is responsible for all initial works of program, main logic realization and state more
Newly, UI thread is responsible for handling user's input, and network thread is responsible for carrying out data transmission with server end.After program brings into operation,
Main thread carries out the initialization of resource, scene, thread, and starts first frame, and the workflow of each frame is:
(1) receive the message that other threads are sent, the data packet for interactively entering request and network thread including UI thread.
(2) program main logic handles the message received, and returns to correctly response, including export and tie to UI thread
Fruit shows, and user is requested delivery network thread, and the latter is responsible for being transmitted to server end.
(3) state for carrying out simulation object updates, including position, posture etc., by the Unity engine bottom layer realization frame wash with watercolours
Dye.So far frame work terminates, if user does not issue exit instruction, returns to (1) step and continues next frame circulation.
Fig. 4 show scenario objects dynamic management logical flow chart, describes dynamic management submodule in one frame of main thread
Workflow.The core of the module is the object pool being made of three queues, be respectively Request (inquiry) queue,
Loaded (load) queue and Remove (removal) queue.Module C# script edit as built in Unity realizes that each queue uses
Dictionary (Dictionary) data structure, its feature be to look for it is fast with addition/deletion service speed, meet dynamic manage
The demand of module.Main thread is as follows into the workflow after the module working stage:
(1) calculating needs object to be loaded, creates and object is put into Request queue, start simultaneously at object resource
Asynchronous load.Due to being that asynchronous system current thread is not blocked, program, which continues calculating scene, no longer needs object to be shown, and
They are moved into Remove queue from Loaded queue.
(2) element in Request queue is inquired in a sequential manner, if it completes resource load, by object
It instantiates and moves to Loaded queue;If load does not complete, skip the element continue to inquire it is next, until Request team
All inquiry is finished and (respectively be queried primary) column.
(3) all elements in Remove queue are removed, object is destroyed and unload resource, releasing memory.So far the frame work
Work terminates, and next frame is still carried out since (1).
Fig. 5 show the network communication module flow chart based on Socket socket.Module realization is divided into user's request
Response and emulation real-time data transmission two parts, the work of each section are completed jointly by client and server end, process
It is as follows:
(1) user requests response:This is based partially on transport layer Transmission Control Protocol, establishes client by binding Socket socket
The connection at end and server end.Client receives upper-layer user's message first, and service is transmitted to after application layer protocol is packaged
Device;Server is unpacked and is analyzed to data packet, data needed for accessing, again back to client after packing;Constantly in circulation
Process is stated, until user generates exit message, closes connection.
(2) real-time data transmission is emulated:This partially due to real-time requirement, based on transport layer udp protocol realize.Client
End issues the request for starting emulation to server first, starts to send to client after server confirmation request and emulates data packet;
Client receives data, and main thread is delivered after unpacking and carries out state update;It repeats above-mentioned be recycled to and emulates stopping, closing Socket
Discharge resource.
Specific example is given below:
1. system hardware and software configures
According to platform architecture shown in this section Fig. 1, this example is as shown in the table using hardware configuration:
The software realization mode of this example includes:XPC real-time simulation program is developed using Matlab/Simulink;Service
Device end software is based on C# language and Visual Studio 2010 writes;Client software is based on Unity 3d engine, and
SteamVR, Orion SDK, Baidu's voice REST API developing plug.
2. unmanned plane cluster formation control method simulation example
Unmanned plane particle movement model is established, formation control device is designed.Using inner and outer ring control structure, wherein outer ring is obtained
Desired motion state, design PID (proportional-integral-differential) controller seek the expectation posture of inner ring posture, inner ring tracking expectation
Attitude Calculation obtains final control output.Unmanned plane is emulated for each frame in forming into columns, builds Simulink simulation model
As shown in fig. 6, and compile it into the xPC target machine real-time simulation program under external schema, download to distributed real-time simulink
It is run in environment.
3. experimental result
This example has carried out the experiment of unmanned plane formation control under experimental platform system, is respectively real shown in Fig. 7 to Figure 10
Test platform and simulated effect schematic diagram.Fig. 7 show client VR and human-computer interaction experimental situation.Fig. 8 show unmanned plane language
Sound control effect picture.Fig. 9 show brain-computer interface control unmanned plane flight pattern effect picture.Figure 10 show unmanned plane formation view
Scape simulated effect figure.The unmanned plane cluster formation control method example achieves good interactive simulation effect under this platform
Fruit demonstrates feasibility of the invention.
Claims (6)
1. a kind of unmanned plane cluster formation interactive simulation verifies system, characterized in that be made of five parts:It is distributed imitative in real time
True device, host, what comes into a driver's computer, that is, client, display equipment, human-computer interaction device;Wherein, it is made of object-computer imitative
True device is responsible for running the simulation unmanned plane cluster formation of real-time simulation program;Host runtime server software is responsible for management emulation
The downloading and operation of program, online tune ginseng, status data transfers;What comes into a driver's computer runs the what comes into a driver's software developed based on Unity,
And the data-driven of server is received, it is shown and real-time display virtual emulation scene in large screen in Virtual Reality head;Man-machine friendship
Mutual equipment includes voice-input device, gesture identification equipment and brain-computer interface equipment, they are connected with what comes into a driver's computer, and
Human-computer interaction is realized by what comes into a driver's Integrated Simulation;Further, it is provided with three-dimensional scenic resource module in what comes into a driver's computer, be used for
Unmanned plane cluster formation task scene is developed using Unity engine, specifically includes following submodule.
2. unmanned plane cluster formation interactive simulation as described in claim 1 verifies system, characterized in that
(1) 3 d model library submodule:The building in three dimensional resource library is based on external software in conjunction with Unity, unmanned plane scene institute
The threedimensional model needed is established using 3D Studio Max modeling software, and generates the model of different level of detail LOD ranks,
In Unity add LOD Group component implementation model LOD, LOD technology be it is a kind of according to viewpoint distance come Controlling model levels of detail
Secondary technology;
(2) real terrain environment submodule:Dimensional topography resource uses WM (World Machine) software development, is based on Unity
Editing machine extends function, designs and develops landform resource construction submodule, realizes that Block-terrain is automatic in the form of graphical interfaces
It generates;
(3) dynamic management submodule:Three dimensional resource including threedimensional model, terrain environment is existed using prefab (default body) form
It is packaged in Unity editing machine, is realized in global procedures process based on this module and call, load, unloading operation;Module is real
Existing core is multithreading, and in Unity exploitation environment, the realization of multithreading shows as Coroutine (association's journey) and asynchronous side
The use of method.
3. unmanned plane cluster formation interactive simulation as claimed in claim 2 verifies system, characterized in that human-computer interaction module
It is embodied as user and human-computer interaction function is provided, the interface user interface UI of 3-D graphic is constructed, by VR aobvious equipment and gesture
It identifies under integration of equipments to the hardware environment of client, the envelope to underlying device interface and interactive instruction is realized in Untiy
Dress, specifically includes following submodule:
(1) three-dimensional UI submodule:Hovercast is the menu mode graphical interfaces library under a reality environment, it can be with
VR aobvious, gesture interaction equipment combines well, and supports to develop under Unity environment, Unity included UGUI interface database
Many infrastructure elements are provided, based on the combination design VR interaction of both Unity included UGUI interface database, Hovercast
Interface shows that user by VR and participates in virtual emulation process with gesture interaction equipment;
(2) VR show integrated:Equipment is shown using HTC Vive virtual reality head and realizes by USB mode number with what comes into a driver's computer
According to connection, software aspects provide the exploitation plug-in unit SteamVR Unity Plugin under Unity environment, good based on Unity
Cross-platform characteristic;
(3) gesture interaction submodule:Leap Motion is used to interactively enter equipment as user gesture, Leap Motion is one
The portable gesture identification equipment of money, the Orion SDK that Leap Motion is provided support Unity environment exploitation, can be realized each
The identification of kind of hand motion, this module design gesture interaction according to UI interface function and instruct, based on Orion bottom identify interface into
Row encapsulation, forms reusable second development interface, also facilitates the calling of three-dimensional UI submodule;
(4) voice control submodule:The interactive voice that user and application program are realized based on Baidu voice REST API, by this
Module integrates online speech identifying function, and platform user can simulate voice control and commander to unmanned plane;
(5) brain-computer interface submodule:The BCI of use belongs to EEG type, and equipment acquisition EEG signals simultaneously deliver matlab data point
Analysis platform is handled, and treated result by UDP (User Datagram Protocol) communication protocol is transferred to what comes into a driver's application, reality
Existing brain control flight simulation, brain-computer interface module for realizing two kinds of normal forms of SSVEP (Steady State Visual Evoked Potential) and Mental imagery,
SSVEP principle be when the visual stimulus by a fixed frequency, the brain visual cortex of people can generate one continuously with
The related response of frequency of stimulation identifies what human brain was transmitted by acquiring the response and matching with response frequency stimulus signal
Information;This module one SSVEP comprising different stroboscopic numbers of building in what comes into a driver's application stimulates interface, and each number is corresponding
In different control instructions, to realize brain control flight simulation.
4. unmanned plane cluster formation interactive simulation as claimed in claim 2 verifies system, characterized in that network communication module
It is intermediate based on .NET Development Framework Socket socket and the communication of asynchronous mode development and application layer based on transport layer ICP/IP protocol
Part designs above-mentioned two parts application layer protocol format, realizes the packing and unpacking of two end data of customer service respectively, wherein server end
Open interface, in order to realize based on the platform secondary development of xPC host-target machine simulation framework, network transmission needs pair
Application layer protocol data packet carries out serializing and unserializing.
5. a kind of unmanned plane cluster formation interactive simulation verification method, characterized in that utilize distributed real-time simulink device, host
Machine, what comes into a driver's computer, that is, client, display equipment, human-computer interaction device realize, wherein form emulator by object-computer and bear
Duty operation real-time simulation program simulation unmanned plane cluster is formed into columns;Host runtime server software is responsible for management simulated program
Downloading and operation, online tune ginseng, status data transfers;What comes into a driver's computer runs the what comes into a driver's software developed based on Unity, and receives
The data-driven of server is shown and real-time display virtual emulation scene in large screen in Virtual Reality head;Human-computer interaction device
Including voice-input device, gesture identification equipment and brain-computer interface equipment, they are connected with what comes into a driver's computer, and pass through view
Scape Integrated Simulation realizes human-computer interaction;Further, it is provided with three-dimensional scenic resource module in what comes into a driver's computer, for utilizing
Unity engine develops unmanned plane cluster formation task scene, specific as follows:
(1) 3 d model library constructs:The building in three dimensional resource library is based on external software in conjunction with Unity, needed for unmanned plane scene
Threedimensional model established using 3D Studio Max modeling software, and generate the model of different LOD ranks, added in Unity
LOD Group component implementation model LOD, LOD technology be it is a kind of according to viewpoint distance come the technology of Controlling model level of detail;
(2) real terrain environment construction:Dimensional topography resource uses WM (World Machine) software development, is compiled based on Unity
It collects device and extends function, design and develop landform resource construction submodule, realize that Block-terrain is given birth to automatically in the form of graphical interfaces
At;
(3) dynamic manages:Three dimensional resource including threedimensional model, terrain environment is using prefab form in Unity editing machine
It is packaged, is realized in global procedures process based on this module and call, load, unloading operation;The core that module is realized is more
Thread, in Unity exploitation environment, the realization of multithreading shows as the use of Coroutine and asynchronous method.
6. unmanned plane cluster formation interactive simulation verification method as claimed in claim 5, characterized in that human-computer interaction needs structure
VR aobvious equipment and gesture identification equipment are integrated under the hardware environment of client by the interface UI for building 3-D graphic,
The encapsulation to underlying device interface and interactive instruction is realized in Untiy, it is specific as follows:
(1) three-dimensional UI:The Hovercast of use is the menu mode graphical interfaces library under a reality environment, it can be with
VR aobvious, gesture interaction equipment combines well, and supports to develop under Unity environment, Unity included UGUI interface database
Many infrastructure elements are provided, based on the combination design VR interaction of both Unity included UGUI interface database, Hovercast
Interface shows that user by VR and participates in virtual emulation process with gesture interaction equipment;
(2) VR show integrated:Equipment is shown using HTC Vive virtual reality head and realizes by USB mode number with what comes into a driver's computer
According to connection, software aspects provide the exploitation plug-in unit SteamVR Unity Plugin under Unity environment, based on Unity across
Platform identity;
(3) gesture interaction:Leap Motion is used to interactively enter equipment as user gesture, Leap Motion is a portable
The gesture identification equipment of formula, the Orion SDK that Leap Motion is provided support Unity environment exploitation, can be realized various hands
The identification of movement, this module design gesture interaction according to UI interface function and instruct, and are sealed based on Orion bottom identification interface
Dress, forms reusable second development interface, also facilitates the calling of three-dimensional UI submodule;
(4) voice control:The interactive voice that user and application program are realized based on Baidu voice REST API, passes through the module collection
At online speech identifying function, platform user can simulate voice control and commander to unmanned plane;
(5) brain-computer interface:The BCI of use belongs to EEG type, and equipment acquisition EEG signals simultaneously deliver matlab Data Analysis Platform
It is handled, treated result by UDP communication protocol is transferred to what comes into a driver's application, realization brain control flight simulation, brain machine connects
For mouth mold block for realizing two kinds of normal forms of SSVEP and Mental imagery, SSVEP principle is when the visual stimulus by a fixed frequency
When, the brain visual cortex of people can generate a continuously response related with frequency of stimulation, by acquire the response and with sound
Frequency stimulation signal is answered to match to identify information that human brain is transmitted;This module includes difference for building one in what comes into a driver's application
The SSVEP of stroboscopic number stimulates interface, and each number is corresponded to different control instructions, to realize brain control flight simulation.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810463947.4A CN108845802A (en) | 2018-05-15 | 2018-05-15 | Unmanned plane cluster formation interactive simulation verifies system and implementation method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810463947.4A CN108845802A (en) | 2018-05-15 | 2018-05-15 | Unmanned plane cluster formation interactive simulation verifies system and implementation method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108845802A true CN108845802A (en) | 2018-11-20 |
Family
ID=64213088
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810463947.4A Pending CN108845802A (en) | 2018-05-15 | 2018-05-15 | Unmanned plane cluster formation interactive simulation verifies system and implementation method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108845802A (en) |
Cited By (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109669477A (en) * | 2019-01-29 | 2019-04-23 | 华南理工大学 | A kind of cooperative control system and control method towards unmanned plane cluster |
CN109710229A (en) * | 2018-12-11 | 2019-05-03 | 中国航空工业集团公司西安航空计算技术研究所 | One kind is towards GPU graphics chip pipeline unit framework verification method and platform |
CN109739353A (en) * | 2018-12-27 | 2019-05-10 | 重庆上丞科技有限公司 | A kind of virtual reality interactive system identified based on gesture, voice, Eye-controlling focus |
CN109876453A (en) * | 2019-01-14 | 2019-06-14 | 珠海金山网络游戏科技有限公司 | A kind of default body optimization method and device |
CN110377049A (en) * | 2019-06-29 | 2019-10-25 | 天津大学 | Unmanned plane cluster flight pattern reconfigurable control method based on brain-computer interface |
CN110531786A (en) * | 2019-09-10 | 2019-12-03 | 西北工业大学 | UAV Maneuver strategy based on DQN is autonomously generated method |
CN110705021A (en) * | 2019-08-29 | 2020-01-17 | 北京神舟航天软件技术有限公司 | Data-driven test driving method |
CN110764433A (en) * | 2019-10-16 | 2020-02-07 | 中山大学 | V-REP platform-based cluster unmanned aerial vehicle system parallel simulation method |
CN110865627A (en) * | 2019-08-29 | 2020-03-06 | 北京神舟航天软件技术有限公司 | Intelligent unmanned cluster system test bed framework |
CN111258554A (en) * | 2020-01-13 | 2020-06-09 | 中船第九设计研究院工程有限公司 | Virtual reality development system for pipeline production process |
CN111291357A (en) * | 2020-03-07 | 2020-06-16 | 王春花 | Terminal access verification method and device and computer equipment |
CN111414688A (en) * | 2020-03-18 | 2020-07-14 | 上海机器人产业技术研究院有限公司 | Mobile robot simulation system and method based on UNITY engine |
CN111638724A (en) * | 2020-05-07 | 2020-09-08 | 西北工业大学 | Novel cooperative intelligent control method for unmanned aerial vehicle group computer |
CN111723473A (en) * | 2020-05-30 | 2020-09-29 | 同济大学 | Three-dimensional visual collaborative simulation system |
CN112148362A (en) * | 2020-09-22 | 2020-12-29 | 深圳顺势为快科技有限公司 | VR multi-platform adaptation system and adaptation method based on Unity engineering |
CN112214209A (en) * | 2020-10-23 | 2021-01-12 | 北航(四川)西部国际创新港科技有限公司 | Modeling method for interaction information and task time sequence in unmanned aerial vehicle operation scene |
CN112631173A (en) * | 2020-12-11 | 2021-04-09 | 中国人民解放军国防科技大学 | Brain-controlled unmanned platform cooperative control system |
CN113467275A (en) * | 2021-08-16 | 2021-10-01 | 北京航空航天大学 | Unmanned aerial vehicle cluster flight simulation system based on real object airborne equipment |
CN113589706A (en) * | 2021-08-02 | 2021-11-02 | 天津大学 | Helicopter trailing edge flap control virtual simulation method |
CN113722912A (en) * | 2021-08-31 | 2021-11-30 | 中国电子科技集团公司第五十四研究所 | Virtual-real fused unmanned cluster collaborative verification system |
CN114020041A (en) * | 2021-12-14 | 2022-02-08 | 云南民族大学 | Multi-unmanned aerial vehicle multithreading two-dimensional exploration simulation method and system |
CN114019828A (en) * | 2021-11-29 | 2022-02-08 | 中国人民解放军国防科技大学 | Multi-mode virtual-real interaction simulation system and method for unmanned aerial vehicle cluster |
CN114063474A (en) * | 2021-12-03 | 2022-02-18 | 北京航空航天大学 | Semi-physical simulation system of unmanned aerial vehicle cluster |
CN114218702A (en) * | 2021-12-10 | 2022-03-22 | 哈尔滨工业大学(深圳) | Virtual visual simulation system for space on-orbit control |
CN114490498A (en) * | 2022-01-20 | 2022-05-13 | 山东大学 | Simulation software simulation heterogeneous system based on VR technology and working method thereof |
CN114912259A (en) * | 2022-04-29 | 2022-08-16 | 中国航空无线电电子研究所 | Task-oriented virtual aircraft cabin modeling simulation verification system and method |
CN115951598A (en) * | 2023-01-16 | 2023-04-11 | 中国人民解放军国防科技大学 | Virtual-real combined simulation method, device and system for multiple unmanned aerial vehicles |
CN116068990A (en) * | 2022-12-16 | 2023-05-05 | 天津大学 | Star group intelligent fault diagnosis interactive virtual simulation platform verification method |
CN116540568A (en) * | 2023-07-05 | 2023-08-04 | 中南大学 | Large-scale distributed unmanned aerial vehicle cluster simulation system |
CN116842758A (en) * | 2023-08-28 | 2023-10-03 | 中国民航管理干部学院 | Simulation platform and method for civil unmanned aerial vehicle air traffic service algorithm verification |
CN117313439A (en) * | 2023-11-30 | 2023-12-29 | 西安辰航卓越科技有限公司 | Multi-scene multi-machine type unmanned aerial vehicle simulation system |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110208506A1 (en) * | 2010-02-24 | 2011-08-25 | Sling Media Inc. | Systems and methods for emulating network-enabled media components |
CN104111861A (en) * | 2014-07-07 | 2014-10-22 | 中国人民解放军军械工程学院 | Unmanned aerial vehicle simulation training system and control method thereof |
CN105739525A (en) * | 2016-02-14 | 2016-07-06 | 普宙飞行器科技(深圳)有限公司 | System of matching somatosensory operation to realize virtual flight |
CN107643695A (en) * | 2017-09-07 | 2018-01-30 | 天津大学 | Someone/unmanned plane cluster formation VR emulation modes and system based on brain electricity |
-
2018
- 2018-05-15 CN CN201810463947.4A patent/CN108845802A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110208506A1 (en) * | 2010-02-24 | 2011-08-25 | Sling Media Inc. | Systems and methods for emulating network-enabled media components |
CN104111861A (en) * | 2014-07-07 | 2014-10-22 | 中国人民解放军军械工程学院 | Unmanned aerial vehicle simulation training system and control method thereof |
CN105739525A (en) * | 2016-02-14 | 2016-07-06 | 普宙飞行器科技(深圳)有限公司 | System of matching somatosensory operation to realize virtual flight |
CN107643695A (en) * | 2017-09-07 | 2018-01-30 | 天津大学 | Someone/unmanned plane cluster formation VR emulation modes and system based on brain electricity |
Cited By (46)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109710229A (en) * | 2018-12-11 | 2019-05-03 | 中国航空工业集团公司西安航空计算技术研究所 | One kind is towards GPU graphics chip pipeline unit framework verification method and platform |
CN109739353A (en) * | 2018-12-27 | 2019-05-10 | 重庆上丞科技有限公司 | A kind of virtual reality interactive system identified based on gesture, voice, Eye-controlling focus |
CN109876453A (en) * | 2019-01-14 | 2019-06-14 | 珠海金山网络游戏科技有限公司 | A kind of default body optimization method and device |
CN109669477A (en) * | 2019-01-29 | 2019-04-23 | 华南理工大学 | A kind of cooperative control system and control method towards unmanned plane cluster |
CN110377049B (en) * | 2019-06-29 | 2022-05-17 | 天津大学 | Brain-computer interface-based unmanned aerial vehicle cluster formation reconfiguration control method |
CN110377049A (en) * | 2019-06-29 | 2019-10-25 | 天津大学 | Unmanned plane cluster flight pattern reconfigurable control method based on brain-computer interface |
CN110865627A (en) * | 2019-08-29 | 2020-03-06 | 北京神舟航天软件技术有限公司 | Intelligent unmanned cluster system test bed framework |
CN110705021A (en) * | 2019-08-29 | 2020-01-17 | 北京神舟航天软件技术有限公司 | Data-driven test driving method |
CN110705021B (en) * | 2019-08-29 | 2023-06-02 | 北京神舟航天软件技术有限公司 | Data-driven test driving method |
CN110531786B (en) * | 2019-09-10 | 2022-07-22 | 西北工业大学 | Unmanned aerial vehicle maneuvering strategy autonomous generation method based on DQN |
CN110531786A (en) * | 2019-09-10 | 2019-12-03 | 西北工业大学 | UAV Maneuver strategy based on DQN is autonomously generated method |
CN110764433A (en) * | 2019-10-16 | 2020-02-07 | 中山大学 | V-REP platform-based cluster unmanned aerial vehicle system parallel simulation method |
CN111258554B (en) * | 2020-01-13 | 2024-04-09 | 中船第九设计研究院工程有限公司 | Pipeline production flow virtual reality development system |
CN111258554A (en) * | 2020-01-13 | 2020-06-09 | 中船第九设计研究院工程有限公司 | Virtual reality development system for pipeline production process |
CN111291357A (en) * | 2020-03-07 | 2020-06-16 | 王春花 | Terminal access verification method and device and computer equipment |
CN111414688A (en) * | 2020-03-18 | 2020-07-14 | 上海机器人产业技术研究院有限公司 | Mobile robot simulation system and method based on UNITY engine |
CN111638724A (en) * | 2020-05-07 | 2020-09-08 | 西北工业大学 | Novel cooperative intelligent control method for unmanned aerial vehicle group computer |
CN111723473A (en) * | 2020-05-30 | 2020-09-29 | 同济大学 | Three-dimensional visual collaborative simulation system |
CN112148362A (en) * | 2020-09-22 | 2020-12-29 | 深圳顺势为快科技有限公司 | VR multi-platform adaptation system and adaptation method based on Unity engineering |
CN112214209A (en) * | 2020-10-23 | 2021-01-12 | 北航(四川)西部国际创新港科技有限公司 | Modeling method for interaction information and task time sequence in unmanned aerial vehicle operation scene |
CN112214209B (en) * | 2020-10-23 | 2024-02-13 | 北航(四川)西部国际创新港科技有限公司 | Modeling method for interaction information and task time sequence in unmanned aerial vehicle operation scene |
CN112631173A (en) * | 2020-12-11 | 2021-04-09 | 中国人民解放军国防科技大学 | Brain-controlled unmanned platform cooperative control system |
CN113589706A (en) * | 2021-08-02 | 2021-11-02 | 天津大学 | Helicopter trailing edge flap control virtual simulation method |
CN113467275A (en) * | 2021-08-16 | 2021-10-01 | 北京航空航天大学 | Unmanned aerial vehicle cluster flight simulation system based on real object airborne equipment |
CN113722912A (en) * | 2021-08-31 | 2021-11-30 | 中国电子科技集团公司第五十四研究所 | Virtual-real fused unmanned cluster collaborative verification system |
CN113722912B (en) * | 2021-08-31 | 2022-12-09 | 中国电子科技集团公司第五十四研究所 | Virtual-real fused unmanned cluster collaborative verification system |
CN114019828A (en) * | 2021-11-29 | 2022-02-08 | 中国人民解放军国防科技大学 | Multi-mode virtual-real interaction simulation system and method for unmanned aerial vehicle cluster |
CN114063474A (en) * | 2021-12-03 | 2022-02-18 | 北京航空航天大学 | Semi-physical simulation system of unmanned aerial vehicle cluster |
CN114063474B (en) * | 2021-12-03 | 2023-06-06 | 北京航空航天大学 | Simulation method of semi-physical simulation system based on unmanned aerial vehicle cluster |
CN114218702A (en) * | 2021-12-10 | 2022-03-22 | 哈尔滨工业大学(深圳) | Virtual visual simulation system for space on-orbit control |
CN114020041A (en) * | 2021-12-14 | 2022-02-08 | 云南民族大学 | Multi-unmanned aerial vehicle multithreading two-dimensional exploration simulation method and system |
CN114020041B (en) * | 2021-12-14 | 2024-02-20 | 云南民族大学 | Multi-unmanned aerial vehicle multi-thread two-dimensional exploration simulation method and system |
CN114490498B (en) * | 2022-01-20 | 2023-12-19 | 山东大学 | Simulation software simulation heterogeneous system based on VR technology and working method thereof |
CN114490498A (en) * | 2022-01-20 | 2022-05-13 | 山东大学 | Simulation software simulation heterogeneous system based on VR technology and working method thereof |
CN114912259A (en) * | 2022-04-29 | 2022-08-16 | 中国航空无线电电子研究所 | Task-oriented virtual aircraft cabin modeling simulation verification system and method |
CN114912259B (en) * | 2022-04-29 | 2024-05-03 | 中国航空无线电电子研究所 | Task-oriented virtual aircraft cabin modeling simulation verification system and method |
CN116068990A (en) * | 2022-12-16 | 2023-05-05 | 天津大学 | Star group intelligent fault diagnosis interactive virtual simulation platform verification method |
CN116068990B (en) * | 2022-12-16 | 2023-11-10 | 天津大学 | Star group intelligent fault diagnosis interactive virtual simulation platform verification method |
CN115951598B (en) * | 2023-01-16 | 2023-12-01 | 中国人民解放军国防科技大学 | Virtual-real combination simulation method, device and system for multiple unmanned aerial vehicles |
CN115951598A (en) * | 2023-01-16 | 2023-04-11 | 中国人民解放军国防科技大学 | Virtual-real combined simulation method, device and system for multiple unmanned aerial vehicles |
CN116540568A (en) * | 2023-07-05 | 2023-08-04 | 中南大学 | Large-scale distributed unmanned aerial vehicle cluster simulation system |
CN116540568B (en) * | 2023-07-05 | 2023-09-22 | 中南大学 | Large-scale distributed unmanned aerial vehicle cluster simulation system |
CN116842758B (en) * | 2023-08-28 | 2024-03-19 | 中国民航管理干部学院 | Simulation platform and method for civil unmanned aerial vehicle air traffic service algorithm verification |
CN116842758A (en) * | 2023-08-28 | 2023-10-03 | 中国民航管理干部学院 | Simulation platform and method for civil unmanned aerial vehicle air traffic service algorithm verification |
CN117313439A (en) * | 2023-11-30 | 2023-12-29 | 西安辰航卓越科技有限公司 | Multi-scene multi-machine type unmanned aerial vehicle simulation system |
CN117313439B (en) * | 2023-11-30 | 2024-03-01 | 西安辰航卓越科技有限公司 | Multi-scene multi-machine type unmanned aerial vehicle simulation system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108845802A (en) | Unmanned plane cluster formation interactive simulation verifies system and implementation method | |
CN107643695B (en) | Human/unmanned aerial vehicle cluster formation VR simulation method and system based on electroencephalogram | |
CN102508439B (en) | HLA (High Level Architecture)-based multi-unmmaned aerial vehicle distributed simulation method | |
CN106845032B (en) | The construction method of multimode navigation three-dimensional dynamic visual simulation platform | |
CN109858111A (en) | RLV virtual emulation Platform Designing and implementation method | |
CN109740283A (en) | Autonomous multiple agent confronting simulation method and system | |
CN115641375B (en) | Method, device, equipment and storage medium for processing hair of virtual object | |
US20150022516A1 (en) | Flexible 3-d character rigging blocks with interface obligations | |
CN107703775A (en) | Hard and soft liquid coupling Complex Spacecraft analogue system and method | |
CN116704103A (en) | Image rendering method, device, equipment, storage medium and program product | |
CN112843704A (en) | Animation model processing method, device, equipment and storage medium | |
CN103208130B (en) | Large-scale group performance animation synthetic method and equipment | |
CN106910233B (en) | Motion simulation method of virtual insect animation role | |
Hu et al. | A rapid development method of virtual assembly experiments based on 3d game engine | |
Tan | Animation Image Art Design Mode Using 3D Modeling Technology | |
CN110705021B (en) | Data-driven test driving method | |
Häfner | PolyVR-a virtual reality authoring framework for engineering applications | |
CN109671144A (en) | Characteristics of bamboo simulation method and system | |
CN104766358A (en) | Optimization method of flying insect moving model parameters based on statistic estimation | |
CN109933810A (en) | A kind of three-dimensional maintenance simulation model construction method based on operating unit | |
CN109509253A (en) | Electric system three-dimensional artificial visual experience VR design method | |
Sang | [Retracted] Interactive Innovation Research on Film Animation Based on Maya‐Unity Animation Simulation of Visual Sensor | |
Temizer | The state of the art and the future of modeling and simulation systems | |
Bekaroo et al. | Ai-assisted extended reality toward the 6g era: challenges and prospective solutions | |
Hu et al. | UTSE: A Game Engine-Based Simulation Environemnt for Agent |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20181120 |
|
RJ01 | Rejection of invention patent application after publication |