CN116011101A - Simulation method, simulation device, electronic equipment and storage medium - Google Patents

Simulation method, simulation device, electronic equipment and storage medium Download PDF

Info

Publication number
CN116011101A
CN116011101A CN202211701713.1A CN202211701713A CN116011101A CN 116011101 A CN116011101 A CN 116011101A CN 202211701713 A CN202211701713 A CN 202211701713A CN 116011101 A CN116011101 A CN 116011101A
Authority
CN
China
Prior art keywords
simulation
environment
target
model
flight
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211701713.1A
Other languages
Chinese (zh)
Inventor
胡明寅
吴伟
谷靖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Huitian Aerospace Technology Co Ltd
Original Assignee
Guangdong Huitian Aerospace Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Huitian Aerospace Technology Co Ltd filed Critical Guangdong Huitian Aerospace Technology Co Ltd
Priority to CN202211701713.1A priority Critical patent/CN116011101A/en
Publication of CN116011101A publication Critical patent/CN116011101A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T90/00Enabling technologies or technologies with a potential or indirect contribution to GHG emissions mitigation

Landscapes

  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The application discloses a simulation method, a simulation device, electronic equipment and a storage medium. The method comprises the following steps: obtaining a target simulation environment, wherein the target simulation environment comprises at least one of the following: a first simulation sub-environment and a second simulation sub-environment; loading an aircraft model in a target simulation environment; the aircraft model is controlled to fly in a target simulation environment. According to the technical scheme provided by the embodiment of the application, the simulation environment is firstly obtained, the simulation environment comprises the static target models respectively corresponding to the static targets in the real world and the dynamic target models respectively corresponding to the dynamic targets, and then the aircraft model is controlled to fly in the simulation environment.

Description

Simulation method, simulation device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of simulation technologies, and in particular, to a simulation method, a simulation device, an electronic device, and a storage medium.
Background
The simulation technology is an effective research means. After the development work of the aircraft is completed and before the aircraft is put into use, it is necessary to debug the functions and performances of the aircraft by simulation technology.
In the related art, the quality of a simulation effect of simulating an aircraft depends on the establishment of a simulation environment. In the related art, simulation for an aircraft is generally based on standard scene simulation, that is, a standard scene condition, such as a specific airflow condition, is set, and then simulation is performed based on the standard scene condition.
In the related art, the simulation effect is poor due to the fact that the standard scene working condition and the actual working condition are greatly different.
Disclosure of Invention
The application provides a simulation method, a simulation device, electronic equipment and a storage medium.
In a first aspect, an embodiment of the present application provides a simulation method, including: obtaining a target simulation environment, wherein the target simulation environment comprises at least one of the following: the system comprises a first simulation sub-environment and a second simulation sub-environment, wherein the first simulation sub-environment comprises static target models respectively corresponding to at least one static target in the real world, and the second simulation sub-environment comprises dynamic target models respectively corresponding to at least one dynamic target in the real world; loading an aircraft model in a target simulation environment; the aircraft model is controlled to fly in a target simulation environment.
In a second aspect, an embodiment of the present application provides an emulation apparatus, including: the environment acquisition module is used for acquiring a target simulation environment, and the target simulation environment comprises at least one of the following: the system comprises a first simulation sub-environment and a second simulation sub-environment, wherein the first simulation sub-environment comprises static target models respectively corresponding to at least one static target in the real world, and the second simulation sub-environment comprises dynamic target models respectively corresponding to at least one dynamic target in the real world; the model loading module is used for loading the aircraft model in the target simulation environment; and the simulation module is used for controlling the aircraft model to fly in the target simulation environment.
In a third aspect, embodiments of the present application provide a vehicle, including: one or more processors; a memory; one or more applications, wherein the one or more applications are stored in memory and configured to be executed by the one or more processors, the one or more applications configured to perform the simulation method as described in the first aspect.
In a fourth aspect, embodiments of the present application provide a computer readable storage medium having stored therein computer program instructions that are callable by a processor to perform a simulation method as in the first aspect.
In a fifth aspect, embodiments of the present application provide a computer program product for implementing the simulation method according to the first aspect when the computer program product is executed.
Compared with the prior art, the simulation method provided by the embodiment of the application comprises the steps of firstly acquiring the simulation environment, wherein the simulation environment comprises the static target models respectively corresponding to the static targets in the real world and the dynamic target models respectively corresponding to the dynamic targets, and then controlling the aircraft model to fly in the simulation environment.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic diagram of a simulation system provided in an embodiment of the present application.
Fig. 2 is a block diagram of a simulation system provided in an embodiment of the present application.
Fig. 3 is a flowchart of a simulation method according to an embodiment of the present application.
Fig. 4 is a flowchart of another simulation method according to another embodiment of the present application.
Fig. 5 is a block diagram of a simulation apparatus according to an embodiment of the present application.
Fig. 6 is a block diagram of an electronic device according to an embodiment of the present application.
Fig. 7 is a block diagram of a computer storage medium according to an embodiment of the present application.
Detailed Description
Embodiments of the present application are described in detail below, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to like or similar elements or elements having like or similar functions throughout. The embodiments described below by referring to the drawings are exemplary only for the purpose of explaining the present application and are not to be construed as limiting the present application.
In order to better understand the solution of the present application, the following description will make clear and complete descriptions of the technical solution of the embodiment of the present application with reference to the accompanying drawings in the embodiment of the present application. It will be apparent that the described embodiments are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by those skilled in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application.
Referring to fig. 1, a schematic diagram of an implementation environment provided in one embodiment of the present application is shown. The implementation environment includes a simulation system 100. The simulation system 100 is used to implement a simulation of the flight process of an aircraft model. The simulation is a technology for establishing, checking and running a model of an actual system by using a computer to obtain the behavior characteristics of the model, so as to achieve the purpose of analyzing and researching the actual system. The simulation system 100 includes a simulation environment and an aircraft model 110.
In the embodiment of the application, the simulation environment is obtained by modeling the real flight environment of the aircraft based on a digital twin technology. The digital twin technology fully utilizes data such as a physical model, sensor update, operation history and the like, integrates simulation processes of multiple disciplines, multiple physical quantities, multiple scales and multiple probabilities, and completes mapping in a virtual space, thereby reflecting the full life cycle process of corresponding entity equipment. In some embodiments, the simulation environment includes a first simulation sub-environment and a second simulation sub-environment, the first simulation sub-environment including at least one static object model respectively corresponding to static objects including, but not limited to: mountain, tree, building, signal tower, street lamp, pole, etc. Dynamic targets include, but are not limited to: people and animals on the periphery of the aircraft at the start point of the flight, other aircraft on the periphery of the aircraft during the flight, birds, people and animals on the periphery of the aircraft at the end point of the flight, and the like. By providing a simulation environment with higher reality, the simulation effect on the aircraft is better.
In some embodiments, the simulation apparatus 100 includes a simulation application having a modeling function that can generate a simulation environment based on a scene image of a real flight environment, spatial position information and form information of a static target, motion parameter information and form information of a dynamic target, and implement a simulation process based on the generated simulation environment. In other embodiments, the simulation device 100 also includes a modeling application through which the simulation environment is built, after which the simulation application loads the simulation environment to implement the simulation process. In other embodiments, the simulation device 100 obtains the simulation environment sent by other devices.
Referring in conjunction to FIG. 2, a schematic diagram of a simulation system is shown. The simulation system 200 comprises a real image database 210, a flight scene library 220, a static twin scene building module 230 and a dynamic twin scene building module 240. Wherein, the static twin scene building module 230 builds a static twin scene (i.e. a first simulation sub-environment) based on the data in the real image database 210, and the dynamic twin scene building module 240 builds a dynamic twin scene (i.e. a second simulation sub-environment) based on the data in the flight scene library 220.
The aircraft model 110 is a virtual model built based on the size, functionality, behavioral logic of the aircraft entity. The aircraft model may be obtained based on a modeling function in a simulation application, may be obtained by building a professional modeling application, and may be imported from outside, which is not limited in the embodiment of the present application.
In some embodiments, referring again to FIG. 2, the simulation system further includes a sensor model 250, the sensor model 250 also being a mathematical model built based on digital twinning techniques that can accomplish the conversion of real world objects to sensor signals. The above sensor models include, but are not limited to: camera models, lidar models, millimeter wave radar models, ultrasonic radar models, inertial sensor (Inertia Measurement Unit, IMU) models, speed sensor models, acceleration sensor models, and the like.
In some embodiments, referring again to fig. 2, the simulation system further includes a perception algorithm model 260, the perception algorithm model 260 being configured to determine flight data (e.g., current spatial position, current velocity, current acceleration, current attitude, current direction of flight, etc.) and obstacle data (e.g., spatial position information, morphology information, distance from the aircraft model, angle with respect to the aircraft model, velocity, etc.) of the aircraft model based on the data collected by the sensor model 250.
In some embodiments, referring again to fig. 2, the simulation system further includes a planning control model 270, where the planning control model 270 is configured to determine flight parameters (e.g., flight path, flight speed, flight acceleration, flight attitude, flight direction, etc.) of the aircraft model based on the flight data and the obstacle data of the aircraft model output by the perception algorithm model 260.
In some embodiments, referring again to FIG. 2, the simulation system also includes an actuator model 280, the actuator model 280 being configured to receive and execute control instructions. In some embodiments, referring again to fig. 2, the simulation system further includes a pneumatic model 290, where the pneumatic model 290 is used to indicate the effect of airflow changes on the rotor, attitude of the aircraft model, so that the simulation of the aircraft model when encountering different airflows can be implemented.
Referring to FIG. 3, a flow chart of a simulation method according to one embodiment of the present application is shown. The method comprises the following steps S301-S303.
Step S301, a target simulation environment is acquired.
The target simulation environment includes at least one of: a first simulation sub-environment and a second simulation sub-environment. The first simulation sub-environment comprises static target models respectively corresponding to at least one static target in the real world. Such static targets include, but are not limited to: mountain, tree, building, signal tower, street lamp, pole, etc. The second simulation sub-environment comprises dynamic target models respectively corresponding to at least one dynamic target in the real world. Dynamic targets include, but are not limited to: people and animals on the periphery of the aircraft at the start point of the flight, other aircraft on the periphery of the aircraft during the flight, birds, people and animals on the periphery of the aircraft at the end point of the flight, and the like.
In some embodiments, the target simulation environment is pre-generated and stored in the simulation device, at which point the simulation device may read the target simulation environment directly from the local.
In one possible implementation, the simulation device displays an environment loading control, after receiving a trigger signal for the environment loading control, the simulation device displays a simulation environment list, the simulation environment list includes unique identifiers of simulation environments generated in advance, and after receiving a selection signal for a target simulation environment, the simulation device reads the target simulation environment locally.
In another possible implementation manner, the simulation device displays an environment loading control, after receiving a trigger signal for the environment loading control, the simulation device displays a static twin scene list and a dynamic twin scene list, wherein the static twin scene list comprises unique identifiers of static twin scenes which are generated in advance, the dynamic twin scene list comprises unique identifiers of dynamic twin scenes which are generated in advance, a technician can select a first simulation sub-environment from the static twin scene list and select a second simulation sub-environment from the dynamic twin scene list respectively, and the two simulation sub-environments are combined to obtain the target simulation environment.
The former implementation may make the selection of the target simulation environment more efficient, and the latter implementation may combine more target simulation environments. The manner in which the technician can obtain the target simulation environment can be selected according to the simulation requirements, which is not limited in the embodiment of the present application.
In other embodiments, the simulation environment is generated in real time by the simulation device, and the specific generation process will be described in the following embodiments.
Step S302, loading an aircraft model in a target simulation environment.
The aircraft model is a virtual model built based on the size, function and behavior logic of the aircraft entity. In some embodiments, a developer may model the aircraft entity based on its size, functionality, and behavioral logic, resulting in a modeling file, after which the simulation device may load the modeling file and display the visualized aircraft model at a simulation interface. In other embodiments, the simulation interface includes a model building control, upon receipt of a trigger signal for the model building control, the model building interface is displayed, the model building interface including one or more geometric elements (e.g., circles, squares, rectangles, etc.), based on which a technician may build an aircraft model.
In some embodiments, the simulation device may display the aircraft model and simultaneously display an editable list corresponding to each model parameter of the aircraft model, so that a technician may directly adjust each model parameter of the aircraft model through the editable list, thereby realizing custom adjustment of the aircraft model.
In step S303, the aircraft model is controlled to fly in the target simulation environment.
In some embodiments, the simulation device may display a dynamic picture of the aircraft flying in the simulation environment for viewing by a technician. In other embodiments, the simulation device may collect state parameters of the aircraft model flying in the simulation environment and then generate a simulation log based on the state parameters for viewing by a technician. The above-mentioned state parameters include current spatial position information of the aircraft model, speed, acceleration, current flight attitude, etc.
In summary, according to the technical solution provided in the embodiment of the present application, by first obtaining a simulation environment, where the simulation environment includes a static target model corresponding to static targets in the real world and a dynamic target model corresponding to dynamic targets, and then controlling an aircraft model to fly in the simulation environment, since the simulation environment is obtained based on object modeling in the real world, the real degree of the simulation environment is higher, and the aircraft model operates in the simulation environment with higher real degree, so that the simulation effect is better.
Referring to FIG. 4, a flow chart of a simulation method according to one embodiment of the present application is shown. The method comprises the following steps S401-S406.
Step S401, a target simulation environment is acquired.
The target simulation environment includes at least one of: a first simulation sub-environment and a second simulation sub-environment. The first simulation sub-environment comprises at least one static target model corresponding to the static targets respectively. The second simulation sub-environment comprises at least one dynamic target model corresponding to the dynamic targets respectively.
Step S402, loading an aircraft model in a target simulation environment.
Step S403, obtaining perception data of the aircraft model in the flight process through the sensor model.
Sensor models include, but are not limited to: camera models, lidar models, millimeter wave radar models, ultrasonic radar models, inertial sensor models, speed sensor models, acceleration sensor models, and the like.
The perception data collected by the camera model includes a current environmental image of the aircraft model. The sensing data collected by the laser radar model comprises a first point cloud data set, wherein the first point cloud data set comprises three-dimensional coordinate information, color information, reflection intensity information, echo frequency information and the like of three-dimensional points detected by the laser radar model in a simulation environment. The sensing data collected by the millimeter wave radar model comprises a second point cloud data set, and the second point cloud data set comprises three-dimensional coordinate information, reflectivity and radial relative speed of three-dimensional points detected by the millimeter wave radar model in a simulation environment. The sensing data collected by the ultrasonic radar model comprises the distance between the obstacle and the aircraft model, the position and the distance of the obstacle relative to the aircraft model, and the sensing data collected by the inertial sensor model comprises the longitudinal and transverse swing angular speeds, longitudinal, transverse and vertical accelerations and the like of the aircraft model. The perception data collected by the speed sensor model includes the speed of the aircraft model. The sensory data collected by the acceleration sensor model includes acceleration of the aircraft model.
Step S404, determining first pose information of the aircraft model and second pose information of the obstacle based on the perception data.
The first pose information of the aircraft model includes, but is not limited to: spatial position information of the aircraft model in the target simulation environment, attitude information of the aircraft model in the target simulation environment, flight direction, speed, acceleration and the like of the aircraft model. The spatial position information of the aircraft model in the target simulation environment can be directly obtained by the positioning model, or the distance between the marker and the aircraft model can be detected by the laser radar model, the angle of the marker relative to the aircraft model, and then the position information of the aircraft model is calculated based on the spatial position information of the marker and the detection data. The above-mentioned markers are usually static objects which are placed on the ground and whose spatial position information is known, such as designated buildings. The attitude information of the aircraft model in the target simulation environment can be directly obtained from the measurement data of the inertial sensor model. The speed of the aircraft model may be directly obtained from the measurement data of the speed sensor model. The acceleration of the aircraft model may be directly obtained from the measurement data of the acceleration sensor model.
The second pose information of the obstacle includes, but is not limited to: spatial location information of the obstacle, distance between the obstacle and the aircraft model, angle of the obstacle relative to the aircraft model, speed of the obstacle, etc. The angle of the obstacle relative to the aircraft model can be directly obtained from detection data of the laser radar sensor model. The spatial position information of the obstacle can be calculated from the distance between the obstacle and the aircraft model, the angle of the obstacle relative to the aircraft model and the spatial position information of the aircraft model.
Step S405, determining a flight parameter of the aircraft model based on the first pose information and the second pose information.
The flight parameters of the aircraft model include the motion trail of the aircraft model, the speed, the acceleration, the steering wheel rotation angle and the like of sampling points in the motion trail. In some embodiments, the electronic device determines the flight parameters of the aircraft model by planning the control model. Path planning algorithms employed by the planning control model include, but are not limited to: a fast traversal random tree (RRT) algorithm, an artificial potential field algorithm, an a-algorithm, a D-algorithm, etc.
In some embodiments, the simulation device may also obtain control instructions for the aircraft model. The control instructions described above may be technician-triggered, including, but not limited to: landing instructions, turning instructions, terrain following instructions, speed control instructions, etc. In some embodiments, the emulation device displays the instruction fetch page and then receives control instructions entered by the technician at the instruction fetch page. In the case of receiving the control instruction, step S405 may be implemented as: and determining a first flight parameter based on the first pose information, the second pose information and the control instruction. The method for determining the first flight parameter may refer to a method for determining the flight parameter, which is not described herein.
In some embodiments, the simulation device may also obtain a simulated airflow model. The simulated airflow model is used to simulate airflow conditions in the real world, and includes a strong convection model, an updraft model, a downdraft model, and the like. In the case where the simulated airflow model is acquired, step S405 may be implemented as: a second flight parameter is determined based on the first pose information, the second pose information, and the simulated airflow model. The method for determining the second flight parameter may refer to a method for determining the flight parameter, which is not described herein.
In step S406, the aircraft model is controlled to fly in the target simulation environment according to the flight parameters.
In the case that the first flight parameter is determined, step S406 is implemented as: and executing the control instruction in the process of controlling the aircraft model to fly in the target simulation environment according to the first flight parameter.
In the case that the second flight parameter is determined, step S406 is implemented as: and controlling the aircraft model to fly in the target simulation environment according to the second flight parameters.
In summary, according to the technical solution provided in the embodiment of the present application, by first obtaining a simulation environment, where the simulation environment includes a static target model corresponding to static targets in the real world and a dynamic target model corresponding to dynamic targets, and then controlling an aircraft model to fly in the simulation environment, since the simulation environment is obtained based on object modeling in the real world, the real degree of the simulation environment is higher, and the aircraft model operates in the simulation environment with higher real degree, so that the simulation effect is better.
The first simulation sub-environment may be generated in real time by the simulation device, or may be generated in advance by other devices or the simulation device. In the embodiment of the present application, only the first simulation sub-environment is described by taking an example that the simulation device generates the first simulation sub-environment in real time. The construction process of the first simulation sub-environment is explained below. The process specifically includes the following steps S501-S504.
In step S501, a scene image is acquired.
The scene image is obtained by image acquisition of the real world. In some embodiments, the scene image is acquired by a specialized mapping personnel controlling the associated camera. In other embodiments, the aircraft entity acquires the scene image by an image acquisition device disposed on the aircraft entity while performing the flight mission. The number of scene images can be determined according to the distance from the flight start point to the flight end point and the accuracy requirements of the simulation environment. The accuracy requirement of the simulation environment can be understood as the fitting degree of the simulation environment and the real world, and under the condition that the accuracy requirement of the simulation environment is certain, the larger the distance between the flight start point and the flight end point is, the larger the number of scene images is. Under the condition that the distance between the flight starting point and the flight ending point is constant, the higher the precision requirement of the simulation environment is, the larger the number of scene images is.
In some embodiments, the simulation device is provided with a static twin scene building control, after receiving a first trigger signal for the static twin scene building control, the simulation device displays a scene image library, after which the simulation device may read the scene image from the scene database according to a selection signal for the scene image by a technician.
Step S502, at least one static target is determined based on the scene image.
The simulation equipment performs image recognition on the scene image through an image recognition algorithm, and further determines at least one static target. Image recognition algorithms include, but are not limited to: an image recognition algorithm based on machine learning, an image recognition algorithm based on template matching, a deep learning algorithm based on a support vector machine, and the like.
Step S503, acquiring spatial location information and morphology information of at least one static target.
The spatial position information and the morphological information of the at least one static target may be detected by the aircraft when the aircraft executes the target flight task, or may be obtained from a background server of the map application program, which is not limited in the embodiment of the present application.
Step S504, a first simulation sub-environment is generated based on the spatial location information and the morphological information of at least one static target.
The first simulation sub-environment comprises at least one static target model corresponding to the static targets respectively. In some embodiments, the simulation device may generate a static target model corresponding to each static target respectively through a three-dimensional modeling technology, an automatic oblique photography generation model, a machine learning model and the like, where the static target models corresponding to each static target respectively form a first simulation sub-environment. In other possible implementation manners, the simulation device can also build a static scene similar to the real scene by using the existing model library, so that the diversity of the scene is realized.
In summary, according to the technical scheme provided by the embodiment of the application, the scene image is obtained based on image acquisition of the real world, and then the first simulation sub-environment containing the static target model is obtained based on modeling of the scene image, so that the real degree of the first simulation sub-environment is higher.
The second simulation sub-environment may be generated in real time by the simulation device or may be pre-generated by other devices or simulation devices. In the embodiment of the present application, only the case where the second simulation sub-environment is generated in real time by the simulation device will be described. The construction process of the second simulation sub-environment is explained below. The process specifically includes the following steps S505 to S508.
In step S505, probe data acquired by the aircraft during execution of the target flight mission is acquired.
The target flight task can be an actual flight task, a simulated flight task or a flight task obtained by deep learning based on the actual flight task and then expanding a network. The detection data includes a motion trajectory of a dynamic target, a speed, an acceleration, a motion gesture, and the like of a plurality of sampling points on the motion trajectory of the dynamic target.
Step S506, determining at least one dynamic target based on the probe data.
The simulation device may directly determine from the above-described detection data the dynamic target detected by the aircraft during the execution of the target flight mission.
Step S507, determining motion parameter information and morphology information of at least one dynamic object based on the detection data.
Similarly, the simulation device may directly determine the motion parameter information and the morphological information of the dynamic object from the detection data.
Step S508, generating a second simulation sub-environment based on the motion parameter information and the morphological information of the at least one dynamic target.
The second simulation sub-environment comprises at least one dynamic target model corresponding to the dynamic targets respectively. In some embodiments, the simulation device may generate a dynamic object model corresponding to each dynamic object by using a virtual engine technology, where the dynamic object model corresponding to each dynamic object forms the second simulation sub-environment.
In summary, according to the technical scheme provided by the embodiment of the application, the second simulation sub-environment including the dynamic target model is obtained by modeling the detection data of the aircraft entity in the process of executing the target task, so that the reality degree of the second simulation sub-environment is higher.
Referring to fig. 5, a block diagram of a simulation apparatus according to an embodiment of the present application is shown, where the apparatus includes: an environment acquisition module 510, a model loading module 520, and a simulation module 530.
An environment acquisition module 510, configured to acquire a simulation environment; wherein the simulation environment comprises at least one of: the system comprises a first simulation sub-environment and a second simulation sub-environment, wherein the first simulation sub-environment comprises static target models respectively corresponding to at least one static target in a real flight environment of an aircraft, and the second simulation sub-environment comprises dynamic target models respectively corresponding to at least one dynamic target in the real flight environment.
A model loading module 520 for loading an aircraft model in a simulation environment;
a simulation module 530 for controlling the flight of the aircraft model in a simulation environment.
In some embodiments, the simulation module 530 includes: sensor model, sensing unit, planning control unit, execution unit (not shown in the figure).
And the sensor model is used for acquiring perception data of the aircraft model in the flight process. And the sensing unit is used for determining first pose information of the aircraft model and second pose information of the obstacle based on the sensing data. And the planning control unit is used for determining flight parameters of the aircraft model based on the first pose information and the second pose information. And the execution unit is used for controlling the aircraft model to fly in the simulation environment according to the flight parameters.
In some embodiments, the simulation module 530 is further to: control instructions for an aircraft model are acquired. The planning control unit is specifically used for determining a first flight parameter based on the first pose information, the second pose information and the control instruction. The execution unit is specifically used for controlling the aircraft model to execute the control instruction in the process of flying in the simulation environment according to the first flight parameter.
In some embodiments, the simulation module 530 is further to: and obtaining a simulated airflow model. The planning control unit is specifically used for determining a second flight parameter based on the first pose information, the second pose information and the simulated airflow model. And the execution unit is specifically used for controlling the aircraft model to fly in the simulation environment according to the second flight parameter.
In some embodiments, the apparatus further comprises: a first environment generation module (not shown). The first environment generation module is used for acquiring a scene image, wherein the scene image is obtained by acquiring an image of the real flight environment of the aircraft; determining at least one static target based on the scene image; acquiring spatial position information and form information of at least one static target; and generating a first simulation sub-environment based on the spatial position information and the morphological information of the at least one static target, wherein the first simulation sub-environment comprises at least one static target model respectively corresponding to the at least one static target.
In some embodiments, the apparatus further comprises: a second environment generation module (not shown). The second environment generation module is used for acquiring detection data acquired by the aircraft in the process of executing the target flight task; determining at least one dynamic target based on the probe data; determining motion parameter information and morphological information of at least one dynamic target based on the detection data; and generating a second simulation sub-environment based on the motion parameter information and the morphological information of the at least one dynamic target, wherein the second simulation sub-environment comprises at least one dynamic target model respectively corresponding to the dynamic target.
In summary, according to the technical solution provided in the embodiment of the present application, by first obtaining a simulation environment, where the simulation environment includes a static target model corresponding to static targets in the real world and a dynamic target model corresponding to dynamic targets, and then controlling an aircraft model to fly in the simulation environment, since the simulation environment is obtained based on object modeling in the real world, the real degree of the simulation environment is higher, and the aircraft model operates in the simulation environment with higher real degree, so that the simulation effect is better.
It will be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working process of the apparatus and modules described above may refer to the corresponding process in the foregoing method embodiment, which is not repeated herein.
In several embodiments provided herein, the coupling of the modules to each other may be electrical, mechanical, or other.
In addition, each functional module in each embodiment of the present application may be integrated into one processing module, or each module may exist alone physically, or two or more modules may be integrated into one module. The integrated modules may be implemented in hardware or in software functional modules.
Referring to fig. 6, an embodiment of the present application further provides an electronic device 600, where the electronic device 600 includes: one or more processors 610, memory 620, and one or more applications. Wherein one or more application programs are stored in the memory and configured to be executed by the one or more processors, the one or more application programs configured to perform the methods described in the above embodiments.
Processor 610 may include one or more processing cores. The processor 610 connects various parts within the overall battery management system using various interfaces and lines, performs various functions of the battery management system and processes data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 620, and invoking data stored in the memory 620. Alternatively, the processor 610 may be implemented in hardware in at least one of digital signal processing (Digital Signal Processing, DSP), field programmable gate array (Field-Programmable Gate Array, FPGA), programmable logic array (Programmable Logic Array, PLA). The processor 610 may integrate one or a combination of several of a central processor 610 (Central Processing Unit, CPU), an image processor 610 (Graphice Processing Unit, GPU), and a modem, etc. The GPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for being responsible for rendering and drawing of display content; the modem is used to handle wireless communications. It will be appreciated that the modem may not be integrated into the processor 610 and may be implemented solely by a single communication chip.
The Memory 620 may include a random access Memory 620 (Random Access Memory, RAM) or a Read-Only Memory 620 (ROM). Memory 620 may be used to store instructions, programs, code sets, or instruction sets. The memory 620 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for implementing at least one function (e.g., a touch function, a sound playing function, an image playing function, etc.), instructions for implementing the various method embodiments described above, and the like. The storage data area may also store data created by the electronic device map in use (e.g., phonebook, audiovisual data, chat log data), and the like.
Referring to fig. 7, there is shown a computer readable storage medium 700, where the computer readable storage medium 700 stores computer program instructions 710, the computer program instructions 710 may be invoked by a processor to perform the method described in the above embodiments.
The computer readable storage medium 700 may be, for example, a flash Memory, an Electrically Erasable Programmable Read Only Memory (EEPROM), an electrically programmable Read Only Memory (Electrical Programmable Read Only Memory, EPROM), a hard disk, or a Read-Only Memory (ROM). Optionally, the computer readable storage medium comprises a Non-volatile computer readable storage medium (Non-transitory Computer-readable Storage Medium). The computer readable storage medium 700 has storage space for computer program instructions 710 that perform any of the method steps described above. The computer program instructions 710 may be read from or written to one or more computer program products.
The foregoing description is not intended to limit the preferred embodiments of the present application, but is not intended to limit the scope of the present application, and any such modifications, equivalents and adaptations of the embodiments described above in accordance with the principles of the present application should and are intended to be within the scope of the present application, as long as they do not depart from the scope of the present application.

Claims (10)

1. A simulation method, the method comprising:
obtaining a target simulation environment, wherein the target simulation environment comprises at least one of the following: the system comprises a first simulation sub-environment and a second simulation sub-environment, wherein the first simulation sub-environment comprises static target models respectively corresponding to at least one static target in the real world, and the second simulation sub-environment comprises dynamic target models respectively corresponding to at least one dynamic target in the real world;
loading an aircraft model in the target simulation environment;
controlling the aircraft model to fly in the target simulation environment.
2. The method of claim 1, wherein the controlling the aircraft model to fly in the target simulation environment comprises:
acquiring perception data of the aircraft model in the flight process through a sensor model;
determining first pose information of the aircraft model and second pose information of an obstacle based on the perception data;
determining flight parameters of the aircraft model based on the first pose information and the second pose information;
and controlling the aircraft model to fly in the target simulation environment according to the flight parameters.
3. The method of claim 2, wherein the flight parameters include first flight parameters, the method further comprising:
acquiring a control instruction for the aircraft model;
the determining, based on the first pose information and the second pose information, a flight parameter of the aircraft model includes: determining the first flight parameter based on the first pose information, the second pose information, and the control instruction;
the controlling the aircraft model to fly in the target simulation environment according to the flight parameters comprises the following steps:
and controlling the aircraft model to execute the control instruction in the process of flying in the target simulation environment according to the first flight parameter.
4. The method of claim 2, wherein the flight parameters include a second flight parameter, the method further comprising:
obtaining a simulated airflow model;
the determining, based on the first pose information and the second pose information, a flight parameter of the aircraft model includes: determining the second flight parameter based on the first pose information, the second pose information, and the simulated airflow model;
the controlling the aircraft model to fly in the target simulation environment according to the flight parameters comprises the following steps: and controlling the aircraft model to fly in the target simulation environment according to the second flight parameter.
5. The method according to any one of claims 1 to 4, wherein the generation of the first simulation sub-environment comprises the steps of:
acquiring a scene image, wherein the scene image is obtained by acquiring an image of the real flight environment of an aircraft;
determining at least one of the static targets based on the scene image;
acquiring spatial position information and morphological information of at least one static target;
generating the first simulation sub-environment based on the spatial position information and the morphological information of at least one static target, wherein the first simulation sub-environment comprises at least one static target model respectively corresponding to the static targets.
6. The method according to any one of claims 1 to 4, wherein the generating of the second simulation sub-environment comprises the steps of:
acquiring detection data acquired by an aircraft in the process of executing a target flight task;
determining at least one of the dynamic targets based on the probe data;
determining motion parameter information and morphological information of at least one dynamic target based on the detection data;
generating the second simulation sub-environment based on the motion parameter information and the morphological information of at least one dynamic target, wherein the second simulation sub-environment comprises at least one dynamic target model respectively corresponding to the dynamic targets.
7. A simulation apparatus, the apparatus comprising:
the environment acquisition module is used for acquiring a target simulation environment, and the target simulation environment comprises at least one of the following: the system comprises a first simulation sub-environment and a second simulation sub-environment, wherein the first simulation sub-environment comprises static target models respectively corresponding to at least one static target in the real world, and the second simulation sub-environment comprises dynamic target models respectively corresponding to at least one dynamic target in the real world;
the model loading module is used for loading an aircraft model in the target simulation environment;
and the simulation module is used for controlling the aircraft model to fly in the target simulation environment.
8. The apparatus of claim 7, wherein the simulation module comprises:
the sensor model is used for acquiring perception data of the aircraft model in the flight process;
a perception unit for determining first pose information of the aircraft model and second pose information of an obstacle based on the perception data;
a planning control unit for determining flight parameters of the aircraft model based on the first pose information and the second pose information;
and the execution unit is used for controlling the aircraft model to fly in the target simulation environment according to the flight parameters.
9. An electronic device, comprising:
one or more processors;
a memory;
one or more applications, wherein one or more of the applications are stored in the memory and configured to be executed by one or more of the processors, the one or more applications configured to perform the simulation method of any of claims 1-6.
10. A computer readable storage medium having stored therein computer program instructions that are callable by a processor to perform the simulation method according to any one of claims 1-6.
CN202211701713.1A 2022-12-28 2022-12-28 Simulation method, simulation device, electronic equipment and storage medium Pending CN116011101A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211701713.1A CN116011101A (en) 2022-12-28 2022-12-28 Simulation method, simulation device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211701713.1A CN116011101A (en) 2022-12-28 2022-12-28 Simulation method, simulation device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116011101A true CN116011101A (en) 2023-04-25

Family

ID=86018947

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211701713.1A Pending CN116011101A (en) 2022-12-28 2022-12-28 Simulation method, simulation device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116011101A (en)

Similar Documents

Publication Publication Date Title
CN113485392B (en) Virtual reality interaction method based on digital twins
CN110765620B (en) Aircraft visual simulation method, system, server and storage medium
US11417106B1 (en) Crowd evacuation system based on real time perception, simulation, and warning
CN112925223B (en) Unmanned aerial vehicle three-dimensional tracking virtual test simulation system based on visual sensing network
US20190039625A1 (en) Method for modeling a motor vehicle sensor in a virtual test environment
CN111324945B (en) Sensor scheme determining method, device, equipment and storage medium
US12039354B2 (en) System and method to operate 3D applications through positional virtualization technology
KR102272411B1 (en) Method and apparatus for learning artificial nearal network to improve the target recognition of simulation-image database in sar image
CN111095170B (en) Virtual reality scene, interaction method thereof and terminal equipment
US10754030B2 (en) Methods and systems for radar simulation and object classification
CN108701164A (en) Obtain method, apparatus, storage medium and the equipment of flight simulation data
CN114488848A (en) Unmanned aerial vehicle autonomous flight system and simulation experiment platform for indoor building space
CN115825067A (en) Geological information acquisition method and system based on unmanned aerial vehicle and electronic equipment
Mittal et al. A simulated dataset in aerial images using Simulink for object detection and recognition
CN110366711A (en) Information processing unit, flight control instructions method and recording medium
CN112562050A (en) Virtual object wind animation generation method and device, storage medium and terminal
CN110021210B (en) Unmanned aerial vehicle VR training method with extensible virtual space
CN116011101A (en) Simulation method, simulation device, electronic equipment and storage medium
KR20210057586A (en) Method and system for camera-based visual localization using blind watermarking
CN113656918B (en) Four-rotor simulation test method applied to finished product overhead warehouse scene
CN115357500A (en) Test method, device, equipment and medium for automatic driving system
KR102320262B1 (en) Method and apparatus for estimating size of damage in the disaster affected areas
KR20220095964A (en) Apparatus and method for generating learning data based on map
CN115531877B (en) Method and system for measuring distance in virtual engine
CN114390270B (en) Real-time intelligent site panorama exploration method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination