CN116070816A - Flight simulation training management method and system based on Internet of things - Google Patents

Flight simulation training management method and system based on Internet of things Download PDF

Info

Publication number
CN116070816A
CN116070816A CN202310049511.1A CN202310049511A CN116070816A CN 116070816 A CN116070816 A CN 116070816A CN 202310049511 A CN202310049511 A CN 202310049511A CN 116070816 A CN116070816 A CN 116070816A
Authority
CN
China
Prior art keywords
training
result
action
evaluation
flight simulation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310049511.1A
Other languages
Chinese (zh)
Other versions
CN116070816B (en
Inventor
茅卫平
杨晶鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Hi Elec Equipment Co ltd
Original Assignee
Suzhou Hi Elec Equipment Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Hi Elec Equipment Co ltd filed Critical Suzhou Hi Elec Equipment Co ltd
Priority to CN202310049511.1A priority Critical patent/CN116070816B/en
Publication of CN116070816A publication Critical patent/CN116070816A/en
Application granted granted Critical
Publication of CN116070816B publication Critical patent/CN116070816B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/20Education
    • G06Q50/205Education administration or guidance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Strategic Management (AREA)
  • Tourism & Hospitality (AREA)
  • Economics (AREA)
  • Educational Administration (AREA)
  • General Physics & Mathematics (AREA)
  • Human Resources & Organizations (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Marketing (AREA)
  • Health & Medical Sciences (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Educational Technology (AREA)
  • Computing Systems (AREA)
  • Game Theory and Decision Science (AREA)
  • Quality & Reliability (AREA)
  • Primary Health Care (AREA)
  • Operations Research (AREA)
  • Artificial Intelligence (AREA)
  • Development Economics (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The present disclosure provides a flight simulation training management method and system based on the internet of things, and relates to the technical field of electric digital data processing, wherein the method comprises: basic training information and training scene information of a training user are read; reading the interaction control data, performing training control evaluation, and generating a training control evaluation result; constructing a training reference image set; calling a training action database based on the training scene information, and performing action feature matching on the reference image set through the training action database; generating auxiliary training evaluation data based on the action characteristic matching result, and generating a comprehensive training evaluation result through the training control evaluation result and the auxiliary training evaluation data; the flight simulation training management is carried out on the training user, so that the technical problems of low efficiency and low training quality of the flight simulation training management caused by insufficient detail of analysis on training data in the prior art are solved.

Description

Flight simulation training management method and system based on Internet of things
Technical Field
The disclosure relates to the technical field of electric digital data processing, in particular to a flight simulation training management method and system based on the Internet of things.
Background
Along with the rapid development of science and technology, the training mode of pilots also changes greatly, and the main mode of training pilots is for carrying out flight simulation training now, and flight simulation training refers to the training that flight personnel utilizes flight simulation machine to master flight skills on ground, and flight simulation training has the advantages of guaranteeing flight safety, reducing environmental pollution, improving training efficiency, saving expenses and the like.
At present, the technical problems of low management efficiency and low training quality of flight simulation training are caused by insufficient detailed analysis of training data in the prior art.
Disclosure of Invention
The disclosure provides a flight simulation training management method and system based on the Internet of things, which are used for solving the technical problems in the prior art that the flight simulation training management efficiency and the training quality are low due to insufficient detail of analysis of training data.
According to a first aspect of the present disclosure, there is provided a flight simulation training management method based on the internet of things, including: connecting the flight simulation training management system, and reading basic training information and training scene information of a training user; reading interactive control data through the data interaction device, and performing training control evaluation according to the interactive control data and the training scene information to generate a training control evaluation result; invoking the image acquisition device based on the training scene information, acquiring a training process image of the training user through the image acquisition device, and constructing a training reference image set; calling a training action database based on the training scene information, and performing action feature matching on the reference image set through the training action database; generating auxiliary training evaluation data based on the action characteristic matching result, and generating a comprehensive training evaluation result through the training control evaluation result and the auxiliary training evaluation data; and carrying out flight simulation training management on the training user through the comprehensive training evaluation result.
According to a second aspect of the present disclosure, there is provided a flight simulation training management system based on the internet of things, comprising: the information reading module is used for connecting the flight simulation training management system and reading basic training information and training scene information of a training user; the training control evaluation module is used for reading the interactive control data through the data interaction device, performing training control evaluation according to the interactive control data and the training scene information, and generating a training control evaluation result; the training image acquisition module is used for calling the image acquisition device based on the training scene information, acquiring a training process image of the training user through the image acquisition device, and constructing a training reference image set; the action feature matching module is used for calling a training action database based on the training scene information and carrying out action feature matching on the reference image set through the training action database; the comprehensive training evaluation module is used for generating auxiliary training evaluation data based on the action characteristic matching result and generating a comprehensive training evaluation result through the training control evaluation result and the auxiliary training evaluation data; and the flight simulation training management module is used for carrying out flight simulation training management on the training user through the comprehensive training evaluation result.
According to a third aspect of the present disclosure, there is provided an electronic device comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of the first aspect.
According to the flight simulation training management method based on the Internet of things, the flight simulation training management system is connected, and basic training information and training scene information of a training user are read; reading interactive control data through the data interaction device, and performing training control evaluation according to the interactive control data and the training scene information to generate a training control evaluation result; invoking the image acquisition device based on the training scene information, acquiring a training process image of the training user through the image acquisition device, and constructing a training reference image set; calling a training action database based on the training scene information, and performing action feature matching on the reference image set through the training action database; generating auxiliary training evaluation data based on the action characteristic matching result, and generating a comprehensive training evaluation result through the training control evaluation result and the auxiliary training evaluation data; and carrying out flight simulation training management on the training user through the comprehensive training evaluation result. According to the method and the system, training control evaluation is conducted according to interaction control data and training scene information, then training process images of a training user are collected, action feature matching is conducted on image collection results based on a training action database, auxiliary training evaluation data are generated according to the action feature matching results, and comprehensive training evaluation results are generated by combining the training control evaluation results and the auxiliary training evaluation data, so that flight simulation training management is conducted, the technical effects of pertinently conducting flight simulation training management and improving flight simulation training management efficiency and training quality are achieved.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the disclosure, nor is it intended to be used to limit the scope of the disclosure. Other features of the present disclosure will become apparent from the following specification.
Drawings
For a clearer description of the present disclosure or of the prior art, the drawings that are required to be used in the description of the embodiments or of the prior art will be briefly described, it being apparent that the drawings in the description below are merely illustrative and that other drawings may be obtained, without inventive effort, by a person skilled in the art from the drawings provided.
Fig. 1 is a schematic flow chart of a flight simulation training management method based on the internet of things, which is provided by the embodiment of the invention;
FIG. 2 is a schematic flow chart of flight simulation training management based on training user habit characteristics and correction identification results in an embodiment of the invention;
FIG. 3 is a schematic flow chart of training teaching for training a user in an embodiment of the invention;
fig. 4 is a schematic structural diagram of a flight simulation training management system based on the internet of things according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Reference numerals illustrate: the system comprises an information reading module 11, a training control evaluation module 12, a training image acquisition module 13, an action characteristic matching module 14, a comprehensive training evaluation module 15, a flight simulation training management module 16, an electronic device 800, a processor 801, a memory 802 and a bus 803.
Description of the embodiments
Exemplary embodiments of the present disclosure are described below in conjunction with the accompanying drawings, which include various details of the embodiments of the present disclosure to facilitate understanding, and should be considered as merely exemplary. Accordingly, one of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present disclosure. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
In order to solve the technical problems of low flight simulation training management efficiency and low training quality caused by insufficient detailed analysis of training data in the prior art, the inventor of the present disclosure obtains the flight simulation training management method and system based on the internet of things through creative labor.
Example 1
Fig. 1 is a diagram of a flight simulation training management method based on internet of things, where the method is applied to a flight simulation training management system, and the flight simulation training management system is in communication connection with a data interaction device and an image acquisition device, as shown in fig. 1, and the method includes:
step S100: connecting the flight simulation training management system, and reading basic training information and training scene information of a training user;
specifically, the embodiment of the application provides a flight simulation training management method based on the Internet of things, the method is applied to a flight simulation training management system, the flight simulation training management system is a system platform for carrying out aircraft simulation training management through analysis processing on training information of an aircraft, therefore, the flight simulation training management system is connected, basic training information and training scene information of a training user can be read, the training user refers to a pilot, the basic training information refers to a training project for carrying out flight simulation training, and the training scene information refers to a flight scene for the pilot to carry out simulation training.
Step S200: reading interactive control data through the data interaction device, and performing training control evaluation according to the interactive control data and the training scene information to generate a training control evaluation result;
specifically, the data interaction device is in communication connection with the flight simulation training management system, can read data in the flight simulation training management system, can also transmit the data to the flight simulation training management system, realizes interactive transmission of the data, and the interactive control data refer to interactive data for controlling a machine for simulation flight when a pilot simulates the flight, and the training scene information refers to a flight scene to be completed.
Step S300: invoking the image acquisition device based on the training scene information, acquiring a training process image of the training user through the image acquisition device, and constructing a training reference image set;
specifically, the image acquisition device comprises a camera, a camera and other devices, and is used for carrying out image acquisition on the operation of a pilot in the flight training process, the image acquisition device is in communication connection with the flight simulation training management system, the image acquisition device can upload acquired image data to the flight simulation training management system, the image acquisition device is used for carrying out the training process image acquisition of a training user, a plurality of image acquisition devices can be called in the acquisition process, the image acquisition of different angles is carried out, the acquired image data is generally a section of continuous image video, and the training reference image set is the image data of the flight training process acquired by the image acquisition device.
Step S400: calling a training action database based on the training scene information, and performing action feature matching on the reference image set through the training action database;
specifically, the training action database comprises a plurality of flight training actions, standard training actions and abnormal training actions exist in the flight training actions, the training action database is called according to training scene information, action feature matching is carried out on the reference image set according to the training action database, in short, the training actions in the training action database are matched with the actions in the reference image set, namely, the matching is successful, the training actions in the training action database are consistent with the actions in the reference image set, the data in the reference image set are divided into two parts, one part is a key frame associated image, the other part is a common associated image, the key frame associated image is a training image under a certain time node, the time node is obtained according to interaction control data, namely, at the time point when a certain operation is carried out, the common associated image is other training images of which do not comprise the key frame associated image, and the action feature matching is carried out on the key frame associated image and the common associated image according to the training action database.
Step S500: generating auxiliary training evaluation data based on the action characteristic matching result, and generating a comprehensive training evaluation result through the training control evaluation result and the auxiliary training evaluation data;
specifically, according to a training action database, action characteristic matching is carried out on a key frame associated image and a common associated image respectively, action characteristic matching results are obtained, the action characteristic matching results comprise standard action matching results and abnormal action matching results, auxiliary training evaluation data are generated based on the action characteristic matching results, the auxiliary training evaluation data are evaluation results of flight simulation training conditions of a training user according to the action characteristic matching results, comprehensive training evaluation results are generated through training control evaluation results and the auxiliary training evaluation data, the comprehensive training evaluation results are comprehensive evaluation of the flight simulation training conditions of the training user by combining the training control evaluation results and the auxiliary training evaluation data, and the comprehensive training evaluation results are bases for carrying out flight simulation training management on the training user.
Step S600: and carrying out flight simulation training management on the training user through the comprehensive training evaluation result.
Specifically, the comprehensive training evaluation result is the comprehensive evaluation of the flight simulation training condition of the training user by combining the training control evaluation result and the auxiliary training evaluation data, and the flight simulation training of the training user is adjusted according to the comprehensive training evaluation result, so that the flight simulation training management is performed, and the flight simulation training management efficiency and the training quality are improved.
Based on the analysis, the present disclosure provides a flight simulation training management method based on the internet of things, in this embodiment, through performing training control evaluation according to interaction control data and training scene information, then performing image acquisition on a training process of a training user, performing action feature matching on an image acquisition result based on a training action database, generating auxiliary training evaluation data according to the action feature matching result, and generating a comprehensive training evaluation result by combining the training control evaluation result and the auxiliary training evaluation data, thereby performing flight simulation training management, achieving the technical effects of performing flight simulation training management pertinently, and improving the efficiency and training quality of the flight simulation training management.
The step S500 in this embodiment of the present application further includes:
step S510: reading an interaction control time node based on the interaction control data, wherein the interaction control time node comprises an interaction control characteristic identifier;
step S520: extracting the image frames of the training reference image set through the interactive control time node to obtain a key frame image;
step S530: obtaining a standard execution action database in the training action database;
step S540: and performing action matching recognition on the key frame images through the standard execution action database, and obtaining the auxiliary training evaluation data according to a matching recognition result.
Specifically, the interactive control data refers to interactive data for controlling the machine in simulated flight when a pilot performs simulated flight, such as a control stick, the interactive control time node refers to a time point when a control action is performed, such as a time point when the control stick is controlled, and the interactive control feature refers to an action when the machine in simulated flight is controlled, such as a toggle stick. Further, the image frames corresponding to the interactive control time nodes are extracted from the training reference image set, and the extracted images are key frame images. The standard execution action database comprises all correct execution actions in the training action database, action matching identification is carried out on the key frame images through the standard execution action database, namely, the pilot manipulation actions are carried out on the key frame images, and matching identification is carried out on the pilot manipulation actions and the standard execution action database, namely, actions which are the same as the pilot manipulation actions are screened out from the standard execution action database, further, matching identification results are obtained, based on the matching identification results, auxiliary training evaluation data are obtained, and data support is provided for subsequent flight simulation training management.
Step S550 in this embodiment of the present application further includes:
step S551: acquiring non-key frame images in the training reference image set, and performing image aggregation on the non-key frame images based on the interactive control time node to acquire an image aggregation result with interval identification;
step S552: obtaining an abnormal action recognition database in the training action database;
step S553: performing action feature calling on the abnormal action recognition database based on the interval identifier, and performing corresponding abnormal feature recognition on the image aggregation result by calling action features;
step S554: and obtaining the auxiliary training evaluation data according to the abnormal characteristic recognition result and the matching recognition result.
Specifically, the images except for the key frame images in the training reference image set are non-key frame images, the non-key frame images are aggregated according to the interaction control time nodes, in short, the interaction control time nodes refer to time points when operation is performed, a plurality of interaction control time nodes exist in the training reference image set, images between two continuous interaction control time nodes are aggregated, the two continuous interaction control time nodes are intervals corresponding to the image aggregation result, the intervals are identified, each interval corresponds to one image aggregation result, for example, after operation of a certain button is performed, the non-key frame images in a time period from operation of the rocker to operation of the button are aggregated together, the time period from operation of the rocker to operation of the button is an identification interval corresponding to the aggregation result, the abnormal action recognition database contains wrong operation in the training action database, each interval corresponds to one image aggregation result, in the process of recognizing the image aggregation results of different intervals, action characteristics are called in the abnormal action recognition database according to the interval selection action characteristics, the action characteristics are recognized in the abnormal action recognition database according to the calling action characteristics, the recognition results are the corresponding action characteristics, the recognition results are recognized in the corresponding action, the training results are recognized according to the action transmission characteristics, the abnormal action recognition results, the recognition results are further, whether the training characteristics can be accurately evaluated, the training characteristics can be obtained, the auxiliary training characteristics can be accurately evaluated, and the recognition results can be obtained, are accurately and the training performance can be evaluated and the auxiliary training quality can be evaluated.
As shown in fig. 2, step S700 in the embodiment of the present application further includes:
step S710: performing action correction identification through the matching identification result to generate a correction identification result;
step S720: extracting the common characteristic of the correction identification result to generate a common characteristic extraction result;
step S730: judging whether the common characteristic value of the common characteristic extraction result meets a preset characteristic threshold value or not;
step S740: when the common feature extraction result can meet the preset feature threshold, generating training user habit features according to the common feature extraction result;
step S750: and performing flight simulation training management based on the training user habit characteristics and the correction identification result.
Specifically, the action correction identification is carried out through matching the identification result, the correction identification result is generated, namely the manipulation action in the matching identification result is wrong, the error of the manipulation action is often caused by a wrong action characteristic, the correction identification result represents the correction of the wrong action characteristic, the correction identification result is subjected to common characteristic extraction, the common characteristic refers to the same action characteristic causing the manipulation error, for example, a plurality of manipulation error results are all caused by the pilot exerting upward force in the process of executing the operation, the upward force is a common characteristic, based on the common characteristic extraction result, the common characteristic value refers to the frequency occupation ratio of the common characteristic in the process of executing similar actions by the pilot, for example, the pilot exerting upward force in the process of pushing the rocker from right to left, in the process of executing other operations similar to right-to-left actions with upward force, the occurrence frequency of the upward force action is the common characteristic value, further, whether the common characteristic value of the common characteristic extraction result meets a preset characteristic threshold value is judged, the preset characteristic threshold value is self-set, the preset characteristic threshold value is comparison data for judging the common characteristic value of the common characteristic extraction result, when the common characteristic extraction result can meet the preset characteristic threshold value, that is, the common characteristic value of the common characteristic extraction result is in the preset characteristic threshold value range, the common characteristic extraction result is used as a habit characteristic of a training user, further, simulation training management is carried out according to the habit characteristic of the training user and the correction identification result, for example, the training quantity of related actions can be increased, and training management is carried out according to the action characteristic, the technical effect of improving training efficiency is improved.
As shown in fig. 3, step S800 in the embodiment of the present application further includes:
step S810: performing operation anomaly statistics on the simulation training operation to obtain an anomaly statistics result;
step S820: accumulating the statistic frequency of the abnormal statistic result to obtain a statistic frequency mark;
step S830: performing abnormal hazard evaluation on the abnormal statistical result to generate a hazard base value;
step S840: generating an abnormal operation characteristic value based on the statistical frequency identification and the hazard basic value;
step S850: and training the training teaching of the user through the abnormal operation characteristic value and the simulated training operation.
Specifically, counting all abnormal operations in the simulated training operation process to obtain an abnormal statistical result, simultaneously counting the accumulated times of occurrence of each abnormal operation and carrying out identification to obtain a statistical frequency identification, carrying out abnormal hazard evaluation on each abnormal operation in the abnormal statistical result, namely evaluating the severity of the possible consequences caused by each abnormal operation, generating a hazard base value based on the severity, wherein the hazard base value is the hazard degree generated by each abnormal operation, generating an abnormal operation characteristic value according to the statistical frequency identification and the hazard base value, simply carrying out superposition calculation on the hazard base value according to the occurrence frequency of the abnormal operation to obtain an abnormal operation characteristic value, carrying out interpretation analysis of the abnormal operation on a pilot according to the abnormal operation characteristic value and the simulated training operation, thereby carrying out training teaching of a training user, achieving the technical effects of carrying out training teaching according to the abnormal operation and improving teaching efficiency.
In this embodiment, step S850 further includes:
step S851: performing the simulated training operation to perform operation sequence sorting based on the abnormal operation characteristic value to obtain a sequence sorting result;
step S852: setting teaching example distribution weights, and generating teaching case constraint information based on the teaching example distribution weights and the sequence ordering result;
step S853: and constructing a teaching sample through the teaching case constraint information and the simulated training operation, and carrying out training teaching of the training user through the teaching sample.
Specifically, the operation sequence sorting is performed on the simulated training operation according to the abnormal operation characteristic value, in short, the operation corresponding to the abnormal operation characteristic value is sorted according to the magnitude of the abnormal operation characteristic value, the teaching example distribution weight is set, the setting of the teaching example distribution weight can be performed according to the experience of simulated training, for example, a relatively difficult operation which is easy to generate errors can be set with a relatively large weight occupation ratio, the teaching case constraint information is generated according to the teaching example distribution weight and the sequence sorting result, the teaching case constraint information is constraint on the teaching example range and the case number, that is, the number of the teaching cases and the occupation ratio of various cases are determined jointly by the teaching example distribution weight and the sequence sorting result, further, a teaching sample is constructed according to the teaching case constraint information and the simulated training operation, the teaching sample comprises the number of the teaching cases and the occupation ratio of various cases, thereby training teaching the training user is performed through the teaching sample, and the technical effect of pertinence teaching sample according to the abnormal operation setting is achieved, and the teaching efficiency is improved.
The step S900 in this embodiment of the present application further includes:
step S910: continuously training and monitoring the training user to generate a continuous training and monitoring result;
step S920: performing skill mastering evaluation of the training user based on the continuous training monitoring result, and generating a skill report of the training user;
step S930: and performing flight simulation training management on the training user through the skill report.
Specifically, continuous training monitoring is performed on the training user, namely, the training process of the pilot is continuously monitored, so that a continuous training monitoring result is generated, the mastering conditions of the pilot on each skill are evaluated according to the continuous training monitoring result, the mastering conditions of different skills may be different, based on the continuous training monitoring result, a skill report of the training user is generated, the skill report comprises the mastering conditions of the pilot on each skill, and flight simulation training management is performed on the training user through the skill report, for example, training guidance can be performed on skills with poor mastering conditions or training time is prolonged, and therefore the effects of performing targeted training according to the skill mastering conditions of the training user and improving the simulation training management efficiency are achieved.
Example 2
Based on the same inventive concept as the flight simulation training management method based on the internet of things in the foregoing embodiment, as shown in fig. 4, the present application further provides a flight simulation training management system based on the internet of things, where the system is in communication connection with a data interaction device and an image acquisition device, and the system includes:
the information reading module 11 is used for connecting the flight simulation training management system, and reading basic training information and training scene information of a training user;
the training control evaluation module 12 is configured to read interactive control data through the data interaction device, perform training control evaluation according to the interactive control data and the training scene information, and generate a training control evaluation result;
the training image acquisition module 13 is used for calling the image acquisition device based on the training scene information, acquiring a training process image of the training user through the image acquisition device, and constructing a training reference image set;
the action feature matching module 14 is used for calling a training action database based on the training scene information, and performing action feature matching on the reference image set through the training action database;
the comprehensive training evaluation module 15 is used for generating auxiliary training evaluation data based on the action characteristic matching result, and generating a comprehensive training evaluation result through the training control evaluation result and the auxiliary training evaluation data;
and the flight simulation training management module 16 is used for carrying out flight simulation training management on the training user through the comprehensive training evaluation result by the flight simulation training management module 16.
Further, the system further comprises:
the interactive control time node reading module is used for reading the interactive control time node based on the interactive control data, wherein the interactive control time node comprises an interactive control feature identifier;
the image frame extraction module is used for extracting the image frames of the training reference image set through the interactive control time node to obtain a key frame image;
the standard execution action database acquisition module is used for acquiring a standard execution action database in the training action database;
and the key frame image action matching module is used for carrying out action matching identification of the key frame image through the standard execution action database and obtaining the auxiliary training evaluation data according to a matching identification result.
Further, the system further comprises:
the non-key frame image aggregation module is used for obtaining non-key frame images in the training reference image set, and image aggregation is carried out on the non-key frame images based on the interaction control time node to obtain an image aggregation result with interval identification;
the abnormal action recognition database acquisition module is used for acquiring an abnormal action recognition database in the training action database;
the abnormal feature recognition module is used for calling action features of the abnormal action recognition database based on the interval identification and carrying out corresponding abnormal feature recognition on the image aggregation result by calling the action features;
the auxiliary training evaluation data generation module is used for obtaining the auxiliary training evaluation data according to the abnormal characteristic recognition result and the matching recognition result.
Further, the system further comprises:
the action correction identification module is used for carrying out action correction identification through the matching identification result and generating a correction identification result;
the common feature extraction module is used for extracting common features of the correction identification result and generating a common feature extraction result;
the commonality characteristic value judging module is used for judging whether the commonality characteristic value of the commonality characteristic extraction result meets a preset characteristic threshold value or not;
the training user habit feature generation module is used for generating training user habit features according to the common feature extraction result when the common feature extraction result can meet the preset feature threshold;
and the second flight simulation training management module is used for carrying out flight simulation training management based on the training user habit characteristics and the correction identification result.
Further, the system further comprises:
the operation abnormality statistical module is used for carrying out operation abnormality statistics on the simulation training operation to obtain an abnormality statistical result;
the statistical frequency accumulation module is used for accumulating the statistical frequency of the abnormal statistical result to obtain a statistical frequency identifier;
the abnormal hazard evaluation module is used for carrying out abnormal hazard evaluation on the abnormal statistical result to generate a hazard basic value;
the abnormal operation characteristic value generation module is used for generating an abnormal operation characteristic value based on the statistical frequency identification and the hazard basic value;
and the training teaching module is used for training the training teaching of the user through the abnormal operation characteristic value and the simulated training operation.
Further, the system further comprises:
the operation sequence ordering module is used for performing the simulated training operation to perform operation sequence ordering based on the abnormal operation characteristic value to obtain a sequence ordering result;
the teaching case constraint information generation module is used for setting teaching example distribution weights and generating teaching case constraint information based on the teaching example distribution weights and the sequence ordering result;
the teaching sample construction module is used for constructing a teaching sample through the teaching case constraint information and the simulated training operation, and carrying out training teaching of the training user through the teaching sample.
Further, the system further comprises:
the continuous training monitoring module is used for carrying out continuous training monitoring on the training user and generating a continuous training monitoring result;
the skill mastering and evaluating module is used for carrying out skill mastering and evaluating on the training user based on the continuous training monitoring result and generating a skill report of the training user;
and the third flight simulation training management module is used for carrying out flight simulation training management on the training user through the skill report.
The specific example of the flight simulation training management method based on the internet of things in the foregoing embodiment is also applicable to the flight simulation training management system based on the internet of things in the present embodiment, and by the foregoing detailed description of the flight simulation training management method based on the internet of things, those skilled in the art can clearly know the flight simulation training management system based on the internet of things in the present embodiment, so that the detailed description thereof will not be repeated herein for the sake of brevity. For the device disclosed in the embodiment, since it corresponds to the method disclosed in the embodiment, the description is relatively simple, and the relevant points refer to the description of the method section.
Example 3
Fig. 5 is a schematic diagram according to a third embodiment of the present disclosure, as shown in fig. 5, an electronic device 800 in the present disclosure may include: a processor 801 and a memory 802.
A memory 802 for storing a program; memory 802, which may include volatile memory (English: volatile memory), such as random-access memory (RAM), such as static random-access memory (SRAM), double data rate synchronous dynamic random-access memory (Double Data Rate Synchronous Dynamic Random Access Memory, DDR SDRAM), and the like; the memory may also include a non-volatile memory (English) such as a flash memory (English). The memory 802 is used to store computer programs (e.g., application programs, functional modules, etc. that implement the methods described above), computer instructions, etc., which may be stored in one or more of the memories 802 in a partitioned manner. And computer programs, computer instructions, data, etc. described above may be called upon by the processor 801.
The computer programs, computer instructions, etc., described above may be stored in one or more of the memories 802 in partitions. And the above-described computer programs, computer instructions, etc. may be invoked by the processor 801.
A processor 801 for executing a computer program stored in a memory 802 to realize the steps in the method according to the above embodiment.
Reference may be made in particular to the description of the embodiments of the method described above.
The processor 801 and the memory 802 may be separate structures or may be integrated structures integrated together. When the processor 801 and the memory 802 are separate structures, the memory 802 and the processor 801 may be coupled by a bus 803.
The electronic device in this embodiment may execute the technical scheme in the above method, and the specific implementation process and the technical principle are the same, which are not described herein again.
According to embodiments of the present disclosure, the present disclosure also provides an electronic device, a readable storage medium and a computer program product.
According to an embodiment of the present disclosure, the present disclosure also provides a computer program product comprising: a computer program stored in a readable storage medium, from which at least one processor of an electronic device can read, the at least one processor executing the computer program causing the electronic device to perform the solution provided by any one of the embodiments described above.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps recited in the present disclosure may be performed in parallel, may be performed sequentially or may be performed in a different order,
the present disclosure is not limited herein so long as the desired results of the disclosed technical solutions can be achieved.
The above detailed description should not be taken as limiting the scope of the present disclosure. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present disclosure are intended to be included within the scope of the present disclosure.

Claims (9)

1. The method is applied to a flight simulation training management system, and the flight simulation training management system is in communication connection with a data interaction device and an image acquisition device, and comprises the following steps:
connecting the flight simulation training management system, and reading basic training information and training scene information of a training user;
reading interactive control data through the data interaction device, and performing training control evaluation according to the interactive control data and the training scene information to generate a training control evaluation result;
invoking the image acquisition device based on the training scene information, acquiring a training process image of the training user through the image acquisition device, and constructing a training reference image set;
calling a training action database based on the training scene information, and performing action feature matching on the reference image set through the training action database;
generating auxiliary training evaluation data based on the action characteristic matching result, and generating a comprehensive training evaluation result through the training control evaluation result and the auxiliary training evaluation data;
and carrying out flight simulation training management on the training user through the comprehensive training evaluation result.
2. The method of claim 1, wherein the method comprises:
reading an interaction control time node based on the interaction control data, wherein the interaction control time node comprises an interaction control characteristic identifier;
extracting the image frames of the training reference image set through the interactive control time node to obtain a key frame image;
obtaining a standard execution action database in the training action database;
and performing action matching recognition on the key frame images through the standard execution action database, and obtaining the auxiliary training evaluation data according to a matching recognition result.
3. The method of claim 2, wherein the method further comprises:
acquiring non-key frame images in the training reference image set, and performing image aggregation on the non-key frame images based on the interactive control time node to acquire an image aggregation result with interval identification;
obtaining an abnormal action recognition database in the training action database;
performing action feature calling on the abnormal action recognition database based on the interval identifier, and performing corresponding abnormal feature recognition on the image aggregation result by calling action features;
and obtaining the auxiliary training evaluation data according to the abnormal characteristic recognition result and the matching recognition result.
4. The method according to claim 2, wherein the method comprises:
performing action correction identification through the matching identification result to generate a correction identification result;
extracting the common characteristic of the correction identification result to generate a common characteristic extraction result;
judging whether the common characteristic value of the common characteristic extraction result meets a preset characteristic threshold value or not;
when the common feature extraction result can meet the preset feature threshold, generating training user habit features according to the common feature extraction result;
and performing flight simulation training management based on the training user habit characteristics and the correction identification result.
5. The method of claim 1, wherein the method comprises:
performing operation anomaly statistics on the simulation training operation to obtain an anomaly statistics result;
accumulating the statistic frequency of the abnormal statistic result to obtain a statistic frequency mark;
performing abnormal hazard evaluation on the abnormal statistical result to generate a hazard base value;
generating an abnormal operation characteristic value based on the statistical frequency identification and the hazard basic value;
and training the training teaching of the user through the abnormal operation characteristic value and the simulated training operation.
6. The method of claim 5, wherein the method comprises:
performing the simulated training operation to perform operation sequence sorting based on the abnormal operation characteristic value to obtain a sequence sorting result;
setting teaching example distribution weights, and generating teaching case constraint information based on the teaching example distribution weights and the sequence ordering result;
and constructing a teaching sample through the teaching case constraint information and the simulated training operation, and carrying out training teaching of the training user through the teaching sample.
7. The method of claim 1, wherein the method comprises:
continuously training and monitoring the training user to generate a continuous training and monitoring result;
performing skill mastering evaluation of the training user based on the continuous training monitoring result, and generating a skill report of the training user;
and performing flight simulation training management on the training user through the skill report.
8. The utility model provides a flight simulation training management system based on thing networking, its characterized in that, system and data interaction device, image acquisition device communication connection, the system includes:
the information reading module is used for connecting the flight simulation training management system and reading basic training information and training scene information of a training user;
the training control evaluation module is used for reading the interactive control data through the data interaction device, performing training control evaluation according to the interactive control data and the training scene information, and generating a training control evaluation result;
the training image acquisition module is used for calling the image acquisition device based on the training scene information, acquiring a training process image of the training user through the image acquisition device, and constructing a training reference image set;
the action feature matching module is used for calling a training action database based on the training scene information and carrying out action feature matching on the reference image set through the training action database;
the comprehensive training evaluation module is used for generating auxiliary training evaluation data based on the action characteristic matching result and generating a comprehensive training evaluation result through the training control evaluation result and the auxiliary training evaluation data;
and the flight simulation training management module is used for carrying out flight simulation training management on the training user through the comprehensive training evaluation result.
9. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-7.
CN202310049511.1A 2023-02-01 2023-02-01 Flight simulation training management method and system based on Internet of things Active CN116070816B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310049511.1A CN116070816B (en) 2023-02-01 2023-02-01 Flight simulation training management method and system based on Internet of things

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310049511.1A CN116070816B (en) 2023-02-01 2023-02-01 Flight simulation training management method and system based on Internet of things

Publications (2)

Publication Number Publication Date
CN116070816A true CN116070816A (en) 2023-05-05
CN116070816B CN116070816B (en) 2023-06-02

Family

ID=86183343

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310049511.1A Active CN116070816B (en) 2023-02-01 2023-02-01 Flight simulation training management method and system based on Internet of things

Country Status (1)

Country Link
CN (1) CN116070816B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116503182A (en) * 2023-06-25 2023-07-28 凯泰铭科技(北京)有限公司 Method and device for dynamically collecting vehicle insurance person injury data based on rule engine

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106530894A (en) * 2017-01-10 2017-03-22 北京捷安申谋军工科技有限公司 Flight trainer virtual head-up display method through augmented reality technology and flight trainer virtual head-up display system thereof
WO2018095069A1 (en) * 2016-11-24 2018-05-31 同方威视技术股份有限公司 Simulated training apparatus, method and system for security check
US20200035116A1 (en) * 2018-07-26 2020-01-30 Honeywell International Inc. System and method for cloud-based flight management system familiarization training
CN112785267A (en) * 2021-01-23 2021-05-11 南京利特嘉软件科技有限公司 Flight information management method and system based on MVC framework technology
DE102021105245A1 (en) * 2020-04-01 2021-10-07 Nvidia Corporation USING IMAGE AUGMENTATION WITH SIMULATED OBJECTS TO TRAIN MACHINE LEARNING MODELS IN AUTONOMOUS DRIVING APPLICATIONS
WO2022053080A2 (en) * 2020-09-10 2022-03-17 成都拟合未来科技有限公司 Training method and system based on multi-dimensional movement ability recognition, terminal, and medium
CN114333485A (en) * 2021-09-18 2022-04-12 能科科技股份有限公司 Equipment online simulation debugging system based on Internet of things
CN114372941A (en) * 2021-12-16 2022-04-19 佳源科技股份有限公司 Low-illumination image enhancement method, device, equipment and medium
CN114618142A (en) * 2022-04-22 2022-06-14 吉林体育学院 Auxiliary training system and method for table tennis sports
CN115049839A (en) * 2022-08-15 2022-09-13 珠海翔翼航空技术有限公司 Quality detection method for objective quality test of flight simulation training equipment
CN115063274A (en) * 2022-08-18 2022-09-16 珠海翔翼航空技术有限公司 Virtual reality flight training scheme generation method based on object technology capability
CN115116296A (en) * 2022-08-25 2022-09-27 中国电子科技集团公司第十五研究所 Tower flight command simulation method and system based on digital twinning
CN115269932A (en) * 2022-09-29 2022-11-01 江西联创精密机电有限公司 Training scoring method and device for simulation training equipment, storage medium and equipment
CN115393818A (en) * 2022-09-06 2022-11-25 一汽解放汽车有限公司 Driving scene recognition method and device, computer equipment and storage medium
CN115482397A (en) * 2021-06-15 2022-12-16 咪付(广西)网络技术有限公司 Action scoring system

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018095069A1 (en) * 2016-11-24 2018-05-31 同方威视技术股份有限公司 Simulated training apparatus, method and system for security check
CN106530894A (en) * 2017-01-10 2017-03-22 北京捷安申谋军工科技有限公司 Flight trainer virtual head-up display method through augmented reality technology and flight trainer virtual head-up display system thereof
US20200035116A1 (en) * 2018-07-26 2020-01-30 Honeywell International Inc. System and method for cloud-based flight management system familiarization training
DE102021105245A1 (en) * 2020-04-01 2021-10-07 Nvidia Corporation USING IMAGE AUGMENTATION WITH SIMULATED OBJECTS TO TRAIN MACHINE LEARNING MODELS IN AUTONOMOUS DRIVING APPLICATIONS
WO2022053080A2 (en) * 2020-09-10 2022-03-17 成都拟合未来科技有限公司 Training method and system based on multi-dimensional movement ability recognition, terminal, and medium
CN112785267A (en) * 2021-01-23 2021-05-11 南京利特嘉软件科技有限公司 Flight information management method and system based on MVC framework technology
CN115482397A (en) * 2021-06-15 2022-12-16 咪付(广西)网络技术有限公司 Action scoring system
CN114333485A (en) * 2021-09-18 2022-04-12 能科科技股份有限公司 Equipment online simulation debugging system based on Internet of things
CN114372941A (en) * 2021-12-16 2022-04-19 佳源科技股份有限公司 Low-illumination image enhancement method, device, equipment and medium
CN114618142A (en) * 2022-04-22 2022-06-14 吉林体育学院 Auxiliary training system and method for table tennis sports
CN115049839A (en) * 2022-08-15 2022-09-13 珠海翔翼航空技术有限公司 Quality detection method for objective quality test of flight simulation training equipment
CN115063274A (en) * 2022-08-18 2022-09-16 珠海翔翼航空技术有限公司 Virtual reality flight training scheme generation method based on object technology capability
CN115116296A (en) * 2022-08-25 2022-09-27 中国电子科技集团公司第十五研究所 Tower flight command simulation method and system based on digital twinning
CN115393818A (en) * 2022-09-06 2022-11-25 一汽解放汽车有限公司 Driving scene recognition method and device, computer equipment and storage medium
CN115269932A (en) * 2022-09-29 2022-11-01 江西联创精密机电有限公司 Training scoring method and device for simulation training equipment, storage medium and equipment

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
薛焱 等: "无人机***模拟训练体系构想", 航空电子技术, no. 01, pages 20 - 25 *
高煊 等: "直升机训练模拟器评分***的设计与实现", 现代电子技术, no. 24, pages 64 - 66 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116503182A (en) * 2023-06-25 2023-07-28 凯泰铭科技(北京)有限公司 Method and device for dynamically collecting vehicle insurance person injury data based on rule engine
CN116503182B (en) * 2023-06-25 2023-09-01 凯泰铭科技(北京)有限公司 Method and device for dynamically collecting vehicle insurance person injury data based on rule engine

Also Published As

Publication number Publication date
CN116070816B (en) 2023-06-02

Similar Documents

Publication Publication Date Title
CN110896381B (en) Deep neural network-based traffic classification method and system and electronic equipment
US11605226B2 (en) Video data processing method and apparatus, and readable storage medium
CN104778173B (en) Target user determination method, device and equipment
WO2018059402A1 (en) Method and apparatus for determining fault type
CN105447147B (en) A kind of data processing method and device
CN116070816B (en) Flight simulation training management method and system based on Internet of things
CN110336838B (en) Account abnormity detection method, device, terminal and storage medium
CN111310057B (en) Online learning mining method and device, online learning system and server
CN107992937B (en) Unstructured data judgment method and device based on deep learning
US8204889B2 (en) System, method, and computer-readable medium for seeking representative images in image set
CN110909868A (en) Node representation method and device based on graph neural network model
CN115795329B (en) Power utilization abnormal behavior analysis method and device based on big data grid
CN115329204A (en) Cloud business service pushing method and pushing processing system based on big data mining
CN113268403A (en) Time series analysis and prediction method, device, equipment and storage medium
CN110852224B (en) Expression recognition method and related device
CN103473308A (en) High-dimensional multimedia data classifying method based on maximum margin tensor study
CN108073582B (en) Computing framework selection method and device
CN111382305B (en) Video deduplication method, video deduplication device, computer equipment and storage medium
CN112465565A (en) User portrait prediction method and device based on machine learning
CN113627464B (en) Image processing method, device, equipment and storage medium
CN111597444B (en) Searching method, searching device, server and storage medium
CN114449342A (en) Video recommendation method and device, computer readable storage medium and computer equipment
CN114510615A (en) Fine-grained encrypted website fingerprint classification method and device based on graph attention pooling network
CN115705706A (en) Video processing method, video processing device, computer equipment and storage medium
CN112508518A (en) RPA flow generation method combining RPA and AI, corresponding device and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant