CN114299519A - Auxiliary flight method based on XML format electronic flight manual - Google Patents

Auxiliary flight method based on XML format electronic flight manual Download PDF

Info

Publication number
CN114299519A
CN114299519A CN202111622420.XA CN202111622420A CN114299519A CN 114299519 A CN114299519 A CN 114299519A CN 202111622420 A CN202111622420 A CN 202111622420A CN 114299519 A CN114299519 A CN 114299519A
Authority
CN
China
Prior art keywords
glasses
manual
flight
xml
electronic flight
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111622420.XA
Other languages
Chinese (zh)
Inventor
李渊恒
温丽华
张丛远
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Aircraft Customer Service Co ltd
Commercial Aircraft Corp of China Ltd
Original Assignee
Shanghai Aircraft Customer Service Co ltd
Commercial Aircraft Corp of China Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Aircraft Customer Service Co ltd, Commercial Aircraft Corp of China Ltd filed Critical Shanghai Aircraft Customer Service Co ltd
Priority to CN202111622420.XA priority Critical patent/CN114299519A/en
Publication of CN114299519A publication Critical patent/CN114299519A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses an auxiliary flight method based on an XML format electronic flight manual. The auxiliary flight method comprises the steps of structuring an original flight manual according to industry specifications to generate an XML-format electronic flight manual which can be read by AR glasses, and forming a workflow by contents with the same label in the XML-format electronic flight manual according to the step sequence of the contents in the manual; scanning an airplane cockpit by using a 3D scanner to generate a three-dimensional model so as to form a material library, and matching the three-dimensional model with a tag in an XML format electronic flight manual; a user selects the content of the electronic flight manual in the XML format through AR glasses, so that a workflow related to the selected content is triggered, and the three-dimensional model is sequentially called according to the sequence in the triggered workflow to be compared with an object in the sight of the AR glasses, so that an operation object is identified; after the operation object is identified successfully, the AR glasses display operation guide and warning information corresponding to the operation object to assist the user in operating to assist the flight.

Description

Auxiliary flight method based on XML format electronic flight manual
Technical Field
The invention relates to the field of auxiliary flight of aircrafts, in particular to an auxiliary flight method based on an XML format electronic flight manual.
Background
An Extensible Markup Language (XML) is an international general program Language, as shown in fig. 1 and fig. 2, a flight manual is written according to manual manuscripts by using an XML Language in the civil aviation field according to the "S1000D industry specification", and the completed XML file is published into files in various forms such as PDF, web pages and the like according to a customized publishing rule for a user to view so as to assist in flying operations. Meanwhile, the AR glasses can read the electronic flight manual in the XML format, and text knowledge or instructive instructions of the operation equipment are provided for the flight operator through the display function on the electronic flight manual.
However, currently in the civil aviation field, flight manual XML data is only used to generate PDF-like or web-page-like text for the user to read and passively provide operation guidance. Although the method for assisting flight operation based on Augmented Reality (AR) has been initially explored in China, the method can automatically display corresponding instruction instructions by scanning and identifying the operation equipment after reading the electronic flight manual in the XML format through AR glasses. However, the user is also required to manually trigger the recognition function of the AR glasses, and the corresponding description is popped up and displayed after the recognition is successful.
Therefore, an auxiliary flight method capable of quickly, intuitively and actively displaying operation information corresponding to the operation equipment cannot be provided, so that the manual operation error rate cannot be reduced, the fault handling time cannot be shortened, and the safe flight of the aircraft cannot be effectively optimized.
Disclosure of Invention
Therefore, the technical problem to be solved by the invention is to overcome the problem that the operation information corresponding to the operation equipment cannot be quickly, intuitively and actively provided for the user to assist the flight in the prior art, and provide a new auxiliary flight method based on the XML electronic flight manual. By the method, the equipment to be operated can be actively identified to assist a user in quickly positioning the operation object, and corresponding operation and warning information is actively displayed after the operation equipment is identified, so that the work operation error rate can be reduced, the fault handling time can be retrieved, and the safe flight of the aircraft can be ensured.
Specifically, the present invention solves the above technical problems by the following technical solutions:
the invention provides an auxiliary flight method based on an XML format electronic flight manual, which is characterized by comprising the following steps:
carrying out structuralization processing on an original flight manual according to industry specifications to generate an XML-format electronic flight manual file which can be read by AR glasses, wherein each piece of content in the XML-format electronic flight manual file is endowed with an independent tag and an attribute which can be assigned, extracting the content with an operation class tag and forming a workflow according to the step sequence described in the electronic flight manual file;
scanning various devices and switches in an airplane cockpit by using a 3D scanner to generate a three-dimensional model material library, and matching three-dimensional models of the various devices and switches with tags in an XML format electronic flight manual file;
the method comprises the steps that a user selects the content of an electronic flight manual file in an XML format through AR glasses, so that a workflow related to the selected content is triggered, and a three-dimensional model is sequentially called according to the sequence in the triggered workflow to be compared with objects in the field of view of the AR glasses;
when the called three-dimensional model is consistent with the object in the visual field of the AR glasses, indicating that the operation object is successfully identified, the AR glasses display operation guide and warning information related to the operation object to assist the user in operation so as to assist the flight.
The auxiliary flight method based on the electronic flight manual in the XML format can generate text contents read by a user, such as PDF texts and webpage texts, can also be used as the input of an operation step program, and can directly convert XML tags into program codes and automatically call background matched three-dimensional models, so that the work of programming operation steps is avoided. In addition, the AR glasses can actively search the operating equipment according to the workflow, mark and display the corresponding operating equipment and the corresponding operating information in the sight of the user, and provide operating guidance for the user, so that the error rate is reduced, the query time is shortened, and the flight safety is effectively ensured.
According to an embodiment of the invention, the secondary flight method further comprises: associating a trigger event for an operation class tag and an alarm class tag of an XML-format electronic flight manual file, so that the trigger event is triggered after a user successfully identifies an operation object through AR glasses, and the content associated with the identified operation object is called; and reading the invoked content in the AR glasses. Through the trigger event, the information related to the operation object can be automatically read, and the user does not need to manually call the information related to the operation object, so that the process of obtaining the information by the user is greatly simplified.
According to another embodiment of the invention, the secondary flight method further comprises: the user manually retrieves the XML-format electronic flight manual file by chapter by clicking a read file button in the display interface of the AR glasses and reads the XML-format electronic flight manual.
According to another embodiment of the invention, the secondary flight method further comprises: and a verification step of verifying whether the contents of the XML format electronic flight manual meet the requirements in the AR glasses, wherein a verifier reads all published XML format electronic flight manual files to be confirmed through the AR glasses and executes all workflows according to the operation sequence recorded in the specified manual files after selecting the specified manual files so as to confirm whether all instructions meet the standard requirements.
According to another embodiment of the invention, the AR glasses are configured with a processor configured to be connectable to a computer via a data line; the secondary flight method further comprises: and the processor of the AR glasses receives the XML format electronic flight manual file edited and updated by the computer and generates a new workflow.
According to another embodiment of the invention, the AR glasses are configured to be able to read the content in the XML-format electronic flight manual file corresponding to the identification class tag through the identification class tag.
According to another embodiment of the invention, the AR glasses are provided with a camera and configured to continuously scan a picture in the line of sight after triggering a workflow and display the scanned picture on the AR glasses; and simultaneously, comparing the scanned object with the called three-dimensional model, and directly marking the operation object in a picture displayed on the AR glasses after the operation object is successfully identified.
According to another embodiment of the present invention, the AR glasses are provided with a display screen for displaying operation guidance and warning information associated with an operation object to a user.
According to another embodiment of the invention, the AR glasses can send out operation guide and warning information associated with the operation object to the user in a voice mode.
According to another embodiment of the present invention, the AR glasses include a memory in which a three-dimensional model material library and an electronic flight manual document in XML format are stored.
According to another embodiment of the invention, the electronic flight manual file in the XML format comprises characters, pictures, audio, video and other multimedia materials inserted in the form of tags. The inserted picture can be called through the label of the electronic flight manual file in the XML format, the called picture is compared with the three-dimensional model, and when the object in the picture is consistent with the three-dimensional model, the electronic flight manual in the XML format is matched with the three-dimensional model.
On the basis of the common knowledge in the field, the above preferred conditions can be combined randomly to obtain the preferred embodiments of the invention.
The auxiliary flight method based on the electronic flight manual with the XML format according to the embodiment of the invention has the following beneficial technical effects and advantages that:
according to the auxiliary flight method based on the electronic flight manual with the XML format, the original flight manual is structured to generate the electronic flight manual with the XML format which can be read by AR glasses, and then the content in the manual is formed into a workflow according to the labels and the step sequence in the electronic flight manual with the XML format, so that the XML labels can be directly converted into program codes, and the operation of programming the operation steps is omitted. Moreover, by matching the XML tag with the three-dimensional model, the three-dimensional model matched by the background can be automatically called when the workflow is triggered to be executed. In addition, the AR glasses can actively search the operation object according to the workflow, recognize and mark the operation object in the sight of the user, and intuitively and actively provide operation guidance for the user, so that the error rate is reduced, and the query time is shortened. Therefore, the auxiliary flight of the aircraft can be realized through the quick, visual and active operation information corresponding to the display operation equipment of the AR glasses, and the flight safety is effectively guaranteed.
Drawings
Fig. 1 is a schematic frame diagram of generating PDF or web page display using XML in the prior art.
Fig. 2 is a schematic diagram of generating a PDF document using an airline manual XML file in the prior art.
Fig. 3 is a flowchart of an auxiliary flight method based on an electronic flight manual in XML format according to a preferred embodiment of the present invention.
Fig. 4 is a schematic diagram illustrating a step of generating an electronic flight manual in XML format according to the auxiliary flight method shown in fig. 3.
Fig. 5 is a schematic diagram of display information of the AR glasses after the operational object is successfully identified according to the auxiliary flight method shown in fig. 3.
Fig. 6 is a frame diagram of a flight assistance system employing the method of assisting flight according to the preferred embodiment of the present invention.
FIG. 7 is a flowchart illustrating user operation steps for the AR electronic flight manual application service system of the flight assistance system of FIG. 6.
Detailed Description
The following detailed description of the preferred embodiments of the present invention, taken in conjunction with the accompanying drawings, is intended to be illustrative, and not restrictive, and it is intended that all such modifications and equivalents be included within the scope of the present invention.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs; the terminology used in the description of the application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application.
The terms "comprising," "including," "having," and the like in the description and claims of this application and in the description of the above figures are open-ended terms. Thus, a method or apparatus that "comprises," "has," such as one or more steps or elements, has one or more steps or elements, but is not limited to having only those one or more elements. In the following detailed description, numerical terms, such as "a," "an," "the," etc., are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implying any particular order among or between indicated features, and are used for exemplary purposes and not by way of limitation.
XML is an international general programming language, the aviation field writes flight manuals according to the 'S1000D industry standard' by using XML language, and the completed XML file is published into various forms such as PDF, web page and the like according to the customized publishing rule. Therefore, the characteristics of the structure, the format, the content screening and the like of the whole flight manual can be changed only by modifying the release rule of the background. Therefore, by changing the issuing rules, the XML file source data of the same set of flight manual can meet the customization requirements of different airlines and different workers such as users, maintenance personnel and performance calculation personnel. In addition, through programming, various hardware can run XML format electronic flight manuals, and rich application methods are expanded.
However, in the current civil aviation field, the flight manual XML data is only used for generating PDF or web page type texts for users to read and passively providing operation instructions, and manual programming is needed for setting operation steps. The method for initially exploring the auxiliary equipment for the flight operation based on the AR in China can provide text knowledge or instructive instructions for identifying objects for users through AR glasses, but needs the users to manually trigger the identification function. In addition, on the basis of the existing equipment, the auxiliary flight method has no published academic papers and patents in the aspects of collecting materials, establishing a database and verifying.
Aiming at the problems, the invention provides a novel auxiliary flight method based on an XML format electronic flight manual and in combination with an AR technology, a workflow can be automatically generated according to the XML format electronic flight manual, AR glasses actively call a three-dimensional model according to the workflow and search and identify an operation object, and thus operation guide and warning information related to the operation object are automatically displayed after the identification is successful. The contents of the flight operation manual are converted into an XML format electronic flight manual, theoretical knowledge query and interactive program execution of a user are realized by using AR flight control intelligent auxiliary equipment, and auxiliary flight is realized.
The auxiliary flight method based on the electronic flight manual with the XML format disclosed by the invention not only can directly display and mark the operation object in the sight of people through the AR glasses, but also can actively provide related operation guide and warning information after identifying the operation object, so that a user does not need to read and understand related description and then operate. In addition, the method can automatically identify alarm words when the airplane gives a fault alarm and give corresponding operation programs and background knowledge, so that a user is assisted to quickly position an operation object without looking up a manual.
Therefore, the auxiliary flight method disclosed by the invention can quickly, visually and actively display the operation information corresponding to the operation equipment, reduce the manual operation error rate and shorten the fault handling time, realize a stronger auxiliary function and obtain higher safety. Hereinafter, the auxiliary flight method based on the electronic flight manual in XML format according to the present invention will be described in detail with reference to the accompanying drawings.
As shown in fig. 3, the auxiliary flight method based on the electronic flight manual in XML format provided by the present invention substantially includes the following steps:
s1: generating an electronic flight manual in an XML format;
s2: forming a workflow;
s3: generating a three-dimensional model of the cockpit;
s4: matching the three-dimensional model with the XML tag;
s5: selecting specified content and triggering a workflow;
s6: identifying an operation object;
s7: and displaying operation guide and warning information.
Specifically, the original flight manual is firstly structured according to the industry standard to generate an XML format electronic flight manual capable of being read by AR glasses, and then the contents with the same label in the XML format electronic flight manual are formed into a workflow according to the step sequence of the contents in the manual. And then scanning the aircraft cockpit by using a 3D scanner to generate a three-dimensional model so as to form a material library, and matching the three-dimensional model with a tag (hereinafter referred to as an XML tag) in an XML format electronic flight manual. The user then selects the contents of the electronic flight manual in XML format via AR glasses, thereby triggering a workflow related to the selected contents. Then the AR glasses call the three-dimensional models in sequence according to the sequence in the triggered workflow to compare with the objects in the sight of the AR glasses, and accordingly the operation objects are identified. And finally, when the called three-dimensional model is consistent with the object in the sight of the AR glasses, the operation object is successfully identified, and the AR glasses display operation guide and warning information corresponding to the operation object to assist the user in operating to assist the flight.
The electronic flight manual in the XML format can be manufactured by manual manufacturing personnel according to manual requirements, each piece of content is endowed with an independent tag and an attribute capable of being assigned, and the tag can be read by AR glasses. Illustratively, the flight manual manuscript in word or PDF format is structured and transformed into an XML format electronic flight manual according to S1000D industry standard, and the generated XML format electronic flight manual defines various labels for the content thereof according to the aviation industry S1000D standard.
As shown in fig. 4, in the generated XML-format electronic flight manual, the content in the "title" tag is a title, the content in the "para" tag is a character, the "listItem" tag is a definable character list, and each piece of content in the XML-format electronic flight manual can be programmed and searched and called through the XML tag. By compiling the running software, the AR glasses can read the contents in the XML format electronic flight manual.
Preferably, the AR glasses can be run by configured software following the various tags in the manual (similar to a computer drive). Illustratively, the content in the "description" tab is displayed on the AR glasses prompt bar via its built-in display. Preferably, the AR glasses further include a camera and a built-in display for identifying the object, and are capable of being connected to a host device, in which a processor, a memory, and a battery are integrated, through a wire.
Besides characters, multimedia materials such as pictures, audio and video can be inserted into the electronic flight manual in the XML format in the form of tags. When the AR glasses run the program formed by the XML tags, the multimedia materials can be called in the database according to the tags. Illustratively, the XML-format electronic flight manual and the database package are stored in a memory chip of the AR glasses.
Further, the AR glasses can form the content with the same tag in the XML format electronic flight manual into a workflow according to the step sequence in the manual through configured software. Preferably, the AR glasses learn the operation objects and the operation sequences corresponding to the operation objects according to the "action" tags in the XML-format electronic flight manual, and extract the contents in each "action" tag to form the workflow.
Moreover, the AR electronic flight manual exists as structured data, which can be modified by XML file editing software on the computer side, or can be modified in the AR editor. The XML format electronic flight manual modified at the computer end can be copied from the computer end to a memory of the AR glasses through a data line, and a processor of the AR glasses runs software to read the XML file and then generate a new workflow. Therefore, when the operation content is updated, only the modification is needed on the basis of the upper version, and the operation content is not needed to be created from the beginning.
The AR glasses can then call the paired three-dimensional models in the order of the workflow to compare with objects in the line of sight of the AR glasses. The three-dimensional model scans various devices and switches in the aircraft cockpit through the 3D scanner, collects materials required by AR identification, and respectively stores the materials as the three-dimensional model to form a material library. And respectively pairing the different three-dimensional models according to the tags in the XML format electronic flight manual. Illustratively, the three-dimensional model is paired with an "action id" tag used for describing operation steps in an electronic flight manual in an XML format, and when the three-dimensional model is consistent with a picture in the "action id" tag, the three-dimensional model is successfully paired.
Then, the user selects the content of the electronic flight manual in the XML format through the AR glasses, so that a workflow related to the selected content is triggered, the three-dimensional models are sequentially called according to the sequence in the triggered workflow to be compared with the objects in the sight line of the AR glasses, and therefore the operation objects are identified according to the operation sequence written in the electronic flight manual in the XML format. Illustratively, the user can manually retrieve and read by chapter as required by himself through a "read file" module configured on the AR glasses, thereby presenting a complete manual page.
Specifically, when a workflow related to the selected content is triggered, the camera of the AR glasses continuously scans the pictures in the view line in the cockpit, and the pictures are compared with the called three-dimensional models one by one. When the called three-dimensional model is consistent with an object in the sight of the AR glasses, the operation object is identified successfully, the AR glasses are positioned to the equipment needing to be operated, and the AR glasses display operation guide and warning information corresponding to the operation object to assist the user in operating to assist the flight.
Furthermore, the 'trigger event' can be associated with the tag of the electronic flight manual in the XML format through the running software configured on the AR glasses, and the trigger event is started after the AR glasses successfully identify, so that characters are called in the electronic flight manual in the XML format or other multimedia files are called in the database. Therefore, after an operation object on the airplane is successfully identified by the AR glasses, the AR glasses can display corresponding text and picture descriptions.
Illustratively, each "action" label or "EICAS" label of the XML format electronic flight manual is associated with a "trigger event", when the AR glasses successfully identify the trigger event, the AR glasses call the operation guidance or the words or pictures related to the warning information about the operation object from the XML format electronic flight manual or the database, and the AR glasses display the operation guidance and the warning information corresponding to the operation object to assist the user operation to assist the flight.
Fig. 5 shows an actual screen seen in the AR glasses, which displays a "recognition success" box after recognizing the operation object in the line of sight, and disappears after one second. After the successful recognition, the operation object calibration frame and the operation guide and warning information prompt bar are continuously displayed before the operation is completed. Illustratively, after the operation is completed, the AR glasses will automatically recognize the state of the operation object and proceed to the next step of the workflow. In another example, the user can enter the next step of the workflow by pressing a button on the AR glasses. Preferably, the "recognition success" box is cyan, the operation target calibration box is white, and the "operation guidance and warning information" prompt box is white, so as to visually distinguish the contents represented by the different boxes.
In addition, the verifier can verify whether the generated electronic flight manual in the XML format meets the requirements through the AR glasses. After the verifier wears the AR glasses to log in, all the issued XML format electronic flight manuals to be confirmed can be seen. Upon selection of the specified content, the AR glasses begin execution of the workflow formed by the XML tags in the order of steps written in the manual. In each execution step of the execution workflow, once the operation object is successfully identified, the AR glasses have corresponding operation guide and warning information to assist the user in operation.
Then, the verifier can confirm whether all the indication items and information are in compliance with the specifications by viewing the operation guidance and warning information displayed by the AR glasses. If the indication item or the information has a problem, the AR editor or the computer terminal AR file editing software is returned to for re-modification and updating, and then verification is carried out until the XML format electronic flight manual is confirmed to meet the requirements. In addition, the processor of the AR glasses runs software that can read the new "workflow" generated by the modified XML format electronic flight manual. Therefore, when the operation content is updated, only the modification is needed on the basis of the upper version, and the operation content is not needed to be created from the beginning.
Fig. 6 shows a flight assistance system that can adopt the auxiliary flight method in the preferred embodiment of the present invention, which includes an XML manual material unit, an identification material collection unit, a manual editing module, a management server unit, and an AR glasses terminal. The manual editing module can receive data from the XML manual material unit to generate an XML format electronic flight manual, can receive the three-dimensional model from the identification material acquisition unit to complete the matching of the manual label and the three-dimensional model, and uploads the manual and the three-dimensional model to the AR glasses terminal through the management server. The AR glasses terminal generates a workflow by analyzing the XML format electronic flight manual, and calls the paired three-dimensional model to identify the operation object when the workflow is executed, so that operation guide and warning information about the operation object are displayed when the identification is successful.
The XML manual material unit comprises a traditional XML electronic manual, an XML analysis module, manual structured data and manual multimedia contents, the XML analysis module can convert input XML codes of the traditional XML electronic manual into label-form manual structured data which can be read by a compiler, and then the manual structured data and the manual multimedia contents are transmitted to the manual editing module through the data interface so as to generate the XML-format electronic flight manual.
The manual editing module comprises an editing interface, a manual multimedia material library and a tool for matching identification materials with manual contents, and can receive manual structured data and manual multimedia contents from XML manual material units and three-dimensional models from identification material acquisition units. And after receiving the manual structured data and the manual multimedia contents, the manual editing module can perform operations such as selecting, sequencing and the like on the manual XML structured data through an editing interface, and insert the manual multimedia contents to form the XML-format electronic flight manual.
And then, the manual editing module matches the tags in the generated XML format electronic flight manual with the corresponding three-dimensional models through a content matching tool in the identification material and manual to obtain the matched three-dimensional models. Subsequently, the manual editing module transmits the generated XML-format electronic flight manual and the paired three-dimensional model to the AR glasses terminal through the management server. The management server determines which version of the manual and the three-dimensional model are transmitted to the AR glasses terminal through the manual version changing management and distribution module.
The AR glasses terminal comprises a manual analysis module, an identification material library module, a workflow management and operation module, an object identification module, a human-computer interaction module, a visual display module and a video recording module. The manual analysis module can be used for decoding the tags in the XML file, generating a workflow executed according to the operation steps of the manual, and managing the operation of the glasses through the workflow management and operation module.
In the process of executing the workflow, the AR glasses terminal can scan when identifying the object through the object identification module, compare the object scanning result with the three-dimensional model stored in the identification material library, and output the identification result. And when the identification result is successful, the workflow management and operation module calls a corresponding manual page, multimedia materials and a handling program according to the identification result, so that operation guidance and warning information about the operation object can be displayed to assist flight.
In the process of executing the workflow, the AR glasses terminal can be further used for receiving and feeding back various operations of the user on the glasses through the man-machine interaction module and the visual display module, and then the workflow management and operation module calls corresponding manual pages, multimedia materials and treatment programs according to the operations of the user, so that operation guidance and warning information related to an operation object can be displayed to assist flight.
In addition, the AR glasses terminal can also record and store the video in the using process through the video recording module, and upload the recorded and stored video to the management server. The video recording storage module of the management server stores the video uploaded each time, and a user can call the stored video for playback and watching, so that the flight control capability is provided by summarizing experience.
Preferably, the AR glasses end-of-hand book application is an execution platform for manual operation and display. To facilitate the update and expansion of the brochure content, the production module of the brochure content and the AR execution application are loosely coupled, i.e.: the content of the manual is often modified according to the requirement of the airline company, and new tags or attributes are added to the XML file, and the modification does not affect the operation of the AR software. The AR glasses terminal and the manual editing module take standard protocol specifications as constraints and are in butt joint with the manual analysis module (similar to a decoder for internet access).
That is, the XML-format electronic flight manual itself forms all the manual execution logic flows, and the operation software of the AR glasses is not responsible for managing the logic flow of a single manual (equivalent to playing movie with a computer, the text and picture in the XML file are equivalent to the video file of a movie, the various tags in the XML-format electronic flight manual form a program equivalent to the video playing software, and the operation software of the AR glasses is equivalent to the video driver installed on the computer). The architecture mode can use the album content package as an independent content plug-in, thereby greatly improving the expansibility of the framework logic (equivalent to that one computer can be provided with a plurality of software and each computer has different functions).
Illustratively, the flight auxiliary system further comprises a flight manual content management platform, which can complete the process work orders of verification, review and the like of manual distribution according to the manual state, can be distributed to a person in charge of a specific flight program to execute corresponding manual proofreading and review tasks, and can be used by a user only after the manual is finalized after the review is completed.
When the system is used specifically, the flight personnel can retrieve all manuals distributed to the flight personnel after logging in, and can execute the flight procedure after clicking. Reading the 'action' label in the program execution process indicates that a switch is required to be operated, and the system triggers the object identification function. The camera of the AR glasses continuously scans pictures in the sight line, the pictures are compared with the three-dimensional models matched in advance, object recognition is achieved, the fact that the switch needing to be operated is located is indicated when recognition is successful, and at the moment, the AR glasses prompt operation actions in a text or voice mode.
FIG. 7 illustrates a flowchart of specific operational steps for flight operations training using the secondary flight method of FIG. 3 and the secondary flight system of FIG. 6. As shown in fig. 7, first, a user wears AR glasses to open an application, then logs in an account, loads an original flight manual, selects information such as a designated airplane model and a designated number of frames, and imports a task content package to start a flight operation of a training user.
Then, according to the imported task content package, the user selects a loading service matching library, loads an XML format flight manual content package and initializes the manual content. And after the manual is initialized, the user selects to load the object identification matching library, the UI instructs the flight simulator to perform identification scanning, and the AR glasses start to perform identification and positioning of the flight simulator. And when the matching of the identified positioning coordinates is completed, the system enters a monitoring event state.
When the system receives the designated operation of a cursor, a gesture, a key or a remote controller input by a user, the system can select to browse the corresponding material content or directly enter the next operation step. If the interactive logic judgment is met, the user is required to judge the current interactive selection, and then the content of the corresponding operation object is jumped to according to different selections of the user.
Illustratively, the interactive logic judgment content is written in an XML file in a "choose" tag. In the workflow of fuel leakage, a user is required to judge whether the residual fuel quantity is enough to fly to a destination airport, if yes, the program enters a branch for continuing flying and eliminating faults, and if no, the program enters a branch for emergency landing.
And finally, after the training task is completed, the user can upload operation state statistics and upload video monitoring of the operation process through the system so as to further improve the flight operation level.
The flight manual which can be adopted by the auxiliary flight method disclosed in the embodiment of the application has developed a complete XML file data packet, can be used for generating text contents (PDF and web pages) read by a user, can also be used as the input of an operation step program, is directly converted into a program code by an XML tag, and automatically calls a background matched three-dimensional model, so that the work of programming operation steps is avoided. The user can also be actively provided with operation guidance and warning, namely: the method actively searches for the operation target object according to the logical sequence of the flight operation, marks the operation target object with a red frame in the sight of a user, and provides operation guidance for the user, so that the error rate is reduced, and the query time is shortened.
The beneficial technical effects of the above embodiment of the invention are as follows:
the workflow of the operation steps can be automatically generated according to the operation step sequence of the flight manual, additional programming is not needed, and the operation flow is simple and convenient.
The AR glasses can directly mark the operation object in the sight of a person, and the operation object can not be determined after the person understands the related description unlike a book or a reader, so that the action to be operated is more intuitive.
The method can quickly give relevant knowledge when a user operates a certain switch, automatically recognize alarm words when the airplane gives a fault alarm, and give corresponding operation programs and background knowledge without manually reviewing a manual, thereby assisting the user in quickly positioning a switch to be operated next.
The auxiliary flight method can rapidly, visually and actively display the operation information corresponding to the operation equipment, so that the manual operation error rate can be reduced, the fault handling time can be shortened, and a stronger auxiliary function and higher flight safety can be realized.
While specific embodiments of the invention have been described above, it will be appreciated by those skilled in the art that these are by way of example only, and that the scope of the invention is defined by the appended claims. Various changes and modifications to these embodiments may be made by those skilled in the art without departing from the spirit and scope of the invention, and these changes and modifications are within the scope of the invention.

Claims (11)

1. An auxiliary flight method based on an electronic flight manual file with an XML format is characterized by comprising the following steps:
structuring an original flight manual according to industry specifications to generate an XML-format electronic flight manual file which can be read by AR glasses, wherein each piece of content in the XML-format electronic flight manual file is endowed with an independent tag and an attribute which can be assigned, extracting the content with an operation class tag and forming a workflow according to the sequence of steps described in the electronic flight manual file;
scanning various devices and switches in an airplane cockpit by using a 3D scanner to generate a three-dimensional model material library, wherein three-dimensional models of the various devices and switches are matched with tags in the XML format electronic flight manual file;
the user selects the content of the XML format electronic flight manual file through the AR glasses, so that a workflow related to the selected content is triggered, and the three-dimensional model is sequentially called to be compared with the objects in the field of view of the AR glasses according to the sequence in the triggered workflow;
when the called three-dimensional model is consistent with the object in the visual field of the AR glasses, indicating that the operation object is successfully identified, the AR glasses display operation guide and warning information related to the operation object to assist the user operation to assist the flight.
2. The secondary flight method as claimed in claim 1, further comprising: associating a trigger event with an operation class tag and an alarm class tag of the XML-format electronic flight manual file, so that the trigger event is triggered after the user successfully identifies the operation object through the AR glasses, and therefore, the content associated with the operation object is called; and reading the invoked content in the AR glasses.
3. The secondary flight method as claimed in claim 1, further comprising: and manually searching the XML format electronic flight manual file according to chapters by clicking a file reading button in the display interface of the AR glasses by a user, and reading the XML format electronic flight manual.
4. The secondary flight method as claimed in claim 1, further comprising: and a verification step of verifying whether the contents of the XML format electronic flight manual meet the requirements in the AR glasses, wherein a verifier reads all published XML format electronic flight manual files to be confirmed through the AR glasses and executes all workflows according to the operation sequence recorded in the specified manual files after selecting the specified manual files so as to confirm whether all instructions meet the standard requirements.
5. The secondary flight method of claim 1, wherein the AR glasses are configured with a processor configured to be connectable to a computer via a data line; the secondary flight method further comprises: and the processor of the AR glasses receives the XML format electronic flight manual file edited and updated by the computer and generates a new workflow.
6. The secondary flight method of claim 1, wherein the AR glasses are configured to enable reading of content in the XML-format electronic flight manual file corresponding to an identification-type tag through the identification-type tag.
7. The secondary flight method according to claim 1, wherein the AR glasses are provided with a camera and configured to continuously scan pictures in the line of sight after triggering the workflow and display the scanned pictures on the AR glasses; and simultaneously, comparing the scanned object with the called three-dimensional model, and directly marking the operation object in a picture displayed on the AR glasses after the operation object is successfully identified.
8. The secondary flight method according to claim 1, wherein the AR glasses are provided with a display screen for displaying operation guidance and warning information associated with the operation object to a user.
9. The secondary flight method according to claim 1, wherein the AR glasses are capable of issuing operation guidance and warning information associated with the operation object to the user by voice.
10. The assisted flight method of claim 1, wherein the AR glasses include a memory in which the three-dimensional model material library and the XML-format electronic flight manual file are saved.
11. An auxiliary flight method according to claim 1, wherein the XML-format electronic flight manual file contains words, and multimedia materials such as pictures, audio and video which are inserted in the form of tags.
CN202111622420.XA 2021-12-28 2021-12-28 Auxiliary flight method based on XML format electronic flight manual Pending CN114299519A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111622420.XA CN114299519A (en) 2021-12-28 2021-12-28 Auxiliary flight method based on XML format electronic flight manual

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111622420.XA CN114299519A (en) 2021-12-28 2021-12-28 Auxiliary flight method based on XML format electronic flight manual

Publications (1)

Publication Number Publication Date
CN114299519A true CN114299519A (en) 2022-04-08

Family

ID=80969587

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111622420.XA Pending CN114299519A (en) 2021-12-28 2021-12-28 Auxiliary flight method based on XML format electronic flight manual

Country Status (1)

Country Link
CN (1) CN114299519A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117291399A (en) * 2023-11-24 2023-12-26 中航材利顿航空科技股份有限公司 Visual chemical card data processing method and system based on XML

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117291399A (en) * 2023-11-24 2023-12-26 中航材利顿航空科技股份有限公司 Visual chemical card data processing method and system based on XML
CN117291399B (en) * 2023-11-24 2024-01-26 中航材利顿航空科技股份有限公司 Visual chemical card data processing method and system based on XML

Similar Documents

Publication Publication Date Title
US10810365B2 (en) Workflow system and method for creating, distributing and publishing content
US10127215B2 (en) Systems and methods for displaying contextual revision history in an electronic document
US20070098263A1 (en) Data entry apparatus and program therefor
US20130262968A1 (en) Apparatus and method for efficiently reviewing patent documents
US20130179761A1 (en) Systems and methods for creating, editing and publishing cross-platform interactive electronic works
US11175934B2 (en) Method of defining and performing dynamic user-computer interaction, computer guided navigation, and application integration for any procedure, instructions, instructional manual, or fillable form
US20170371855A1 (en) Collecting and auditing structured data layered on unstructured objects
US20070234201A1 (en) Information Management Device
CN113391871A (en) RPA element intelligent fusion picking method and system
CN101490668A (en) Reuse of available source data and localizations
EP3103002B1 (en) Batch generation of links to documents based on document name and page content matching
CN111259202A (en) Document structured data embedding method and system
US20210271886A1 (en) System and method for capturing, indexing and extracting digital workflow from videos using artificial intelligence
CN114254158A (en) Video generation method and device, and neural network training method and device
CN114299519A (en) Auxiliary flight method based on XML format electronic flight manual
US11104454B2 (en) System and method for converting technical manuals for augmented reality
US20070220439A1 (en) Information Management Device
NL2025739B1 (en) Artificial intelligence and augmented reality system and method
JP2006276912A (en) Device, method, and program for editing document
CN106489110A (en) Graphic user interface for non-hierarchical file system
JP6577392B2 (en) Program development support device, program development support method, and computer program executable by program development support device
US20230392935A1 (en) Method for controlling dissemination of instructional content to operators performing procedures within a facility
CN117829128A (en) Intelligent manufacturing standard extraction system
Verhagen et al. The CLAMS Platform at Work: Processing Audiovisual Data from the American Archive of Public Broadcasting
AU2022254081A1 (en) Processing video for enhanced, interactive end user experience

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination