CN112734945B - Assembly guiding method, system and application based on augmented reality - Google Patents

Assembly guiding method, system and application based on augmented reality Download PDF

Info

Publication number
CN112734945B
CN112734945B CN202110337423.2A CN202110337423A CN112734945B CN 112734945 B CN112734945 B CN 112734945B CN 202110337423 A CN202110337423 A CN 202110337423A CN 112734945 B CN112734945 B CN 112734945B
Authority
CN
China
Prior art keywords
assembly
information
assembled
parts
dimensional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110337423.2A
Other languages
Chinese (zh)
Other versions
CN112734945A (en
Inventor
莫威
朱建
吴京京
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Smartstate Technology Co ltd
Original Assignee
Shanghai Smartstate Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Smartstate Technology Co ltd filed Critical Shanghai Smartstate Technology Co ltd
Priority to CN202110337423.2A priority Critical patent/CN112734945B/en
Publication of CN112734945A publication Critical patent/CN112734945A/en
Application granted granted Critical
Publication of CN112734945B publication Critical patent/CN112734945B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/003Navigation within 3D models or images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/10Geometric CAD
    • G06F30/17Mechanical parametric or variational design
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating
    • G06F9/44521Dynamic linking or loading; Link editing at or after load time, e.g. Java class loading
    • G06F9/44526Plug-ins; Add-ons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/04Manufacturing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30216Redeye defect
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Geometry (AREA)
  • Computer Hardware Design (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Business, Economics & Management (AREA)
  • Computer Graphics (AREA)
  • Remote Sensing (AREA)
  • Economics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Pure & Applied Mathematics (AREA)
  • Mathematical Optimization (AREA)
  • Mathematical Analysis (AREA)
  • Manufacturing & Machinery (AREA)
  • Computational Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention provides an assembly guiding method, a system and application based on augmented reality, wherein the assembly guiding method comprises the following steps: analyzing the contour information and the assembly information of the part model into structured data; identifying and capturing pose information of all parts, providing a three-dimensional tracking registration matrix of the parts for an AR scene, dynamically tracking and positioning the parts in a real scene, and building an augmented reality scene; extracting three-dimensional edge profile characteristic information of the parts in real time, and distinguishing a basic part and other parts of the to-be-assembled body in the current state; highlighting guide information at the position of the basic part to be assembled; carrying out envelope highlighting identification on a part to be assembled; assembling and dynamically visually recognizing and judging the assembling condition to realize the fusion of virtuality and reality; and repeating the steps to finish the installation of all parts. The invention improves the usability of the guiding process, improves the positioning precision of assembly, avoids the problem of wrong assembly and disassembly, and can be suitable for the assembly guiding of the to-be-assembled body with a simple structure or a complex structure.

Description

Assembly guiding method, system and application based on augmented reality
Technical Field
The invention relates to the technical field of industrial assembly, in particular to an assembly guiding method, a system and application based on augmented reality.
Background
The assembly refers to a process of jointing a plurality of parts into a component or jointing a plurality of parts and components into a product according to technical requirements, and debugging, checking and testing the product to be a qualified product.
In the traditional assembly process, a designer gives a paper assembly drawing and an assembly requirement to a field operator, the operator needs a certain time to understand the drawing requirement, the understanding speed and accuracy depend on the experience degree, supervision is lacked in the assembly process, the risk of assembly error is difficult to control, in addition, the communication between the operator and the designer is limited, a digital assembly guiding device and a digital assembly guiding method are urgently needed, the design end information is accurately and intuitively conveyed, particularly, the structure is complex, the assembly difficulty is large, and the assembly requirement is high, so that the problems of low assembly efficiency and unstable assembly quality of products are solved.
Meanwhile, the existing assembly methods based on the augmented reality technology are infinite, but most of the assembly methods have the problems of high cost and low assembly detection precision.
Therefore, an assembly guiding method needs to be designed to solve the problems of low assembly efficiency and unstable assembly quality of products.
Disclosure of Invention
The invention aims to design an assembly guiding method based on augmented reality, which has the advantages of high capture speed, high matching degree, high detection precision and convenience for popularization, and can realize assembly guiding of an assembly body to be assembled with high efficiency and high quality. An augmented reality-based system is also provided, which can be adapted to simple structures, in particular to assembly guidance of complex structures to be assembled.
The technical scheme for realizing the purpose of the invention is as follows:
the invention provides an assembly guiding method based on augmented reality, which comprises the following steps:
s1, acquiring and storing the part model outline information and the assembly information of the to-be-assembled body, and analyzing the part model outline information and the assembly information into structured data; the method for acquiring the structured data comprises the following steps: designing and developing an assembly process analysis plug-in, reading an assembly process file and assembly information of a to-be-assembled body, obtaining part model outline information and assembly information, and analyzing the part model outline information and the assembly information into structured data; the assembly information comprises assembly sequence, assembly process requirements, assembly guide information and assembly marking information;
s2, identifying and capturing pose information of all parts to be assembled on the operating platform according to the camera, providing a three-dimensional tracking registration matrix of the parts for an AR scene, and dynamically tracking and positioning the parts in a real scene to build an augmented reality scene, wherein the pose information comprises part model contour information and position information;
s3, acquiring the current position image of the part in real time according to the camera, extracting the three-dimensional edge contour feature information of the part, matching the three-dimensional edge contour feature information with the part contour information stored in S1, and distinguishing the basic part and the other parts of the to-be-assembled body in the current state;
s4, matching the contour information with data and assembly information, combining the three-dimensional tracking registration matrix and the current position image, and highlighting assembly guide information at the position to be assembled of the basic part;
s5, carrying out envelope body highlighting marking on parts to be assembled in the rest parts on the operating table, and guiding an operator to grab an object for assembly, wherein the envelope body is generated by utilizing a voxelization mode based on three-dimensional edge contour characteristic information so as to establish a collision body of the parts to be assembled for highlighting display and assembly detection;
s6, in the assembling process of the step S5, the assembling condition is judged through dynamic visual recognition, and virtual-real fusion is achieved;
and S7, repeating the steps S3 to S6, and guiding and assembling all parts until the to-be-assembled body is assembled.
The principle of the assembly guiding method is as follows: firstly, reading an assembly sequence and assembly marking information from a design end, designing a plug-in and obtaining structured data; then, binding the structured data with the three-dimensional model information of the body to be assembled to establish virtual model data; finally, calibration data of the basic part and other parts are dynamically provided through a camera fixed at the top end of the operation platform, the basic part in the current state is dynamically captured, identified and compared, three-dimensional calibration information is provided for an AR scene, virtual assembly information and a scene displaying an assembly body to be assembled are fused and superposed in the scene, and therefore guiding of the assembly process is achieved.
Further, in step S1, the structured data is obtained by: designing a development and assembly process analysis plug-in, reading an assembly process file and assembly information of a three-dimensional model to be assembled in CAD software through a standard data reading interface and a standard analysis template by adopting the development and assembly process analysis plug-in, obtaining part model outline information and assembly information, and analyzing the part model outline information and the assembly information into structured data. The assembling information comprises an assembling sequence and assembling process requirements, and the assembling of all parts of the to-be-assembled body can be accurately realized according to the assembling sequence, so that the problem of neglected assembling is avoided. The assembly process requires that the information such as the size, the outline, the assembly position and the like of each part of the to-be-assembled body can be accurately provided, the problem of wrong assembly is avoided, and the assembly precision of the to-be-assembled body is ensured.
Further, in step S3, the recognition and capture of the current position image of the part are obtained based on the deep learning contour recognition method.
Preferably, in step S6, it is determined whether assembly is completed, whether a wrong part exists in the part to be assembled, and whether a missing part exists in the part to be assembled, according to the proportion of the overlap ratio between the collision body and the part to be assembled.
Preferably, in step S6, whether the part to be assembled has an assembly pose difference is determined according to the proportion of the overlap ratio between the collision body and the part to be assembled and the position information, and whether the part to be assembled is reversely assembled is determined.
The invention also provides an assembly guiding system based on augmented reality, which is used for carrying out assembly guiding on an assembly body to be assembled by adopting the assembly guiding method, and the assembly guiding system comprises a vision module, a calculation module and a playing module.
The visual module is used for acquiring real-time images of the basic part and other parts in the current assembly process and acquiring feature point information of the real-time images.
The calculation module is used for acquiring enveloping body information and assembly detection information of the parts to be assembled by adopting a visual algorithm, and calculating a video frame in a real-time image by combining preset assembly part 3D model data to obtain a three-dimensional registration matrix and a current assembly state of the parts to be assembled.
The playing module is used for a graphic rendering engine and outputs assembled image information in real time by combining the data of the calculation module.
Furthermore, the image information output by the playing module can be presented through three types of equipment, namely head-wearing equipment, handheld equipment and space display equipment.
Further, the vision module is a camera.
Furthermore, the playing module supports various presentation devices, and the presentation devices include three types, such as a head-mounted type, a handheld type and a space projection type.
Further, the image information output by the playing module is arranged in the image center of the basic part by the world coordinate origin of the three-dimensional engine, the contour characteristics and the assembly information of the parts in the database are read, and a three-dimensional model of the parts to be assembled is generated by combining a three-dimensional registration matrix; and combining the assembly process file with the assembly process judgment, highlighting the enveloping body of the part to be assembled, highlighting the three-dimensional model of the part to be assembled, and displaying the assembly process requirement.
The invention also provides application of the assembly guiding system based on augmented reality, and the assembly guiding method and the assembly guiding system are applied to assemble workpieces with complex structures, wherein the workpieces with complex structures comprise spacecrafts.
Compared with the prior art, the invention has the beneficial effects that: the assembly guiding method based on augmented reality can provide more visual assembly guidance and reduce the training cost of assembly; the behavior constraint based on the assembly process can be provided, and the development cost cycle is shortened; an augmented reality display with higher expansibility can be provided; the actual assembly result can be checked, and the assembly quality is improved.
Drawings
In order to more clearly illustrate the technical solution of the embodiment of the present invention, the drawings used in the description of the embodiment will be briefly introduced below. It should be apparent that the drawings in the following description are only for illustrating the embodiments of the present invention or technical solutions in the prior art more clearly, and that other drawings can be obtained by those skilled in the art without any inventive work.
FIG. 1 is a flow chart of an augmented reality based assembly guidance method of the present invention;
FIG. 2 is an architectural diagram of an assembly guide in an embodiment of the present invention;
fig. 3 is a flowchart of assembly guidance in the embodiment.
Detailed Description
The invention will be further described with reference to specific embodiments, and the advantages and features of the invention will become apparent as the description proceeds. These examples are illustrative only and do not limit the scope of the present invention in any way. It will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention, and that such changes and modifications may be made without departing from the spirit and scope of the invention.
In the description of the present embodiments, it is to be understood that the terms "center", "longitudinal", "lateral", "up", "down", "front", "back", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", etc. indicate orientations or positional relationships based on those shown in the drawings, and are only for convenience of describing the present invention and simplifying the description, but do not indicate or imply that the device or element referred to must have a particular orientation, be constructed and operated in a particular orientation, and thus, should not be construed as limiting the present invention.
Furthermore, the terms "first," "second," "third," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicit to a number of indicated technical features. Thus, a feature defined as "first," "second," etc. may explicitly or implicitly include one or more of that feature. In the description of the invention, the meaning of "a plurality" is two or more unless otherwise specified.
Example 1:
the embodiment provides an assembly guiding system based on augmented reality, the assembly guiding system adopts the assembly guiding method designed by the invention to carry out assembly guiding on an assembly body to be assembled, and the assembly guiding system comprises a vision module, a calculation module and a playing module.
The visual module is used for acquiring real-time images of the basic part and other parts in the current assembly process and acquiring feature point information of the real-time images.
Specifically, the vision module is a camera, and the camera performs space coordinate calculation to identify the current state of the foundation piece, including pose information, contour features of the installation position of the part to be assembled, and the like.
The calculation module is used for acquiring enveloping body information and assembly detection information of the parts to be assembled by adopting a visual algorithm, and calculating a video frame in a real-time image by combining preset assembly part 3D model data to obtain a three-dimensional registration matrix and a current assembly state of the parts to be assembled.
The playing module is used for a graphic rendering engine and outputs assembled image information in real time by combining the data of the calculation module. Specifically, the image information output by the playing module can be presented through three types of equipment, namely head-mounted equipment, handheld equipment and space display equipment.
Specifically, the image information output by the playing module is arranged in the image center of the basic part by using the world coordinate origin of the three-dimensional engine, the contour characteristics and the assembly information of the parts in the database are read, and a three-dimensional model of the parts to be assembled is generated by combining a three-dimensional registration matrix; and combining the assembly process file with the assembly process judgment, highlighting the enveloping body of the part to be assembled, highlighting the three-dimensional model of the part to be assembled, and displaying the assembly process requirement.
Example 2:
in this embodiment, the assembly guiding method based on augmented reality, as shown in fig. 1, is a flowchart of the assembly guiding method, and includes the following steps:
and S1, acquiring and storing the part model outline information and the assembly information of the to-be-assembled body, and analyzing the part model outline information and the assembly information into structured data.
Specifically, the method for acquiring the structured data comprises the following steps: designing a development and assembly process analysis plug-in, reading an assembly process file and assembly information of a three-dimensional model to be assembled in CAD software through a standard data reading interface and a standard analysis template by adopting the development and assembly process analysis plug-in, obtaining part model outline information and assembly information, and analyzing the part model outline information and the assembly information into structured data. The assembly information comprises an assembly sequence, assembly process requirements, assembly guide information and assembly marking information, and the assembly of all parts of the to-be-assembled body can be accurately realized according to the assembly sequence, so that the problem of neglected assembly is avoided. The assembly process requires that the information such as the size, the outline, the assembly position and the like of each part of the to-be-assembled body can be accurately provided, the problem of wrong assembly is avoided, and the assembly precision of the to-be-assembled body is ensured.
And S2, recognizing and capturing pose information of all parts to be assembled on the operation table according to the camera, providing a three-dimensional tracking registration matrix of the parts for the AR scene, and dynamically tracking and positioning the parts in the real scene to build an augmented reality scene. The pose information comprises contour information and position information of the part model.
Specifically, step S2 is a three-dimensional registration of the console, which belongs to the assembly preparation process, and is intended to improve the reusability of assembly guides, all of which are tied to the actual assembly process requirements through part identification. The contour comparison is carried out according to the contour information of all the parts recorded in the step S1 and the contour information of all the parts (basic parts and other parts) placed on the operation table, the registration of image acquisition is realized, the three-dimensional registration matrix is provided for all the assembly parts by taking the contour information as basic coordinates, and simultaneously the contours of all the other parts are extracted and the contour comparison is carried out so as to identify the positioning of the parts and the parts.
And S3, acquiring the current position image of the part in real time according to the camera, extracting the three-dimensional edge contour characteristic information of the part, matching the three-dimensional edge contour characteristic information with the part contour information stored in the S1, and distinguishing the basic part and the other parts to be assembled in the current state.
It should be noted here that the base part refers to an assembled semi-finished product that has been assembled in a previous step in a different assembly step, and the remaining parts refer to all parts that remain unassembled in the assembly body except for the base part.
The identification and capture of the current position image of the part are acquired based on a deep learning contour identification method.
And S4, matching the contour information with the data and the assembly information, combining the three-dimensional tracking registration matrix and the current position image, and highlighting the assembly guide information at the position to be assembled of the basic part.
In the step, the position to be assembled of the base part in the three-dimensional dynamic model is subjected to high-brightness processing, the specific position of the part to be assembled, which needs to be installed on the base part, can be marked, and the assembly can be rapidly performed by combining the contour of the position to be assembled with the contour of the part to be assembled.
And S5, carrying out envelope highlighting marking on parts to be assembled in the rest parts on the operating platform, and guiding an operator to grab an object for assembly.
Specifically, in the step, through three-dimensional edge contour characteristic information, a collision body of the part to be assembled is established in a voxelization envelope body generating mode and is used for highlight display and assembly detection.
S6, in the assembling process of the step S5, the assembling condition is judged through dynamic visual recognition, and virtual-real fusion is achieved.
Preferably, in step S6, it is determined whether assembly is completed, whether a wrong part exists in the part to be assembled, and whether a missing part exists in the part to be assembled, according to the proportion of the overlap ratio between the collision body and the part to be assembled.
Preferably, in step S6, whether the part to be assembled has an assembly pose difference is determined according to the proportion of the overlap ratio between the collision body and the part to be assembled and the position information, and whether the part to be assembled is reversely assembled is determined.
And S7, repeating the steps S3 to S6, and guiding and assembling all parts until the to-be-assembled body is assembled.
The principle of the assembly guiding method is as follows: firstly, reading an assembly sequence and assembly marking information from a design end, designing a plug-in and obtaining structured data; then, binding the structured data with the three-dimensional model information of the body to be assembled to establish virtual model data; finally, calibration data of the basic part and other parts are dynamically provided through a camera fixed at the top end of the operation platform, the basic part in the current state is dynamically captured, identified and compared, three-dimensional calibration information is provided for an AR scene, virtual assembly information and a scene displaying an assembly body to be assembled are fused and superposed in the scene, and therefore guiding of the assembly process is achieved.
An assembly guide framework, as shown in fig. 2, firstly, reading an assembly sequence and assembly marking information at a design end, and inputting the assembly sequence and the assembly marking information into a constructed three-dimensional model and an assembled virtual model; then, in the assembly process, the entity information identified by the camera is fused with the virtual information of the virtual model, assembly guidance is output, three-dimensional label display based on the assembly sequence, assembly body (the installation position of the basic part and the part to be assembled) highlight display is provided, and the assembly structure is detected.
Example 3:
the present embodiment provides an application of assembly guide suitable for assembly guide of a simple structure, particularly suitable for assembly guide of a body to be assembled of a complex structure, which is assembled using the assembly guide structure of embodiment 1 and the assembly guide method of embodiment 2.
In particular, the body to be assembled comprises piece A, B, C, D, E, F, G, H … ….
S101, during assembly, starting an assembly program, and extracting and identifying the type and the pose of a central part of an operation table by using a camera;
s102, reading three-dimensional model data and assembly process data in the CAD by using a development assembly process analysis plug-in, analyzing the three-dimensional model data and the assembly process data into structured process flow data, and storing the structured process flow data into a database, wherein the data are assembly sequence and assembly process requirements of an assembly guiding process;
s103, identifying the outline information (parameters such as outline and size) of the part A, B, C, D, E, F, G, H … … by using a camera, comparing the outline information with the information in a database, judging whether each part meets the assembly process requirement of the to-be-assembled body, and acquiring the assembly information (installation position, sequence and the like) of each part; in this step and the following steps, the camera is freely selected as a binocular camera, and a 3D camera or other cameras may be selected.
S104, judging the actual assembly progress according to the completion condition according to the assembly sequence read in the S102, highlighting the parts to be assembled, highlighting assembly guide information at the assembly position of the basic parts, and guiding the assembly process;
and S105, assembling the part B to the part A according to the assembly sequence read in the S102 and the assembly guiding method and the assembly process requirements to form a basic part, wherein all the rest parts are collectively called as rest parts.
And S106, dynamically identifying the assembling process of assembling the part B on the part A, correcting the error in real time, and judging the assembling result.
S107, repeating the steps S104 to S106, and sequentially assembling the parts C to be assembled in the rest parts to the basic part (A + B), the parts D to be assembled in the rest parts to the basic part (A + B + C), the parts E to be assembled in the rest parts to the basic parts (A + B + C + D) and … … until all the parts of the parts to be assembled are assembled; specifically, the assembly method of the component C, D, E, F, G, H … … is the same as that of the component B, and the assembly of each component needs to be guided by the assembly guiding method of embodiment 2 and the assembly result is tested.
The assembly guiding method and system are described below by assembling a spacecraft with a complex structure, as shown in fig. 3:
assembly is started;
grabbing a center image of the working head (taking a picture through a double-bus camera);
calling data in a workpiece database, and identifying the type of a workpiece (namely an assembly body to be assembled);
designing a standardized plug-in according to the process file, analyzing the process file according to the standardized plug-in, and reading an assembly process;
identifying whether the process requirements are met and judging;
1. if the parts meet the process requirement (Y), highlighting the parts in sequence to prompt the assembly process, dynamically identifying the positions of the parts, and judging whether the parts are assembled in place;
1.1 if the assembly is in the position (Y), judging whether the assembly process flow is finished, if not, continuing to assemble the next part, and if so, finishing the assembly.
1.2 if the assembly is not in place (N), returning to the previous step to dynamically identify the position of the part again, and carrying out the assembly again until the assembly is in place.
2. And if the part does not meet the (N) process requirement, ending the program, prompting to replace the part, and performing guide assembly again.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.
Furthermore, it should be understood that although the present description refers to embodiments, not every embodiment may contain only a single embodiment, and such description is for clarity only, and those skilled in the art should integrate the description, and the embodiments may be combined as appropriate to form other embodiments understood by those skilled in the art.

Claims (7)

1. An assembly guiding method based on augmented reality is characterized in that: the method comprises the following steps:
s1, acquiring and storing the part model outline information and the assembly information of the to-be-assembled body, and analyzing the part model outline information and the assembly information into structured data, wherein the acquisition method of the structured data comprises the following steps: designing and developing an assembly process analysis plug-in, reading an assembly process file and assembly information of a to-be-assembled body, obtaining part model outline information and assembly information, and analyzing the part model outline information and the assembly information into structured data; the assembly information comprises assembly sequence, assembly process requirements, assembly guide information and assembly marking information;
s2, identifying and capturing pose information of all parts to be assembled on the operating platform according to the camera, providing a three-dimensional tracking registration matrix of the parts for an AR scene, and dynamically tracking and positioning the parts in a real scene to build an augmented reality scene, wherein the pose information comprises part model contour information and position information;
s3, acquiring the current position image of the part in real time according to the camera, extracting the three-dimensional edge contour feature information of the part, matching the three-dimensional edge contour feature information with the part contour information stored in S1, and distinguishing the basic part and the other parts of the to-be-assembled body in the current state;
s4, matching the contour information with data and assembly information, combining the three-dimensional tracking registration matrix and the current position image, and highlighting assembly guide information at the position to be assembled of the basic part;
s5, carrying out envelope body highlighting identification on parts to be assembled in the rest parts on the operating table, and guiding an operator to grab an object for assembly, wherein the envelope body is generated by a voxelization mode based on three-dimensional edge contour characteristic information so as to establish a collision body of the parts to be assembled;
s6, in the assembling process of the step S5, the assembling condition is judged through dynamic visual recognition, and virtual-real fusion is achieved; the method for judging the assembly condition by dynamic visual recognition comprises the following steps: judging whether assembly is finished or not, whether wrong parts exist in the parts to be assembled or not and whether missing parts exist in the parts to be assembled or not according to the proportion of the contact ratio of the collision body and the parts to be assembled; or judging whether the part to be assembled has assembly pose difference or not according to the contact ratio and the position information of the collision body and the part to be assembled, and judging whether the part to be assembled is reversely assembled or not;
and S7, repeating the steps S3 to S6, and guiding and assembling all parts until the to-be-assembled body is assembled.
2. The fitting guide method according to claim 1, characterized in that: in step S1, the method for acquiring structured data includes: designing a development and assembly process analysis plug-in, reading an assembly process file and assembly information of a three-dimensional model to be assembled in CAD software through a standard data reading interface and a standard analysis template by adopting the development and assembly process analysis plug-in, obtaining part model outline information and assembly information, and analyzing the part model outline information and the assembly information into structured data.
3. The fitting guide method according to claim 1, characterized in that: in step S3, the recognition and capture of the current position image of the part are obtained based on the deep learning contour recognition method.
4. An augmented reality-based assembly guidance system, characterized by: the assembly guiding method of any one of claims 1 to 3 is adopted to guide the assembly to be assembled, and the assembly guiding system comprises a vision module, a calculation module and a playing module;
the visual module is used for acquiring real-time images of the basic part and other parts in the current assembly process and acquiring characteristic point information of the real-time images;
the calculation module is used for acquiring enveloping body information and assembly detection information of the part to be assembled by adopting a visual algorithm, and calculating a video frame in a real-time image by combining preset 3D model data of the part to be assembled to obtain a three-dimensional registration matrix and a current assembly state of the part to be assembled;
the playing module is used for a graphic rendering engine and outputs assembled image information in real time by combining the data of the calculation module.
5. The assembly guide system of claim 4, wherein: the image information output by the playing module can be presented through three types of equipment, namely head-mounted equipment, handheld equipment and space display equipment.
6. The assembly guide system of claim 4, wherein: the image information output by the playing module is arranged in the image center of the basic part by the world coordinate origin of the three-dimensional engine, the contour characteristics and the assembly information of the part in the database are read, and a three-dimensional model of the part is generated by combining a three-dimensional registration matrix; and combining the assembly process file with the assembly process judgment, highlighting the enveloping body of the part to be assembled, highlighting the three-dimensional model of the part to be assembled, and displaying the assembly process requirement.
7. An application of an augmented reality based assembly guidance system, characterized by: assembling a complex structure workpiece comprising a spacecraft by applying the assembly guiding method of any one of claims 1 to 3 and the assembly guiding system of any one of claims 4 to 6.
CN202110337423.2A 2021-03-30 2021-03-30 Assembly guiding method, system and application based on augmented reality Active CN112734945B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110337423.2A CN112734945B (en) 2021-03-30 2021-03-30 Assembly guiding method, system and application based on augmented reality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110337423.2A CN112734945B (en) 2021-03-30 2021-03-30 Assembly guiding method, system and application based on augmented reality

Publications (2)

Publication Number Publication Date
CN112734945A CN112734945A (en) 2021-04-30
CN112734945B true CN112734945B (en) 2021-08-17

Family

ID=75596020

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110337423.2A Active CN112734945B (en) 2021-03-30 2021-03-30 Assembly guiding method, system and application based on augmented reality

Country Status (1)

Country Link
CN (1) CN112734945B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113343660B (en) * 2021-06-17 2023-05-12 东风柳州汽车有限公司 Assembly file creation system and method
CN113673894B (en) * 2021-08-27 2024-02-02 东华大学 Multi-person cooperation AR assembly method and system based on digital twinning
CN114323000B (en) * 2021-12-17 2023-06-09 中国电子科技集团公司第三十八研究所 Cable AR guide assembly system and method
CN114197884B (en) * 2021-12-27 2022-07-08 广东景龙建设集团有限公司 Assembling guiding method and system for customized decorative wallboard
CN114723817A (en) * 2022-03-24 2022-07-08 华中科技大学 Auxiliary assembly method and system based on aviation knowledge graph
CN115170776A (en) * 2022-07-27 2022-10-11 浙江大学 Handicraft path guiding learning system based on mixed reality technology
CN116300773B (en) * 2023-05-19 2023-08-01 深圳市致尚科技股份有限公司 Flexible control method, device and storage medium for fully automatic assembly of electronic products

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104484523A (en) * 2014-12-12 2015-04-01 西安交通大学 Equipment and method for realizing augmented reality induced maintenance system
CN108664722A (en) * 2018-05-04 2018-10-16 北京卫星环境工程研究所 Satellite cable based on augmented reality is laid with guidance system and guidance method
CN109164777A (en) * 2018-10-15 2019-01-08 上海交大智邦科技有限公司 From the flexible manufacturing system and method for moving production
CN109491497A (en) * 2018-10-19 2019-03-19 华中科技大学 A kind of human assistance assembly application system based on augmented reality
CN109636854A (en) * 2018-12-18 2019-04-16 重庆邮电大学 A kind of augmented reality three-dimensional Tracing Registration method based on LINE-MOD template matching
CN110147162A (en) * 2019-04-17 2019-08-20 江苏大学 A kind of reinforced assembly teaching system and its control method based on fingertip characteristic
CN110390137A (en) * 2019-06-24 2019-10-29 浙江大学 A kind of chain feature extraction matching process for the registration of machine components three-dimensional
CN110555271A (en) * 2019-09-04 2019-12-10 兰州理工大学 Auxiliary assembly method and device
CN110744549A (en) * 2019-11-11 2020-02-04 电子科技大学 Intelligent assembly process based on man-machine cooperation
CN110866332A (en) * 2019-10-29 2020-03-06 中国电子科技集团公司第三十八研究所 Complex cable assembly assembling method and system
CN111968228A (en) * 2020-06-28 2020-11-20 成都飞机工业(集团)有限责任公司 Augmented reality self-positioning method based on aviation assembly

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10429923B1 (en) * 2015-02-13 2019-10-01 Ultrahaptics IP Two Limited Interaction engine for creating a realistic experience in virtual reality/augmented reality environments
US20200259896A1 (en) * 2019-02-13 2020-08-13 Telefonaktiebolaget Lm Ericsson (Publ) Industrial Automation with 5G and Beyond
KR102174035B1 (en) * 2019-04-15 2020-11-04 선문대학교 산학협력단 Object inspection method using an augmented-reality
CN110928418A (en) * 2019-12-11 2020-03-27 北京航空航天大学 Aviation cable auxiliary assembly method and system based on MR

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104484523A (en) * 2014-12-12 2015-04-01 西安交通大学 Equipment and method for realizing augmented reality induced maintenance system
CN108664722A (en) * 2018-05-04 2018-10-16 北京卫星环境工程研究所 Satellite cable based on augmented reality is laid with guidance system and guidance method
CN109164777A (en) * 2018-10-15 2019-01-08 上海交大智邦科技有限公司 From the flexible manufacturing system and method for moving production
CN109491497A (en) * 2018-10-19 2019-03-19 华中科技大学 A kind of human assistance assembly application system based on augmented reality
CN109636854A (en) * 2018-12-18 2019-04-16 重庆邮电大学 A kind of augmented reality three-dimensional Tracing Registration method based on LINE-MOD template matching
CN110147162A (en) * 2019-04-17 2019-08-20 江苏大学 A kind of reinforced assembly teaching system and its control method based on fingertip characteristic
CN110390137A (en) * 2019-06-24 2019-10-29 浙江大学 A kind of chain feature extraction matching process for the registration of machine components three-dimensional
CN110555271A (en) * 2019-09-04 2019-12-10 兰州理工大学 Auxiliary assembly method and device
CN110866332A (en) * 2019-10-29 2020-03-06 中国电子科技集团公司第三十八研究所 Complex cable assembly assembling method and system
CN110744549A (en) * 2019-11-11 2020-02-04 电子科技大学 Intelligent assembly process based on man-machine cooperation
CN111968228A (en) * 2020-06-28 2020-11-20 成都飞机工业(集团)有限责任公司 Augmented reality self-positioning method based on aviation assembly

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
AR辅助装配中基体及零件位姿组合估计方法;刘然等;《机械设计与研究》;20181231;第34卷(第6期);第119-125、137页 *
Augmented reality in a serious game for manual assembly processes;Robert Woll等;《IEEE》;20111201;第37-39页 *
航天产品装配作业增强现实引导训练***及应用;尹旭悦等;《航空制造技术》;20181231;第61卷(第1/2期);第48-53页 *

Also Published As

Publication number Publication date
CN112734945A (en) 2021-04-30

Similar Documents

Publication Publication Date Title
CN112734945B (en) Assembly guiding method, system and application based on augmented reality
CN109584295B (en) Method, device and system for automatically labeling target object in image
US11049280B2 (en) System and method for tying together machine vision coordinate spaces in a guided assembly environment
JP5273196B2 (en) Image processing device
EP2728548B1 (en) Automated frame of reference calibration for augmented reality
JP4492654B2 (en) 3D measuring method and 3D measuring apparatus
JP4347386B2 (en) Processing robot program creation device
US10223589B2 (en) Vision system for training an assembly system through virtual assembly of objects
US20100134601A1 (en) Method and device for determining the pose of video capture means in the digitization frame of reference of at least one three-dimensional virtual object modelling at least one real object
US7027963B2 (en) Simulation system
CN105500370B (en) A kind of robot off-line teaching programing system and method based on body-sensing technology
WO2018223038A1 (en) Augmented reality application for manufacturing
US11455767B1 (en) Intelligent material completeness detection and configuration method based on digital twin and augmented reality (AR)
CN112732075B (en) Virtual-real fusion machine teacher teaching method and system for teaching experiments
CN116524022B (en) Offset data calculation method, image fusion device and electronic equipment
JP4788475B2 (en) Image processing apparatus and image processing method
JP2019211981A (en) Information processor, information processor controlling method and program
KR101792701B1 (en) Apparatus and method for inspecting drawing
CN112059983A (en) Method, device and computer readable medium for assembling workpiece
CN110853103A (en) Data set manufacturing method for deep learning attitude estimation
JP4236202B2 (en) Modeling apparatus and camera parameter calculation method
JP2003022454A (en) Method for converting data coordinate
CN117377929A (en) Identifying places of interest on physical objects by 3D models of the physical objects in an augmented reality view
JP3765061B2 (en) Offline teaching system for multi-dimensional coordinate measuring machine
CN113642565B (en) Object detection method, device, equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant