CN116310231A - Engineering equipment real-time interaction system and motion simulation method based on mixed reality - Google Patents

Engineering equipment real-time interaction system and motion simulation method based on mixed reality Download PDF

Info

Publication number
CN116310231A
CN116310231A CN202211100913.1A CN202211100913A CN116310231A CN 116310231 A CN116310231 A CN 116310231A CN 202211100913 A CN202211100913 A CN 202211100913A CN 116310231 A CN116310231 A CN 116310231A
Authority
CN
China
Prior art keywords
rigid
rigid body
constraint
real
motion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211100913.1A
Other languages
Chinese (zh)
Inventor
代明远
路易霖
王玮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu Xugong Construction Machinery Research Institute Co ltd
Yanshan University
Original Assignee
Jiangsu Xugong Construction Machinery Research Institute Co ltd
Yanshan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu Xugong Construction Machinery Research Institute Co ltd, Yanshan University filed Critical Jiangsu Xugong Construction Machinery Research Institute Co ltd
Priority to CN202211100913.1A priority Critical patent/CN116310231A/en
Publication of CN116310231A publication Critical patent/CN116310231A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03543Mice or pucks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/10Geometric CAD
    • G06F30/15Vehicle, aircraft or watercraft design
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/20Design optimisation, verification or simulation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/20Education
    • G06Q50/205Education administration or guidance
    • G06Q50/2057Career enhancement or continuing education service
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2111/00Details relating to CAD techniques
    • G06F2111/04Constraint-based CAD
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2111/00Details relating to CAD techniques
    • G06F2111/10Numerical modelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2119/00Details relating to the type or aim of the analysis or the optimisation
    • G06F2119/14Force analysis or force optimisation, e.g. static or dynamic forces
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Geometry (AREA)
  • Computer Hardware Design (AREA)
  • Educational Technology (AREA)
  • Educational Administration (AREA)
  • Strategic Management (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Human Resources & Organizations (AREA)
  • Computer Graphics (AREA)
  • Tourism & Hospitality (AREA)
  • Evolutionary Computation (AREA)
  • Marketing (AREA)
  • General Health & Medical Sciences (AREA)
  • Economics (AREA)
  • Primary Health Care (AREA)
  • General Business, Economics & Management (AREA)
  • Health & Medical Sciences (AREA)
  • Automation & Control Theory (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Pure & Applied Mathematics (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention belongs to the technical field of man-machine interaction, and discloses a real-time interaction system and a motion simulation method of engineering equipment based on mixed reality, wherein the real-time interaction system comprises mixed reality hardware, a control handle, a simulation control module, a space mapping and collision detection module and an information storage module; by establishing a complex mechanical constraint relation in engineering equipment operation simulation, introducing simple constraint in a physical engine to simulate the motion process of each component of the engineering equipment, the problem of considering both real-time performance and simulation effect of motion simulation is solved, and an operator can perform virtual operation on the engineering equipment in a scene in a mixed virtual environment with real immersion and interactivity to simulate the construction process, so that a platform-independent virtual interactive simulation method capable of meeting the interactivity and real-time performance is provided.

Description

Engineering equipment real-time interaction system and motion simulation method based on mixed reality
Technical Field
The invention belongs to the technical field of man-machine interaction, and particularly relates to a real-time interaction system and a motion simulation method of engineering equipment based on mixed reality.
Background
Engineering equipment such as high-altitude operation equipment, pump trucks, drilling trolleys and the like is complex in structure, high in control difficulty and high in danger coefficient, technical level requirements on operators are high, in order to reduce construction risks, improve working quality and efficiency, ensure construction safety, and simulate corresponding movement behaviors generated by engineering equipment control of operators, a simulation control simulation training system related to engineering machinery is used, interactivity cannot be met, mixed reality is a computer virtual technology enabling real world and virtual objects to be displayed and interacted in the same visual space, a brand-new visual environment is created, and rapid development and application are gradually provided in the fields of industry, education training, entertainment, medical treatment and the like.
In the process of engineering equipment development, motion simulation is mostly carried out under the environments of professional software such as PRO/E, ADAMS, and the like, so that the simulation has higher dependence on a development platform and low interactivity, and cannot be trained or verified in the field; in order to make the development process independent of the software platform, the traditional method is to calculate the position coordinates of the movements of different parts of the working device through a transformation matrix and then display the position coordinates, and the calculated result is relatively accurate, but the calculation is complex and difficult to meet the real-time requirement; in the prior art, the motion of the emergency engineering machinery is simulated based on a virtual reality modeling language, and motion control can be simplified through the setting of DOF nodes, but the development of a control interface corresponding to a platform is not perfect. However, in the field of virtual simulation control of engineering equipment, no technology can simultaneously meet the requirements of real-time construction flow verification and training on a construction site, real-time operation simulation of equipment motion and visual display in a virtual visualization mode.
Disclosure of Invention
Aiming at the defects of the prior art, the invention provides a real-time interaction system and a motion simulation method of engineering equipment based on mixed reality.
In order to achieve the above purpose, the present invention provides the following technical solutions:
in a first aspect, the invention provides a real-time interaction system of engineering equipment based on mixed reality, which comprises mixed reality hardware, a manipulation control handle, a simulation control module, a space mapping and collision detection module and an information storage module;
the mixed reality hardware is respectively in communication connection with the simulation control module, the space mapping and collision detection module and is used for acquiring and displaying a three-dimensional model of engineering equipment, identifying and acquiring construction site environment information and acquiring operator position information;
the control handle is in communication connection with the simulation control module and is used for transmitting an operating digital signal to the simulation control module through handle driving;
the simulation control module is respectively in communication connection with the mixed reality hardware, the control handle, the space mapping and collision detection module and the information storage module, and is used for receiving virtual interaction information, analyzing the virtual interaction information into instruction information corresponding to the movement of the virtual three-dimensional model, and simulating the movement of the virtual three-dimensional model in real time according to the instruction; the virtual interaction information comprises a handle operation digital signal from a control handle, handle operation of mixed reality hardware, operator gestures, interaction instruction information of operator voice, operator position information for receiving and wearing the mixed reality hardware and real construction site environment information;
The space mapping and collision detection module is respectively in communication connection with the mixed reality hardware and the simulation control module, and is used for displaying the construction site environment information acquired by the mixed reality hardware and the virtual three-dimensional model in the simulation control module in a real construction site environment through a display screen of the mixed reality hardware in a virtual-real fusion mode, and simultaneously carrying out accurate collision detection and real-time motion interference simulation between a virtual object and a real scene object;
the information storage module is respectively in communication connection with the simulation control module, the space mapping and the collision detection module and is used for storing motion state information and collision detection information records during virtual training and virtual verification so as to realize the reproduction of training and verification results.
With reference to the first aspect, the real-time interactive system of the present invention further includes a server, where the server is a hardware carrier of the simulation control module, the space mapping and collision detection module, and the information storage module.
With reference to the first aspect, further, the simulation control module includes an interactive instruction acquisition module, a dynamics simulation calculation module, and a scene management and graphic image real-time rendering module;
The interactive instruction acquisition module is used for receiving operation information of an operator wearing the mixed reality hardware, wherein the operation information is virtual interactive information;
the dynamic simulation calculation module is a core control module of the whole simulation system and is used for carrying out real-time dynamic simulation calculation according to the operation information of operators and outputting simulation calculation results to the scene management and graphic image real-time rendering module;
the scene management and graphic image real-time rendering module is used for updating the motion state of the three-dimensional model of the engineering equipment in the virtual scene in real time according to the simulation calculation result of the dynamic simulation calculation module, and outputting the visual operation result to the mixed reality hardware by utilizing the graphic rendering engine for displaying to an operator. The simulation control module analyzes the received virtual interaction information into instruction information corresponding to the motion of the virtual three-dimensional model through the dynamic simulation calculation module, the virtual three-dimensional model simulates the motion in real time according to a specific instruction, and the image information of the virtual three-dimensional model is synchronously displayed in a display screen of the mixed reality hardware through the scene management and graphic image real-time rendering module, the space mapping and collision detection module.
With reference to the first aspect, further, the manipulation control handle includes a handle drive, and the manipulation control handle transmits an operating digital signal to the simulation control module through the handle drive; the handle drive analyzes an operation control flow input by the operation control handle into an operation digital signal, and transmits the operation digital signal to the simulation control module, in particular to an interactive instruction acquisition module in the simulation control module.
With reference to the first aspect, further, a server where the handle driving and simulation control module is located is connected by bluetooth or USB.
In a second aspect, the invention provides a motion simulation method of engineering equipment based on mixed reality, and the motion simulation calculation module is used for establishing corresponding kinematic constraints among rigid body components of the engineering equipment by using a physical engine through multi-rigid body dynamics modeling of the engineering equipment based on the real-time simulated interaction system, wherein the kinematic constraints comprise a multi-rigid body dynamics system and a motion constraint system.
With reference to the second aspect, further, the motion simulation method includes the following steps:
step one: configuring mixed reality hardware, manipulating a control handle and a server; i.e. the mixed reality hardware is in communication connection with the server, and the control handle is in communication connection with the server.
Step two: loading a virtual model to the mixed reality hardware; the server transmits the stored virtual scene data to the mixed reality hardware, and the mixed reality hardware fuses the virtual scene and the real scene after identifying the actual environment information, so as to present a virtual-real fusion scene.
Step three: the user performs operation input by manipulating the control handle; if the user operation input is wrong, carrying out error information prompt; if the operation input of the user is correct, performing dynamic simulation calculation, and outputting a simulation calculation result to a scene management and graphic image real-time rendering module, wherein the scene management and graphic image real-time rendering module updates the motion state of the three-dimensional model of the engineering equipment in the virtual scene in real time according to the simulation calculation result of the dynamic simulation calculation module, and outputs the operation result to the mixed reality hardware to be displayed to an operator;
step four: displaying the engineering equipment three-dimensional model in the real-time updated virtual scene in the step three in a real construction site environment in a virtual-real fusion mode, and simultaneously performing collision detection and real-time motion interference simulation between the virtual object and the real scene object; if interference with the real environment is detected, sending an interference information prompt to an operator wearing the mixed reality hardware, and if interference with the real environment is detected, continuing to execute the next verification and training task;
Step five: writing the operation process into an information storage module, recording and storing the operation process information in the process, and completing the task.
With reference to the second aspect, further, the multi-rigid-body dynamic system includes a multi-rigid-body system physical model and a multi-rigid-body system mathematical model; the multi-rigid body system physical model is a physical model for forming the mechanical characteristics of the expression system by carrying out physical modeling on engineering equipment geometric models, and the physical modeling comprises the following steps:
step S1: assembling the engineering equipment geometric model according to the kinematic constraint and the initial position condition of the virtual three-dimensional model, and setting a father-son nesting relationship between the virtual three-dimensional model components by analyzing the motion dependency relationship between the virtual three-dimensional model components to finally finish the construction of the whole engineering equipment model tree;
step S2: each geometric model member is considered as a rigid member and its Cartesian generalized coordinate vector is set
Figure BDA0003840354650000051
Where l=1, 2, &, n, n represents a multi-rigid body system having n points, and any adjacent rigid bodies connected by hinges are regarded as one unit, i.e., n represents the number of units of the multi-rigid body system; q l The single rigid body is used as a reference, the position of the other rigid body relative to the rigid body is described by generalized coordinates, a Cartesian generalized coordinate vector is represented, and a Lagrangian coordinate array q can be used for describing the system; let r be l (x, y, z) is a vector of the centroid of each rigid body member in the absolute coordinate system, x represents the x-axis in the absolute coordinate system, y represents the y-axis in the absolute coordinate system, z represents the z-axis in the absolute coordinate system, γ= (ψ, θ, Φ) l For three euler angles of the rigid body component relative to the coordinate base, psi represents a precession angle, theta represents a nutation angle, phi represents a self-rotation angle, and then the position vector matrix of the whole multi-rigid body dynamic system is formed:
q=(q 1 q 2 … q n ) T (1)
the expression of the kinematic constraint equation set of the whole multi-rigid-body dynamic system is that
Figure BDA0003840354650000052
In the formula (2), m is the number of constraint pairs, Γ V Representing the motion constraint equation set of the whole multi-rigid-body dynamics system as a whole,
Figure BDA0003840354650000053
representing the motion constraint equation of the corresponding component alone, wherein w=1, 2,3, …, m, V is a variable arbitrarily taken as a distinction from the drive constraint equation;
when the total degree of freedom of the whole multi-rigid-body dynamic system is zero, the motion is determined, so that the expression of the driving constraint number equation required by the multi-rigid-body dynamic system is as follows:
Γ H (q,t)=0 (3)
wherein Γ (q, t) =0 is the driving forceA dynamic constraint number equation, the driving constraint is a time function about generalized coordinates; Γ -shaped structure H (q, t) =0 is a vector form of the driving constraint equation; h is a variable arbitrarily taken as a distinction from the motion constraint equation; t represents the motion time of the multi-rigid-body system; for n coordinates q= (q 1 q 2 … q n ) T In the multi-rigid-body system of m constraint pairs, only n-m in the coordinate system of the multi-rigid-body system are independent, i.e. n-m is the degree of freedom of the multi-rigid-body system, from the kinematic perspective, for example, for a planar link mechanism system, only when the total degree of freedom of the system is zero, the system has determined motion, so that the number of driving constraints required to be determined by the system is n-m, and the expression of the driving constraint equation is Γ H (q,t)=0。
All constraints imposed by the multi-rigid-body dynamics system are combined by the kinematic constraint of the formula (2), the driving constraint of the formula (3) and the Euler parameter constraint:
Figure BDA0003840354650000061
the equation (4) forms n nonlinear position equation sets of the whole multi-rigid-body dynamic system under generalized coordinates; Γ (q, t) =0 is the whole system driving constraint equation, driving constraint is a function of time with respect to generalized coordinates;
Figure BDA0003840354650000062
the equation expression is constrained for Euler parameters.
Step S3: and analyzing the engineering equipment geometric model, and establishing a corresponding constraint type for the geometric model component by using a physical engine according to the constraint equation type among the geometric model components to finish the physical modeling from the engineering equipment geometric model to the multi-rigid-body system physical model.
With reference to the second aspect, further, the mathematical model of the multi-rigid-body system is mathematical modeling performed after the physical model of the multi-rigid-body system is obtained, so as to obtain an equal mathematical model of the speed and the acceleration of the multi-rigid-body system, and further obtain a dynamic model of the multi-rigid-body system.
With reference to the second aspect, further, the mathematical model of the multi-rigid-body system includes a mathematical model of rigid-body member motion and a mathematical model of rigid-body member rotation.
With reference to the second aspect, further, the modeling process of the rigid body member motion mathematical model includes the following steps:
step A: and (3) deriving the equation, namely, the speed constraint equation of the multi-rigid-body system is as follows:
Figure BDA0003840354650000071
the whole formula (5) is the result of deriving formula (4), wherein Γ q For the Jacobian matrix of the system, i.e
Figure BDA0003840354650000072
Γ t For the constraint equation the derivative with respect to time, i.e.>
Figure BDA0003840354650000073
Figure BDA0003840354650000074
The generalized speed of the system is provided;
and (B) step (B): and (3) secondarily deriving the equation, namely, the acceleration constraint equation of the multi-rigid-body system is as follows:
Figure BDA0003840354650000075
wherein F q Is a Jacobian matrix; Γ -shaped structure qt For the derivative of the Jacobian matrix with respect to time, i.e.
Figure BDA0003840354650000076
Γ tt Quadratic derivation of time for constraint equation, i.e. +.>
Figure BDA0003840354650000077
Figure BDA0003840354650000078
The generalized acceleration of the system is adopted;
step C: the first type of pull-type multiplier form of the multi-rigid-body system motion equation is obtained by a multi-rigid-body system position constraint equation of the formula (4), a multi-rigid-body system speed constraint equation of the formula (5) and a multi-rigid-body system acceleration constraint equation of the formula (6):
Figure BDA0003840354650000079
wherein q, v, eta e R n Respectively a system generalized coordinate, a speed and an acceleration vector,
Figure BDA00038403546500000710
λ∈R m is Lagrangian multiplier column vector, is internal force and internal moment between rigid body components connected by kinematic pair, t E R is time, M (q, t) E R n×n Representing the quality matrix of the system, f (q, v, t) ∈R n Is a generalized external force column vector comprising external force and external moment, R n N is n-dimensional real number vector space, and the numerical value is consistent with the unit number of the multi-rigid body system; r is R m M is m-dimensional real number vector space, and the number of m is consistent with that of constraint pairs.
With reference to the second aspect, further modeling the rigid member rotation mathematical model needs to determine the motion of the multi-rigid system by using a satellite coordinate system fixedly connected to the rigid member for each mechanism of the engineering equipment in the three-dimensional space, specifically: describing the rigid body by three attitude variables of a direction cosine matrix, euler angles and Euler quaternions to construct fixed-point rotation;
the Euler angle is as follows: gamma ray l =(ψ,θ,φ) l (8)
The direction cosine matrix is as follows:
Figure BDA0003840354650000081
the Euler quaternion is:
Figure BDA0003840354650000082
the Euler quaternion is used for representing a direction cosine matrix as follows:
Figure BDA0003840354650000083
euler parameter variables in the direction cosine matrix and the Euler quaternion need to satisfy constraint equations:
Figure BDA0003840354650000084
wherein a is 1 ,a 2 ,a 3 Vector being Euler quaternion, a 4 For the euler quaternion scalar,
Figure BDA0003840354650000085
a constraint vector constituted by euler parameters of the rigid body member.
With reference to the second aspect, further, the azimuth parameter of the satellite coordinate system relative to the global coordinate system in the generalized coordinate of the rigid body member may be represented by a directional cosine matrix, and may also be represented by an euler angle.
With reference to the second aspect, further, the mathematical model of the multi-rigid-body system may be obtained by performing mathematical modeling on the physical model of the multi-rigid-body system by using a cartesian coordinate or lagrangian coordinate modeling method.
With reference to the second aspect, the motion constraint system is further constructed based on the dynamics theory of a multi-rigid-body system, and according to constraint types, jacobian matrixes of two connected rigid-body member objects are calculated, moment of inertia is calculated by matching with the shapes of the rigid-body members, and the positions and the speeds of the interconnected rigid-body member objects are updated, so that the stress effect of the connected objects is simulated; the types of constraints between the rigid body members include Hinge constraints and slide bar constraints, i.e., the motion constraints of the motion constraint system include Hinge (Hinge) constraints and slide bar (slide) constraints.
The motion constraint system uses corresponding constraints in the physical engine to connect after determining the constraint type between rigid body components, and calculates motion, rotation and collision reactions by assigning real physical properties to rigid body objects.
With reference to the second aspect, further, the establishing a motion constraint of the engineering equipment multi-rigid-body system dynamics model specifically includes the following steps:
Step 1: the multi-rigid body system model (namely a virtual three-dimensional model, which is called a multi-rigid body system model in dynamic motion constraint) of the actual system is simplified;
the actual object of the complex engineering equipment is simplified into a body, a hinge and the like through theoretical abstraction. The body: a member in a multi-rigid body system; and (3) hinging: motion constraint between rigid bodies, no mass; the action relation between the rigid bodies is limited by motion constraint;
step 2: constructing a physical model with constraint connection relation;
and assembling the engineering equipment geometric model according to the kinematic constraint and the initial position condition of the virtual three-dimensional model, and establishing corresponding constraint connection between the rigid body components according to corresponding stress analysis.
Hinge (Hinge) constraint: a rotation axis is established at a certain node to limit two rigid body components to rotate around the rotation axis, and the rotation axis is a hinge axis, for example: a door or wheel that rotates about only one axis and a user can set a rotation angle limit for the hinge axis.
Slide bar (slide) constraint: refers to a constraint relationship in which two rigid members can move only along a certain axis.
With reference to the second aspect, further, the implementation steps of the hinge constraint are:
Step I: obtaining mass centers of a first rigid body component and a second rigid body component, and obtaining a mass center formula of the rigid body components:
Figure BDA0003840354650000091
wherein r is c Representing a rigid body member centroid position vector; x is x c 、y c 、z c Respectively representing x, y and z coordinates of the mass center of the rigid body; i. j and k represent vector bases, respectively.
Step II: calculating the position of the anchor point relative to the first rigid body component, and acquiring the hinge axis of the first rigid body component;
step III: calculating the position of the anchor point relative to the second rigid body component, and acquiring the hinge axis of the second rigid body component;
step IV: hinge constraint is performed on the two rigid body members.
With reference to the second aspect, further, the implementation steps of the sliding rod constraint are:
step a: acquiring centroids of the first and second rigid body members using equation (13);
step b: calculating the position of the second rigid body component relative to the first rigid body component, and acquiring a sliding shaft of the first rigid body component;
step c: calculating the position of the first rigid body component relative to the second rigid body component, and acquiring a sliding shaft of the second rigid body component;
step d: slide bar restraining is performed on the two rigid body members.
With reference to the second aspect, further, the motion constraint system established by the physical engine needs to adjust the spatial pose or constraint axis of the rigid member to successfully use the function in the physical engine, and the adjustment method is as follows: the rotation matrix rotated around the unit vector μ (x, y, z) and rotated by β using the rondrigas rotation formula is:
Figure BDA0003840354650000101
After the spatial pose of the rigid member is acquired, the rigid member is adjusted by equation (14).
With reference to the second aspect, further, the physical engine comprises a gravity engine, so that the gravity test as in reality can be realized, and if the gravity center is unstable due to incorrect operation, the phenomenon of rollover occurs.
Compared with the prior art, the invention provides the luffing mechanism, the arm support system and the drainage robot, which have the following beneficial effects:
(1) The real-time interaction system can effectively solve the problems that the traditional engineering equipment is high in safety risk and high in training cost when being trained by adopting a real machine operation, and the real environment is isolated from the physical environment, so that sense of reality is lost, interaction with an actual construction scene is impossible, the virtual environment is single, flow verification is not suitable for being carried out according to the actual construction environment, and the like in the virtual reality training, and provides a more advanced technical means for engineering equipment control training and real-time confirmation of a field working procedure working method, so that the rationality of training operators or verifying the working procedure working method is achieved, and the purpose of timely adjusting and avoiding the construction risk is achieved before the actual vehicle works.
(2) According to the motion simulation method, the complex mechanical constraint relation in the operation simulation of the engineering equipment is established, the motion process of each component of the engineering equipment is simulated by introducing the simple constraint in the physical engine, the problem of considering the real-time performance and the simulation effect of the motion simulation is solved, an operator can perform virtual operation on the engineering equipment in a scene in a mixed virtual environment with real immersion and interactivity, and the construction process is simulated, so that the platform-independent virtual interactive simulation method capable of meeting the interactivity and the real-time performance is provided.
(3) According to the motion simulation method, construction simulation is carried out by utilizing a mixed reality technology, space positioning accurate mapping and interference detection between a virtual object and an actual environment in a motion process of a complex construction site are carried out, and real-time confirmation of a construction flow and a working procedure working method is realized, so that construction verification and training tasks are more efficiently completed.
Drawings
FIG. 1 is a schematic diagram of a real-time interactive system according to the present invention;
FIG. 2 is a flow chart of dynamic modeling of the motion simulation method of the present invention;
FIG. 3 is a flow chart of an implementation of the motion simulation method of the present invention;
fig. 4 is a schematic diagram of related keys for boom system control of a control handle of a real-time interactive system according to an embodiment of the present invention.
The meaning of the reference numerals in the figures is:
1-mixed reality hardware; 2-manipulating the control handle; 21-handle actuation; 3-a simulation control module; 31-an interactive instruction acquisition module; 32-a dynamics simulation calculation module; 33-a scene management and graphic image real-time rendering module; 4-a spatial mapping and collision detection module; 5-an information storage module; 6-server.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The relative arrangement of the components and steps, numerical expressions and numerical values set forth in these embodiments do not limit the scope of the present invention unless it is specifically stated otherwise. Meanwhile, it should be understood that the sizes of the respective parts shown in the drawings are not drawn in actual scale for convenience of description. Techniques, methods, and apparatus known to one of ordinary skill in the relevant art may not be discussed in detail, but should be considered part of the specification where appropriate. In all examples shown and discussed herein, any specific values should be construed as merely illustrative, and not a limitation. Thus, other examples of exemplary embodiments may also include different values. It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further discussion thereof is necessary in subsequent figures.
In the description of the present application, it should be understood that the terms "center," "longitudinal," "transverse," "upper," "lower," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," and the like indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, merely to facilitate description of the present invention and simplify the description, and do not indicate or imply that the device or element being referred to must have a specific orientation, be configured and operated in a specific orientation, and thus should not be construed as limiting the protection of the present invention.
As shown in fig. 1, the real-time pseudo-interaction system of the invention comprises mixed reality hardware 1, a steering control handle 2, a simulation control module 3, a space mapping and collision detection module 4, an information storage module 5 and a server 6; in the embodiment, the engineering equipment takes an aerial work platform as an example (other engineering equipment methods and principles are the same), the mixed reality hardware 1 adopts Microsoft hollens 2 or hollens 1 or Meta2 glasses, and adopts communication connection with the simulation control module 3 and the space mapping and collision detection module 4, so that the functions of displaying a three-dimensional model of the aerial work platform, identifying the environmental information of a construction site and acquiring the position information of an operator are realized, the network communication connection conforms to the TCP/IP protocol, the running server 6 automatically opens the network for monitoring, monitors the network in real time and waits for the client of the mixed reality hardware 1 to be matched and connected with the client; an operator wearing Hollolens 2 glasses enters an actual working site, and operation information of the virtual aerial working platform is transmitted into the server 6 through a socket network. The control handle 2 can adopt a game handle, the handle drive 21 of the control handle 2 is connected with the USB of the server 6 where the simulation control module 3 is located, the handle drive 21 analyzes the operation control flow input by the handle and transmits the operation control flow to the interaction instruction acquisition module 31 of the simulation control module 3, the buttons on the control handle 2 can be used for controlling the movement of the aerial work platform, the aerial work platform is opened to a destination through the up-down left-right buttons of the control handle 2, the buttons of the control handle 2 corresponding to the operations of the boom system and the turntable are shown in fig. 4, the boom system operation is started by pressing the 'PgDn' button in the selection chart 4 on the control handle 2, the corresponding 1-6 buttons can respectively select the corresponding lower main arm, lower telescopic arm, upper main arm, two upper telescopic arms and working platform, then the corresponding up-up "=" and down "-" operation "K" - "D" keys can rotate the turntable. The space mapping and collision detection module 4 is respectively in communication connection with the mixed reality hardware 1 and the simulation control module 3, and has the functions of displaying the construction scene information acquired by the mixed reality hardware 1 hollens 2 glasses and the virtual high-altitude operation platform three-dimensional model in the simulation control module 3 in a real construction site environment through the display screen of the mixed reality hardware 1 hollens 2 glasses, displaying the virtual high-altitude operation platform three-dimensional model in a virtual-real fusion mode, and simultaneously carrying out accurate collision detection and real-time motion interference simulation on moving parts such as a boom system of the virtual high-altitude operation platform and objects in the real construction site environment in an operation state, thereby achieving the purposes of training operators or verifying the rationality of a working procedure method, and timely adjusting the virtual high-altitude operation platform three-dimensional model before the actual vehicle operation and avoiding construction risks. The information storage module 4 is respectively in communication connection with the simulation control module 3 and the space mapping and collision detection module 4, and has the functions of storing the motion state information and the collision detection information record during virtual training and virtual verification of the aerial work platform so as to realize the reproduction of training and verification results. The simulation control module 3 is respectively in communication connection with the mixed reality hardware 1, the control handle 2, the space mapping and collision detection module 4 and the information storage module 5, and has the functions of receiving interactive instruction information of handle operation, gestures and voices, operator position information and real construction site environment information from the control handle 2 and the mixed reality hardware 1, analyzing the received virtual interactive information into instruction information corresponding to the movement of a three-dimensional model of a virtual aerial work platform through the dynamics simulation calculation module 32, simulating the real-time movement of the three-dimensional model of the virtual aerial work platform according to a specific instruction, and synchronizing the image information of the three-dimensional model of the virtual aerial work platform to a display screen in the mixed reality hardware 1 for display through the scene management and graphic image real-time rendering module 33 and the space mapping and collision detection module 4. The server 6 is a hardware carrier of the simulation control module 3, the space mapping and collision detection module 4 and the information storage module 5.
In a specific implementation manner of this embodiment, the simulation control module 3 includes an interaction instruction acquisition module 31, a dynamics simulation calculation module 32, and a scene management and graphic image real-time rendering module 33; the interaction instruction acquisition module 31 is for receiving operation information of an operator wearing the mixed reality hardware 1, the operation information including actions and voices in virtual interaction operation instructions of the mixed reality hardware 1 and operation digital signals in the handle driver 21; the dynamics simulation calculation module 32 is a core control module of the whole simulation system, and has the functions of performing real-time dynamics simulation calculation according to the operation information of an operator, and outputting simulation calculation results to the scene management and graphic image real-time rendering module 33; the scene management and graphic image real-time rendering module 33 has the functions of updating the motion state of the aerial work platform in the virtual scene in real time according to the simulation calculation result, and outputting the operation result to the mixed reality hardware 1 in a visualized manner by utilizing the graphic rendering engine for displaying to an operator.
As shown in fig. 2, in order to implement virtual motion simulation control on a complex mechanical system such as an aerial working platform, the method for simulating the movement of engineering equipment has the most basic problem of building a dynamic model of each component of the aerial working platform. From the initial geometric model to the physical model, the dynamics model is finally obtained through the numerical solution of the mathematical model, and the dynamics simulation calculation module 32 comprises a multi-rigid-body dynamics system and a motion constraint system, wherein the multi-rigid-body dynamics system comprises a multi-rigid-body system physical model and a multi-rigid-body system mathematical model; the multi-rigid body system physical model is an aerial working platform physical model for expressing mechanical characteristics of the system by carrying out physical modeling on the aerial working platform geometric model; the motion constraint system comprises Hinge (Hinge) constraint and sliding rod (slide) constraint, and corresponding motion constraint is established among rigid body components of the aerial work platform by using a physical engine through multi-rigid body dynamics modeling of the aerial work platform.
In order to form a physical model of the aerial work platform for expressing the mechanical characteristics of the system, the physical modeling of the geometric model of the aerial work platform is required. In the physical modeling process, the geometric model of the aerial working platform needs to be assembled according to kinematic constraint and initial position conditions, and the physical modeling comprises the following steps:
each geometric model member is considered as a rigid member and its Cartesian generalized coordinate vector is set
Figure BDA0003840354650000151
Where l=1, 2, &, n, n represents a multi-rigid body system having n points, and any adjacent rigid bodies connected by hinges are regarded as one unit, i.e., n represents the number of units of the multi-rigid body system; q l The single rigid body is used as a reference, the position of the other rigid body relative to the rigid body is described by generalized coordinates, a Cartesian generalized coordinate vector is represented, and a Lagrangian coordinate array q can be used for describing the system; let r be l (x, y, z) is a vector of the centroid of each rigid body member in the absolute coordinate system, x represents the x-axis in the absolute coordinate system, y represents the y-axis in the absolute coordinate system, z represents the z-axis in the absolute coordinate system, γ= (ψ, θ, Φ) l For three euler angles of the rigid body component relative to the coordinate base, psi represents a precession angle, theta represents a nutation angle, phi represents a self-rotation angle, and then the position vector matrix of the whole multi-rigid body dynamic system is formed:
q=(q 1 q 2 … q n ) T (15)
The expression of the kinematic constraint equation set of the whole multi-rigid-body dynamic system is that
Figure BDA0003840354650000152
In the formula (16), m is the number of constraint pairs, Γ V Representing the motion constraint equation set of the whole multi-rigid-body dynamics system as a whole,
Figure BDA0003840354650000161
representing the motion constraint equation of the corresponding component alone, wherein w=1, 2,3, …, m, V is a variable arbitrarily taken as a distinction from the drive constraint equation;
when the total degree of freedom of the whole multi-rigid-body dynamic system is zero, the motion is determined, so that the expression of the driving constraint number equation required by the multi-rigid-body dynamic system is as follows:
Γ H (q,t)=0 (17)
wherein Γ (q, t) =0 is a driving constraint number equation, and the driving constraint is a time function about generalized coordinates; Γ -shaped structure H (q, t) =0 is a vector form of the driving constraint equation; h is a variable arbitrarily taken as a distinction from the motion constraint equation; t represents the motion time of the multi-rigid-body system; for n coordinates q= (q 1 q 2 … q n ) T In the multi-rigid-body system of m constraint pairs, only n-m in the coordinate system of the multi-rigid-body system are independent, i.e. n-m is the degree of freedom of the multi-rigid-body system, from the kinematic perspective, for example, for a planar link mechanism system, only when the total degree of freedom of the system is zero, the system has determined motion, so that the number of driving constraints required to be determined by the system is n-m, and the expression of the driving constraint equation is Γ H (q,t)=0。
All constraints imposed by the multi-rigid-body dynamics system are combined by the kinematic constraint of the formula (16), the driving constraint of the formula (17) and the Euler parameter constraint:
Figure BDA0003840354650000162
equation (18) constitutes a system of n nonlinear position equations of the whole multi-rigid-body dynamics system under generalized coordinates; Γ (q, t) =0 is the whole system driving constraint equation, driving constraint is a function of time with respect to generalized coordinates;
Figure BDA0003840354650000163
the equation expression is constrained for Euler parameters.
Step S3: and analyzing the engineering equipment geometric model, and establishing a corresponding constraint type for the geometric model component by using a physical engine according to the constraint equation type among the geometric model components to finish the physical modeling from the engineering equipment geometric model to the multi-rigid-body system physical model.
After the geometric model of the aerial working platform is analyzed, a corresponding constraint type is established for the components by using a physical engine according to the constraint equation type among the components, so that the physical modeling from the geometric model to the physical model is successfully completed, the solution of the constraint equation is omitted, and the aerial working platform is convenient to control in real time.
The mathematical model of the multi-rigid body system is mathematical modeling which is carried out after the physical model of the multi-rigid body system is obtained, and a Cartesian coordinate or Lagrange coordinate modeling method is adopted to obtain the mathematical model of the multi-rigid body system such as speed, acceleration and the like, and finally the dynamic model of the multi-rigid body system is obtained. The mathematical model of the multi-rigid-body system comprises a mathematical model of rigid-body component motion of the aerial work platform and a mathematical model of rigid-body component rotation.
The modeling process of the rigid body member motion mathematical model specifically comprises the following steps:
step A: deriving equation (18), the multi-rigid-body system speed constraint equation is:
Figure BDA0003840354650000171
the whole formula (19) is the result of deriving formula (18), wherein Γ q For the Jacobian matrix of the system, i.e
Figure BDA0003840354650000172
t For the constraint equation the derivative with respect to time, i.e.>
Figure BDA0003840354650000173
Figure BDA0003840354650000174
The generalized speed of the system is provided;
and (B) step (B): and (3) deriving the equation (18) secondarily, and then obtaining the acceleration constraint equation of the multi-rigid-body system as follows:
Figure BDA0003840354650000175
wherein F q Is a Jacobian matrix; Γ -shaped structure qt For the derivative of the Jacobian matrix with respect to time, i.e.
Figure BDA0003840354650000176
Γ tt Quadratic derivation of time for constraint equation, i.e. +.>
Figure BDA0003840354650000177
Figure BDA0003840354650000178
The generalized acceleration of the system is adopted;
step C: the first type of pull-type multiplier form of the multi-rigid-body system motion equation is obtained by the multi-rigid-body system position constraint equation of the formula (18), the multi-rigid-body system speed constraint equation of the formula (19) and the multi-rigid-body system acceleration constraint equation of the formula (20):
Figure BDA0003840354650000181
wherein q, v, eta e R n Respectively a system generalized coordinate, a speed and an acceleration vector,
Figure BDA0003840354650000182
λ∈R m is Lagrangian multiplier column vector, is internal force and internal moment between rigid body components connected by kinematic pair, t E R is time, M (q, t) E R n×n Representing the quality matrix of the system, f (q, v, t) ∈R n Is a generalized external force column vector comprising external force and external moment, R n N is n-dimensional real number vector space, and the numerical value is consistent with the unit number of the multi-rigid body system; r is R m M is m-dimensional real number vector space, and the number of m is consistent with that of constraint pairs.
Modeling of a rigid body member rotation mathematical model needs to determine motion of a multi-rigid body system by adopting a satellite coordinate system fixedly connected to a rigid body member for each mechanism of an aerial work platform in a three-dimensional space, and generalized coordinates of the member are composed of two parts, namely, an origin coordinate of the satellite coordinate system and azimuth parameters of the satellite coordinate system relative to a global coordinate system are determined, wherein the method specifically comprises the following steps: the rigid body is described by three gesture variables of a direction cosine matrix, euler angles and Euler quaternions to construct fixed-point rotation, and in addition, the azimuth parameters of a satellite coordinate system in the generalized coordinates of the rigid body component relative to a global coordinate system can be represented by the direction cosine matrix and the Euler angles and also can be represented by Euler parameters.
The Euler angle is as follows: gamma ray l =(ψ,θ,φ) l (22)
The direction cosine matrix is as follows:
Figure BDA0003840354650000183
the Euler quaternion is:
Figure BDA0003840354650000184
the Euler quaternion is used for representing a direction cosine matrix as follows:
Figure BDA0003840354650000191
euler parameter variables in the direction cosine matrix and the Euler quaternion need to satisfy constraint equations:
Figure BDA0003840354650000192
wherein a is 1 ,a 2 ,a 3 Vector being Euler quaternion, a 4 For the euler quaternion scalar,
Figure BDA0003840354650000193
A constraint vector constituted by euler parameters of the rigid body member.
The complex mechanical system is successfully simplified by physical modeling and mathematical modeling of the multi-rigid-body system, and the dynamic model of the multi-rigid-body system is obtained, so that the movement of the mechanical system of the aerial working platform is conveniently simulated.
The most critical problem of virtual control of the aerial work platform is motion control, including motion constraint between rigid body members of the multi-rigid body system and motion control of the whole multi-rigid body system.
The motion constraint system is constructed based on a multi-body system dynamics theory, a Jacobian matrix of two connected rigid body member objects is calculated according to constraint types, moment of inertia is calculated by matching with the shape of the rigid body members, and the positions and the speeds of the interconnected rigid body member objects are updated, so that the stress effect of the connected objects is simulated, and the motion constraint of the multi-rigid body system model of the aerial work platform is established specifically comprises the following steps:
step 1: the multi-rigid body system model of the actual system is simplified: the working motion of the aerial work platform is simplified into system motion consisting of a limited number of rigid bodies, the rigid bodies are connected with each other in a certain form of constraint mode, such as a body and a hinge, and the aerial work platform is simplified into a plurality of parts of self-propelled chassis, a turntable, an arm support system and the like according to the multi-rigid-body theory. The self-propelled chassis finishes traction and power supply of the aerial working platform, the turntable is connected with the self-propelled chassis to realize 360-degree rotation, meanwhile, the stability of the aerial working vehicle is ensured, and the boom system is connected with the turntable to control the working state of the working platform;
Step 2: constructing a physical model with constraint connection relations: the method comprises the steps of assembling a geometric model of an aerial work platform according to kinematic constraint and model initial position conditions, establishing corresponding constraint connection between rigid body members according to corresponding stress analysis, and connecting the rigid body members by utilizing corresponding constraints in a physical engine after determining constraint types among the rigid body members for simplicity of calculation, wherein the physical engine is an integrated solution for simulating a physical environment in real time, and based on rigid body mechanics, calculating motion, rotation and collision reactions by means of giving real physical properties to rigid body objects, and finally confirming that the constraints of the aerial work platform to be established are 21 hinge constraints and 9 sliding constraints, wherein the total number of the constraints is 30.
The constraint types among the rigid body components are specifically divided into:
hinge (Hinge) constraint: a rotation axis is established at a certain node to limit two rigid body components to rotate around the rotation axis, and the rotation axis is a hinge axis, for example: the turntable and wheels of the aerial platform rotate about only one axis and the user can set a rotation angle limit for the hinge axis.
Slide bar (slide) constraint: the two rigid body members can only move along a certain axis to restrict the relationship, such as a telescopic cylinder and a telescopic arm of the aerial working platform.
The implementation steps of the Hinge (Hinge) constraint are as follows:
step I: obtaining mass centers of a first rigid body component and a second rigid body component, and obtaining a mass center formula of the rigid body components:
Figure BDA0003840354650000201
wherein r is c Representing a rigid body member centroid position vector; x is x c 、y c 、z c Respectively representing x, y and z coordinates of the mass center of the rigid body; i. j and k represent vector bases, respectively.
Step II: calculating the position of the anchor point relative to the first rigid body component, and acquiring the hinge axis of the first rigid body component;
step III: calculating the position of the anchor point relative to the second rigid body component, and acquiring the hinge axis of the second rigid body component;
step IV: hinge constraint is performed on the two rigid body members.
The implementation steps of the constraint of the sliding rod (slide) are as follows:
step a: obtaining centroids of the first and second rigid body members using equation (27);
step b: calculating the position of the second rigid body component relative to the first rigid body component, and acquiring a sliding shaft of the first rigid body component;
step c: calculating the position of the first rigid body component relative to the second rigid body component, and acquiring a sliding shaft of the second rigid body component;
step d: slide bar restraining is performed on the two rigid body members.
Because most of the constraints in the physical engine are aimed at a local coordinate system, the motion constraint system established by the physical engine needs to adjust the spatial attitude or constraint axis of the rigid body member to successfully use the functions in the physical engine, and the specific method is as follows: the rotation matrix rotated around the unit vector μ (x, y, z) and rotated by β using the rondrigas rotation formula is:
Figure BDA0003840354650000211
After the spatial pose of the rigid member is acquired, the rigid member is adjusted by equation (28).
In a specific implementation of this embodiment, the physical engine includes a gravity engine, so that the gravity test as in reality can be implemented, and if the gravity center is unstable due to incorrect operation, a rollover phenomenon occurs.
As shown in fig. 3, the motion simulation method of the present invention includes the steps of:
step one: configuring mixed reality hardware, manipulating a control handle and a server; i.e. the mixed reality hardware is in communication connection with the server, and the control handle is in communication connection with the server.
Step two: loading a virtual model to the mixed reality hardware; the server transmits the stored virtual scene data to the mixed reality hardware, and the mixed reality hardware fuses the virtual scene and the real scene after identifying the actual environment information, so as to present a virtual-real fusion scene.
Step three: the user performs operation input by manipulating the control handle; if the user operation input is wrong, carrying out error information prompt; if the operation input of the user is correct, performing dynamic simulation calculation, outputting a simulation calculation result to a scene management and graphic image real-time rendering module, and updating the motion state of the engineering equipment three-dimensional model in the virtual scene in real time by the scene management and graphic image real-time rendering module according to the simulation calculation result of the dynamic simulation calculation module, and outputting the operation result to the mixed reality hardware to be displayed to an operator;
Step four: displaying the engineering equipment three-dimensional model in the real-time updated virtual scene in the step three in a real construction site environment in a virtual-real fusion mode, and simultaneously performing collision detection and real-time motion interference simulation between the virtual object and the real scene object; if interference with the real environment is detected, sending an interference information prompt to an operator wearing the mixed reality hardware, and if interference with the real environment is detected, continuing to execute the next verification and training task;
step five: writing the operation process into an information storage module, recording and storing the operation process information in the process, and completing the task.
Virtual control according to the actual demand of the aerial work platform comprises three major parts: azimuth operation of the aerial working platform, operation of the boom system and rotation of the turntable. The system simulates the control process of the aerial working platform, an operator controls the aerial working platform arm support system by using a handle, the aerial working platform is driven to a target place in the working process, then the arm support system and the turntable are operated, the working platform is adjusted to a proper working position and leveled, in actual operation, the operation of the aerial working platform is determined according to the position condition of the target working point, the operator can adjust the horizontal angle through clockwise or anticlockwise rotation of the turntable, the height or vertical angle can be adjusted through the corresponding operation of the arm support system, the gravity engine is further introduced into the system, the aerial working platform can realize the same gravity test as in reality, and the phenomenon of rollover can occur if the gravity center is unstable due to incorrect operation.
According to the invention, a complex mechanical constraint relation in operation simulation of the aerial work platform is established, and a simple constraint in a physical engine is introduced to simulate the motion process of each part of the aerial work platform, so that the problem of considering the real-time performance and the simulation effect of motion simulation is solved, an operator can perform virtual operation on the aerial work platform in a scene in a mixed virtual environment with real immersion and interactivity, and the construction process is simulated, thereby providing a platform-independent virtual interactive simulation method capable of meeting the interactivity and the real-time performance; by utilizing the mixed reality technology to perform construction simulation, space positioning accurate mapping and interference detection with the actual environment in the movement process of the virtual aerial working platform are performed on a complex construction site, so that the real-time confirmation of the construction flow and the working procedure construction method is realized, and the construction verification and training tasks of the aerial working platform are more efficiently completed.
It is noted that in this application relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Although embodiments of the present invention have been shown and described, it will be understood by those skilled in the art that various changes, modifications, substitutions and alterations can be made therein without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.

Claims (15)

1. Real-time interactive system of engineering equipment based on mixed reality, its characterized in that: the system comprises mixed reality hardware, a control handle, a simulation control module, a space mapping and collision detection module and an information storage module;
the mixed reality hardware is respectively in communication connection with the simulation control module, the space mapping and collision detection module and is used for acquiring and displaying a three-dimensional model of engineering equipment, identifying and acquiring construction site environment information and acquiring operator position information;
the control handle is in communication connection with the simulation control module and is used for transmitting the digital signal of operation to the simulation control module;
the simulation control module is respectively in communication connection with the mixed reality hardware, the control handle, the space mapping and collision detection module and the information storage module, and is used for receiving virtual interaction information, analyzing the virtual interaction information into instruction information corresponding to the movement of the virtual three-dimensional model, and simulating the movement of the virtual three-dimensional model in real time according to the instruction;
The space mapping and collision detection module is respectively in communication connection with the mixed reality hardware and the simulation control module, and is used for displaying the construction site environment information acquired by the mixed reality hardware and the virtual three-dimensional model in the simulation control module in a real construction site environment through a display screen of the mixed reality hardware in a virtual-real fusion mode, and simultaneously carrying out collision detection and real-time motion interference simulation between a virtual object and a real scene object;
the information storage module is respectively in communication connection with the simulation control module, the space mapping and the collision detection module and is used for storing motion state information and collision detection information records during virtual training and virtual verification so as to realize the reproduction of training and verification results.
2. The mixed reality based engineering equipment real-time interaction system according to claim 1, wherein: the system also comprises a server, wherein the server is a hardware carrier of the simulation control module, the space mapping and collision detection module and the information storage module.
3. The mixed reality based engineering equipment real-time interaction system according to claim 1, wherein: the simulation control module comprises an interaction instruction acquisition module, a dynamics simulation calculation module and a scene management and graphic image real-time rendering module;
The interactive instruction acquisition module is used for receiving operation information of an operator wearing the mixed reality hardware, wherein the operation information is virtual interactive information, and the virtual interactive information comprises a handle operation digital signal from a control handle, handle operation of the mixed reality hardware, operator gestures, interactive instruction information of operator voices, operator position information of the operator wearing the mixed reality hardware and real construction site environment information;
the dynamics simulation calculation module is used for carrying out real-time dynamics simulation calculation according to the operation information of an operator and outputting a simulation calculation result to the scene management and graphic image real-time rendering module;
the scene management and graphic image real-time rendering module is used for updating the motion state of the three-dimensional model of the engineering equipment in the virtual scene in real time according to the simulation calculation result of the dynamic simulation calculation module, and outputting the operation result to the mixed reality hardware to be displayed to an operator.
4. The real-time pseudo-interactive system of engineering equipment based on mixed reality according to claim 1, wherein: the control handle comprises a handle drive, and the control handle transmits an operating digital signal to the simulation control module through the handle drive; the handle driver analyzes the operation control flow input by the operation control handle into an operation digital signal, and transmits the operation digital signal to the simulation control module.
5. The motion simulation method of the engineering equipment based on mixed reality is characterized by comprising the following steps of: based on the real-time pseudo-interaction system of any one of claims 1 to 4, the dynamics simulation calculation module establishes corresponding kinematic constraints among rigid body components of engineering equipment by carrying out multi-rigid body dynamics modeling on the engineering equipment, wherein the kinematic constraints comprise a multi-rigid body dynamics system and a motion constraint system.
6. The mixed reality-based engineering equipment motion simulation method according to claim 5, wherein the method comprises the following steps of: the motion simulation method comprises the following steps:
step one: configuring mixed reality hardware, manipulating a control handle and a server;
step two: loading a virtual model to the mixed reality hardware;
step three: the user performs operation input by manipulating the control handle; if the user operation input is wrong, carrying out error information prompt; if the operation input of the user is correct, performing dynamic simulation calculation, and outputting a simulation calculation result to a scene management and graphic image real-time rendering module, wherein the scene management and graphic image real-time rendering module updates the motion state of the three-dimensional model of the engineering equipment in the virtual scene in real time according to the simulation calculation result of the dynamic simulation calculation module, and outputs the operation result to the mixed reality hardware to be displayed to an operator;
Step four: displaying the engineering equipment three-dimensional model in the real-time updated virtual scene in the step three in a real construction site environment in a virtual-real fusion mode, and simultaneously performing collision detection and real-time motion interference simulation between the virtual object and the real scene object; if interference with the real environment is detected, sending an interference information prompt to an operator wearing the mixed reality hardware, and if interference with the real environment is detected, continuing to execute the next verification and training task;
step five: writing the operation process into an information storage module, recording and storing the operation process information in the process, and completing the task.
7. The mixed reality-based engineering equipment motion simulation method according to claim 5, wherein the method comprises the following steps of: the multi-rigid-body dynamic system comprises a multi-rigid-body system physical model and a multi-rigid-body system mathematical model; the multi-rigid body system physical model is a physical model for forming the mechanical characteristics of the expression system by carrying out physical modeling on engineering equipment geometric models, and the physical modeling comprises the following steps:
step S1: assembling the engineering equipment geometric model according to the kinematic constraint and the initial position condition of the virtual three-dimensional model, and setting a father-son nesting relationship between the virtual three-dimensional model components by analyzing the motion dependency relationship between the virtual three-dimensional model components to finally finish the construction of the whole engineering equipment model tree;
Step S2: each geometric model member is considered as a rigid member and its Cartesian generalized coordinate vector is set
Figure FDA0003840354640000031
Where l=1, 2, &, n, n represents a multi-rigid body system having n points, and any adjacent rigid bodies connected by hinges are regarded as one unit, i.e., n represents the number of units of the multi-rigid body system; q l The position of the other rigid body relative to the rigid body is described by generalized coordinates by taking the single rigid body as a reference object; let r be l (x, y, z) is a vector of the centroid of each rigid body member in the absolute coordinate system, x represents the x-axis in the absolute coordinate system, y represents the y-axis in the absolute coordinate system, z represents the z-axis in the absolute coordinate system, γ= (ψ, θ, Φ) l For three euler angles of the rigid body component relative to the coordinate base, psi represents a precession angle, theta represents a nutation angle, phi represents a self-rotation angle, and then the position vector matrix of the whole multi-rigid body dynamic system is formed:
q=(q 1 q 2 … q n ) T (1)
the expression of the kinematic constraint equation set of the whole multi-rigid-body dynamic system is that
Figure FDA0003840354640000041
In the formula (2), m is the number of constraint pairs, Γ V Representing the motion constraint equation set of the whole multi-rigid-body dynamics system as a whole,
Figure FDA0003840354640000042
representing the motion constraint equation of the corresponding component alone, wherein w=1, 2,3, …, m, V is a variable arbitrarily taken as a distinction from the drive constraint equation;
When the total degree of freedom of the whole multi-rigid-body dynamic system is zero, the motion is determined, so that the expression of the driving constraint number equation required by the multi-rigid-body dynamic system is as follows:
Γ H (q,t)=0 (3)
wherein Γ (q, t) =0 is a driving constraint number equation, and the driving constraint is a time function about generalized coordinates; Γ -shaped structure H (q, t) =0 is a vector form of the driving constraint equation; h is a variable arbitrarily taken as a distinction from the motion constraint equation; t represents the multi-rigid body system motion time.
All constraints imposed by the multi-rigid-body dynamics system are combined by the kinematic constraint of the formula (2), the driving constraint of the formula (3) and the Euler parameter constraint:
Figure FDA0003840354640000043
the equation (4) forms n nonlinear position equation sets of the whole multi-rigid-body dynamic system under generalized coordinates; Γ (q, t) =0 is the whole system driving constraint equation, driving constraint is a function of time with respect to generalized coordinates;
Figure FDA0003840354640000051
the equation expression is constrained for Euler parameters.
Step S3: and analyzing the engineering equipment geometric model, and establishing a corresponding constraint type for the geometric model component by using a physical engine according to the constraint equation type among the geometric model components to finish the physical modeling from the engineering equipment geometric model to the multi-rigid-body system physical model.
8. The mixed reality-based engineering equipment motion simulation method according to claim 7, wherein the method comprises the following steps: the mathematical model of the multi-rigid body system is mathematical modeling which is carried out after the physical model of the multi-rigid body system is obtained, so that the mathematical model of the speed and the acceleration of the multi-rigid body system is obtained, and further, the dynamics model of the multi-rigid body system is obtained.
9. The mixed reality-based engineering equipment motion simulation method according to claim 8, wherein the method comprises the following steps of: the mathematical model of the multi-rigid-body system comprises a mathematical model of rigid-body component motion and a mathematical model of rigid-body component rotation.
10. The mixed reality-based engineering equipment motion simulation method according to claim 9, characterized by comprising the following steps: the modeling process of the rigid body member motion mathematical model comprises the following steps:
step A: and (3) deriving the equation, namely, the speed constraint equation of the multi-rigid-body system is as follows:
Figure FDA0003840354640000058
the whole formula (5) is the result of deriving formula (4), wherein Γ q For the Jacobian matrix of the system, i.e
Figure FDA0003840354640000052
Γ t For the constraint equation the derivative with respect to time, i.e.>
Figure FDA0003840354640000053
Figure FDA0003840354640000054
The generalized speed of the system is provided;
and (B) step (B): and (3) secondarily deriving the equation, namely, the acceleration constraint equation of the multi-rigid-body system is as follows:
Figure FDA0003840354640000055
wherein F q Is a Jacobian matrix; Γ -shaped structure qt For the derivative of the Jacobian matrix with respect to time, i.e.
Figure FDA0003840354640000059
Γ tt Quadratic derivation of time for constraint equation, i.e. +.>
Figure FDA0003840354640000056
Figure FDA0003840354640000057
The generalized acceleration of the system is adopted;
step C: the first type of pull-type multiplier form of the multi-rigid-body system motion equation is obtained by a multi-rigid-body system position constraint equation of the formula (4), a multi-rigid-body system speed constraint equation of the formula (5) and a multi-rigid-body system acceleration constraint equation of the formula (6):
Figure FDA0003840354640000061
wherein q, v, eta e R n Respectively a system generalized coordinate, a speed and an acceleration vector,
Figure FDA0003840354640000064
λ∈R m is Lagrangian multiplier column vector, is internal force and internal moment between rigid body components connected by kinematic pair, t E R is time, M (q, t) E R n×n Representing the quality matrix of the system, f (q, v, t) ∈R n Is a generalized external force column vector comprising external force and external moment, R n N is n-dimensional real number vector space, and the numerical value is consistent with the unit number of the multi-rigid body system; r is R m M is m-dimensional real number vector space, and the number of m is consistent with that of constraint pairs.
11. The mixed reality-based engineering equipment motion simulation method according to claim 9 or 10, characterized by comprising the following steps: modeling of the rigid body member rotation mathematical model needs to determine multi-rigid body system motion by adopting a satellite coordinate system fixedly connected to the rigid body member for each mechanism of engineering equipment in a three-dimensional space, and specifically comprises the following steps: describing the rigid body by three attitude variables of a direction cosine matrix, euler angles and Euler quaternions to construct fixed-point rotation;
The Euler angle is as follows: gamma ray l =(ψ,θ,φ) l (8)
The direction cosine matrix is as follows:
Figure FDA0003840354640000062
the Euler quaternion is:
Figure FDA0003840354640000063
the Euler quaternion is used for representing a direction cosine matrix as follows:
Figure FDA0003840354640000071
euler parameter variables in the direction cosine matrix and the Euler quaternion need to satisfy constraint equations:
Figure FDA0003840354640000072
wherein a is 1 ,a 2 ,a 3 Vector being Euler quaternion, a 4 For the euler quaternion scalar,
Figure FDA0003840354640000074
a constraint vector constituted by euler parameters of the rigid body member.
12. The mixed reality-based engineering equipment motion simulation method according to claim 5, wherein the method comprises the following steps of: the motion constraint system calculates Jacobian matrixes of two connected rigid body member objects according to constraint types, calculates moment of inertia in cooperation with the shape of the rigid body member, and updates the positions and the speeds of the interconnected rigid body member objects so as to simulate the stress effect of the connected objects; types of constraints between the rigid body members include hinge constraints and sliding rod constraints.
13. The mixed reality-based engineering equipment motion simulation method of claim 12, wherein the method comprises the following steps of: the implementation steps of the hinge constraint are as follows:
step I: obtaining mass centers of a first rigid body component and a second rigid body component, and obtaining a mass center formula of the rigid body components:
Figure FDA0003840354640000073
Wherein r is c Representing a rigid body member centroid position vector; x is x c 、y c 、z c Respectively representing x, y and z coordinates of the mass center of the rigid body; i. j and k represent vector bases, respectively.
Step II: calculating the position of the anchor point relative to the first rigid body component, and acquiring the hinge axis of the first rigid body component;
step III: calculating the position of the anchor point relative to the second rigid body component, and acquiring the hinge axis of the second rigid body component;
step IV: hinge constraint is performed on the two rigid body members.
14. The mixed reality-based engineering equipment motion simulation method of claim 13, wherein the method comprises the following steps of: the implementation steps of the sliding rod constraint are as follows:
step a: acquiring centroids of the first and second rigid body members using equation (13);
step b: calculating the position of the second rigid body component relative to the first rigid body component, and acquiring a sliding shaft of the first rigid body component;
step c: calculating the position of the first rigid body component relative to the second rigid body component, and acquiring a sliding shaft of the second rigid body component;
step d: slide bar restraining is performed on the two rigid body members.
15. The mixed reality-based engineering equipment motion simulation method according to claim 7, wherein the method comprises the following steps: the motion constraint system established by the physical engine can successfully use the function in the physical engine only by adjusting the spatial posture or constraint axis of the rigid body component, and the adjusting method comprises the following steps: the rotation matrix rotated around the unit vector μ (x, y, z) and rotated by β using the rondrigas rotation formula is:
Figure FDA0003840354640000081
After the spatial pose of the rigid member is acquired, the rigid member is adjusted by equation (14).
CN202211100913.1A 2022-09-09 2022-09-09 Engineering equipment real-time interaction system and motion simulation method based on mixed reality Pending CN116310231A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211100913.1A CN116310231A (en) 2022-09-09 2022-09-09 Engineering equipment real-time interaction system and motion simulation method based on mixed reality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211100913.1A CN116310231A (en) 2022-09-09 2022-09-09 Engineering equipment real-time interaction system and motion simulation method based on mixed reality

Publications (1)

Publication Number Publication Date
CN116310231A true CN116310231A (en) 2023-06-23

Family

ID=86820935

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211100913.1A Pending CN116310231A (en) 2022-09-09 2022-09-09 Engineering equipment real-time interaction system and motion simulation method based on mixed reality

Country Status (1)

Country Link
CN (1) CN116310231A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117173240A (en) * 2023-11-03 2023-12-05 天津信天电子科技有限公司 AR auxiliary assembly method, device, equipment and medium for servo control driver
CN117688706A (en) * 2024-01-31 2024-03-12 湘潭大学 Wiring design method and system based on visual guidance

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117173240A (en) * 2023-11-03 2023-12-05 天津信天电子科技有限公司 AR auxiliary assembly method, device, equipment and medium for servo control driver
CN117173240B (en) * 2023-11-03 2024-02-06 天津信天电子科技有限公司 AR auxiliary assembly method, device, equipment and medium for servo control driver
CN117688706A (en) * 2024-01-31 2024-03-12 湘潭大学 Wiring design method and system based on visual guidance
CN117688706B (en) * 2024-01-31 2024-05-10 湘潭大学 Wiring design method and system based on visual guidance

Similar Documents

Publication Publication Date Title
CN116310231A (en) Engineering equipment real-time interaction system and motion simulation method based on mixed reality
Schaal The SL simulation and real-time control software package
USRE37374E1 (en) Gyro-stabilized platforms for force-feedback applications
Harris et al. Survey of popular robotics simulators, frameworks, and toolkits
USRE39906E1 (en) Gyro-stabilized platforms for force-feedback applications
CN107256284A (en) A kind of many gait dynamic modeling methods of real-time interactive quadruped robot and system
Rossmann et al. Virtual robotic testbeds: A foundation for e-robotics in space, in industry-and in the woods
CN111251305B (en) Robot force control method, device, system, robot and storage medium
CN107703775B (en) Rigid-flexible-liquid coupling complex spacecraft simulation system and method
RU2308764C2 (en) Method for moving a virtual jointed object in virtual space with prevention of collisions of jointed object with elements of environment
CN112497208A (en) Mobile operation robot general control method based on full-state impedance controller
Sharifi et al. Modelling and simulation of a non-holonomic omnidirectional mobile robot for offline programming and system performance analysis
Dobrokvashina et al. How to Create a New Model of a Mobile Robot in ROS/Gazebo Environment: An Extended Tutorial
Rossmann erobotics: The symbiosis of advanced robotics and virtual reality technologies
JP3247832B2 (en) Kinematics arithmetic unit
Yıldırım et al. ODE (Open Dynamics Engine) based stability control algorithm for six legged robot
CN113119102A (en) Humanoid robot modeling method and device based on floating base flywheel inverted pendulum
Jaramillo-Botero et al. Robomosp
Zheng et al. Research on virtual driving system of a forestry logging harvester
KR20200097896A (en) Apparatus and method for generating manipulator URDF file
Mikhalevich et al. Developing of KUKA youBot software for education process
US20220402126A1 (en) Systems, computer program products, and methods for building simulated worlds
Gaut et al. A Jupyter notebook environment for multibody dynamics
Harris Design and implementation of an autonomous robotics simulator
Roßmann From space to the forest and to construction sites: virtual testbeds pave the way for new technologies

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination