CN115390677B - Assembly simulation man-machine work efficiency evaluation system and method based on virtual reality - Google Patents

Assembly simulation man-machine work efficiency evaluation system and method based on virtual reality Download PDF

Info

Publication number
CN115390677B
CN115390677B CN202211324748.8A CN202211324748A CN115390677B CN 115390677 B CN115390677 B CN 115390677B CN 202211324748 A CN202211324748 A CN 202211324748A CN 115390677 B CN115390677 B CN 115390677B
Authority
CN
China
Prior art keywords
assembly
calculating
human
operator
real
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211324748.8A
Other languages
Chinese (zh)
Other versions
CN115390677A (en
Inventor
曲涛
景宁
王世龙
王绪海
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu CRRC Digital Technology Co Ltd
Original Assignee
Jiangsu CRRC Digital Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu CRRC Digital Technology Co Ltd filed Critical Jiangsu CRRC Digital Technology Co Ltd
Priority to CN202211324748.8A priority Critical patent/CN115390677B/en
Publication of CN115390677A publication Critical patent/CN115390677A/en
Application granted granted Critical
Publication of CN115390677B publication Critical patent/CN115390677B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Evolutionary Computation (AREA)
  • Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses an assembly simulation human-machine work efficiency evaluation system based on virtual reality, which comprises a virtual assembly simulation subsystem and a human-machine work efficiency monitoring and evaluation subsystem; the virtual assembly simulation subsystem comprises an assembly unit construction module, a process guide module and an interaction processing module; the human-machine work efficiency monitoring and evaluating subsystem comprises a human body posture capturing module and a calculating and analyzing module. The invention combines standard structured process information to carry out real-time data acquisition, analysis and evaluation on the operation posture and the action of an assembly operator, guides a process designer to carry out optimization and improvement on the station layout and the assembly process according to a feedback result, enables the operator to form a standard operation habit in a more reasonable and healthy environment, carries out standardized operation, and improves the production efficiency of the manufacturing industry by people oriented.

Description

Assembly simulation man-machine work efficiency evaluation system and method based on virtual reality
Technical Field
The invention belongs to the technical field of assembly simulation, and particularly relates to an assembly simulation man-machine work efficiency evaluation system and method based on virtual reality.
Background
With the progress of society and the change of science and technology, a new manufacturing concept and theory, namely virtual manufacturing, is promoted, and a brand new manufacturing system and mode become a necessary trend of the development of modern manufacturing technology. Virtual assembly simulation is an important branch of research in the field of virtual manufacturing research. The virtual assembly simulation is to pre-assemble a three-dimensional model of a designed product through a virtual reality technology in a virtual environment, so that the design and assembly structure of the product can be rapidly improved, and the aims of shortening the design optimization period of the product and reducing the cost can be fulfilled.
In an actual assembly scenario, human core factors are not negligible. The unreasonable design of the assembly process, the unreasonable layout of the assembly units (stations), the non-standard actions of the operators and other factors can affect the physical health and the production efficiency of workers. In addition, the traditional ergonomic evaluation is carried out on the premise that discomfort or occupational injury occurs to workers, and an ideal ergonomic evaluation method can analyze and judge the possible occurrence situation of the injury as far as possible before the injury occurs. Therefore, there is a need to establish a system and method for replacing the traditional physical prototype with a virtual product, performing human-machine work efficiency analysis in a virtual environment, monitoring and evaluating the human-machine work efficiency of an operator from the perspective of personnel health, and analyzing the comfort level of the human during working and the adaptability of the human to a work load, so as to eliminate ineffective work in time, reduce fatigue, reasonably utilize human power and equipment, and improve work efficiency.
At present, with the increasing demand of personalized customization of products, assembly process planning and station layout often change and adjust frequently, under the restriction of multiple factors such as field, cost and time, the actual trial assembly process needs to be replaced by virtual assembly simulation, the assembly process verification and the human-computer work efficiency analysis and evaluation are rapidly carried out, and the process design and the human factor problem existing in the assembly process are found in advance. Therefore, a real-time assembly operation simulation man-machine efficiency evaluation system is built by utilizing the virtual reality and computer vision technology, a virtual assembly environment consistent with an actual scene can be quickly built, assembly operation simulation and man-machine efficiency evaluation can be carried out, and verification cost and period are reduced.
The traditional virtual assembly simulation and man-machine ergonomics evaluation method is usually based on a virtual human, and does not fully reflect the real participation and interactivity of the human in the assembly operation process. The invention with the patent number of CN113633281A, CN113331825A respectively provides a posture acquisition and evaluation method in an assembly process and a virtual reality-based RULA real-time evaluation method, focuses on health risk factors of an operator and provides posture scoring rules. However, the evaluation methods neglect effective utilization of process information, such as the influence of the size and weight of parts and tools on the biomechanical load level of a human body during the assembly process is not considered, and the accuracy is lacked. Meanwhile, the wearable device is used as the human motion capture device, a specific experiment site and specific equipment are needed, the requirement on test evaluation is high, the evaluation cost is high, and flexible deployment and application are difficult. In addition, the invention does not relate to the influence of operation stress, the invention with the patent number of CN110163425A combines the operation force and the human body posture, and carries out more scientific analysis on the safety of personnel during operation by means of torque, but the method carries out analysis calculation and evaluation by constructing a virtual human body model, does not provide a real human body posture real-time data acquisition scheme, and has the disadvantages of too complex calculation process, large resource cost and difficult popularization.
Disclosure of Invention
The technical problem to be solved is as follows: in order to solve the problems of lack of accuracy, high requirement on evaluated environment, high cost and lack of a specific human body posture real-time data acquisition scheme in the prior art, the invention provides an assembly simulation human-computer work efficiency evaluation system and method based on virtual reality based on technologies such as virtual reality, computer vision, deep learning and the like, and the verification of the comfort of assembly operators, the accessibility of operation and the reasonability of the spatial layout of an assembly unit in the assembly process is realized; the human body action characteristic points are identified in the virtual environment mapped with the actual assembly operation unit 1:1, the real-time data acquisition, analysis and evaluation are carried out on the operation postures and actions of assembly operators by combining standard structural process information, and the process designers are guided to carry out optimization improvement on the station layout and the assembly process according to the feedback result, so that operators can form standard operation habits and carry out standardized operation in a more reasonable and healthy environment, and the production efficiency of the manufacturing industry is improved based on people.
The technical scheme is as follows:
an assembly simulation human-machine work efficiency evaluation system based on virtual reality comprises a virtual assembly simulation subsystem and a human-machine work efficiency monitoring and evaluation subsystem;
the virtual assembly simulation subsystem comprises an assembly unit construction module, a process guide module and an interaction processing module;
the assembly unit construction module reads the hierarchical structure and spatial layout information of the assembly units, loads the three-dimensional models of the assembly operation environment and the station facilities, and generates a basic operation scene consisting of non-assembly objects; reading the structured process information, loading three-dimensional models of parts, assemblies, tools and clamps corresponding to all assembly procedures of the assembly unit, and generating a virtual assembly operation scene consistent with a real assembly environment; the process guidance module sequentially reads an assembly operation guidance instruction, information of the parts to be assembled and related tool clamp information of the current assembly process from an assembly process route, guides an operator to select the corresponding parts to be assembled and the corresponding tool clamp in a virtual assembly operation scene, and places the parts to be assembled on a final assembly position according to assembly constraint and an assembly path required by the assembly process; the interactive processing module identifies collision interference in the assembly simulation operation process, captures the pick-up, movement, placement and fixation interactive actions of an operator on the parts and the tool clamp, generates assembly simulation operation information including weight information, size information, initial position and current position of the picked-up parts and the tool in real time by combining manufacture BOM data, and sends the assembly simulation operation information to the human machine work efficiency monitoring and evaluating subsystem for analysis and calculation;
the human-machine work efficiency monitoring and evaluating subsystem comprises a human body posture capturing module and a calculating and analyzing module;
the human body posture capturing module captures an assembly operation posture video image of an operator in real time by adopting a monocular camera; the calculation and analysis module analyzes and processes the assembly operation action gesture video images collected by the human body gesture capture module to obtain 3D coordinate information of human body key points of an operator, calculates human body key part angles based on RULA and LUBA based on the 3D coordinate information of the human body key points, analyzes and obtains first human efficiency evaluation data based on the human body gesture, calculates external moment of the human body key points and limbs based on the 3D coordinate information of the human body key points and assembly simulation operation information sent by the interactive processing module, calculates load index according to an NIOSH evaluation method, and analyzes and obtains second human efficiency evaluation data based on the external force applied to the human body; and integrating the first human-machine work efficiency evaluation data and the second human-machine work efficiency evaluation data to obtain real-time integrated human-machine work efficiency data and recording the real-time integrated human-machine work efficiency data to a human-machine work efficiency scoring log.
Further, the human-machine work efficiency monitoring and evaluating subsystem comprises an early warning module;
after parameter configuration and initial calibration are carried out by the early warning module, real-time evaluation result display and risk alarm reminding are carried out on an operator in the virtual assembly process; the interactive processing module receives the risk alarm prompt sent by the early warning module in real time and prompts an operator to perceive assembly operation which may cause fatigue and musculoskeletal injury in real time.
Further, the calculation and analysis module comprises a 3D attitude estimation full convolution neural network for transfer learning based on a ResNet-50 network and a human-machine ergonomics analysis module constructed based on RULA and LUBA;
when the monocular camera is used for replacing the position each time, the human body posture capturing module continuously captures a posture image of an operator in a relaxation state at the center of the field in a timing manner of a normal comfortable posture station, sends an acquisition result to the 3D posture estimation complete convolution neural network for basic calibration operation, and records three-dimensional coordinates of a plurality of groups of effective 3D key points and calculates an average value of the three-dimensional coordinates as basic comparison information for human body posture estimation in subsequent virtual reality assembly simulation;
after the virtual assembly operation is started, the human body posture capturing module captures an assembly operation posture video image of an operator in real time, the assembly operation posture video image is imported into a 3D posture estimation full convolution neural network, and real-time relative coordinate information of a 3D key point of the operator is obtained through calculation;
the 3D attitude estimation full convolution neural network inputs real-time relative coordinate information and basic comparison information of 3D key points of an operator to a man-machine ergonomics analysis module at the same time, and calculation, evaluation and early warning of human body attitude are carried out.
Further, the human-computer ergonomics analysis module calculates the relative angles of the trunk, the neck, the left arm and the right arm according to the real-time relative coordinate information and the basic comparison information of the 3D key points of the operator;
calculating the relative angle of the sagittal plane of the trunk, calculating a trunk facing direction vector by using the coordinates of the left shoulder, the right shoulder and the pelvis, comparing the trunk facing direction vector obtained by monitoring and calculating during virtual assembly operation with the trunk facing direction vector obtained by calibration, calculating the angle of the trunk facing direction changed on the vertical plane, and obtaining the real-time relative angle of the trunk sagittal plane of an operator in virtual operation; calculating the relative angle of the coronal plane of the trunk, taking the vector from the left shoulder to the right shoulder as a lateral direction vector of the trunk, comparing the lateral direction vector of the trunk obtained by monitoring and calculating during virtual assembly operation with the lateral direction vector of the trunk obtained by calibration, calculating the angle of the lateral direction of the trunk changed on the vertical plane, and obtaining the real-time relative angle of the coronal plane of the trunk of the operator in virtual operation;
for the calculation of the relative angle of the sagittal plane of the neck, the central points P1 of the left eye, the right eye and the nose are calculated firstly, the central points P2 of the left ear and the right ear are calculated, the vectors from P2 to P1 are used as the facing direction vectors of the head, the facing direction vectors of the head obtained by monitoring and calculating during the virtual assembly operation are compared with the facing direction vectors of the head obtained by calibration, the angle of the facing direction of the head changed on the vertical plane is calculated, and the real-time relative angle of the sagittal plane of the neck of an operator in the virtual operation is obtained; calculating the relative angle of the coronal plane of the neck, calculating a vector V1 from the left ear to the right ear and a direction V2 from the left eye to the right eye, taking an average vector of V1 and V2 as a lateral direction vector of the head, comparing the lateral direction vector of the head monitored and calculated during virtual assembly operation with the lateral direction vector of the head obtained by calibration, calculating the angle of the lateral direction of the head changed on a vertical plane, and obtaining the real-time relative angle of the coronal plane of the neck of an operator in virtual operation;
calculating the relative angle of the sagittal plane of the left arm, calculating the left arm direction vector from the left shoulder to the left elbow, calculating the midpoint P3 between the left shoulder and the right shoulder, calculating the midpoint P4 between the left crotch and the right crotch, calculating the vectors from P3 to P4 to obtain the vertical direction vector V3 of the human body, then calculating the facing direction vector V4 of the body, and expressing the relative sagittal plane of the body by V3 and V4; projecting the direction vector of the left arm to a relative sagittal plane of the trunk, and calculating an included angle between the projected vector and V3 to obtain a real-time relative angle of the left arm sagittal plane of an operator in virtual operation; calculating the relative angle of the sagittal plane of the left arm, replacing the projection plane with a relative trunk coronal plane determined by the left shoulder, the right shoulder and the pelvis, and calculating to obtain the real-time relative angle of the coronal plane of the left arm of the operator in the virtual operation;
calculating the relative angle of the sagittal plane of the right arm, calculating the right arm direction vector from the right shoulder to the right elbow, calculating the midpoint P3 between the left shoulder and the right shoulder, calculating the midpoint P4 between the left crotch and the right crotch, calculating the vectors from P3 to P4 to obtain the vertical direction vector V3 of the human body, then calculating the facing direction vector V4 of the body, and expressing the relative sagittal plane of the body by V3 and V4; projecting the direction vector of the right arm to a relative sagittal plane of the trunk, and calculating an included angle between the projected vector and V3 to obtain a real-time relative angle of the sagittal plane of the right arm of an operator in virtual operation; and calculating the relative angle of the sagittal plane of the right arm, replacing the projection plane with a relative trunk coronal plane determined by the left shoulder, the right shoulder and the pelvis, and calculating to obtain the real-time relative angle of the right arm coronal plane of the operator in the virtual operation.
Furthermore, the human-computer work efficiency analysis module is provided with a plurality of value ranges aiming at the relative angles of the trunk, the neck, the left arm and the right arm, each value range corresponds to a score, and corresponding key point scores are judged and obtained according to the values of the relative angles of the trunk, the neck, the left arm and the right arm corresponding to the real-time relative coordinate information of an operator; and then, integrating all the key point scoring analysis to obtain first human-machine efficiency evaluation data based on the human body posture.
Further, the human-computer ergonomics analysis module calculates external moment borne by key nodes and limbs of a human body according to real-time relative coordinate information of the 3D key points and current workpiece weight and size data in the received real-time assembly simulation operation information;
for a workpiece held by a single hand, directly calculating the moment of the workpiece on the corresponding elbow, shoulder and waist; equally dividing the gravity of the workpiece into two parts for the workpiece held by two hands, respectively calculating the moments of the equally divided workpiece on the corresponding elbow, shoulder and waist according to the equally divided workpiece gravity, and adding the moments of the two equally divided workpiece gravity on the waist to serve as the moments of the complete workpiece on the waist.
Further, the process that the man-machine ergonomics analysis module calculates the load index according to the NIOSH assessment method and analyzes and obtains second man-machine ergonomics evaluation data based on external force applied to the human body comprises the following steps:
according to the real-time relative coordinate information of the 3D key points, calculating to obtain a horizontal distance H from the hand of the operator to the middle point of the central lines of the two feet, a vertical distance V from the hand of the operator to the middle point of the central lines of the two feet, a vector included angle between the left crotch and the right crotch of the operator and a vector included angle between the left shoulder and the right shoulder, and an angle A of body rotation of the operator;
acquiring the size of a current workpiece, the distance D between the operation start of the workpiece and the current position in the vertical direction in the assembly simulation operation information, and acquiring the corresponding picking difficulty level C;
recording and calculating the operation frequency F of a task in the virtual assembly process;
calculating a horizontal factor HM according to the horizontal distance H from the hand of the operator to the middle point of the central lines of the two feet; calculating a vertical factor VM according to a vertical distance V from the hand of an operator to the middle point of the central lines of the two feet; calculating a distance factor DM according to the distance D between the workpiece operation start and the current position in the vertical direction; calculating an asymmetry factor AM according to the angle A of the body rotation of the operator; calculating a frequency factor FM based on the operating frequency F; calculating a coupling factor CM according to the picking difficulty level C;
calculating a recommended limit weight RWL = LC × HM × VM × DM × FM × AM × CM, wherein LC is a load constant; and calculating a lifting coefficient LI = m/RWL, wherein m is the mass of the currently operated workpiece.
And further, taking the smaller value of the first human-machine work efficiency evaluation data and the second human-machine work efficiency evaluation data as real-time comprehensive human-machine work efficiency data and recording the real-time comprehensive human-machine work efficiency data to a human-machine work efficiency scoring log.
The invention discloses an assembly simulation human-machine work efficiency evaluation method based on virtual reality, which is executed based on the assembly simulation human-machine work efficiency evaluation system;
the assembly simulation man-machine ergonomics evaluation method comprises the following steps:
s1, loading layout information and structural process information of an assembly unit into a virtual assembly simulation subsystem;
s2, reading the hierarchical structure and spatial layout information of the assembly units by using an assembly unit construction module, loading a three-dimensional model of an assembly operation environment and a station facility, and generating a basic operation scene consisting of non-assembly objects; reading the structured process information, loading three-dimensional models of parts, assemblies, tools and clamps required in all assembly procedures of the assembly unit, and interactively adjusting the placing position and the spatial layout of each object by an operator to generate a virtual assembly operation scene consistent with a real assembly environment;
s3, sequentially reading the assembly operation guide description, the information of the parts to be assembled and the information of the tool clamps to be used of the current assembly process from the assembly process route by using a process guide module, guiding an operator to select the corresponding parts to be assembled and the tool clamps from the virtual assembly operation scene generated in the step S2, and placing the parts to be assembled on a final assembly position according to assembly constraints and an assembly path required by the assembly process;
s4, recognizing collision interference in the assembly simulation operation process by using an interactive processing module, capturing the pick-up, movement, placement and fixation interactive actions of an operator on the parts and the tool clamp in the step S3, generating assembly simulation operation information including weight information, size information, the initial position and the current position of the picked-up parts and the tool in real time by combining the manufacture BOM data, and sending the assembly simulation operation information to the human efficiency monitoring and evaluating subsystem for analysis and calculation;
s5, capturing a virtual assembly real-time operation image of an operator by using a human body posture capturing module in the human-computer work efficiency monitoring and evaluating subsystem, and analyzing and processing the image by using a calculating and analyzing module to calculate the 3D coordinate information of the human body key points; calculating human body key part angles based on the RULA and the LUBA based on the obtained 3D coordinate information of the human body key points, and obtaining first human engineering efficiency evaluation data based on the human body posture in real time;
s6, calculating external moment borne by the key points and the limbs of the human body based on the 3D coordinate information of the key points of the human body obtained in the step S5 and the current workpiece weight and size data in the received real-time assembly simulation operation information, and calculating a load index according to an NIOSH (non-invasive surgery) evaluation method to obtain second human-machine work efficiency evaluation data based on the external force borne by the human body;
s7, calculating and analyzing real-time comprehensive human-machine work efficiency data through a calculating and analyzing module based on first human-machine work efficiency evaluation data of human body postures and second human-machine work efficiency evaluation data of external forces borne by human bodies, and recording the real-time comprehensive human-machine work efficiency data to a human-machine work efficiency scoring log; in the virtual assembly simulation process, if the real-time comprehensive human-computer work efficiency data is lower than a set standard, the early warning module carries out early warning reminding on an operator;
and S8, carrying out comprehensive statistical analysis on the human-machine efficiency scoring log data to obtain a virtual production line comprehensive human-machine efficiency evaluation result comprising a comprehensive human-machine efficiency statistical chart of the assembly unit, a final scoring table, risk factor analysis and station adjustment suggestions.
Has the advantages that:
firstly, the human-machine work efficiency evaluation system and method for the assembly simulation based on the virtual reality capture, collect and analyze the posture and the action of an operator based on a low-cost monocular camera in the process of the virtual assembly simulation by combining the virtual reality technology and the computer vision technology, estimate the human-machine work efficiency load by utilizing the structured process data, realize qualitative and quantitative human-machine work efficiency real-time comprehensive analysis, and help a process designer to find out the design defect of the assembly process through an evaluation result. The method greatly reduces the implementation cost and the operation difficulty of the man-machine work efficiency evaluation work of the assembly operation, does not need to perform trial assembly verification on a real assembly unit, improves the accuracy, comprehensiveness and objectivity of the man-machine work efficiency evaluation, and can be flexibly applied to various production assembly scenes.
Secondly, the assembly simulation man-machine work efficiency evaluation system and method based on virtual reality not only utilize RULA and LUBA evaluation methods to evaluate and analyze human body postures, but also more importantly consider the influences of the sizes and weights of the current operating workpieces and tools on human body loads, utilize NIOSH method and the moment analysis of the workpieces on key points of the human body to carry out man-machine work efficiency evaluation on external forces borne by the human body, and solve the problem that the traditional method only evaluates based on the human body postures to cause inaccurate, incomplete and unserviceable results. Compared with the existing human-computer work efficiency evaluation human body posture data acquisition method, the method has the advantages that the requirement on the environment and the cost for acquiring the human body posture data are greatly reduced.
Drawings
FIG. 1 is a schematic structural diagram of an assembly simulation human-machine ergonomics evaluation system based on virtual reality according to an embodiment of the present invention;
FIG. 2 is a flow chart of an assembly simulation human-machine ergonomics assessment method based on virtual reality according to an embodiment of the present invention;
fig. 3 is an architecture diagram of an assembly simulation human-machine ergonomics evaluation method based on virtual reality according to an embodiment of the present invention.
Detailed Description
The following examples are presented to enable one of ordinary skill in the art to more fully understand the present invention and are not intended to limit the invention in any way.
Fig. 1 is a schematic structural diagram of an assembly simulation ergonomic assessment system based on virtual reality according to an embodiment of the present invention. Referring to fig. 1, the assembly simulation ergonomic evaluation system includes a virtual assembly simulation subsystem and an ergonomic monitoring and evaluation subsystem. An operator executes virtual assembly operation under the guidance of the virtual assembly simulation subsystem, and man-machine work efficiency is evaluated through the man-machine work efficiency monitoring and evaluating subsystem.
The virtual assembly simulation subsystem mainly simulates the assembly process in a virtual reality mode, and an operator performs assembly simulation operation in a virtual scene consistent with an actual assembly scene in an immersive interaction mode to identify problems of assembly design defects, unreasonable station layout, unreasonable assembly process and the like. The virtual assembly simulation subsystem is a virtual simulation environment based on the development and realization of an Unreal virtual engine, and mainly comprises an assembly unit construction module, a process guidance module and an interactive processing module.
The assembly unit construction module reads the hierarchical structure and spatial layout information of the assembly units, loads the three-dimensional models of the assembly operation environment and the station facilities, and generates a basic operation scene consisting of non-assembly objects; and reading the structured process information, loading the three-dimensional models of parts, assemblies, tools and clamps required in all assembly steps of the assembly unit (station), and interactively adjusting the placing position and the spatial layout of each object by an operator to generate a virtual assembly operation scene consistent with a real assembly environment.
And the process guide module sequentially reads the assembly operation guide instruction, the information of the parts to be assembled and the information of the tool clamps to be used in the current working procedure step from the assembly process route, guides an operator to select the corresponding parts to be assembled and the corresponding tool clamps in the virtual assembly operation scene through VR equipment, and places the parts to be assembled on the final assembly position according to the assembly constraint and the assembly path required by the assembly process. The assembly simulation guide information provided by the process guide module comprises specific operation instructions such as selection of an assembly tool, picking, moving, placing, fixing action, assembly action path and the like of parts.
The interactive processing module identifies collision interference in the assembly simulation operation process, captures interactive actions of operators on picking, moving, placing, fixing and the like of parts and tool fixtures, generates assembly simulation operation information including weight information, size information, initial positions and current positions of the picked parts and tools and assembly simulation operation information including the initial positions and the current positions of the operated workpieces in real time by combining manufacturing BOM data, and sends the assembly simulation operation information to the human efficiency monitoring and evaluation subsystem for analysis and calculation. The interactive processing module also receives real-time alarm feedback information sent by the human-computer work efficiency monitoring and evaluating subsystem in real time, and reminds an operator to perceive assembly operation which possibly causes fatigue and musculoskeletal injury in real time.
The human-machine work efficiency monitoring and evaluating subsystem realizes analysis and evaluation of human-machine work efficiency according to the assembly simulation operation information and the computer vision information, is built based on a Jetson Nano of an Invitta edge computing platform, and mainly comprises a human body posture capturing module, a computing and analyzing module and an early warning module.
The human body posture capturing module captures and identifies the assembly operation action posture of an operator by using a monocular camera, and collects a virtual assembly operation image of the operator.
The calculation and analysis module analyzes and processes the real-time image information acquired by the human body posture capture module to obtain the posture information of the human body, so as to obtain the relative 3D coordinates of the human body, and then the real-time evaluation and statistical grading of the human-computer work efficiency are obtained by utilizing the relative 3D coordinates of the human body obtained by calculation and the assembly simulation operation information received in real time and combining with the physiological mechanical limit and the rapid upper limb evaluation (RULA, LUBA). Compared with wearable motion capture equipment, the cost of using a camera and a machine vision algorithm is lower, the deployment is more flexible and convenient, and the test can be performed in various environments; the calculation and analysis module is also used for integrating and analyzing the obtained human body posture information and the assembly simulation operation information sent by the virtual assembly simulation subsystem to obtain real-time evaluation and statistical grading of the human-computer work efficiency.
After parameter configuration and initial calibration are carried out by the early warning module, real-time evaluation result display and risk alarm reminding can be carried out on an operator in the virtual assembly process. The early warning module comprises an operation interface interaction submodule and an audio interaction submodule and is used for displaying the calculation information of the human-computer work efficiency to an operator in real time and assisting the operator to finish the standard and complete virtual operation testing step.
In the virtual assembly process of the system used in the embodiment, the human body posture recognition and the human-computer work efficiency evaluation can be completed only by using the edge computing platform and the monocular camera, and the problems that the traditional method depends on wearable human body posture capture equipment, the cost is high, the deployment is complex, and the requirement on the realization environment is high are solved.
Referring to fig. 2 and fig. 3, the present embodiment further provides an assembly simulation human-machine ergonomics evaluation method based on virtual reality, which is implemented based on the aforementioned assembly simulation human-machine ergonomics evaluation system based on virtual reality, and the method includes the following steps:
s1, loading layout information and structural process information of the assembly unit into a virtual assembly simulation subsystem.
S2, reading the hierarchical structure and spatial layout information of the assembly units by using an assembly unit construction module, loading a three-dimensional model of an assembly operation environment and a station facility, and generating a basic operation scene consisting of non-assembly objects; and reading the structured process information, loading the three-dimensional models of parts, assemblies, tools and clamps required in all assembly steps of the assembly unit (station), and interactively adjusting the placing position and the spatial layout of each object by an operator to generate a virtual assembly operation scene consistent with a real assembly environment.
And S3, sequentially reading the assembly operation instruction, the information of the parts to be assembled and the information of the tool clamps to be used in the current working procedure step from the assembly process route by using the process guide module, guiding an operator to select the corresponding parts to be assembled and the corresponding tool clamps in the virtual assembly operation scene generated in the step S2 through VR equipment, and placing the parts to be assembled at the final assembly position according to the assembly constraint and the assembly path required by the assembly process. The assembly simulation guide information provided by the process guide module comprises specific operation instructions such as selection of an assembly tool, picking, moving, placing, fixing action, assembly action path and the like of parts.
And S4, recognizing collision interference in the assembly simulation operation process by using the interactive processing module, capturing interactive actions of picking, moving, placing, fixing and the like of the operator on the parts and the tool clamp in the step S3, generating assembly simulation operation information including weight information, size information, initial position and current position of the picked parts and the tool in real time by combining the manufacture BOM data, and sending the assembly simulation operation information to the human machine work effect monitoring and evaluating subsystem for analysis and calculation. The interactive processing module also receives real-time alarm feedback information sent by the human-computer work efficiency monitoring and evaluating subsystem in real time, and reminds an operator to perceive assembly operation which possibly causes fatigue and musculoskeletal injury in real time.
And S5, capturing a virtual assembly real-time operation image of an operator, which is acquired by a monocular camera, by using a human body posture capturing module in the human-machine work efficiency monitoring and evaluating subsystem, inputting the image acquired in real time into a 3D posture estimation full convolution neural network for transfer learning based on ResNet-50, and obtaining relative 3D coordinate data of the human body key points. And calculating the human body key part angle based on the RULA and the LUBA based on the obtained human body key point 3D coordinate information, and obtaining first human engineering efficiency evaluation data based on the human body posture in real time according to a specified evaluation rule. The method realizes the capture of real-time operation images of an operator by using the camera and a machine vision algorithm, has lower cost and more flexible and convenient deployment compared with a method adopting wearable motion capture equipment, and can carry out testing in various environments.
Specifically, in the actual operation process, when the monocular camera is changed in position each time, the simulation operator stands in a normal comfortable posture at the center of the field, basic calibration operation is carried out, 20 effective three-dimensional coordinates are recorded, and the average value of the effective three-dimensional coordinates is obtained to serve as basic comparison information for human posture evaluation in the subsequent virtual reality assembly simulation. After the virtual assembly operation is started, the monocular camera captures an operation picture of an operator in real time, and relative coordinate information of the 3D key point of the operator is obtained through the method. And inputting the information and the basic information into a human-machine work efficiency calculation and analysis module based on the RULA and the LUBA to calculate, evaluate and early warn the human body posture.
In the human-machine work efficiency calculating and analyzing module based on the RULA and the LUBA, the relative angle calculation of key parts of a human body is firstly carried out, wherein the relative angle calculation comprises a trunk, a neck, a left arm and a right arm.
For the sagittal plane relative angle calculation of the torso, the torso-facing direction vector is calculated with the coordinates of the left shoulder, right shoulder, and pelvis. And comparing the trunk facing direction vector obtained by monitoring and calculating during the virtual assembly operation with the trunk facing direction vector obtained by calibration, calculating the angle of the trunk facing direction changed on the vertical plane, and obtaining the real-time relative angle of the trunk sagittal plane of the operator in the virtual operation.
For the coronal relative angle calculation of the torso, the vector from the left shoulder to the right shoulder is taken as the torso lateral direction vector. And comparing the lateral direction vector of the trunk, which is obtained by monitoring and calculating during the virtual assembly operation, with the lateral direction vector of the trunk, which is obtained by calibration, calculating the angle of the lateral direction of the trunk, which changes on a vertical plane, and obtaining the real-time relative angle of the coronal plane of the trunk of the operator in the virtual operation.
For the calculation of the relative angle of the sagittal plane of the neck, the central points P1 of the left eye, the right eye and the nose are calculated, then the central points P2 of the left ear and the right ear are calculated, and the vectors from P2 to P1 are taken as the facing direction vectors of the head. And comparing the head direction-facing vector obtained by monitoring and calculating during the virtual assembly operation with the head direction-facing vector obtained by calibration, calculating the angle of the head direction-facing in the vertical plane, and obtaining the real-time relative angle of the neck sagittal plane of the operator in the virtual operation.
For the coronal relative angle calculation of the neck, a left-ear-to-right-ear vector V1 and a left-eye-to-right-eye direction V2 are calculated, and the average vector of V1 and V2 is taken as the lateral direction vector of the head. And comparing the lateral direction vector of the head, which is obtained by monitoring and calculating during the virtual assembly operation, with the lateral direction vector of the head, which is obtained by calibration, calculating the angle of the lateral direction of the head, which changes on a vertical plane, and obtaining the real-time relative angle of the coronal plane of the neck of the operator in the virtual operation.
For the sagittal plane relative angle calculation of the left arm, a left arm direction vector from the left shoulder to the left elbow is calculated, a middle point P1 between the left shoulder and the right shoulder is calculated, a middle point P2 between the left crotch and the right crotch is calculated, and vectors from P1 to P2 are calculated, so that a vertical direction vector V1 of the human body is obtained. The facing direction vector V2 of the torso is calculated from the previous steps, and V1 and V2 may represent the torso relative to the sagittal plane. And projecting the direction vector of the left arm to the relative sagittal plane of the trunk, and calculating the included angle between the projected vector and the V1 to obtain the real-time relative angle of the left arm sagittal plane of an operator in virtual operation.
For the calculation of the relative angle of the sagittal plane of the left arm, the projection plane is replaced by the relative coronal plane of the trunk determined by the left shoulder, the right shoulder and the pelvis according to the method, and the real-time relative angle of the coronal plane of the left arm of the operator in the virtual operation is calculated.
The calculation of the sagittal plane and the coronal plane is carried out in the same way for the right arm, and the real-time angle can be obtained.
Based on the RULA and BULA methods, real-time scoring is carried out according to the following modes:
the attitude scores were initially all 5.
If the trunk sagittal angle is less than-10 ° (10 ° backward tilt), the length of time exceeds 1 second, and the trunk sagittal score is 2.
If the sagittal angle of the torso is between-10 and 20 (20 forward dip), the sagittal torso score is 5.
If the trunk sagittal angle is between 20 ° and 60 ° and the duration exceeds 5 seconds, the trunk sagittal score is 4.
If the trunk sagittal angle exceeds 60 deg., for more than 1 second, the trunk sagittal score is 2.
If the trunk coronal angle is between-10 ° and 10 °, the trunk coronal score is 5.
If the angle of the trunk coronal is less than-10 degrees or more than 10 degrees and the duration exceeds 1 second, the trunk coronal score is 3.
The torso score was the smaller of the torso sagittal score and the torso coronal score.
If the cervical sagittal angle is less than-5 ° (5 ° poured backwards), for more than 1 second, the cervical sagittal score is 2.
If the neck sagittal angle is between-5 ° and 10 ° (10 ° forward tilt), the neck sagittal score is 5.
If the cervical sagittal angle is between 10 ° and 20 ° for more than 5 seconds, the cervical sagittal score is 4.
If the neck sagittal angle exceeds 20 deg., and the duration exceeds 1 second, the neck sagittal score is 2.
If the cervical coronal angle is between-5 ° and 5 °, the cervical coronal score is 5.
If the neck coronal angle is less than-10 ° or greater than 10 ° and the duration exceeds 1 second, the neck coronal score is 3.
The neck score is the lesser of the cervical sagittal score and the cervical coronal score.
If the arm sagittal angle is less than-20 (20 for a 20. Backwards vertical lift), the length of time exceeds 1 second, and the arm sagittal score is 2.
If the sagittal angle of the arm is between-20 deg. and 20 deg. (20 deg. forward vertical lift), the sagittal score of the arm is 5.
If the sagittal angle of the arm is between 20 degrees and 45 degrees and the duration exceeds 10 seconds, the sagittal score of the arm is 4.
If the sagittal angle of the arm is between 45 degrees and 90 degrees and the duration exceeds 5 seconds, the sagittal score of the arm is 3.
If the arm sagittal angle exceeds 45 degrees and the duration exceeds 1 second, the arm sagittal score is 2.
If the coronal angle of the arm is between-20 ° and 20 °, the coronal score of the arm is 5.
If the coronal angle of the arm is less than-20 ° or greater than 20 ° and the duration exceeds 5 seconds, the coronal score of the arm is 3.
The arm score is the lesser of the neck arm score and the arm coronal score, and the left arm and right arm scores are the same standard.
The whole body score based on human ergonomics assessment data of human body posture is the average of all body part scores.
And S6, calculating external moment applied to the key nodes and the limbs of the human body based on the 3D coordinate information of the key points of the human body obtained in the step S5 and the weight and size data of the current workpiece in the received real-time assembly simulation operation information, calculating a load index according to an NIOSH (non-invasive surgery) evaluation method, and obtaining second human-machine work efficiency evaluation data based on the external force applied to the human body according to a specified evaluation rule.
Specifically, current workpiece weight data of assembly simulation operation information is received, and moments of an operation workpiece on an elbow, a shoulder and a waist are calculated respectively.
If the workpiece is held by one hand, the moments of the workpiece on the corresponding elbow, shoulder and waist are calculated.
If the moment of the workpiece to the elbow is less than 2.5N · m and the moment to the shoulder is less than 4N · m, the arm scores 5 and the neck scores 5.
If the moment of the workpiece on the elbow is between 2.5N · m and 5N · m and not more than 8N · m for the shoulder, or the moment on the shoulder is between 4N · m and 8N · m and not more than 5N · m for the elbow, the score for the arm is 4 and the score for the neck is 5.
If the moment of the workpiece on the elbow is between 5N · m and 7.5N · m and not more than 12N · m for the shoulder, or the moment on the shoulder is between 8N · m and 12N · m and not more than 7.5N · m for the elbow, the score for the arm is 3 and the score for the neck is 4.
If the moment of the workpiece to the elbow is greater than 7.5N · m or the moment to the shoulder is greater than 12N · m, the arm scores 2 and the neck scores 3.
If the moment of the workpiece to the waist is less than 5N · m, the torso is scored as 5.
If the moment of the work piece to the waist is between 5N · m and 10N · m, the torso is scored as 4.
If the moment of the work piece to the waist is between 10N · m and 15N · m, the torso is scored as 3.
If the moment of the workpiece to the waist is greater than 15N · m, the torso is scored as 2.
And for the workpiece held by two hands, equally dividing the gravity of the workpiece into two parts, respectively calculating corresponding elbows and shoulders, summing the calculation results of the waist, and scoring according to the above mode.
Carrying out human body load scoring based on NIOSH, wherein the processing flow is as follows:
according to the 3D coordinate information of the key points of the human body, calculating the following information:
the horizontal distance H from the operator's hand to the midpoint of the centerline of the two feet is in cm.
The vertical distance V from the operator's hand to the midpoint of the center lines of the two feet is in cm.
And calculating the included angle between the vector from the left crotch to the right crotch of the operator and the vector from the left shoulder to the right shoulder to obtain the rotating angle A of the body of the operator.
And simultaneously receiving the current workpiece size, the workpiece operation start and the distance D of the current position in the vertical direction in the assembly simulation operation information to obtain a picking difficulty degree grade C which is divided into good, good and poor.
The operating frequency F of a certain task in the virtual assembly process is recorded and calculated.
Calculating a level factor HM from H: if H < 25, HM =1; HM = (25/H) if H is more than or equal to 25 and less than or equal to 63; if H > 63, HM =0.
Calculating a vertical factor VM according to V: if V is less than or equal to 175, VM = (1-0.003 non-V-75 |); if H > 175, VM =0.
Calculating the distance factor DM from D: if D < 25, DM =1; if D is more than or equal to 25 and less than or equal to 175, DM =0.82+ (4.5/D); if D > 175, DM =0.
Calculating an asymmetry factor AM according to a: AM =1- (0.0032A) if A is not more than 135 degrees, and AM =0 if A is more than 135 degrees.
The frequency factor FM is calculated based on F as shown in table 1.
Figure DEST_PATH_IMAGE001
Calculating a coupling factor CM according to C: if C is good, CM =1.0; if C is better, CM =0.95; if C is poor, CM =0.9.
Taking LC as a load constant, and taking 18kg of male and 15kg of female; the weight of each hand was 7kg and 5kg.
Calculating recommended limit weight RWL = LC × HM × VM × DM × FM × AM × CM
And calculating a lifting coefficient LI = m/RWL, wherein m is the mass of the currently operated workpiece.
If LI is less than 0.5, NIOSH score is 5.0; if LI is more than or equal to 0.5 and less than 0.75, the NIOSH score is 4.0; if LI is more than or equal to 0.75 and less than 0.85, the NIOSH score is 3.0; if LI is greater than or equal to 0.85, the NIOSH score is 2.0;
and (3) taking the smaller value of the external moment score and the NIOSH score of the key points of the human body based on the human-machine work efficiency score of the external force applied to the human body.
And S7, taking the smaller value of the first human-machine work efficiency evaluation data based on the posture of the human body and the second human-machine work efficiency evaluation data based on the external force applied to the human body as real-time comprehensive human-machine work efficiency scores, and recording the real-time comprehensive human-machine work efficiency scores to a human-machine work efficiency score log.
And S8, in the virtual assembly simulation process, if the human-machine work efficiency score is lower than a set standard, the early warning module carries out early warning reminding on an operator. For example, if the score is 2, an audio prompt alarm is performed.
And S9, carrying out comprehensive statistical analysis on the human-machine efficiency scoring log data to obtain a virtual production line comprehensive human-machine efficiency evaluation result comprising a comprehensive human-machine efficiency statistical chart of the assembly unit, a final scoring table, risk factor analysis and station adjustment suggestions.
The method adopted by the embodiment enables the virtual assembly simulation subsystem and the human-machine work efficiency monitoring and evaluating subsystem to be tightly combined, the human posture is evaluated and analyzed by utilizing the RULA and LUBA evaluation methods, more importantly, the influences of the sizes and the weights of the current operation workpieces and tools on the human load are considered, the human-machine work efficiency evaluation of the external force borne by the human body is carried out by utilizing the NIOSH method and the moment analysis of the workpieces on the key points of the human body, the problems of inaccurate, incomplete and unobtrusive results caused by the evaluation based on the human posture in the traditional method are solved, meanwhile, the calculation is simple, the resource cost is low, and the realization and the popularization are facilitated.

Claims (8)

1. An assembly simulation man-machine work efficiency evaluation system based on virtual reality is characterized in that the assembly simulation man-machine work efficiency evaluation system comprises a virtual assembly simulation subsystem and a man-machine work efficiency monitoring and evaluation subsystem; the virtual assembly simulation subsystem comprises an assembly unit construction module, a process guide module and an interaction processing module;
the assembly unit construction module reads the hierarchical structure and spatial layout information of the assembly units, loads the three-dimensional models of the assembly operation environment and the station facilities, and generates a basic operation scene formed by non-assembly objects; reading the structured process information, loading three-dimensional models of parts, assemblies, tools and clamps corresponding to all assembly procedures of the assembly unit, and generating a virtual assembly operation scene consistent with a real assembly environment; the process guidance module sequentially reads an assembly operation guidance instruction, information of the parts to be assembled and related tool clamp information of the current assembly process from an assembly process route, guides an operator to select the corresponding parts to be assembled and the corresponding tool clamp in a virtual assembly operation scene, and places the parts to be assembled on a final assembly position according to assembly constraint and an assembly path required by the assembly process; the interactive processing module identifies collision interference in the assembly simulation operation process, captures the pick-up, movement, placement and fixation interactive actions of an operator on parts and a tool clamp, generates assembly simulation operation information including weight information, size information, the initial position and the current position of the picked-up parts and the tool in real time by combining manufacturing BOM data, and sends the assembly simulation operation information to the human efficiency monitoring and evaluating subsystem for analysis and calculation;
the human-machine work efficiency monitoring and evaluating subsystem comprises a human body posture capturing module and a calculating and analyzing module;
the human body posture capturing module captures an assembly operation posture video image of an operator in real time by adopting a monocular camera; the calculation and analysis module analyzes and processes the assembly operation action gesture video images collected by the human body gesture capture module to obtain 3D coordinate information of human body key points of an operator, calculates human body key part angles based on RULA and LUBA based on the 3D coordinate information of the human body key points, analyzes to obtain first human-machine work efficiency evaluation data based on the human body gesture, calculates external moments of the human body key points and limbs based on the 3D coordinate information of the human body key points and assembly simulation operation information sent by the interaction processing module, calculates load indexes according to an NIOSH evaluation method, and analyzes to obtain second human-machine work efficiency evaluation data based on the external forces applied to the human body; synthesizing the first human-machine work efficiency evaluation data and the second human-machine work efficiency evaluation data to obtain real-time comprehensive human-machine work efficiency data and recording the real-time comprehensive human-machine work efficiency data to a human-machine work efficiency scoring log;
the human-machine work efficiency monitoring and evaluating subsystem comprises an early warning module;
after parameter configuration and initial calibration are carried out by the early warning module, real-time evaluation result display and risk alarm reminding are carried out on an operator in the virtual assembly process; the interactive processing module receives the risk alarm prompt sent by the early warning module in real time and prompts an operator to perceive the assembly operation causing fatigue and musculoskeletal injury in real time.
2. The virtual reality based assembly simulation ergonomic system of claim 1 wherein said calculation and analysis module comprises a 3D pose estimation full convolution neural network for transfer learning based on the ResNet-50 network and a human ergonomics analysis module constructed based on RULA, LUBA;
when the monocular camera is used for replacing the position each time, the human body posture capturing module continuously captures a posture image of an operator in a relaxation state at the center of the field in a timing manner of a normal comfortable posture station, sends an acquisition result to the 3D posture estimation complete convolution neural network for basic calibration operation, and records three-dimensional coordinates of a plurality of groups of effective 3D key points and calculates an average value of the three-dimensional coordinates as basic comparison information for human body posture estimation in subsequent virtual reality assembly simulation; after the virtual assembly operation is started, a human body posture capturing module captures an assembly operation posture video image of an operator in real time, the assembly operation posture video image is imported into a 3D posture estimation full convolution neural network, and real-time coordinate information of a 3D key point of the operator is obtained through calculation;
the 3D attitude estimation full convolution neural network inputs real-time coordinate information and basic comparison information of 3D key points of an operator to a man-machine work efficiency analysis module at the same time, and calculation, evaluation and early warning of human body attitude are carried out.
3. The virtual reality based assembly simulation ergonomic evaluation system of claim 2 wherein the ergonomic analysis module calculates the angles of the torso, neck, left arm and right arm from the real-time coordinate information of the operator's 3D key points and the base comparison information;
calculating the sagittal plane angle of the trunk, calculating a trunk facing direction vector according to the coordinates of the left shoulder, the right shoulder and the pelvis, comparing the trunk facing direction vector obtained by monitoring and calculating during virtual assembly operation with the trunk facing direction vector obtained by calibration, calculating the angle of the trunk facing direction changed on the vertical plane, and obtaining the real-time trunk sagittal plane angle of an operator in virtual operation; calculating the angle of the coronal plane of the trunk, taking the vector from the left shoulder to the right shoulder as a lateral direction vector of the trunk, comparing the lateral direction vector of the trunk obtained by monitoring and calculating during virtual assembly operation with the lateral direction vector of the trunk obtained by calibration, and calculating the angle of the lateral direction of the trunk changed on a vertical plane to obtain the real-time angle of the coronal plane of the trunk of the operator in virtual operation;
for the calculation of the sagittal plane angle of the neck, the central point P1 of the left eye, the right eye and the nose is calculated firstly, the central point P2 of the left ear and the right ear is calculated, the vector from P2 to P1 is taken as the vector of the facing direction of the head, the vector of the facing direction of the head obtained by monitoring and calculating during the virtual assembly operation is compared with the vector of the facing direction of the head obtained by calibration, the angle of the facing direction of the head changed on the vertical plane is calculated, and the real-time neck sagittal plane angle of an operator in the virtual operation is obtained; calculating the angle of the coronal plane of the neck, calculating a vector V1 from a left ear to a right ear and a direction V2 from a left eye to a right eye, taking an average vector of V1 and V2 as a lateral direction vector of the head, comparing the lateral direction vector of the head obtained by monitoring and calculating during virtual assembly operation with the lateral direction vector of the head obtained by calibration, calculating the angle of the lateral direction of the head changed on a vertical plane, and obtaining the real-time angle of the coronal plane of the neck of an operator in virtual operation;
calculating the angle of the sagittal plane of the left arm, calculating the vector of the left arm from the left shoulder to the left elbow, calculating the midpoint P3 between the left shoulder and the right shoulder, calculating the midpoint P4 between the left crotch and the right crotch, calculating the vectors from P3 to P4 to obtain the vertical vector V3 of the human body, calculating the vector V4 of the human body facing the direction, and expressing the sagittal plane of the human body by V3 and V4; projecting the direction vector of the left arm to a sagittal plane of the trunk, and calculating an included angle between the projected vector and V3 to obtain the real-time angle of the sagittal plane of the left arm of an operator in virtual operation; calculating the sagittal plane angle of the left arm, replacing the projection plane with a trunk coronal plane determined by the left shoulder, the right shoulder and the pelvis, and calculating to obtain the real-time left arm coronal plane angle of the operator in the virtual operation;
calculating the sagittal plane angle of the right arm, calculating a right arm direction vector from the right shoulder to the right elbow, calculating a midpoint P3 between the left shoulder and the right shoulder, calculating a midpoint P4 between the left crotch and the right crotch, calculating vectors from P3 to P4 to obtain a vertical direction vector V3 of the human body, calculating a facing direction vector V4 of the body, and expressing the sagittal plane of the body by V3 and V4; projecting the direction vector of the right arm to a sagittal plane of the trunk, and calculating an included angle between the projected vector and V3 to obtain the real-time right arm sagittal plane angle of an operator in virtual operation; and for the calculation of the sagittal plane angle of the right arm, the projection plane is replaced by a trunk coronal plane determined by the left shoulder, the right shoulder and the pelvis, and the real-time right arm coronal plane angle of the operator in the virtual operation is calculated.
4. The virtual reality based assembly simulation human-machine work efficiency evaluation system of claim 3, wherein the human-machine work efficiency analysis module is provided with a plurality of value ranges for the angles of the trunk, the neck, the left arm and the right arm, each value range is provided with a score, and the corresponding key point scores are obtained by judgment according to the values of the angles of the trunk, the neck, the left arm and the right arm corresponding to the real-time coordinate information of the operator; and then, integrating all the key point scoring analysis to obtain first human-machine efficiency evaluation data based on the human body posture.
5. The virtual reality based assembly simulation human-machine ergonomics evaluation system of claim 2 wherein the human-machine ergonomics analysis module calculates external moments experienced by human body key nodes and limbs according to real-time coordinate information of 3D key points and current workpiece weight and size data in the received real-time assembly simulation operation information;
for a workpiece held by a single hand, directly calculating the moment of the workpiece on the corresponding elbow, shoulder and waist; equally dividing the gravity of the workpiece into two parts for the workpiece held by two hands, respectively calculating the moments of the equally divided workpiece on the corresponding elbow, shoulder and waist according to the equally divided workpiece gravity, and adding the moments of the two equally divided workpiece gravity on the waist to serve as the moments of the complete workpiece on the waist.
6. The virtual reality based assembly simulation ergonomic system of claim 2 wherein said ergonomic analysis module calculates a load index according to the NIOSH assessment method, and wherein the process of analyzing the second ergonomic evaluation data based on the external forces to which the human body is subjected comprises the steps of:
according to the real-time coordinate information of the 3D key points, calculating to obtain a horizontal distance H from the hand of the operator to the middle point of the central lines of the two feet, a vertical distance V from the hand of the operator to the middle point of the central lines of the two feet, a vector included angle between the left crotch and the right crotch of the operator and a vector included angle between the left shoulder and the right shoulder, and an angle A of body rotation of the operator;
acquiring the size of a current workpiece, the distance D between the operation start of the workpiece and the current position in the vertical direction in the assembly simulation operation information, and acquiring the corresponding picking difficulty level C;
recording and calculating the operation frequency F of a task in the virtual assembly process;
calculating a horizontal factor HM according to the horizontal distance H from the hand of the operator to the middle point of the central lines of the two feet; calculating a vertical factor VM according to a vertical distance V from the hand of an operator to the middle points of the central lines of the two feet; calculating a distance factor DM according to the distance D between the workpiece operation start and the current position in the vertical direction; calculating an asymmetry factor AM according to the angle A of the body rotation of the operator; calculating a frequency factor FM based on the operating frequency F; calculating a coupling factor CM according to the picking difficulty level C;
calculating a recommended limit weight RWL = LC × HM × VM × DM × FM × AM × CM, where LC is a load constant; and calculating a lifting coefficient LI = m/RWL, wherein m is the mass of the currently operated workpiece.
7. The virtual reality based assembly simulation ergonomic system of claim 1 wherein the minimum of the first and second ergonomic evaluation data is taken as real-time integrated ergonomic data and recorded to an ergonomic scoring log.
8. An assembly simulation ergonomic assessment method based on virtual reality, characterized in that it is performed based on an assembly simulation ergonomic assessment system as claimed in any of claims 1-7;
the assembly simulation man-machine ergonomics evaluation method comprises the following steps:
s1, loading layout information and structural process information of an assembly unit into a virtual assembly simulation subsystem;
s2, reading the hierarchical structure and spatial layout information of the assembly units by using an assembly unit construction module, loading a three-dimensional model of an assembly operation environment and a station facility, and generating a basic operation scene consisting of non-assembly objects; reading the structured process information, loading three-dimensional models of parts, assemblies, tools and clamps required in all assembly procedures of the assembly unit, and interactively adjusting the placing position and the spatial layout of each object by an operator to generate a virtual assembly operation scene consistent with a real assembly environment;
s3, sequentially reading the assembly operation guide description, the information of the parts to be assembled and the information of the tool clamps to be used of the current assembly process from the assembly process route by using a process guide module, guiding an operator to select the corresponding parts to be assembled and the tool clamps from the virtual assembly operation scene generated in the step S2, and placing the parts to be assembled on a final assembly position according to assembly constraints and an assembly path required by the assembly process;
s4, recognizing collision interference in the assembly simulation operation process by using an interactive processing module, capturing the pick-up, movement, placement and fixation interactive actions of an operator on the parts and the tool clamp in the step S3, generating assembly simulation operation information including weight information, size information, the initial position and the current position of the picked-up parts and the tool in real time by combining the manufacture BOM data, and sending the assembly simulation operation information to the human efficiency monitoring and evaluating subsystem for analysis and calculation;
s5, capturing a virtual assembly real-time operation image of an operator by using a human body posture capturing module in the human-computer work efficiency monitoring and evaluating subsystem, and analyzing and processing the image by using a calculating and analyzing module to calculate the 3D coordinate information of the human body key points; calculating human body key part angles based on the RULA and the LUBA based on the obtained 3D coordinate information of the human body key points, and obtaining first human engineering efficiency evaluation data based on the human body posture in real time;
s6, calculating external moment borne by the key points and the limbs of the human body based on the 3D coordinate information of the key points of the human body obtained in the step S5 and the current workpiece weight and size data in the received real-time assembly simulation operation information, and calculating a load index according to an NIOSH (non-invasive surgery) evaluation method to obtain second human-machine work efficiency evaluation data based on the external force borne by the human body;
s7, calculating and analyzing real-time comprehensive human-machine work efficiency data through a calculating and analyzing module based on first human-machine work efficiency evaluation data of human body postures and second human-machine work efficiency evaluation data of external forces borne by human bodies, and recording the real-time comprehensive human-machine work efficiency data to a human-machine work efficiency scoring log; in the virtual assembly simulation process, if the real-time comprehensive human-computer work efficiency data is lower than a set standard, the early warning module carries out early warning reminding on an operator;
and S8, carrying out comprehensive statistical analysis on the human-machine efficiency scoring log data to obtain a virtual production line comprehensive human-machine efficiency evaluation result comprising a comprehensive human-machine efficiency statistical chart of the assembly unit, a final scoring table, risk factor analysis and a station adjusting strategy.
CN202211324748.8A 2022-10-27 2022-10-27 Assembly simulation man-machine work efficiency evaluation system and method based on virtual reality Active CN115390677B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211324748.8A CN115390677B (en) 2022-10-27 2022-10-27 Assembly simulation man-machine work efficiency evaluation system and method based on virtual reality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211324748.8A CN115390677B (en) 2022-10-27 2022-10-27 Assembly simulation man-machine work efficiency evaluation system and method based on virtual reality

Publications (2)

Publication Number Publication Date
CN115390677A CN115390677A (en) 2022-11-25
CN115390677B true CN115390677B (en) 2023-04-07

Family

ID=84129139

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211324748.8A Active CN115390677B (en) 2022-10-27 2022-10-27 Assembly simulation man-machine work efficiency evaluation system and method based on virtual reality

Country Status (1)

Country Link
CN (1) CN115390677B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116223529A (en) * 2023-05-09 2023-06-06 张家港大裕橡胶制品有限公司 Intelligent detection method and system for production of film-pressed gloves
CN116301481A (en) * 2023-05-12 2023-06-23 北京天图万境科技有限公司 Multi-multiplexing visual bearing interaction method and device
CN117687554B (en) * 2023-12-11 2024-05-28 上海梅斯医药科技有限公司 Scale element flexible configuration system and method based on visual simulation scoring

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015125578A (en) * 2013-12-26 2015-07-06 川崎重工業株式会社 Multi-aspect evaluation system of working posture

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101414179B (en) * 2008-11-20 2010-08-18 上海交通大学 Human-machine interactive assembly process planning system
CN104573230B (en) * 2015-01-06 2018-05-18 北京卫星环境工程研究所 Towards the visual human's job task simulation analysis system and method for spacecraft maintenance
US11270461B2 (en) * 2019-01-07 2022-03-08 Genieland Company Limited System and method for posture sequence on video from mobile terminals
CN113393091B (en) * 2021-05-25 2022-09-02 北京航空航天大学 Accessibility evaluation method and system based on comfort in virtual scene
CN113269448A (en) * 2021-05-31 2021-08-17 北京理工大学 System and method for evaluating assembling performance of human-computer work efficiency in virtual reality environment

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015125578A (en) * 2013-12-26 2015-07-06 川崎重工業株式会社 Multi-aspect evaluation system of working posture

Also Published As

Publication number Publication date
CN115390677A (en) 2022-11-25

Similar Documents

Publication Publication Date Title
CN115390677B (en) Assembly simulation man-machine work efficiency evaluation system and method based on virtual reality
CN105832343B (en) Multidimensional vision hand function rehabilitation quantitative evaluation system and evaluation method
CN112069933A (en) Skeletal muscle stress estimation method based on posture recognition and human body biomechanics
CN108161882A (en) A kind of robot teaching reproducting method and device based on augmented reality
WO2014042121A1 (en) Movement evaluation device and program therefor
Ma et al. A framework for interactive work design based on motion tracking, simulation, and analysis
CN116386414A (en) Digital mirror image-based ergonomic adjustment line training system and method
Suay et al. A position generation algorithm utilizing a biomechanical model for robot-human object handover
Duffy Modified virtual build methodology for computer‐aided ergonomics and safety
Yin et al. A systematic review on digital human models in assembly process planning
Jung et al. A man-machine interface model with improved visibility and reach functions
CN117637166A (en) Hand rehabilitation evaluation method and system based on real-time tracking of joint points
Liu et al. Object transfer point predicting based on human comfort model for human-robot handover
CN112182763A (en) Assembly simulation method based on VR technology and motion capture technology
CN111581840A (en) Equipment maintenance characteristic simulation test and evaluation system
Ma et al. A framework for interactive work design based on digital work analysis and simulation
CN114757293A (en) Man-machine co-fusion risk early warning method and system based on action recognition and man-machine distance
Wang et al. Digital human modeling for physiological factors evaluation in work system design
CN111002292B (en) Robot arm humanoid motion teaching method based on similarity measurement
CN114417618A (en) Virtual reality assisted assembly complexity evaluation system
CN111027470B (en) Behavior measurement and evaluation method based on cognitive decomposition
Yu et al. 3D posture estimation from 2D posture data for construction workers
Wang Ergonomic-centric methods for workplace design in industrialized construction
KR20200073588A (en) A production line monitoring system using motion recognition of a workekr and a monitoring method using the same
Li et al. Key techniques of upper limb rehabilitation training system based on Kinect

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant