CN109919134B - Vision-based method for detecting abnormal behaviors of operating vehicle personnel - Google Patents

Vision-based method for detecting abnormal behaviors of operating vehicle personnel Download PDF

Info

Publication number
CN109919134B
CN109919134B CN201910229246.9A CN201910229246A CN109919134B CN 109919134 B CN109919134 B CN 109919134B CN 201910229246 A CN201910229246 A CN 201910229246A CN 109919134 B CN109919134 B CN 109919134B
Authority
CN
China
Prior art keywords
person
node
interfered
vehicle
personnel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910229246.9A
Other languages
Chinese (zh)
Other versions
CN109919134A (en
Inventor
王奕然
吴嘉锟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201910229246.9A priority Critical patent/CN109919134B/en
Publication of CN109919134A publication Critical patent/CN109919134A/en
Application granted granted Critical
Publication of CN109919134B publication Critical patent/CN109919134B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Emergency Alarm Devices (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a vision-based method for detecting abnormal behaviors of operating vehicle personnel, belonging to the field of vision identification. The method comprises the steps of firstly, comprehensively acquiring image information of a driver in a vehicle and surrounding people; then, extracting the 2D image, and labeling the bone nodes of the personnel in all the images by utilizing a deep learning algorithm in an openposition database; and then, defining the interfering personnel and the interfered personnel according to the driving state of the vehicle, and calculating the dangerous distance threshold value of each key limb part of the interfered personnel. Then, calculating the space distance between the wrist node of the interfering person and the key limb part of the interfered person based on the binocular distance measuring principle; and finally, detecting abnormal behaviors of the interference personnel by judging the time when the space distance is within the dangerous distance threshold. The method is suitable for detecting the abnormal behaviors of the personnel in the operation vehicle, provides key judgment basis for warning and timely alarming of the abnormal behaviors of the personnel in the vehicle, and has important significance for guaranteeing the life and property safety of the personnel in the vehicle.

Description

Vision-based method for detecting abnormal behaviors of operating vehicle personnel
Technical Field
The invention belongs to the technical field of visual identification, and particularly relates to a visual-based method for detecting abnormal behaviors of operating vehicle personnel.
Background
With the progress of internet technology, the industry of our country, mainly special cars, fast cars and unmanned ticketing buses, is rapidly developed, and great convenience is brought to people. However, due to the lack of strict regulatory systems, there are increasing cases of serious injuries and deaths occurring in the vehicle interior due to the passengers interfering with the normal driving of the driver during the driving process or due to the passengers intruding by the driver. Therefore, a vision-based method for detecting abnormal behaviors of people in operating vehicles is sought, the method is used for detecting the postures of the people in the vehicles in real time, automatically distinguishing the abnormal behaviors of the people in the vehicles, providing key judgment basis for warning and timely alarming of the abnormal behaviors of the people in the vehicles, and has important significance for guaranteeing the life and property safety of the people in the vehicles.
In the actual running process of the vehicle, in order to guarantee the safety of the passenger, the passenger generally shares the position with the friend in real time through software to ensure that the passenger gets a rescue in time when being infringed. However, the method has strong subjectivity and cannot ensure that each passenger can immediately obtain early warning when being infringed. On the other hand, in order to ensure the safety of the driver, a protection device with an ejection function is usually installed around the driving seat. However, the method still depends on subjective judgment of a driver on specific situations, and cannot effectively perform early warning on abnormal behaviors of passengers in time. Therefore, a vision-based method for detecting abnormal behaviors of operating vehicle personnel is urgently sought.
Researches show that the abnormal behavior detection of the operating vehicle personnel based on vision needs to meet basic conditions of real-time monitoring of the posture of the personnel in the vehicle, accurate calculation of the relative position of the personnel, timely judgment of the abnormal behavior of the personnel and the like, and has great engineering challenge. The image information of the personnel in the vehicle is obtained in an all-round mode through the distributed cameras, the postures of the personnel in the vehicle are extracted in real time based on the openposition database, the spatial relative positions of the personnel in the vehicle are obtained by combining a binocular range finding principle, whether the personnel in the vehicle have abnormal behaviors or not is judged according to the spatial relative positions, and the possibility is provided for detecting the abnormal behaviors of the personnel in the operating vehicle based on vision.
A dangerous driving behavior real-time detection method based on deep learning is disclosed in patent CN201611267904.6 of technical university Kang Yu of China in 2016, and the method adopts an image acquisition system to obtain information of a driver and judges abnormal behaviors of the driver such as smoking, phone holding and the like through a deep learning method. 2016 Chengdu remote control technology company, xie Zhonghua, discloses a fatigue detection method and system based on a video intelligent algorithm in patent CN201610264968.4, the method acquires images of a driver through the acquisition of the images, and judges whether the driver is tired according to the opening and closing degree of the mouth and eyes of the driver. However, the above measurement method does not involve the detection of abnormal behavior between the driver and the passenger.
Disclosure of Invention
The invention aims to overcome the defects of the existing method, and provides a visual-based method for detecting abnormal behaviors of operating vehicle personnel aiming at the problem of detecting the abnormal behaviors of the operating vehicle personnel. The method utilizes a deep learning algorithm in an openposition database to extract a skeleton node sequence of passengers and drivers, and provides a position basis for posture judgment of people in the vehicle; the method comprises the steps that the spatial distance between key limb parts of people in the vehicle is obtained based on a binocular ranging principle, and a foundation is laid for judging abnormal behaviors; and finally, detecting the abnormal behaviors of the personnel operating the vehicle by calculating the time length of the distance between the key limb parts of the personnel in the vehicle being less than the danger threshold.
The measuring method adopts a distributed camera to collect image information of drivers and surrounding drivers in an operating vehicle; judging the positions of the interfering personnel and the interfered personnel in the vehicle by combining the running state of the operating vehicle; extracting a skeleton node sequence of people in the vehicle by using a deep learning algorithm in an openposition database; calculating the space distance between a wrist part node of an interfering person in the vehicle and a key limb part of the interfered person based on a double visual range principle; and setting a danger threshold, and completing the detection of the abnormal behaviors of the personnel by the time when the space distance is within the danger threshold.
The technical scheme of the invention is as follows:
a method for detecting abnormal behaviors of operating vehicle personnel based on vision comprises the steps of firstly, installing a camera at the top of an operating vehicle, and comprehensively acquiring image information of drivers in the vehicle and surrounding personnel; then, extracting a 2D image, and labeling bone nodes of people in all images by using a deep learning algorithm in an openposition database; and then, defining the interference personnel and the interfered personnel by combining the running state of the operating vehicle, and calculating the dangerous distance threshold value of each key limb part of the interfered personnel. Then, calculating the space distance between the wrist node of the interfering person and the key limb part of the interfered person based on the binocular distance measuring principle; finally, detecting abnormal behaviors of the interfering personnel by judging the time when the space distance is within the dangerous distance threshold; the method comprises the following specific steps:
first, the camera is installed and calibrated
First, two cameras i are arranged at the rear view mirror of the manned vehicle 21 and numbered: i =1,2. The shooting angles of the two cameras are adjusted, so that the image information including the driving part 20 and the surrounding area of the driving part can be directly acquired. The scan rate of each camera needs to be greater than 5 frames/second.
Then, the two cameras are combined into a binocular vision camera I. And the opencv database is utilized to independently calibrate the two cameras, and the opencv database is utilized to perform double-target calibration on the binocular vision camera I.
Second, real-time extraction of the pose of the person in the operating vehicle 21
Firstly, according to the distribution of seats in the operating vehicle 21, the number of the persons in the vehicle in the collected image is coded. If m persons are in the vehicle, the number of the person on the driving seat is marked as 1, and the numbers of the other persons are sequentially 2-m. The same personnel number needs to be consistent when the same personnel number is collected by different cameras. The person with the number j is set as the person j.
Then, the skeletal nodes of all the persons in the collected image are marked by using a deep learning algorithm in an openposition database. The skeletal nodes k (k =0,2, …, 17) of the person j (j =1,2, …, m) collected by the i-th camera (i =1,2) constitute a person j skeletal node set
Figure BDA0002006198500000041
Wherein it is present>
Figure BDA0002006198500000042
Acquiring the coordinate of a skeleton node k of a person j in an ith camera in an image coordinate system>
Figure BDA0002006198500000043
Finally, obtaining a j skeleton node set of the in-vehicle personnel acquired by the ith camera according to calculation
Figure BDA0002006198500000044
Using binocular vision in opencv databaseThe lower space coordinate calculation method obtains the coordinates of the skeleton node k (k =0,2, …, 17) of the in-car space coordinate system of the person j (j =1,2, …, m) in the i-th camera
Figure BDA0002006198500000045
Third, calculation of dangerous distance threshold
First, the number of the interfering person in the vehicle is set to p, and the number of the interfered person is set to q. Based on the ECU protocol, the running speed V of the vehicle is extracted through the operating vehicle 21OBD interface. When the operating vehicle 21 is running, i.e. when | V | >0, the driver is defined as an interfered person q, i.e. q =1; the remaining persons are the persons performing the disturbance p, i.e. p ≠ 1. When the vehicle is not running, that is, when V =0, it is defined that the driver is the interfered person q =1, and the remaining persons are the interfering person p ≠ 1, or the driver is the interfering person p =1 and the remaining persons are the interfered person q ≠ 1.
Then, a danger distance threshold value of each key limb part of the interfered person q is calculated. The key limb parts comprise the head, the body and the arms; each of the key limb portions includes a nasal bone node 0, a neck bone node 1, a right shoulder bone node 2, a right elbow bone node 3, a right wrist bone node 4, a left shoulder bone node 5, a left elbow bone node 6, a left wrist bone node 7, a right span bone node 8, a right knee bone node 9, a right ankle bone node 10, a left span bone node 11, a left knee bone node 12, a left ankle bone node 13, a right eye node 14, a left eye node 15, a right temple bone node 16, and a left temple bone node 17;
the q arm of the person to be interfered is set to have the connecting line of the wrist bone node, the elbow bone node, the shoulder bone node k and k-1 (k =3,4,6,7) as the axis and the diameter c q A cylindrical body of (a); the body takes a nasal bone node 0 and a cervical bone node 1 as axes and has a shoulder width b q Is a cylinder with a diameter; the head takes a nasal bone node 0 and a neck bone node 1 as axes, and the distance a between temples q Is a cylinder of diameter.
Arm diameter c q According toThe actual situation is set by self;
shoulder width b q Comprises the following steps: interfered person q right shoulder bone node 2 coordinate in space coordinate system in vehicle in i camera
Figure BDA0002006198500000051
Figure BDA0002006198500000051
5 coordinate of the left shoulder skeleton node>
Figure BDA0002006198500000052
Distance between:
Figure BDA0002006198500000053
head width a of interfered person q q Comprises the following steps: interfered person q right temple skeleton node 16 coordinate in-vehicle space coordinate system of i camera
Figure BDA0002006198500000054
In combination with left temple bone node 17 coordinate>
Figure BDA0002006198500000055
Distance between:
Figure BDA0002006198500000056
and finally, setting dangerous distance thresholds of the head, the body and the arms of the interfered person q as follows:
Figure BDA0002006198500000057
in the formula, A q 、B q 、C q The dangerous distance threshold of the head, the body and the arms of the person to be interfered is q, and the unit is mm.
Fourthly, judging abnormal behaviors of the personnel
The judgment of the abnormal behavior of the personnel comprises the following steps: and (3) performing actual distance calculation of the wrist position distance of the interfered person p and the head, body and arm of the interfered person q, and determining whether the interfering person p has abnormal behaviors.
Implementing an actual distance E between the position of the wrist of the interfering person p and the arm of the interfered person q q Comprises the following steps: coordinates of right wrist skeleton node 4 and left wrist skeleton node 7 of implementing interference person p in-vehicle space coordinate system in ith camera
Figure BDA0002006198500000058
Node adjacent to the skeleton at the arm q of the person q to be disturbed->
Figure BDA0002006198500000059
Distance d between the connecting lines c . Wherein the person q affected has a bone-adjacent node on the arm>
Figure BDA00020061985000000510
Is connected with the middle line as->
Figure BDA0002006198500000061
Wherein, (x, y, z) represents adjacent nodes of skeleton at q arms of interfered person
Figure BDA0002006198500000062
Any point on the inter-line;
the head and the body of the interfered person q are set to be in the same axis. Implementing an actual distance D from the wrist position of the interfering person p to the head of the interfered person q q Actual distance from body F q The method comprises the following steps: coordinates of right wrist skeleton node 4 and left wrist skeleton node 7 of implementation interference person p in space coordinate system in ith camera middle vehicle
Figure BDA0002006198500000063
The node of the head and the neck of the person q to be interfered>
Figure BDA0002006198500000064
Distance between connecting linesFrom d ab . Wherein, the interfered person q is at the bone node of the head and neck
Figure BDA0002006198500000065
The connecting line between the two lines is as follows,
Figure BDA0002006198500000066
wherein (x) 1 ,y 1 ,z 1 ) Representing the bone nodes at the head and neck of an interfered person q
Figure BDA0002006198500000067
Any point on the inter-connecting line;
the formula for judging the wrist position of the interfering person p to be in the head or body of the interfered person q is as follows,
Figure BDA0002006198500000068
when the position of the wrist of the person implementing the interference p is positioned at the head of the person q to be interfered, delta>0; when the position of the wrist of the person implementing the interference p is at the body of the person q to be interfered, delta<0. Implementing the actual distance D between the wrist position of the interfering person p and the head of the interfered person q q Or the actual distance F from the body of the person q to be disturbed q In order to realize the purpose of the method,
Figure BDA0002006198500000069
and finally, judging whether the person in the vehicle has abnormal behavior. When the wrist position of the interfering person p is at the actual distance E from the arm and body of the interfered person q q 、F q Continuously less than a dangerous distance C q 、B q Exceeds 10s, the offender p takes an abnormal behavior. When the actual distance D between the wrist position of the interfering person p and the head of the interfered person q is implemented q Less than the dangerous distance A within 5s q Time of (2) is exceededAfter 1s, the implementing and interfering person p takes an abnormal behavior.
The invention has the beneficial effects that: the invention is suitable for detecting the abnormal behaviors of the personnel in the operating vehicle, provides key judgment basis for the warning and the timely alarm of the abnormal behaviors of the personnel in the vehicle, and has important significance for guaranteeing the safety of lives and properties of the personnel in the vehicle.
Drawings
Fig. 1 is a schematic view of a mounting position of a camera in a vehicle.
Fig. 2 is a sequence number diagram of each skeleton node of a person.
Fig. 3 is an image captured within a service vehicle.
In the figure: 0 nasal bone node; 1 cervical skeletal node; 2 right shoulder bone node; 3 the right elbow skeletal node; 4 right wrist bone node; 5 left shoulder bone node; 6 left elbow skeletal node; 7 left wrist skeletal joint; 8 right span bone node; 9 right knee skeletal node; 10 a right ankle bone node; 11 left span bone node; 12 a left knee skeletal node; 13 a left ankle bone node; 14 right eye node; 15. a left eye node; 16 right temple bone nodes; 17 left temple bone node; 18 a first camera; 19. a second camera; 20 driving part; 21 operating the vehicle; person No. 22; person No. 23.
FIG. 4 is a flow chart of the present invention.
Detailed Description
The following detailed description of the embodiments of the invention will be made with reference to the accompanying drawings and accompanying claims.
First, the camera is installed and calibrated
First, 2 cameras are arranged on both sides of the front part of the roof of the people carrier 4. The first camera 18 and the second camera 19 are disposed at a vehicle rearview mirror, and the installation position of the cameras in the vehicle is shown in fig. 1. The shooting angles of the first camera 18 and the second camera 19 are adjusted, so that the image information including the driving part 20 and the surrounding image information can be directly acquired. The two cameras are manufactured by BASLER corporation, the chip type is CMOS, and the scanning rate is 20 frames/second.
Then, the first camera 18 and the second camera 19 are combined into a binocular vision camera I. And the first camera 18 and the second camera 19 are calibrated independently by using an opencv database. Similarly, binocular calibration is performed on the binocular vision camera I by using the opencv database.
Second, the attitude of the person in the operating vehicle 21 is extracted in real time
Firstly, numbering the people in the vehicle in the collected image according to the distribution of seats in the vehicle. The captured image in the operating vehicle 21 is shown in fig. 3. There are 2 persons in the car, the person marked on the driver's seat is the first person 22, and the rest are the second person 23. The same personnel number needs to be consistent when the same personnel number is collected by different cameras. Here, a person with the number j is set to be the person j.
Then, all people in the collected image are marked by using a deep learning algorithm in an openposition database: 1. the bone nodes of person # 22 and person # two 23. The bone nodes of the first person 22 acquired by the first camera 18 form a first person 22 bone node set
Figure BDA0002006198500000081
Figure BDA0002006198500000082
The bone nodes of the second person 23 acquired by the second camera 19 form a second person 23 bone node set
Figure BDA0002006198500000083
/>
Figure BDA0002006198500000084
Figure BDA0002006198500000091
The serial numbers of the bone nodes of the first person 22 and the second person 23 are shown in fig. 2.
Finally, according to the bone node sets of the first person 22 and the second person 23 in the vehicle collected by the first camera 18 and the second camera 19 obtained through calculation
Figure BDA0002006198500000092
With the binocular vision spatial coordinate calculation method in the opencv database, the coordinates of the bone nodes of person number one 22 in the first camera 18 in the in-vehicle spatial coordinate system are,
Figure BDA0002006198500000093
the coordinates of the in-vehicle space coordinate system of the bone node of person No. two 23 in first camera 18 are,
Figure BDA0002006198500000094
Figure BDA0002006198500000101
third, calculation of dangerous distance threshold
First, based on the ECU protocol, the traveling speed V =30 of the vehicle is extracted through the vehicle OBD interface. At this time, the driver is defined as an interfered person q, i.e. q =1; the remaining persons are the persons carrying out the intervention p, i.e. p =2.
Then, a danger distance threshold value of each key limb part of the interfered person q is calculated. Setting the arm diameter c q =20mm. Shoulder width b q For the right shoulder skeletal node 2 coordinates of person q within the in-vehicle space coordinate system in the first camera 18
Figure BDA0002006198500000102
5 coordinate of the left shoulder skeleton node>
Figure BDA0002006198500000103
With an inter-distance of>
Figure BDA0002006198500000104
Person q head width a q In a first camera 18, a person q right temple bone node 16 coordinate +>
Figure BDA0002006198500000105
In combination with left temple bone node 17 coordinate>
Figure BDA0002006198500000106
At a distance of->
Figure BDA0002006198500000107
Finally, a danger distance threshold value (unit mm) A of the head, the body and the arms of the person q to be interfered is set q 、B q 、 C q Respectively as follows: a. The q =163.9/2+30=111.95mm、B q =370.6/2+50=235.3mm、 C q =20/2+50=60mm。
Fourthly, judging abnormal behaviors of the personnel
Implementing an actual distance E between the position of the wrist of the interfering person p and the arm of the interfered person q q For the coordinates of a person p in the in-vehicle space coordinate system in the first camera 18, the left wrist bone node 7
Figure BDA0002006198500000108
Node adjacent to the bone of person q arm>
Figure BDA0002006198500000109
Distance d between the connecting lines c =19.37mm。
The head and the body of the interfered person q are set to be in the same axis. Implementing an actual distance D from the wrist position of the interfering person p to the head of the interfered person q q Actual distance from body F q Are all the coordinates of the left wrist bone node 7 of the person p in the in-vehicle space coordinate system in the first camera 18
Figure BDA00020061985000001010
With the head and neck of the person qAt the bone node
Figure BDA00020061985000001011
Distance d between the connecting lines ab =245.5mm. Delta = -16930 for determining that the position of the wrist of the person p who performs the interference is at the head or body of the person q who is interfered<0. Thus, F q =245.5mm。
Finally, the actual distance F of the wrist position of person p from the body of person q q =245.5mm greater than the danger distance B q =235.3mm. Actual distance E of wrist position of person p from arm of person q q =19.37mm less than the danger distance C q =60mm, the duration exceeds 10s, and therefore the person p performs an abnormal behavior.
The above-mentioned method for detecting abnormal behavior of the person operating the vehicle 21 based on vision is only a preferred method of the present invention, so that all equivalent changes or modifications made according to the characteristics and principles described in the patent application scope of the present invention are included in the patent application scope of the present invention.

Claims (1)

1. A vision-based method for detecting abnormal behaviors of operating vehicle personnel is characterized by comprising the following steps:
first, the camera is installed and calibrated
First, two cameras i are placed at the rear-view mirror of a manned vehicle (21) and numbered: i =1,2; adjusting the shooting angles of the two cameras to directly acquire image information including a driving part (20) and a surrounding area thereof; the scanning rate of each camera needs to be more than 5 frames/second;
then, forming a binocular vision camera I by the two cameras; the opencv database is utilized to independently calibrate the two cameras, and meanwhile, the opencv database is utilized to perform binocular calibration on the binocular vision camera I;
second, the real-time extraction of the posture of the person in the operating vehicle (21)
Firstly, numbering the personnel in the vehicle in the collected image according to the distribution condition of seats in the operating vehicle (21); setting m persons in the vehicle, marking the number of the person on the driving seat as 1, and sequentially numbering the rest persons as 2-m; the numbers of the same person collected by different cameras need to be consistent; setting the person with the serial number j as the person j;
then, marking and collecting skeleton nodes of all the persons in the image by using a deep learning algorithm in an openposition database; collecting a bone node k of a person j in an ith camera, wherein i =1,2; j =1,2, …, m; k =0,2, …,17; form a person j skeleton node set
Figure FDA0004099976630000011
Wherein it is present>
Figure FDA0004099976630000012
For the coordinate of the bone node k of the person j acquired in the ith camera in the image coordinate system->
Figure FDA0004099976630000013
Finally, obtaining a j skeleton node set of the in-vehicle personnel acquired by the ith camera according to calculation
Figure FDA0004099976630000014
Obtaining the coordinates of a bone node k of a person j in an ith camera in a vehicle space coordinate system by using a binocular vision space coordinate calculation method in an opencv database>
Figure FDA0004099976630000015
Third, calculation of a dangerous distance threshold
Firstly, setting the number of an interfering person in the vehicle as p and the number of an interfered person as q; extracting a running speed V of the vehicle through an OBD interface of the operating vehicle (21) based on an ECU protocol; when the operating vehicle (21) is running, namely | V | >0, defining the driver as an interfered person q, namely q =1; the other personnel are implementing interference personnel p, namely p is not equal to 1; when the vehicle is not driven, namely V =0, defining that a driver is an interfered person q =1, and the rest persons are interfering persons p ≠ 1, or the driver is an interfering person p =1 and the rest persons are interfered persons q ≠ 1;
then, calculating a dangerous distance threshold value of each key limb part of the interfered person q; the key limb parts comprise the head, the body and the arms; each node in the key limb part comprises a nasal skeleton node (0), a neck skeleton node (1), a right shoulder skeleton node (2), a right elbow skeleton node (3), a right wrist skeleton node (4), a left shoulder skeleton node (5), a left elbow skeleton node (6), a left wrist skeleton node (7), a right cross-section skeleton node (8), a right knee skeleton node (9), a right ankle skeleton node (10), a left cross-section skeleton node (11), a left knee skeleton node (12), a left ankle skeleton node (13), a right eye node (14), a left eye node (15), a right temple skeleton node (16) and a left temple skeleton node (17);
setting the q arm of the interfered person as an axis by using a connecting line of a wrist bone node, an elbow bone node and a shoulder bone node k and k-1, wherein k =3,4,6,7; diameter c q A cylindrical body of (a); the body takes a nasal bone node (0) and a neck bone node (1) as axes and has a shoulder width b q Is a cylinder with a diameter; the head takes a nasal bone node (0) and a neck bone node (1) as axes, and the distance a between temples q Is a cylinder with a diameter;
diameter of arm c q Setting by self according to the actual situation;
shoulder width b q Comprises the following steps: coordinates of q right shoulder skeleton node (2) of interfered person in-vehicle space coordinate system in ith camera
Figure FDA0004099976630000021
Coordinate of the left shoulder skeleton node (5)>
Figure FDA0004099976630000022
Distance therebetween:
Figure FDA0004099976630000023
head width a of interfered person q q Comprises the following steps: coordinates of q right temple skeleton node (16) of interfered person in the in-vehicle space coordinate system of the ith camera
Figure FDA0004099976630000024
Coordinate of the left temple bone node (17)>
Figure FDA0004099976630000025
Distance between:
Figure FDA0004099976630000031
and finally, setting danger distance thresholds of the head, the body and the arms of the interfered person q as follows:
Figure FDA0004099976630000032
in the formula, A q 、B q 、C q The dangerous distance threshold of the head, body and arm of the person to be interfered is q, and the unit is mm;
fourthly, judging abnormal behaviors of the personnel
The judgment of the abnormal behavior of the personnel comprises the following steps: calculating the actual distance between the wrist position of the interfering person p and the head, body and arm of the interfered person q, and judging whether the interfering person p has abnormal behavior;
implementing an actual distance E between the position of the wrist of the interfering person p and the arm of the interfered person q q Comprises the following steps: coordinates of right wrist skeleton node (4) and left wrist skeleton node (7) of implementation interference personnel p in space coordinate system in ith camera middle vehicle
Figure FDA0004099976630000033
Node adjacent to skeleton at arm q of interfered person>
Figure FDA0004099976630000034
Inter linkDistance d of the line c (ii) a Wherein the adjacent node of the skeleton at the arm q of the person to be disturbed>
Figure FDA0004099976630000035
Is connected with each other by
Figure FDA0004099976630000036
Wherein, (x, y, z) represents adjacent nodes of skeleton at q arms of interfered person
Figure FDA0004099976630000037
Any point on the connecting line;
setting the head and the body of an interfered person q to be in the same axis; implementing an actual distance D between the wrist position of the interfering person p and the head of the interfered person q q Actual distance from body F q The method comprises the following steps: coordinates of right wrist skeleton node (4) and left wrist skeleton node (7) of implementation interference personnel p in space coordinate system in ith camera middle vehicle
Figure FDA0004099976630000041
The node of the head and the neck of the person q to be interfered>
Figure FDA0004099976630000042
Distance d between the connecting lines ab (ii) a Wherein, the interfered person q is at the skeleton node on the head and the neck>
Figure FDA0004099976630000043
The connecting line between the two wires is,
Figure FDA0004099976630000044
/>
wherein (x) 1 ,y 1 ,z 1 ) Representing the bone nodes at the head and neck of an interfered person q
Figure FDA0004099976630000045
Any point on the connecting line;
the formula for judging the wrist position of the interfering person p to be in the head or body of the interfered person q is as follows,
Figure FDA0004099976630000046
when the wrist position of the person implementing the interference p is positioned at the head of the person q to be interfered, delta is larger than 0; when the wrist position of the person p is in the body of the person q to be interfered, delta is less than 0; implementing the actual distance D between the wrist position of the interfering person p and the head of the interfered person q q Or the actual distance F from the body of the person q to be disturbed q In order to realize the purpose,
Figure FDA0004099976630000047
finally, judging whether the person in the vehicle has abnormal behavior; when the wrist position of the interfering person p is at the actual distance E from the arm and body of the interfered person q q 、F q Continuously less than a dangerous distance C q 、B q If the time exceeds 10s, the implementation interfering person p takes abnormal behavior; when the actual distance D between the wrist position of the interfering person p and the head of the interfered person q is implemented q Less than the critical distance A in 5s q If the cumulative time exceeds 1s, the interfering person p performs an abnormal action.
CN201910229246.9A 2019-03-25 2019-03-25 Vision-based method for detecting abnormal behaviors of operating vehicle personnel Active CN109919134B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910229246.9A CN109919134B (en) 2019-03-25 2019-03-25 Vision-based method for detecting abnormal behaviors of operating vehicle personnel

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910229246.9A CN109919134B (en) 2019-03-25 2019-03-25 Vision-based method for detecting abnormal behaviors of operating vehicle personnel

Publications (2)

Publication Number Publication Date
CN109919134A CN109919134A (en) 2019-06-21
CN109919134B true CN109919134B (en) 2023-04-18

Family

ID=66966749

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910229246.9A Active CN109919134B (en) 2019-03-25 2019-03-25 Vision-based method for detecting abnormal behaviors of operating vehicle personnel

Country Status (1)

Country Link
CN (1) CN109919134B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110751100A (en) * 2019-10-22 2020-02-04 北京理工大学 Auxiliary training method and system for stadium
CN112686090B (en) * 2020-11-04 2024-02-06 北方工业大学 Intelligent monitoring system for abnormal behavior in bus
CN112434564B (en) * 2020-11-04 2023-06-27 北方工业大学 Detection system for abnormal aggregation behavior in bus

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104442566A (en) * 2014-11-13 2015-03-25 长安大学 Vehicle inside passenger dangerous state alarming device and alarming method
CN105551182A (en) * 2015-11-26 2016-05-04 吉林大学 Driving state monitoring system based on Kinect human body posture recognition
JP2016200910A (en) * 2015-04-08 2016-12-01 日野自動車株式会社 Driver state determination device
CN107665326A (en) * 2016-07-29 2018-02-06 奥的斯电梯公司 Monitoring system, passenger transporter and its monitoring method of passenger transporter
CN108446600A (en) * 2018-02-27 2018-08-24 上海汽车集团股份有限公司 A kind of vehicle driver's fatigue monitoring early warning system and method
CN108986400A (en) * 2018-09-03 2018-12-11 深圳市尼欧科技有限公司 A kind of third party based on image procossing, which multiplies, drives safety automatic-alarming method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104442566A (en) * 2014-11-13 2015-03-25 长安大学 Vehicle inside passenger dangerous state alarming device and alarming method
JP2016200910A (en) * 2015-04-08 2016-12-01 日野自動車株式会社 Driver state determination device
CN105551182A (en) * 2015-11-26 2016-05-04 吉林大学 Driving state monitoring system based on Kinect human body posture recognition
CN107665326A (en) * 2016-07-29 2018-02-06 奥的斯电梯公司 Monitoring system, passenger transporter and its monitoring method of passenger transporter
CN108446600A (en) * 2018-02-27 2018-08-24 上海汽车集团股份有限公司 A kind of vehicle driver's fatigue monitoring early warning system and method
CN108986400A (en) * 2018-09-03 2018-12-11 深圳市尼欧科技有限公司 A kind of third party based on image procossing, which multiplies, drives safety automatic-alarming method

Also Published As

Publication number Publication date
CN109919134A (en) 2019-06-21

Similar Documents

Publication Publication Date Title
CN109919134B (en) Vision-based method for detecting abnormal behaviors of operating vehicle personnel
CN208344074U (en) A kind of comprehensive DAS (Driver Assistant System) of the automobile based on machine vision
CN111417990B (en) System and method for vehicle fleet management in a fleet of vehicles using driver-oriented imaging devices to monitor driver behavior
CN104442566B (en) A kind of passenger&#39;s precarious position warning device and alarm method
US8593519B2 (en) Field watch apparatus
CN103770780B (en) A kind of active safety systems of vehicles alarm shield device
CN107697069A (en) Fatigue of automobile driver driving intelligent control method
JP2020114377A (en) System and method detecting problematic health situation
CN106571015A (en) Driving behavior data collection method based on Internet
US11514688B2 (en) Drowsiness detection system
US20180012090A1 (en) Visual learning system and method for determining a driver&#39;s state
CN105383381B (en) For the control method for vehicle and its device of driving safety
CN111645694B (en) Driver driving state monitoring system and method based on attitude estimation
CN102555982A (en) Safety belt wearing identification method and device based on machine vision
CN105252973B (en) For the temperature monitoring method of automobile, device and equipment
KR20180119258A (en) Driver state sensing system, driver state sensing method, and vehicle including thereof
CN212484555U (en) Fatigue driving multi-source information detection system
CN106341661A (en) Patrol robot
CN103700220A (en) Fatigue driving monitoring device
WO2020161610A2 (en) Adaptive monitoring of a vehicle using a camera
CN107571735A (en) A kind of vehicle drivers status monitoring system and monitoring method
CN106650635A (en) Method and system for detecting rearview mirror viewing behavior of driver
CN111523386B (en) High-speed railway platform door monitoring and protecting method and system based on machine vision
CN114005088A (en) Safety rope wearing state monitoring method and system
CN107170190B (en) A kind of dangerous driving warning system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant