CN106372621A - Face recognition-based fatigue driving detection method - Google Patents

Face recognition-based fatigue driving detection method Download PDF

Info

Publication number
CN106372621A
CN106372621A CN201610869723.4A CN201610869723A CN106372621A CN 106372621 A CN106372621 A CN 106372621A CN 201610869723 A CN201610869723 A CN 201610869723A CN 106372621 A CN106372621 A CN 106372621A
Authority
CN
China
Prior art keywords
eye
eyes
face
image
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN201610869723.4A
Other languages
Chinese (zh)
Inventor
陈泉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fangchenggang Port District Gaochuang Information Technology Co Ltd
Original Assignee
Fangchenggang Port District Gaochuang Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fangchenggang Port District Gaochuang Information Technology Co Ltd filed Critical Fangchenggang Port District Gaochuang Information Technology Co Ltd
Priority to CN201610869723.4A priority Critical patent/CN106372621A/en
Publication of CN106372621A publication Critical patent/CN106372621A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • G06V40/171Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition
    • G06V40/175Static expression
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/06Alarms for ensuring the safety of persons indicating a condition of sleep, e.g. anti-dozing alarms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20068Projection on vertical or horizontal image axis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a face recognition-based fatigue driving detection method. The method includes the following steps that: human face video image acquisition is carried out; an integral projection algorithm is adopted to carry out eye and mouth positioning on the first frame of image, the images of a left eye portion, a right eye portion and a mouth portion are extracted; an adaptive particle filtering-based eye tracking algorithm is adopted to carry out eye tracking; an improved horizontal projection-based eye and mouth state recognition method is adopted to carry out recognition; eye and mouth fatigue information extraction is carried out, a PERCLOS value, blink frequency, eye closing time and the number of times of yawning are calculated; and fatigue judgment is performed on a driver based on the PERCLOS value, the blink frequency, the eye closing time and the number of times of yawning. According to the defects of an original horizontal projection method, an improved horizontal projection method is adopted to recognize eye and mouth states, and therefore, high recognition accuracy and high adaptability can be achieved; and the fatigue judgment algorithm can improve judgment accuracy.

Description

Method for detecting fatigue driving based on recognition of face
Technical field
The present invention relates to a kind of method for detecting fatigue driving based on recognition of face.
Background technology
Fatigue driving refers to driver because driving a vehicle for a long time, not having enough sleep or the factor such as uncomfortable and the sight line mould that causes Paste, bradykinesia, distractibility, the phenomenon such as stiff in the movements, in other words driver after continuous for a long time driving, produce physiology , the phenomenon of driving efficiency decline in function and the imbalance of psychological function.Driver does not rest or long-duration driving, easily Fatigue occurring, attention, judgement, decision and the sports consciousness of driver being made after fatigue to weaken, if being further continued for driving car , action occurs and delays or too early, the faulty operation such as operation pauses or the correction time is improper, easily lead to vehicle accident, for protecting Safety should don't drive when tired for barrier.
According to latest data statistics, if driver occurs absent minded within the 3s time, about 80% friendship can be caused Interpreter's event, mainly has deviation and rear-end collision, a large amount of vehicle accidents have had a strong impact on our harmonious life.Grind Study carefully display, if the 1.5s before road traffic accident occurs sends early warning to driver, 90% this kind of accident can be avoided.Institute These are had all to say " fatigue driving is suddenly in tiger ".So fatigue driving except reasonable arrangement running time in addition it is also necessary to effective section Skill prior-warning device, could at utmost reduce the generation of fatigue driving accident.
Fatigue driving has become " number one killer " of traffic safety, and driver fatigue monitor system is popular, in the urgent need to We update fatigue-driving detection technology.
Content of the invention
The technical problem to be solved in the present invention is the method for detecting fatigue driving providing based on recognition of face.
Based on the method for detecting fatigue driving of recognition of face, comprise the following steps:
S1: facial video image gathers;
S2: the first two field picture is adopted integral projection algorithm eyes and mouth are positioned, extract left eye, right eye and face The image in portion;
S3: the eye-tracking algorithms using adaptive particle filter carry out tracing of human eye, can be first according to step s2 Two field picture is accurately positioned the position of eyes, and using this position as eye tracking initial information, eye-tracking algorithms are to eyes It is tracked, such as follow the tracks of unsuccessfully, return to step 2, reorientate eye position, repeat above operation, until video sequence terminates, The eye position of each two field picture can be traced into;
S4: be identified using the eyes and mouth state identification method improving floor projection;
S5: eyes and mouth fatigue information retrieval, calculate perclos value, frequency of wink, eyes closed time and yawn Number of times;
S6: with reference to perclos value, frequency of wink, eyes closed time and number of times of yawning, fatigue is carried out to driver and sentences Not.
Further, the concrete grammar of the eyes described in step s2 and mouth positioning is as follows:
S2-1: the facial image collecting is carried out with vertical integral projection, the boundary point of upright projection curve hill and be Facial boundary coordinate, according to this coordinate by the image zooming-out in facial right boundary out;
S2-2: the image in face right boundary that step s2-1 is extracted does horizontal integral projection, and level integration is thrown The first paragraph crest of shadow curve is the brow portion that maximum correspond to people, and wherein four concave points correspond to brows, eyes, nose respectively Son and the vertical coordinate of mouth, are denoted as the vertical coordinate that y1 is eyebrow, y2 is the vertical coordinate of eyes, y3 is the vertical coordinate of nose, y4 is mouth The vertical coordinate of bar;
S2-3: the image zooming-out between [y1, y2+ (y2-y1)] out, that is, obtain the image of eyebrow and eye portion;Right The image of eyebrow and eye portion carries out vertical integral projection, and the corresponding point of two troughs of the integral projection curve obtaining is Left eye and right eye central point abscissa, are denoted as x1 and x2 respectively;
S2-4: with step s2-3 simultaneously, the image zooming-out between [y3, y4+ (y4-y3)] out, that is, obtain face part Image, vertical integral projection is carried out to the image of face part, the corresponding point of trough of the integral projection curve obtaining is Face central point abscissa, is denoted as x3;
S2-5: according to the coordinate position of the left eye obtaining, right eye and face, extract the figure of left eye, right eye and face portion Picture;
Further, the specifically comprising the following steps that of the eyes described in step s4 and mouth state identification method
S4-1: the left-eye image being extracted according to step s2, carries out floor projection to it, obtains its floor projection bent Line, calculated level drop shadow curve height hlWith width wl, and think and meet hlIt is figure of opening eyes more than the first two field picture of some value Picture, calculates and preserves hl、wlAnd hl/wlThree values, are designated as h respectivelyl、wl, and kl
Determine eye image in subsequent frame closes the state of opening, as follows:
1) work as hl>hl, eyes open;
2) work as hl<hl, calculate kl=hl/wl
3) whenEyes closed, otherwise opens;
S4-2: the eye image being extracted according to step s2, carries out floor projection to it, obtains its floor projection bent Line, calculated level drop shadow curve height hrWith width wr, and think and meet hrIt is figure of opening eyes more than the first two field picture of some value Picture, calculates and preserves hr、wrAnd hl/wrThree values, are designated as h respectivelyr、wr, and kr
Determine eye image in subsequent frame closes the state of opening, as follows:
1) work as hr>hr, eyes open;
2) work as hr<hr, calculate kr=hr/wr
3) whenEyes closed, otherwise opens;
S4-3: the face image being extracted according to step s2, carries out floor projection to it, obtains its floor projection bent Line, calculated level drop shadow curve height hmWith width wm, and think and meet hmIt is that face is complete more than the first two field picture of some value Entirely open image, calculate and preserve hm、wmAnd hm/wmThree values, are designated as h respectivelym、wm, and km
Determine a closed state of face in subsequent frame, as follows:
1) work as hm>hm, eyes open;
2) work as hm<hm, calculate km=hm/wm
3) whenFace is of completely closed, and otherwise face partly opens;
Further, the perclos value described in step s5, frequency of wink, eyes closed time and number of times of yawning Computational methods are as follows:
1) calculating of perclos value, computing formula is as follows:
f p = &sigma; x = 1 r t x &overbar; t 0 ,
Wherein, fpAccount for the percentage rate of setting time section for the eyes closed time,For x: th t2To t3Time (t2For eye Eyeball opens the moment that degree is reduced to 20%, t3Open for 20% moment for eyes again), r is time of measuring t0Interior driver Eye closing number of times;
2) calculating of frequency of wink, computing formula is as follows:
f b = 60 t b f ,
Wherein, fbFor frequency of wink, tbfFor the interval of wink time twice;
If frequency of wink fbBelow or above the threshold value chosen, that is, it is considered as fatigue driving;
3) calculating of eyes closed time, computing formula is as follows:
e t c = m c l o s e &times; 1 w ,
Wherein, ect is the eyes closed time, and mclose is interior eyes per minute closure frame number the longest, and w is video acquisition Speed (unit is frame/s);
If the eyes closed time is more than threshold value 3s chosen, it is considered as fatigue driving;
4) yawn the calculating of number of times, calculate the time that face is in complete open configuration, if this time is more than and sets Threshold value is then designated as once yawning;If the number of times of yawning of driver is more than the threshold value setting it is believed that driving in the unit interval Member is in fatigue state;
Further, the tired method of discrimination described in step s6 is specific as follows:
S6-1: first judge human eye state, when closing one's eyes, judge whether closed-eye time is more than 3s, if closed-eye time is more than 3s, then be fatigue driving;
S6-2: if closed-eye time is less than 3s, judge perclos value f in 30spWhether it is more than 40%, if fp> 40%, then it is fatigue driving;
S6-3: if fp< 40%, then judge whether to yawn number of times more than 15 times/s, and the persistent period of each yawn will More than 0.6s, if number of times of yawning is more than 15 times/s, it is fatigue driving, otherwise it is assumed that driver is normal driving, again examines Survey.
The invention has the beneficial effects as follows:
The present invention, according to the deficiency of original horizontal projection method, adopts based on improvement floor projection algorithm to eyes and face State is identified, and extracts perclos value, frequency of wink, eyes closed time and four kinds of information of number of times of yawning, finally leads to Cross and consider whether four kinds of information differentiate tired driver, there is higher accuracy of identification and adaptability, tired distinguished number carries The high accuracy differentiating.
Specific embodiment
The present invention is further elaborated for specific examples below, but not as a limitation of the invention.
Experiment 1 adopts the integral projection algorithm of the present invention that eyes and mouth are positioned.
Driving interior, facial image is carried out to 3 experimenters (using experimenter 1 respectively, experimenter 2, and experimenter 3 represents) Actual acquisition, everyone has different expression shape change in gatherer process, and everyone gathers 200, three people's totally 600 face colours Image, chooses wherein one image of personage 1, and the integral projection algorithm using the present invention positions to eyes and mouth, gained Experimental data is as shown in table 1 below.
Y1, y2, y3, y4 and x1 that table 1 projection obtains, the numerical value of x2, x3
Parameter y1 y2 y3 y4 x1 x2 x3
Numerical value 90 139 216 263 82 200 134
600 facial images of three people carry out eyes and face positioning experiment, and extract eyes and face image, and result is such as Shown in table 2 below.
Table 2 eyes and face locating accuracy
Position Test quantity Correct result number Error result number Accuracy rate (%)
Eyes 600 558 42 93.0
Face 600 570 30 95.0
Wherein, eyes missing inspection goods is shown in and is primarily due to blocking of eyes, and face missing inspection or flase drop are mainly hand interference Or block and cause.
Test result indicate that, the integral projection algorithm of the present invention has higher positioning precision to eyes and mouth positioning, Can be accurately positioned and extract the eyes in front face and face image.
Experiment 2 improves floor projection method using the present invention and carries out eyes and mouth state recognition.
Experimenter 2 image choosing collection in experiment 1 carries out eyes and mouth states identification, and order meets hl> 30, hr> 30 First two field picture is eye opening image, meets hm> 50 the first two field picture opens image completely for face, then hl、wl、kl、hr、wr、 kr、hm、wmAnd kmParameter value as shown in table 3 below.
Table 3 initial frame eyes open each parameter value opening completely with face
The method according to the invention determines the parameter of eyes and face and state in subsequent frame, takes wherein 7 two field pictures to carry out Experiment, wherein, a is left eye eye closing image, and b is left eye eye opening image, and c is right eye eye closing image, and d is right eye eye opening image, and e is Face closes image, and f partly opens image for face, and g opens image completely for face.Result is as shown in table 4 below.
The parameter of table 4 subsequent frame eyes and face and state recognition result
Test result indicate that, the h of the drop shadow curve of image cr=47 > hr32, therefore it is identified as eyes-open state, actual should be Closed-eye state, be primarily due to eyes interference make the height value of its floor projection larger lead to judge by accident.
The tired discriminating experiment of experiment 3
Driving the indoor video image that 6 experimenters are gathered respectively with 4 minutes, the video image of every experimenter is being pressed Segmentation in 1 minute, every experimenter selects 3 sections of video images, and frame rate is 15 frames/s, frame width 640, frame height 180, every section of video Comprise 900 two field pictures, choose wherein 3 and calculate ect value, f respectivelypValue, fbValue and number of times of yawning, final fatigue differentiates knot The fatigue state of fruit and experimenter's reality is as shown in table 5 below.
53 experimenter's fatigue discriminating experiment results of table
By 6 experimenter's test result indicate that, this tired method of discrimination accuracy rate 94.4%.Because this experiment is adopted The sample space of collection is limited, enters 6 experimenters of team and is tested, thus the accuracy rate of impact identification, if sample can be increased further This space, it will obtain more preferable recognition effect.

Claims (5)

1. the method for detecting fatigue driving based on recognition of face is it is characterised in that comprise the following steps:
S1: facial video image gathers;
S2: the first two field picture is adopted integral projection algorithm eyes and mouth are positioned, extract left eye, right eye and face portion Image;
S3: the eye-tracking algorithms using adaptive particle filter carry out tracing of human eye, can be in the first frame figure according to step s2 As being accurately positioned the position of eyes, and using this position as eye tracking initial information, eye-tracking algorithms carry out to eyes Follow the tracks of, such as follow the tracks of unsuccessfully, return to step 2, reorientate eye position, repeat above operation, until video sequence terminates, you can Trace into the eye position of each two field picture;
S4: be identified using the eyes and mouth state identification method improving floor projection;
S5: eyes and mouth fatigue information retrieval, calculate perclos value, frequency of wink, eyes closed time and yawn secondary Number;
S6: with reference to perclos value, frequency of wink, eyes closed time and number of times of yawning, driver is carried out with fatigue differentiation.
2. the method for detecting fatigue driving based on recognition of face according to claim 1 is it is characterised in that described in step s2 Eyes and mouth positioning concrete grammar as follows:
S2-1: the facial image collecting is carried out with vertical integral projection, the boundary point of upright projection curve hill and for face Boundary coordinate, according to this coordinate by the image zooming-out in facial right boundary out;
S2-2: the image in face right boundary that step s2-1 is extracted does horizontal integral projection, and horizontal integral projection is bent The first paragraph crest of line is the brow portion that maximum correspond to people, wherein four concave points correspond to respectively brows, eyes, nose and The vertical coordinate of mouth, is denoted as the vertical coordinate that y1 is eyebrow, and y2 is the vertical coordinate of eyes, and y3 is the vertical coordinate of nose, and y4 is face Vertical coordinate;
S2-3: the image zooming-out between [y1, y2+ (y2-y1)] out, that is, obtain the image of eyebrow and eye portion;To eyebrow Carry out vertical integral projection with the image of eye portion, the corresponding point of two troughs of the integral projection curve obtaining is left eye With right eye central point abscissa, it is denoted as x1 and x2 respectively;
S2-4: with step s2-3 simultaneously, the image zooming-out between [y3, y4+ (y4-y3)] out, that is, obtain the figure of face part Picture, carries out vertical integral projection to the image of face part, and the corresponding point of trough of the integral projection curve obtaining is face Central point abscissa, is denoted as x3;
S2-5: according to the coordinate position of the left eye obtaining, right eye and face, extract the image of left eye, right eye and face portion.
3. the method for detecting fatigue driving based on recognition of face according to claim 1 is it is characterised in that described in step s4 Eyes and the specifically comprising the following steps that of mouth state identification method
S4-1: the left-eye image being extracted according to step s2, carries out floor projection to it, obtains its floor projection curve, meter Calculate floor projection height of curve hlWith width wl, and think and meet hlIt is eye opening image more than the first two field picture of some value, meter Calculate and preserve hl、wlAnd hl/wlThree values, are designated as h respectivelyl、wl, and kl
Determine eye image in subsequent frame closes the state of opening, as follows:
1) work as hl>hl, eyes open;
2) work as hl<hl, calculate kl=hl/wl
3) whenEyes closed, otherwise opens;
S4-2: the eye image being extracted according to step s2, carries out floor projection to it, obtains its floor projection curve, meter Calculate floor projection height of curve hrWith width wr, and think and meet hrIt is eye opening image more than the first two field picture of some value, meter Calculate and preserve hr、wrAnd hl/wrThree values, are designated as h respectivelyr、wr, and kr
Determine eye image in subsequent frame closes the state of opening, as follows:
1) work as hr>hr, eyes open;
2) work as hr<hr, calculate kr=hr/wr
3) whenEyes closed, otherwise opens;
S4-3: the face image being extracted according to step s2, carries out floor projection to it, obtains its floor projection curve, meter Calculate floor projection height of curve hmWith width wm, and think and meet hmOpen completely for face more than the first two field picture of some value Open image, calculate and preserve hm、wmAnd hm/wmThree values, are designated as h respectivelym、wm, and km
Determine a closed state of face in subsequent frame, as follows:
1) work as hm>hm, eyes open;
2) work as hm<hm, calculate km=hm/wm
3) whenFace is of completely closed, and otherwise face partly opens.
4. the method for detecting fatigue driving based on recognition of face according to claim 1 is it is characterised in that institute in step s5 The perclos value stated, the computational methods of frequency of wink, eyes closed time and number of times of yawning are as follows:
1) calculating of perclos value, computing formula is as follows:
Wherein, fpAccount for the percentage rate of setting time section for the eyes closed time,For x: th t2To t3Time (t2Open for eyes The degree of opening is reduced to 20% moment, t3Open for 20% moment for eyes again), r is time of measuring t0Interior driver closes one's eyes Number of times;
2) calculating of frequency of wink, computing formula is as follows:
Wherein, fbFor frequency of wink, tbfFor the interval of wink time twice;
If frequency of wink fbBelow or above the threshold value chosen, that is, it is considered as fatigue driving;
3) calculating of eyes closed time, computing formula is as follows:
Wherein, ect is the eyes closed time, and mclose is interior eyes per minute closure frame number the longest, and w is video acquisition speed (unit is frame/s);
If the eyes closed time is more than threshold value 3s chosen, it is considered as fatigue driving;
4) yawn the calculating of number of times, calculate the time that face is in complete open configuration, if this time is more than the threshold value setting Then it is designated as once yawning;If the number of times of yawning of driver is more than the threshold value setting it is believed that at driver in the unit interval In fatigue state.
5. the method for detecting fatigue driving based on recognition of face according to claim 1 is it is characterised in that institute in step s6 The tired method of discrimination stated is specific as follows:
S6-1: first judge human eye state, when closing one's eyes, judge whether closed-eye time is more than 3s, if closed-eye time is more than 3s, It is fatigue driving;
S6-2: if closed-eye time is less than 3s, judge perclos value f in 30spWhether it is more than 40%, if fp> 40%, then It is fatigue driving;
S6-3: if fp< 40%, then judge whether to yawn number of times more than 15 times/s, and the persistent period of each yawn will exceed 0.6s, if number of times of yawning is more than 15 times/s, is fatigue driving, otherwise it is assumed that driver is normal driving, again detects.
CN201610869723.4A 2016-09-30 2016-09-30 Face recognition-based fatigue driving detection method Withdrawn CN106372621A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610869723.4A CN106372621A (en) 2016-09-30 2016-09-30 Face recognition-based fatigue driving detection method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610869723.4A CN106372621A (en) 2016-09-30 2016-09-30 Face recognition-based fatigue driving detection method

Publications (1)

Publication Number Publication Date
CN106372621A true CN106372621A (en) 2017-02-01

Family

ID=57898527

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610869723.4A Withdrawn CN106372621A (en) 2016-09-30 2016-09-30 Face recognition-based fatigue driving detection method

Country Status (1)

Country Link
CN (1) CN106372621A (en)

Cited By (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107085715A (en) * 2017-05-19 2017-08-22 武汉理工大学 A kind of television set intelligently detects the dormant system and method for user
CN107480629A (en) * 2017-08-11 2017-12-15 常熟理工学院 A kind of method for detecting fatigue driving and device based on depth information
CN107679468A (en) * 2017-09-19 2018-02-09 浙江师范大学 A kind of embedded computer vision detects fatigue driving method and device
CN107704849A (en) * 2017-10-28 2018-02-16 上海爱优威软件开发有限公司 The face identification method and system of double verification function
CN107741784A (en) * 2017-10-09 2018-02-27 济南大学 A kind of amusement exchange method suitable for leaden paralysis patient
CN107798295A (en) * 2017-09-27 2018-03-13 杭州分数科技有限公司 Driving based reminding method, device and equipment
CN107992831A (en) * 2017-12-07 2018-05-04 深圳云天励飞技术有限公司 Fatigue state detection method, device, electronic equipment and storage medium
CN108363968A (en) * 2018-01-31 2018-08-03 上海瀚所信息技术有限公司 A kind of tired driver driving monitoring system and method based on key point extraction
CN108460345A (en) * 2018-02-08 2018-08-28 电子科技大学 A kind of facial fatigue detection method based on face key point location
CN108545080A (en) * 2018-03-20 2018-09-18 北京理工大学 Driver Fatigue Detection and system
CN108596022A (en) * 2018-03-13 2018-09-28 苏州奥科德瑞智能科技有限公司 A kind of mobile image identifying system
CN108647616A (en) * 2018-05-01 2018-10-12 南京理工大学 Real-time drowsiness detection method based on facial characteristics
CN108830240A (en) * 2018-06-22 2018-11-16 广州通达汽车电气股份有限公司 Fatigue driving state detection method, device, computer equipment and storage medium
CN109063545A (en) * 2018-06-13 2018-12-21 五邑大学 A kind of method for detecting fatigue driving and device
CN109119095A (en) * 2018-08-31 2019-01-01 平安科技(深圳)有限公司 Level of fatigue recognition methods, device, computer equipment and storage medium
CN109215293A (en) * 2018-11-23 2019-01-15 深圳市元征科技股份有限公司 A kind of method for detecting fatigue driving, device and vehicle-mounted terminal equipment
CN109308445A (en) * 2018-07-25 2019-02-05 南京莱斯电子设备有限公司 A kind of fixation post personnel fatigue detection method based on information fusion
CN109447025A (en) * 2018-11-08 2019-03-08 北京旷视科技有限公司 Fatigue detection method, device, system and computer readable storage medium
CN109522820A (en) * 2018-10-29 2019-03-26 江西科技学院 A kind of fatigue monitoring method, system, readable storage medium storing program for executing and mobile terminal
CN109635851A (en) * 2018-11-23 2019-04-16 武汉风行在线技术有限公司 A kind of smart television human fatigue detection system and method based on face multiple features fusion
CN109849660A (en) * 2019-01-29 2019-06-07 合肥革绿信息科技有限公司 A kind of vehicle safety control system
CN109934199A (en) * 2019-03-22 2019-06-25 扬州大学 A kind of Driver Fatigue Detection based on computer vision and system
CN109977930A (en) * 2019-04-29 2019-07-05 中国电子信息产业集团有限公司第六研究所 Method for detecting fatigue driving and device
CN110021147A (en) * 2019-05-07 2019-07-16 四川九洲视讯科技有限责任公司 A kind of method for detecting fatigue driving demarcated based on machine learning and numerical value
CN110210445A (en) * 2019-06-12 2019-09-06 广东工业大学 A kind of fatigue state detection method, device, equipment and the medium of target object
CN110276273A (en) * 2019-05-30 2019-09-24 福建工程学院 Merge the Driver Fatigue Detection of facial characteristics and the estimation of image pulse heart rate
CN110705453A (en) * 2019-09-29 2020-01-17 中国科学技术大学 Real-time fatigue driving detection method
CN110751810A (en) * 2019-10-29 2020-02-04 深圳联安通达科技有限公司 Fatigue driving detection method and device
CN111104817A (en) * 2018-10-25 2020-05-05 中车株洲电力机车研究所有限公司 Fatigue detection method based on deep learning
WO2020140723A1 (en) * 2018-12-30 2020-07-09 广州市百果园信息技术有限公司 Method, apparatus and device for detecting dynamic facial expression, and storage medium
CN111582086A (en) * 2020-04-26 2020-08-25 湖南大学 Fatigue driving identification method and system based on multiple characteristics
CN112052775A (en) * 2020-08-31 2020-12-08 同济大学 Fatigue driving detection method based on gradient histogram video recognition technology
CN112528843A (en) * 2020-12-07 2021-03-19 湖南警察学院 Motor vehicle driver fatigue detection method fusing facial features
CN113095297A (en) * 2021-05-11 2021-07-09 昆明理工大学 Fatigue detection method based on one-dimensional projection tracking eye movement rate
CN115311819A (en) * 2022-10-10 2022-11-08 南京智慧交通信息股份有限公司 Intelligent bus intelligent vehicle-mounted real-time warning system and method thereof

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
赵琳: "基于人脸识别的疲劳驾驶监控方法研究", 《长春工业大学硕士学位论文》 *

Cited By (43)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107085715A (en) * 2017-05-19 2017-08-22 武汉理工大学 A kind of television set intelligently detects the dormant system and method for user
CN107480629A (en) * 2017-08-11 2017-12-15 常熟理工学院 A kind of method for detecting fatigue driving and device based on depth information
CN107679468A (en) * 2017-09-19 2018-02-09 浙江师范大学 A kind of embedded computer vision detects fatigue driving method and device
CN107798295A (en) * 2017-09-27 2018-03-13 杭州分数科技有限公司 Driving based reminding method, device and equipment
CN107741784A (en) * 2017-10-09 2018-02-27 济南大学 A kind of amusement exchange method suitable for leaden paralysis patient
CN107704849A (en) * 2017-10-28 2018-02-16 上海爱优威软件开发有限公司 The face identification method and system of double verification function
CN107992831A (en) * 2017-12-07 2018-05-04 深圳云天励飞技术有限公司 Fatigue state detection method, device, electronic equipment and storage medium
CN108363968A (en) * 2018-01-31 2018-08-03 上海瀚所信息技术有限公司 A kind of tired driver driving monitoring system and method based on key point extraction
CN108460345A (en) * 2018-02-08 2018-08-28 电子科技大学 A kind of facial fatigue detection method based on face key point location
CN108596022A (en) * 2018-03-13 2018-09-28 苏州奥科德瑞智能科技有限公司 A kind of mobile image identifying system
CN108596022B (en) * 2018-03-13 2021-09-28 广东合晟网络科技有限公司 Movable image recognition system
CN108545080A (en) * 2018-03-20 2018-09-18 北京理工大学 Driver Fatigue Detection and system
CN108647616A (en) * 2018-05-01 2018-10-12 南京理工大学 Real-time drowsiness detection method based on facial characteristics
CN109063545B (en) * 2018-06-13 2021-11-12 五邑大学 Fatigue driving detection method and device
CN109063545A (en) * 2018-06-13 2018-12-21 五邑大学 A kind of method for detecting fatigue driving and device
CN108830240A (en) * 2018-06-22 2018-11-16 广州通达汽车电气股份有限公司 Fatigue driving state detection method, device, computer equipment and storage medium
CN109308445A (en) * 2018-07-25 2019-02-05 南京莱斯电子设备有限公司 A kind of fixation post personnel fatigue detection method based on information fusion
CN109308445B (en) * 2018-07-25 2019-06-25 南京莱斯电子设备有限公司 A kind of fixation post personnel fatigue detection method based on information fusion
CN109119095B (en) * 2018-08-31 2023-06-06 平安科技(深圳)有限公司 Fatigue grade identification method, device, computer equipment and storage medium
CN109119095A (en) * 2018-08-31 2019-01-01 平安科技(深圳)有限公司 Level of fatigue recognition methods, device, computer equipment and storage medium
CN111104817A (en) * 2018-10-25 2020-05-05 中车株洲电力机车研究所有限公司 Fatigue detection method based on deep learning
CN109522820A (en) * 2018-10-29 2019-03-26 江西科技学院 A kind of fatigue monitoring method, system, readable storage medium storing program for executing and mobile terminal
CN109447025A (en) * 2018-11-08 2019-03-08 北京旷视科技有限公司 Fatigue detection method, device, system and computer readable storage medium
CN109635851A (en) * 2018-11-23 2019-04-16 武汉风行在线技术有限公司 A kind of smart television human fatigue detection system and method based on face multiple features fusion
CN109215293A (en) * 2018-11-23 2019-01-15 深圳市元征科技股份有限公司 A kind of method for detecting fatigue driving, device and vehicle-mounted terminal equipment
WO2020140723A1 (en) * 2018-12-30 2020-07-09 广州市百果园信息技术有限公司 Method, apparatus and device for detecting dynamic facial expression, and storage medium
CN109849660A (en) * 2019-01-29 2019-06-07 合肥革绿信息科技有限公司 A kind of vehicle safety control system
CN109934199A (en) * 2019-03-22 2019-06-25 扬州大学 A kind of Driver Fatigue Detection based on computer vision and system
CN109977930B (en) * 2019-04-29 2021-04-02 中国电子信息产业集团有限公司第六研究所 Fatigue driving detection method and device
CN109977930A (en) * 2019-04-29 2019-07-05 中国电子信息产业集团有限公司第六研究所 Method for detecting fatigue driving and device
CN110021147A (en) * 2019-05-07 2019-07-16 四川九洲视讯科技有限责任公司 A kind of method for detecting fatigue driving demarcated based on machine learning and numerical value
CN110276273A (en) * 2019-05-30 2019-09-24 福建工程学院 Merge the Driver Fatigue Detection of facial characteristics and the estimation of image pulse heart rate
CN110276273B (en) * 2019-05-30 2024-01-02 福建工程学院 Driver fatigue detection method integrating facial features and image pulse heart rate estimation
CN110210445A (en) * 2019-06-12 2019-09-06 广东工业大学 A kind of fatigue state detection method, device, equipment and the medium of target object
CN110705453A (en) * 2019-09-29 2020-01-17 中国科学技术大学 Real-time fatigue driving detection method
CN110751810A (en) * 2019-10-29 2020-02-04 深圳联安通达科技有限公司 Fatigue driving detection method and device
CN111582086A (en) * 2020-04-26 2020-08-25 湖南大学 Fatigue driving identification method and system based on multiple characteristics
CN112052775A (en) * 2020-08-31 2020-12-08 同济大学 Fatigue driving detection method based on gradient histogram video recognition technology
CN112528843A (en) * 2020-12-07 2021-03-19 湖南警察学院 Motor vehicle driver fatigue detection method fusing facial features
CN113095297A (en) * 2021-05-11 2021-07-09 昆明理工大学 Fatigue detection method based on one-dimensional projection tracking eye movement rate
CN113095297B (en) * 2021-05-11 2022-07-15 昆明理工大学 Fatigue detection method based on one-dimensional projection tracking eye movement rate
CN115311819A (en) * 2022-10-10 2022-11-08 南京智慧交通信息股份有限公司 Intelligent bus intelligent vehicle-mounted real-time warning system and method thereof
CN115311819B (en) * 2022-10-10 2023-01-20 南京智慧交通信息股份有限公司 Intelligent bus intelligent vehicle-mounted real-time warning system and method thereof

Similar Documents

Publication Publication Date Title
CN106372621A (en) Face recognition-based fatigue driving detection method
CN108446600A (en) A kind of vehicle driver&#39;s fatigue monitoring early warning system and method
CN109308445B (en) A kind of fixation post personnel fatigue detection method based on information fusion
CN103340637A (en) System and method for driver alertness intelligent monitoring based on fusion of eye movement and brain waves
CN104183091B (en) System for adjusting sensitivity of fatigue driving early warning system in self-adaptive mode
CN107292251B (en) Driver fatigue detection method and system based on human eye state
CN110119676A (en) A kind of Driver Fatigue Detection neural network based
CN111616718B (en) Method and system for detecting fatigue state of driver based on attitude characteristics
CN106529496B (en) A kind of method of engine drivers in locomotive depot real-time video fatigue detecting
CN104574817A (en) Machine vision-based fatigue driving pre-warning system suitable for smart phone
CN108230619A (en) Method for detecting fatigue driving based on multi-feature fusion
CN106080194A (en) The method for early warning of anti-fatigue-driving and system
CN107085715A (en) A kind of television set intelligently detects the dormant system and method for user
CN105574487A (en) Facial feature based driver attention state detection method
CN101305913A (en) Face beauty assessment method based on video
CN101877051A (en) Driver attention state monitoring method and device
CN106650635A (en) Method and system for detecting rearview mirror viewing behavior of driver
CN109740477A (en) Study in Driver Fatigue State Surveillance System and its fatigue detection method
CN104068868A (en) Method and device for monitoring driver fatigue on basis of machine vision
CN103198616A (en) Method and system for detecting fatigue driving based on head and neck movement feature recognition of driver
CN106904169A (en) Traffic safety method for early warning and device
CN108021875A (en) A kind of vehicle driver&#39;s personalization fatigue monitoring and method for early warning
CN110427830A (en) Driver&#39;s abnormal driving real-time detection system for state and method
Du et al. Driver fatigue detection based on eye state analysis
CN110148092A (en) The analysis method of teenager&#39;s sitting posture based on machine vision and emotional state

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20170201

WW01 Invention patent application withdrawn after publication