CN113761982A - Fatigue driving identification system and identification method thereof - Google Patents

Fatigue driving identification system and identification method thereof Download PDF

Info

Publication number
CN113761982A
CN113761982A CN202010503035.2A CN202010503035A CN113761982A CN 113761982 A CN113761982 A CN 113761982A CN 202010503035 A CN202010503035 A CN 202010503035A CN 113761982 A CN113761982 A CN 113761982A
Authority
CN
China
Prior art keywords
face
information
image
unit
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010503035.2A
Other languages
Chinese (zh)
Inventor
黄铭嘉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fuhong Shanghai Electronic Technology Co Ltd
Yaohong Jiaxing Electronic Technology Co ltd
MKD Tech Inc
Original Assignee
Fuhong Shanghai Electronic Technology Co Ltd
Yaohong Jiaxing Electronic Technology Co ltd
MKD Tech Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fuhong Shanghai Electronic Technology Co Ltd, Yaohong Jiaxing Electronic Technology Co ltd, MKD Tech Inc filed Critical Fuhong Shanghai Electronic Technology Co Ltd
Priority to CN202010503035.2A priority Critical patent/CN113761982A/en
Publication of CN113761982A publication Critical patent/CN113761982A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/06Alarms for ensuring the safety of persons indicating a condition of sleep, e.g. anti-dozing alarms

Landscapes

  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention has provided a fatigue driving to discern the system and its identification method, mainly through the vehicle group sets up the optical identification module, monitoring module, the optical identification module includes the image pick-up unit, image receiving unit, estimates the processing unit of operation, transmission unit and power supply unit, the image pick-up unit launches laser beam or LED light beam to the face of the testee, and pick up the image information of the face, the wavelength of the laser beam is between 940nm and 1400nm, the transmission unit transmits eyes to discern and head position information to the monitoring module, carry on the range finding and discerning step and procedure of flying time to the face of the testee by means of the optical identification module; therefore, when the driver is tired or dozes off, the driver can be informed in real time, and traffic accidents of the vehicle when the driver is tired are avoided.

Description

Fatigue driving identification system and identification method thereof
Technical Field
The present invention relates to identification systems, and particularly to a fatigue driving identification system and a method thereof.
Background
At present, 20% of traffic accidents and one fourth of fatal major traffic accidents are caused by physical fatigue, illness and fever of drivers or drunk driving of the drivers, so that fatigue driving, sick driving and drunk driving are prevented, and the method has important significance for reducing traffic accidents and protecting the personal safety of the drivers. Aiming at fatigue driving, the current detection method mainly comprises subjective evaluation and objective evaluation.
The subjective evaluation is self-evaluation of the driver or evaluation of the driver by other people, and the method is obviously greatly influenced by human factor prejudgment, and the evaluation standards are inconsistent, so objective evaluation is needed. Research shows that multiple indexes of human electrocardiosignals are closely related to the fatigue degree of a human body, and the existing human body needs to realize the electrocardio monitoring of a driver by two methods of skin-pasted electrode contact detection and common grounding electrode non-contact detection so as to achieve the aim of detecting fatigue driving.
However, the former traditional electrocardiographic detection method of human skin contact type is not suitable for the application of drivers; the latter non-contact detection method comprises the following specific steps: designing a detection device, wherein a capacitance electrode of the detection device is made of a double-sided board, is placed on a driver seat and is used as a signal electrode: the grounding electrode is made of thick copper sheet, is installed on the outer edge of the left half circle of the steering wheel, is connected with an Analog ground in the circuit, and is directly contacted with the left hand of a driver in the driving process so as to form an electrode configuration detection mode, the driver normally drives with common clothes, electrocardiosignals of the driver are transmitted to a post-stage amplification circuit through the coupling effect of a capacitive electrode, and the digitized electrocardiosignals are captured by an Analog-to-digital converter (ADC) to a host computer to realize detection, so that the interference caused by the motion of the upper half body of the driver is reduced, but the non-contact method is theoretically feasible, but the practical application is insufficient; the PCB is used for manufacturing the capacitance type signal electrode, so that the collision impact of a vehicle body in the driving process after a driver sits on a driver seat is not easy to bear, and the grounding electrode still needs to be contacted by the left hand, so that the real non-contact detection is not realized.
The method for preventing the fatigue driving not only detects the fatigue degree of the driver by electrocardio of the human body for warning prevention, but also realizes real-time warning of the driver when the driver is detected to be in a fatigue state, particularly under the condition that the driver is difficult to control the vehicle to run in real time due to the inattention and the reduction of the strain capacity, so as to avoid traffic accidents
Thus, the inventors have made the present invention by observing the above-mentioned deletion.
Disclosure of Invention
The embodiment of the invention provides a fatigue driving identification system and an identification method thereof, which can quickly acquire the face information and the eye information of a driver, identify the fatigue state and send out a warning message to remind the driver in real time so as to improve the safety in the driving process and reduce the occurrence of traffic accidents.
In order to solve the technical problems, the invention adopts the following technical scheme:
a fatigue driving identification system, comprising: a vehicle; the optical identification module is arranged on at least one side post in the vehicle, the side post is positioned on one side of a driving seat in the vehicle, the optical identification module comprises a plurality of image acquisition units, at least one image receiving unit, an estimation operation processing unit, at least one transmission unit and a power supply unit, and the optical identification module irradiates the face of an object to be detected; the image capturing unit emits a plurality of laser beams or a plurality of LED light beams to the face of the object to be detected and captures a plurality of face image information, and the wavelength of the laser beams is between 940nm and 1400 nm; the image receiving unit is electrically connected with the image capturing unit and used for obtaining face scale information by utilizing at least one face comparison image information to correct the scale, converting the laser beam or the LED light beam into a plurality of structural light beams by corresponding transmission and reflection transmission paths, and calculating to form a plurality of pieces of face image information of the object to be detected and a plurality of pieces of light beam flight time information; the estimation operation processing unit is electrically connected with the image receiving unit, receives the image information of the face of the object to be detected and the flight time information of the light beam provided by the image receiving unit, and calculates at least one eye identification information, at least one head position information and at least one eye closing time information of the face of the object to be detected in a measuring environment; the transmission unit is electrically connected with the image acquisition unit, the image receiving unit and the estimation operation processing unit and transmits the eye identification information to at least one monitoring module; and the power supply unit is electrically connected with the transmission unit.
Preferably, the wavelength of the laser beam emitted by the image capturing unit is between 1310nm and 1350 nm.
Preferably, the distance between the image capturing unit and the laser beam or the LED beam is between 1m and 3 m.
Preferably, the structured light beam is projected on an image projection area and forms a plurality of superimposed patterns, the image projection area has a plurality of sub-projection areas which are arranged in an array and adjacent to each other, the pattern distribution of the superimposed patterns on each sub-projection area is different from each other, and image information of the object to be measured and flight time information of the light beam are calculated and formed.
Preferably, the horizontal viewing angle range of the monitoring viewing angle of the optical recognition module is between 0 degree and 120 degrees, and the vertical viewing angle range is between 0 degree and 90 degrees.
Preferably, the transmission unit includes at least one antenna, at least one signal transmission element and at least one power transmission element, and the transmission unit is used for transmitting information of wireless signals or wired signals.
Preferably, the monitoring module is a vehicle computer, the vehicle is provided with a central control processing module, the transmission module is connected to at least one Local Area Network bus in the central control processing module, the Local Area Network bus is selected from one or a combination of a serial communication Network (Local Interconnect Network) and a control bus Network (Controller Area Network), and the transmission unit can be connected to the monitoring module through the signal transmission member via an ethernet Network.
In order to achieve the above object, the present invention provides a method for identifying fatigue driving, comprising the following steps: the image capturing unit of the optical identification module emits the laser beam or the LED light beam to the face of the object to be detected, and the wavelength of the laser beam is 940nm to 1350 nm; scanning the face of the object to be detected and capturing the face image information; the image receiving unit obtains face scale information by using at least one face comparison image information to correct the scale; the laser beam or the LED light beam correspondingly transmits and reflects a transmission path to convert the structural light beam, and the face image information of the object to be detected and the flight time information of the light beam are calculated and formed; the estimation operation processing unit receives the image information of the face of the object to be measured and the flight time information of the light beam provided by the image receiving unit and calculates the eye identification information of the face of the object to be measured in the measuring environment; the optical identification module carries out a comparison data program according to the eye identification information, wherein the comparison data program is used for determining the closing degree of the eyes of the face of the object to be detected and determining the fixation position of the eyes of the face of the object to be detected; when the eyes of the face of the object to be detected are closed or semi-closed in a set time sensing area, the transmission unit sends out alarm notification information to the monitoring module; and when the eye fixation position of the face of the object to be detected is downward or upward, the transmission unit sends out alarm notification information to the monitoring module.
Preferably, the wavelength of the laser beam emitted by the image capturing units is between 1310nm and 1350nm, the distance between the laser beam or the LED beam is between 1m and 3m, and the time sensing area interval is 2 seconds to 5 seconds; the structured light beam is projected in an image projection area and forms a plurality of superimposed patterns, the image projection area is provided with a plurality of sub projection areas which are arranged in an array and are adjacent to each other, the pattern distribution of the superimposed patterns in each sub projection area is different from each other, and image information of an object to be measured and flight time information of the light beam are calculated and formed
Preferably, the optical identification module performs the comparison data procedure according to the head position information, wherein the comparison data procedure is to identify a head inclination position of the object to be detected; when the head of the object to be detected is lower, the transmission unit sends out alarm notification information to the monitoring module; and when the head of the object to be detected leans to the left and the right, the transmission unit sends out alarm notification information to the monitoring module.
The invention has the beneficial effects that:
the fatigue driving identification system and the identification method thereof provided by the invention mainly set the optical identification module through the side post group of the vehicle, identify the face of the object to be detected by the optical identification module, quickly obtain the eye identification information of the face of the object to be detected, identify the fatigue state, and send out the warning message in real time to remind the driver, so as to improve the safety in the driving process and reduce the occurrence of traffic accidents.
Drawings
FIG. 1 is a schematic system diagram of a first embodiment of the present invention;
FIG. 2 is a schematic view of a first embodiment of the present invention in a set position;
FIG. 3 is a system architecture diagram illustrating a first embodiment of the present invention;
FIG. 4 is a schematic diagram illustrating an identification of a projection area of a display image according to a first embodiment of the present invention;
FIG. 5 is a diagram illustrating an identification status of an optical identification device according to a first embodiment of the present invention;
FIG. 6 is a schematic view illustrating an identification range of an image capturing unit according to a first embodiment of the present invention;
FIG. 7 is a diagram illustrating the recognition result of the image capturing unit according to the first embodiment of the present invention;
FIG. 8 is a diagram illustrating the recognition result of the image capturing unit according to the first embodiment of the present invention;
FIG. 9 is a system architecture diagram illustrating a second embodiment of the present invention;
FIG. 10 is a flowchart illustrating an identification method according to a first embodiment of the present invention;
FIG. 11A is a flowchart illustrating an identification method according to a second embodiment of the present invention;
fig. 11B is a flowchart illustrating an identification method according to a second embodiment of the invention.
Description of reference numerals:
100 fatigue driving identification system
10 vehicle
11: side column
20 optical identification module
21 image capturing unit
22 image receiving unit
221 image projection area
2211 sub projection area
222 superimposed pattern
Estimation arithmetic processing unit (23)
231 eye identification information
232 head position information
24 transmission unit
25 power supply unit
30 monitoring module
40 central control processing module
50 warning module
200 object to be measured
S1 step 1
S2 step 2
S3 step 3
S4 step 4
S5 step 5
S6 step 6
S6A step 6A
S61A step 61A
S6B step 6B
S61B step 61B:
S6C step 6C
S61C step 61C
S6D step 6D
S61D step 61D
Horizontal angle of view theta 1
Theta 2 vertical angle of view
Detailed Description
Hereinafter, an embodiment of a fatigue driving recognition system according to a first embodiment of the present invention will be described with reference to the drawings.
Referring to fig. 1 to 4, fig. 1 is a system diagram of a first embodiment of the present invention, fig. 2 is a schematic diagram of a setting position of the first embodiment of the present invention, fig. 3 is a schematic diagram of a system architecture of the first embodiment of the present invention, and fig. 4 is a schematic diagram of an image projection area identification in the first embodiment of the present invention. The invention discloses a fatigue driving identification system 100, comprising:
a vehicle 10.
An optical recognition module 20, assembled on at least one side pillar 11 inside the vehicle 10, where the side pillar 11 is located at one side of a driver seat in the vehicle 10, the optical recognition module 20 includes a plurality of image capturing units 21, at least one image receiving unit 22, an estimation and calculation processing unit 23, at least one transmission unit 24, and a power supply unit 25, and the optical recognition module 20 irradiates a face of an object 200 to be tested.
The image capturing unit 21 emits a plurality of laser beams or a plurality of LED beams to the face of the object 200 to be measured, and captures a plurality of face image information, wherein the wavelength of the laser beam is between 940nm and 1400 nm. It should be further noted that, in the present embodiment, the wavelength of the laser beam emitted by the image capturing unit 21 is between 1310nm and 1350 nm.
In addition, the image capturing unit 21 is provided with a plurality of laser diodes (not shown), which are mainly made of AlGaInAs and InGaAsP materials, and require operating power greater than tens mW to 1W, wherein the most important of the invention is to use the laser diode with 1310nm beam wavelength of single-mode, frequency-stabilized operation as the emitting light source; the Laser diode is a plurality of semiconductor lasers or a plurality of light Emitting diodes arranged in an array, and the semiconductor lasers are, for example, Vertical-Cavity Surface-Emitting lasers (VCSELs) or Photonic crystal lasers (Photonic crystal lasers).
Furthermore, the wavelength of the laser beam is used between 1310nm and 1350nm, which is characterized in that it can generate laser with multiple wavelengths simultaneously, the wavelength distribution is based on the center frequency, and other wavelength lasers are generated every 3nn or 6nm, such as 1318/1315/1312/1309 … if the center wavelength is 1315nm, more specifically, the laser with other wavelength has multi-mode (wavelength) characteristics in the use state; importantly, the eyes of ordinary users are sensitive to purple light with the wavelength of 400nm to red light with the wavelength of 700 nm; however, the laser beam has a longer wavelength and is invisible to the naked eye, and thus, the eye to the test object 200 can be protected.
Referring to fig. 5 and 6 in conjunction with fig. 4, fig. 5 is a schematic view of an identification state of an optical identification device according to a first embodiment of the present invention, and fig. 6 is a schematic view of an identification range of an image capturing unit according to the first embodiment of the present invention. The monitoring view angle (FOV) of the optical identification module 20 has a horizontal view angle θ 1 ranging from 0 degree to 120 degrees, and a vertical view angle θ 2 ranging from 0 degree to 90 degrees.
The image receiving unit 22 is electrically connected to the image capturing unit 21, and is configured to obtain face scale information by using at least one face comparison image information to correct the scale, convert the laser beam or the LED beam into a plurality of structural beams by corresponding transmission and reflection paths, and calculate to form a plurality of object face image information and a plurality of beam flight time information.
In this embodiment, as shown in fig. 4, the structured light beam is projected on an image projection area 221 and forms a plurality of superimposed patterns 222, the image projection area 221 has a plurality of sub-projection areas 2211 arranged in an array and adjacent to each other, the pattern distribution of the superimposed patterns 222 in each sub-projection area is different from each other, and image information of the object to be measured and flight time information of the light beam are calculated and formed. More specifically, the present invention can identify the facial information of the object 200 according to the distribution and variation of the pattern in each projection area.
In addition, the calculation of the face information also includes that the laser beam reflected from the target reaches the image receiving unit 22 (for example, a photoelectric detector), and the generated electric signal is converted to stop timing so as to know the time of the beam flight; the way round-trip ToF at time is measured can be to calculate the distance R to the reflection point, which is then determined by the following equation:
R=(1/2n)cΔt
where c is the speed of light in vacuum, n is the index of refraction of the propagation medium (about 1 for air), and there are two factors that affect the range resolution Δ R: the uncertainty δ Δ t in the measurement of Δ t and the resulting spatial error w (w ═ c τ) of the pulse width.
The estimation processing unit 23 is electrically connected to the image receiving unit 22, and the estimation processing unit 23 receives the image information of the face of the object to be measured and the flight time information of the light beam provided by the image receiving unit 22, and calculates at least one eye identification information 231, at least one head position information 232 and at least one eye closure time information of the face of the object 200 in the measurement environment.
The transmission unit 24 is electrically connected to the image capturing unit 21, the image receiving unit 22 and the estimation processing unit 23, and the transmission unit 24 transmits the eye identification information 231, the head position information 232 and the eye closing time data to at least one monitoring module 30. In the present embodiment, as shown in fig. 1, the transmission unit 24 includes at least one antenna, at least one signal transmission element and at least one power transmission element, and the transmission unit 24 is used for information transmission of wireless signals or wired signals.
It should be further noted that the antenna of the transmission unit 24 is used for information transmission of wireless signals, and is selected from one of wireless communication protocols of Bluetooth (Bluetooth), third generation mobile communication (3G), fourth generation mobile communication (4G), wireless local area network (Wi-Fi), Wireless Local Area Network (WLAN), and fifth generation mobile communication (5G); the signal transmission device of the transmission unit 24 is used for transmitting information of wired signals, and is an Ethernet (Ethernet). In this embodiment, as shown in fig. 3, the monitoring module 30 is a vehicle computer, the vehicle 10 is provided with a central control processing module 40, the transmission module is connected to at least one Local Area Network bus in the central control processing module 40, the Local Area Network bus is selected from one or a combination of a serial communication Network (LIN) and a control bus Network (CAN), and the transmission unit 24 CAN be connected to the monitoring module 30 through the signal transmission element via an Ethernet (Ethernet); in addition, the central control processing module 40 is selected from one or a combination of a central control vehicle and an instrument panel.
The power supply unit 25 is electrically connected to the transmission unit 24. In the present embodiment, the power supply unit 25 can be an internal replaceable battery or an external power line, but the present invention is not limited thereto, and in another preferred embodiment, the power supply unit 25 can be a replaceable battery.
For a further understanding of the nature of the structures, uses of the techniques, and intended effects of the invention, reference should now be made to the description of the preferred embodiments of the invention, and it is believed that the invention will be more fully understood and appreciated from the following description:
referring to fig. 7 and 8 in combination with fig. 3 to 6, fig. 7 and 8 are schematic diagrams illustrating recognition results of an image capturing unit according to a first embodiment of the present invention. The image capturing unit 21 of the optical identification module 20 emits the laser beam or the LED beam to the face of the object 200, the wavelength of the laser beam is 940nm to 1350nm, the face of the object 200 is scanned, and the image information of the face is captured, the image receiving unit 22 obtains face scale information by using at least one face comparison image information to correct the scale, the laser beam or the LED light beam correspondingly transmits and reflects a transmission path to convert the structural light beam and calculate to form face image information of the object to be detected and flight time information of the light beam, the estimation processing unit 23 receives the image information of the face of the object to be measured and the flight time information of the light beam provided by the image receiving unit 22, and calculates the eye identification information 231, the head position information 232 and the eye-closing time data of the face of the object 200 in the measurement environment.
It should be further noted that, as shown in fig. 3, fig. 7 and fig. 8, the estimation and calculation processing unit 23 calculates the eye identification information 231 and the head position information 232 of the face of the object 200 in the measurement environment, and when the closing degree of the eyes of the eye identification information 231 is closed or semi-closed, the monitoring module sends an alarm notification message to the central control processing module 40, and when the closing time of the eyes of the eye identification information 231 is 2 seconds to 5 seconds, and the head of the head position information 232 is low or close to the left and right, the monitoring module sends an alarm notification message to the central control processing module 40, so as to remind the driver in real time.
Therefore, it can be further understood from the above description that, with the configuration of the optical identification module 20, the image capturing unit 21 of the optical identification module 20 scans the face of the object 200, the image receiving unit 22 obtains face scale information to calibrate the scale by comparing at least one face with image information, the laser beam correspondingly transmits and reflects a transmission path to convert the structured light beam and calculate to form the face image information of the object and the flight time information of the light beam, the estimation calculation processing unit 23 receives the face image information of the object and the flight time information of the light beam provided by the image receiving unit 22 and calculates the eye identification information 231, the head position information 232 and the eye closing time data of the face of the object 200 in the measurement environment, so as to identify the fatigue degree of the object 200 in real time, when the optical recognition module 20 recognizes the fatigue state of the dut 200, the transmission unit 24 sends the alarm notification message to the monitoring module 30, and the monitoring module 30 transmits the alarm notification message to the central control processing module 40 to notify the driver in real time, so as to reduce the occurrence of traffic accidents.
(embodiment 2)
Referring to fig. 9, fig. 9 is a schematic diagram of a system architecture according to a second embodiment of the present invention. Compared with the first embodiment, the main structural difference of the second embodiment is that the fatigue driving identification system 100 further includes at least one warning module 50, the warning module 50 is electrically connected to the monitoring module 30, when the optical identification module 20 identifies the fatigue state of the object 200, the transmission unit 24 sends the warning notification message to the monitoring module 30, and simultaneously starts the warning module 50 to remind the driver, in this embodiment, the warning module 50 is a speaker, and the warning module 50 sounds so as to remind the driver. However, the present invention is not limited thereto, in another preferred embodiment, the warning module 50 is a smart phone or a steering wheel, when the warning module 50 is a smart phone, the transmission unit 24 sends the warning notification message to the monitoring module 30, and simultaneously transmits the warning notification message to the warning module 50, and the warning module 50 makes a sound and vibrates, so as to remind the driver; when the warning module 50 is a steering wheel, the transmission unit 24 sends the warning notification message to the monitoring module 30 and simultaneously activates the warning module 50, and the warning module 50 sends a vibration, and the driver can be notified in real time due to the holding direction of the driver's hands. In addition, the warning module 50 can be selected from one or a combination of a speaker, a smart phone and a steering wheel.
Therefore, the second embodiment can not only achieve the effects of the first embodiment, but also provide different structures, and the second embodiment can increase the convenience and practicability in use and improve the driving safety of the driver through the arrangement of the warning module 50, so that the occurrence of traffic accidents is reduced.
(step flow of the identification method of embodiment 1)
Referring to fig. 10 in combination with fig. 1 to 9, fig. 10 is a schematic flow chart illustrating an identification method according to a first embodiment of the present invention. The present invention further provides an identification method of the fatigue driving identification system 100 based on the above fatigue driving identification system 100, comprising the following steps:
step S1: the image capturing unit 21 of the optical identification module 20 emits the laser beam or the LED beam to the face of the object 200, and the wavelength of the laser beam is between 940nm and 1350 nm.
Step S2: the face of the object 200 is scanned and the image information of the face is captured.
Step S3: the image receiving unit 22 obtains face scale information to correct the scale by using at least one face comparison image information.
Step S4: the laser beam or the LED light beam is corresponding to a transmission path of transmission and reflection to convert the structural light beam, and face image information of the object to be detected and flight time information of the light beam are calculated and formed.
Step S5: the estimation processing unit 23 receives the image information of the face of the object to be measured and the flight time information of the light beam provided by the image receiving unit 22, and calculates the eye identification information 231 of the face of the object to be measured 200 in the measurement environment.
Step S6: the optical identification module 20 performs a comparison data procedure for determining the degree of closure of the eyes of the face of the dut 200 and determining the gaze position of the eyes of the face of the dut 200 according to the eye identification information 231, in this embodiment, the step S6 includes the following steps:
step S6A: when the eyes of the face of the object 200 are closed or semi-closed in a predetermined time sensing interval, the time sensing interval is 2-5 seconds.
Step S61A: the transmission unit 24 sends an alarm notification message to the monitoring module 30.
Step S6B: when the eye gaze position of the face of the test object 200 is down or up.
Step S61B: the transmission unit 24 sends an alarm notification message to the monitoring module 30.
In this embodiment, when the eyes of the face of the object 200 are closed or semi-closed, or when the eye gaze position of the face of the object 200 is downward or upward, the alarm notification message sends out an alarm pattern on the operation interface of the central control processing module 40 when the monitoring module 30 is a vehicle computer; in addition, the monitoring module 30 sends the alarm notification message and activates the warning module 50 at the same time, and the alarm notification message can be sounded or vibrated by the warning module 50.
When the determination results in step S6A and step S6B are both not eye-closed or semi-closed and not eye-gaze position down or up, step S1, step S2, step S3, step S4, step S5, step S6, step S6A and step S6B are sequentially repeated.
Please refer to fig. 11A and 11B, which are schematic diagrams illustrating a flow of an identification method according to a second embodiment of the present invention in fig. 11A and 11B. In step S6, the optical recognition module 20 performs the comparison data process for recognizing the inclined position of the head of the dut 200 according to the head position information 232, wherein the step S6 includes the following steps:
step S6C: when the head of the object 200 is lowered.
Step S61C: the transmission unit 24 sends an alarm notification message to the monitoring module 30.
Step S6D: when the head of the object 200 is leaned to the left and right.
Step S61D: the transmission unit 24 sends an alarm notification message to the monitoring module 30.
In addition, when none of the results of the steps S6A, S6B, S6C, and S6D is eye closure or half closure, and is not eye gaze position down or up, and the head is not low or left right side down, the steps S1, S2, S3, S4, S5, S6, S6A, S6B, S6C, and S6D are repeated in order.
It should be noted that, since the wavelength of the laser used for identification of a general mobile phone or a monitoring device is 940nm, the infrared laser with this wavelength is also proved by medicine to be harmful to the human eye, which may cause cataract and retina burn; conversely, the laser beam used in the present invention has a wavelength of 1310nm, and more particularly, is harmless to the eyes of the driver.
It should be noted that, when the optical identification module 20 of the present invention can emit the LED light beam, the red light wavelength of the LED light beam ranges from 620nm to 900 nm.
The features of the invention and their expected effects are further set forth below:
the fatigue driving identification system 100 and the identification method thereof of the present invention mainly set the optical identification module 20 through the side pillar 11 of the vehicle 10, and identify the face of the object 200 by the optical identification module 20, rapidly obtain the eye identification information 231, the head position information 232 and the eye closing time data of the face of the object 200, and identify the fatigue state, and send out the warning message in real time to remind the driver, so as to improve the safety in the driving process and reduce the occurrence of traffic accidents.
The invention has the following implementation efficacy and technical efficacy:
first, according to the present invention, the optical identification module 20 is configured such that the optical identification module 20 identifies the fatigue driving identification system 100 by time-of-flight distance measurement, so as to have the advantages of accurately identifying the face of the object 200 and low calculation amount.
Secondly, according to the present invention, through the arrangement of the optical identification module 20 and the arrangement of the monitoring module and the central control processing module 40, the image capturing unit 21 of the optical identification module 20 scans the face of the object 200, when the optical identification module 20 identifies that the object 200 is in a fatigue state, the transmission unit 24 sends the alarm notification information to the monitoring module 30, and the monitoring module 30 transmits the alarm notification information to the central control processing module 40, so as to notify the driver in real time, thereby reducing the occurrence of traffic accidents.
Thirdly, the present invention, through the arrangement of the image capturing unit 21, the wavelength of the laser beam emitted by the image capturing unit 21 is between 1310nm and 1350nm, so as to reduce the eye injury of the driver.
While the preferred embodiments of the present invention have been described, it will be understood by those skilled in the art that various changes and modifications may be made without departing from the spirit and scope of the invention as defined in the following claims.

Claims (10)

1. A fatigue driving identification system, comprising:
a vehicle; and
the optical identification module is assembled on at least one side post in the vehicle, the side post is positioned on one side of a driving seat in the vehicle, the optical identification module comprises a plurality of image acquisition units, at least one image receiving unit, an estimation operation processing unit, at least one transmission unit and a power supply unit, and the optical identification module irradiates the face of an object to be detected;
the image capturing unit emits a plurality of laser beams or a plurality of LED light beams to the face of the object to be detected and captures a plurality of face image information, and the wavelength of the laser beams is 940nm to 1400 nm;
the image receiving unit is electrically connected with the image capturing unit and used for obtaining face scale information by utilizing at least one face comparison image information to correct the scale, converting the laser beam or the LED light beam into a plurality of structural light beams by corresponding transmission and reflection transmission paths, and calculating to form a plurality of pieces of face image information of the object to be detected and a plurality of pieces of light beam flight time information;
the estimation operation processing unit is electrically connected with the image receiving unit, receives the image information of the face of the object to be measured and the flight time information of the light beam provided by the image receiving unit, and calculates at least one eye identification information, at least one head position information and at least one eye closing time data of the face of the object to be measured in a measuring environment;
the transmission unit is electrically connected with the image acquisition unit, the image receiving unit and the estimation operation processing unit, and transmits the eye identification information, the head position information and the eye closing time data to at least one monitoring module;
the power supply unit is electrically connected with the transmission unit.
2. The fatigue driving recognition system of claim 1, wherein the wavelength of the laser beam emitted by the image capturing unit is between 1310nm and 1350 nm.
3. The fatigue driving recognition system of claim 1, wherein the distance between the laser beam or the LED beam emitted by the image capturing unit is between 1m and 3 m.
4. The fatigue driving identification system of claim 1, wherein the structured light beam is projected onto an image projection area and forms a plurality of superimposed patterns, the image projection area has a plurality of sub-projection areas arranged in an array and adjacent to each other, and the superimposed patterns are distributed differently in each sub-projection area, and image information of the object to be measured and flight time information of the light beam are calculated and formed.
5. The fatigue driving identification system of claim 1, wherein the monitoring angle of view of the optical identification module is in a range of 0 to 120 degrees horizontally and 0 to 90 degrees vertically.
6. The fatigue driving identification system of claim 1, wherein the transmission unit comprises at least one antenna, at least one signal transmission element and at least one power transmission element, and the transmission unit is configured to transmit information via wireless or wired signals.
7. The fatigue driving identification system of claim 6, wherein the monitoring module is a vehicle computer, the vehicle has a central processing module, the transmission module is connected to at least one control local area network bus in the central processing module, the control local area network bus is selected from one or a combination of a serial communication network and a control bus network, and the transmission unit is connected to the monitoring module via the signal transmission device via an ethernet network.
8. A method for identifying fatigue driving, applying the fatigue driving identification system of claim 1, comprising:
the image capturing unit of the optical identification module emits the laser beam or the LED light beam to the face of the object to be detected, and the wavelength of the laser beam is 940nm to 1400 nm;
scanning the face of the object to be detected and capturing the face image information;
the image receiving unit obtains face scale information by utilizing at least one face comparison image information to correct the scale;
the laser beam or the LED light beam correspondingly transmits and reflects a transmission path to convert the structural light beam, and the face image information of the object to be detected and the flight time information of the light beam are calculated and formed;
the estimation operation processing unit receives the image information of the face of the object to be measured and the flight time information of the light beam provided by the image receiving unit and calculates the eye identification information of the face of the object to be measured in a measuring environment;
the optical identification module carries out a comparison data program according to the eye identification information, wherein the comparison data program is used for determining the closing degree of the eyes of the face of the object to be detected and determining the fixation position of the eyes of the face of the object to be detected;
when the eyes of the face of the object to be detected are closed or semi-closed in a set time sensing interval, the transmission unit sends out alarm notification information to the monitoring module; and
when the eye fixation position of the face of the object to be detected is downward or upward, the transmission unit sends out alarm notification information to the monitoring module.
9. The method of claim 8, wherein the wavelength of the laser beam emitted by the image capturing unit is 1310nm to 1350nm, the distance between the laser beam or the LED beam is 1m to 3m, and the time sensing interval is 2 seconds to 5 seconds;
the structured light beam is projected in an image projection area and images a plurality of superposition patterns, the image projection area is provided with a plurality of sub projection areas which are arranged in an array and are adjacent to each other, the pattern distribution of the superposition patterns in each sub projection area is different from each other, and image information of an object to be detected and flight time information of the light beam are calculated and formed.
10. The method according to claim 8, wherein the optical recognition module performs the comparison data procedure according to the head position information, the comparison data procedure being to recognize a head inclination position of the object;
when the head of the object to be detected is lowered, the transmission unit sends out alarm notification information to the monitoring module; and
and when the head of the object to be detected leans to the left side or the right side, the transmission unit sends out alarm notification information to the monitoring module.
CN202010503035.2A 2020-06-04 2020-06-04 Fatigue driving identification system and identification method thereof Pending CN113761982A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010503035.2A CN113761982A (en) 2020-06-04 2020-06-04 Fatigue driving identification system and identification method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010503035.2A CN113761982A (en) 2020-06-04 2020-06-04 Fatigue driving identification system and identification method thereof

Publications (1)

Publication Number Publication Date
CN113761982A true CN113761982A (en) 2021-12-07

Family

ID=78783868

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010503035.2A Pending CN113761982A (en) 2020-06-04 2020-06-04 Fatigue driving identification system and identification method thereof

Country Status (1)

Country Link
CN (1) CN113761982A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102855754A (en) * 2011-06-30 2013-01-02 由田新技股份有限公司 Vehicle management monitoring system and method thereof
CN208239783U (en) * 2018-06-05 2018-12-14 深圳奥比中光科技有限公司 VCSEL array light source and its projector, depth camera
CN109383525A (en) * 2017-08-10 2019-02-26 欧姆龙株式会社 Driver status grasps device, driver status grasps system and method
CN109614892A (en) * 2018-11-26 2019-04-12 青岛小鸟看看科技有限公司 A kind of method for detecting fatigue driving, device and electronic equipment
CN109643366A (en) * 2016-07-21 2019-04-16 戈斯蒂冈有限责任公司 For monitoring the method and system of the situation of vehicle driver
WO2019228105A1 (en) * 2018-06-01 2019-12-05 Boe Technology Group Co., Ltd. Computer-implemented method of alerting driver of vehicle, apparatus for alerting driver of vehicle, vehicle, and computer-program product
JP2020009749A (en) * 2018-06-29 2020-01-16 株式会社リコー Light source, projection device, measurement device, robot, electronic apparatus, moving body, and molding device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102855754A (en) * 2011-06-30 2013-01-02 由田新技股份有限公司 Vehicle management monitoring system and method thereof
CN109643366A (en) * 2016-07-21 2019-04-16 戈斯蒂冈有限责任公司 For monitoring the method and system of the situation of vehicle driver
CN109383525A (en) * 2017-08-10 2019-02-26 欧姆龙株式会社 Driver status grasps device, driver status grasps system and method
WO2019228105A1 (en) * 2018-06-01 2019-12-05 Boe Technology Group Co., Ltd. Computer-implemented method of alerting driver of vehicle, apparatus for alerting driver of vehicle, vehicle, and computer-program product
CN208239783U (en) * 2018-06-05 2018-12-14 深圳奥比中光科技有限公司 VCSEL array light source and its projector, depth camera
JP2020009749A (en) * 2018-06-29 2020-01-16 株式会社リコー Light source, projection device, measurement device, robot, electronic apparatus, moving body, and molding device
CN109614892A (en) * 2018-11-26 2019-04-12 青岛小鸟看看科技有限公司 A kind of method for detecting fatigue driving, device and electronic equipment

Similar Documents

Publication Publication Date Title
US11334066B2 (en) Safety monitoring apparatus and method thereof for human-driven vehicle
US20180206771A1 (en) Eye closure detection using structured illumination
Chang et al. Design and implementation of a drowsiness-fatigue-detection system based on wearable smart glasses to increase road safety
US20140293053A1 (en) Safety monitoring apparatus and method thereof for human-driven vehicle
US8725311B1 (en) Driver health and fatigue monitoring system and method
JP2020504295A (en) 3D time-of-flight active reflection detection system and method
CN104224204B (en) A kind of Study in Driver Fatigue State Surveillance System based on infrared detection technology
JP5689874B2 (en) Gaze control device, ophthalmic device, method of operating gaze control device for ophthalmic device for controlling eye gaze, computer program or evaluation unit
US11312384B2 (en) Personalized device and method for monitoring a motor vehicle driver
WO2015174963A1 (en) Driver health and fatigue monitoring system and method
WO2015175435A1 (en) Driver health and fatigue monitoring system and method
JP2020114377A (en) System and method detecting problematic health situation
CN101132729A (en) Measuring alertness
WO2014204567A1 (en) Imaging-based monitoring of stress and fatigue
US20080074618A1 (en) Fatigue detection device using encoded light signals
Jiang et al. Driversonar: Fine-grained dangerous driving detection using active sonar
WO2009156937A1 (en) Sensing apparatus for sensing a movement
CN113761982A (en) Fatigue driving identification system and identification method thereof
JP2024517704A (en) Method and device for controlling charging lid of electric vehicle, and electric vehicle
TWI727819B (en) Fatigue driving identification system and its identification method
CN109895782A (en) Intelligent automobile seat and working method
CN114492656A (en) Fatigue degree monitoring system based on computer vision and sensor
CN203689651U (en) Vehicle-mounted fatigue driving early warning system based on infrared imaging technology
CN112494045A (en) Driver fatigue detection method and device
KR101548629B1 (en) System for measuring living body signal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination