CN211066569U - Action carrier auxiliary system - Google Patents

Action carrier auxiliary system Download PDF

Info

Publication number
CN211066569U
CN211066569U CN201920753471.8U CN201920753471U CN211066569U CN 211066569 U CN211066569 U CN 211066569U CN 201920753471 U CN201920753471 U CN 201920753471U CN 211066569 U CN211066569 U CN 211066569U
Authority
CN
China
Prior art keywords
lens
image
module
lens element
driver
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201920753471.8U
Other languages
Chinese (zh)
Inventor
张永明
赖建勋
刘燿维
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ability Opto Electronics Technology Co Ltd
Original Assignee
Ability Opto Electronics Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ability Opto Electronics Technology Co Ltd filed Critical Ability Opto Electronics Technology Co Ltd
Application granted granted Critical
Publication of CN211066569U publication Critical patent/CN211066569U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/18Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state for vehicle drivers or machine operators
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/021Measuring pressure in heart or blood vessels
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • A61B5/0816Measuring devices for examining respiratory frequency
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • A61B5/082Evaluation by breath analysis, e.g. determination of the chemical composition of exhaled breath
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
    • A61B5/14532Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue for measuring glucose, e.g. by tissue impedance measurement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
    • A61B5/14542Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue for measuring blood gases
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
    • A61B5/14546Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue for measuring analytes not otherwise provided for, e.g. ions, cytochromes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6893Cars
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/746Alarms related to a physiological condition, e.g. details of setting alarm thresholds or avoiding false alarms
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K28/00Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions
    • B60K28/02Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver
    • B60K28/06Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver responsive to incapacity of driver
    • B60K28/063Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver responsive to incapacity of driver preventing starting of vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K28/00Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions
    • B60K28/02Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver
    • B60K28/06Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver responsive to incapacity of driver
    • B60K28/066Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver responsive to incapacity of driver actuating a signalling device
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q9/00Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/12Limiting control by the driver depending on vehicle state, e.g. interlocking means for the control input for preventing unsafe operation
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/005Handover processes
    • B60W60/0059Estimation of the risk associated with autonomous or manual driving, e.g. situation too complex, sensor failure or driver incapacity
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B13/00Optical objectives specially designed for the purposes specified below
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B13/00Optical objectives specially designed for the purposes specified below
    • G02B13/18Optical objectives specially designed for the purposes specified below with lenses having one or more non-spherical faces, e.g. for reducing geometrical aberration
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/01Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour 
    • G02F1/15Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on an electrochromic effect
    • G02F1/153Constructional details
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/01Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour 
    • G02F1/15Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on an electrochromic effect
    • G02F1/153Constructional details
    • G02F1/155Electrodes
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/01Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour 
    • G02F1/15Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on an electrochromic effect
    • G02F1/153Constructional details
    • G02F1/157Structural association of cells with optical devices, e.g. reflectors or illuminating devices
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/01Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour 
    • G02F1/15Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on an electrochromic effect
    • G02F1/163Operation of electrochromic cells, e.g. electrodeposition cells; Circuit arrangements therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • G06V20/597Recognising the driver's state or behaviour, e.g. attention or drowsiness
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2503/00Evaluating a particular growth phase or type of persons or animals
    • A61B2503/20Workers
    • A61B2503/22Motor vehicles operators, e.g. drivers, pilots, captains
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0233Special features of optical sensors or probes classified in A61B5/00
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/163Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state by tracking eye movement, gaze, or pupil change
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/681Wristwatch-type devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R11/04Mounting of cameras operative during drive; Arrangement of controls thereof relative to the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R2011/0001Arrangements for holding or mounting articles, not otherwise provided for characterised by position
    • B60R2011/0003Arrangements for holding or mounting articles, not otherwise provided for characterised by position inside the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • B60W2040/0818Inactivity or incapacity of driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • B60W2040/0818Inactivity or incapacity of driver
    • B60W2040/0836Inactivity or incapacity of driver due to alcohol
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • B60W2040/0872Driver physiology
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/143Alarm means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/146Display means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/22Psychological state; Stress level or workload
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/221Physiology, e.g. weight, heartbeat, health or special needs
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/24Drug level, e.g. alcohol
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/26Incapacity

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Pathology (AREA)
  • Medical Informatics (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • Surgery (AREA)
  • Optics & Photonics (AREA)
  • Nonlinear Science (AREA)
  • Physiology (AREA)
  • Cardiology (AREA)
  • General Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Pulmonology (AREA)
  • Automation & Control Theory (AREA)
  • Multimedia (AREA)
  • Hospice & Palliative Care (AREA)
  • Social Psychology (AREA)
  • Child & Adolescent Psychology (AREA)
  • Psychiatry (AREA)
  • Educational Technology (AREA)
  • Developmental Disabilities (AREA)
  • Psychology (AREA)
  • Human Computer Interaction (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Signal Processing (AREA)
  • Vascular Medicine (AREA)
  • Theoretical Computer Science (AREA)
  • Emergency Medicine (AREA)
  • Mathematical Physics (AREA)

Abstract

A kind of action carrier auxiliary system, include a driving state detection device and a warning device, the driving state detection device includes a physiological state detection module, a storage module and a operation module; the physiological state detection module is used for detecting the physiological state of a driver; the storage module is arranged in a mobile carrier and stores allowable parameters corresponding to the at least one physiological state; the operation module is arranged in the mobile carrier and is in signal connection with the physiological state detection module and the storage module so as to detect whether the plurality of physiological states of the driver exceed the allowable parameters and generate a corresponding detection signal; the warning device generates warning information when receiving the detection signal that the physiological state of the driver exceeds the allowable parameter. Therefore, the physiological state of the driver can be effectively judged.

Description

Action carrier auxiliary system
Technical Field
The present invention relates to an auxiliary system for a mobile carrier, and more particularly to an auxiliary system capable of determining the physiological status of a driver.
Background
With the rapid expansion of high-frequency commercial activities and transportation logistics, people rely more deeply on mobile vehicles such as automobiles, and drivers also pay more and more attention to the protection of their lives and properties during driving, and generally consider whether the mobile vehicle to be purchased provides sufficient safety protection devices or auxiliary devices in addition to the performance and riding comfort of the mobile vehicle. Under the tide, in order to improve the driving safety, automobile manufacturers or automobile equipment designers have developed various driving safety protection devices or auxiliary devices, such as a rearview mirror, a driving recorder, a global positioning system capable of displaying a surrounding image of an object in a driving dead angle area in real time or recording a driving path at any time, and the like.
In addition, with the recent development of digital cameras in daily life and computer vision, it has been applied to driving assistance systems, and it is desired to reduce the incidence of traffic accidents by applying artificial intelligence.
For the mobile carrier needing to be driven by a person, the mobile carrier is controlled by the driver, when the physical condition of the driver is good, the driver can control the mobile carrier to safely run, and when the body of the driver is not good, the situation that the driver cannot control or controls the mobile carrier by mistake can occur. For example, during travel, drivers smell suddenly on a myocardial choking event. If the mobile carrier cannot be operated or is operated by mistake, the light person will damage the mobile carrier, and the heavy person will also endanger the life safety of the personnel.
Therefore, how to effectively develop an auxiliary system capable of accurately detecting the physiological status of the driver to improve driving safety becomes an important issue.
SUMMERY OF THE UTILITY MODEL
An aspect of the present invention is directed to a mobile carrier assistance system, which includes a driving state detection device and a warning device, wherein the driving state detection device includes a physiological state detection module, a storage module, and an operation module; the physiological state detection module is used for detecting at least one physiological state of a driver; the storage module is arranged in a mobile carrier and stores allowable parameters corresponding to the at least one physiological state; the operation module is arranged in the mobile carrier and is in signal connection with the physiological state detection module and the storage module so as to detect whether the plurality of physiological states of the driver exceed the allowable parameters and generate a corresponding detection signal; the warning device is electrically connected with the operation module and is used for generating warning information when receiving the detection signals that the plurality of physiological states of the driver exceed the allowable parameters.
In addition, the physiological state detection module further comprises an image capturing module for capturing at least a driving image of the driver in the mobile carrier, and the operation module judges whether the plurality of physiological states of the driver exceed the allowable parameters according to the driving image and generates a corresponding detection signal; the image capturing module comprises a lens set, and the lens set comprises at least two lenses with refractive power; in addition, the lens group further satisfies the following condition: f/HEP is more than or equal to 1.0 and less than or equal to 10.0; 0deg < HAF ≤ 150 deg; and 0.9 is less than or equal to 2(ARE/HEP) is less than or equal to 2.0. Wherein f is the focal length of the lens group; HEP is the diameter of an entrance pupil of the lens group; HAF is half of the maximum visual angle of the lens group; the ARE is a contour curve length obtained along the contour of any lens surface in the lens group, starting at the intersection of the lens surface with the optical axis and ending at a position at a vertical height from the entrance pupil diameter of the optical axis 1/2.
The aforesaid lens battery utilizes the design of structure size and cooperates the refractive power of more than two lenses, the combination of convex surface and concave surface (the convex surface or concave surface refer to the description that the object side face or the image side face of each lens change apart from the geometry of the different height of optical axis in principle), improves optical imaging system's the light inlet volume and increases optical imaging lens's visual angle simultaneously effectively, and like this, alright make optical imaging system have certain relative illuminance and improve the total pixel and the quality of formation of image.
In an embodiment of the present invention, the lens assembly further satisfies the following conditions: ARS/EHD is more than or equal to 0.9 and less than or equal to 2.0; wherein, ARS is the length of a contour curve obtained along the contour of the lens surface by taking the intersection point of any lens surface of any lens in the lens group and the optical axis as a starting point and the maximum effective radius of the lens surface as an end point; the EHD is the maximum effective radius of any surface of any lens in the lens group.
In one embodiment of the present invention, the lens assembly further satisfies the following conditions that P L TA is less than or equal to 100 μm, PSTA is less than or equal to 100 μm, N L TA is less than or equal to 100 μm, NSTA is less than or equal to 100 μm, S L TA is less than or equal to 100 μm, SSTA is less than or equal to 100 μm, and-TDT | is less than 250%, wherein HOI is defined as a maximum imaging height perpendicular to an optical axis on an imaging plane of the image capture module, P L TA is a transverse aberration at which a longest operating wavelength of visible light of a normal meridian fan of the image capture module passes through the entrance pupil edge and is incident on the imaging plane at 0.7HOI, PSTA is a transverse aberration at which a shortest operating wavelength of visible light of a normal meridian fan of the image capture module passes through the entrance pupil edge and is incident on the imaging plane at 0.7HOI, N L TA is a transverse aberration of visible light of a longest operating wavelength of visible light of a negative meridian fan of the image capture module when the normal meridian fan passes through the entrance pupil edge and is incident on the imaging plane at 0.7HOI, and the imaging plane of the image capture module passes through the shortest operating wavelength of the entrance pupil edge, and is a transverse aberration of visible image capture module at 0.7 HOT transverse aberration at 0.7HOI on the entrance pupil edge, and is a shortest operating wavelength of the entrance pupil edge of the image capture module at 0.7 image capture module.
In an embodiment of the present invention, the lens assembly includes four lens elements with refractive power, and the lens assembly includes a first lens element, a second lens element, a third lens element and a fourth lens element in order from an object side to an image side, and the lens assembly satisfies the following conditions that 0.1 ≤ InT L/HOS ≤ 0.95, wherein HOS is an axial distance from an object side surface of the first lens element to an image side surface of the image capture module, and InT L is an axial distance from an object side surface of the first lens element to an image side surface of the fourth lens element.
In an embodiment of the present invention, the lens assembly includes five lens elements with refractive power, in order from an object side to an image side, a first lens element, a second lens element, a third lens element, a fourth lens element and a fifth lens element, and the lens assembly satisfies the following conditions of 0.1 ≤ InT L/HOS ≤ 0.95, wherein HOS is an axial distance from an object side surface of the first lens element to an image side surface of the image capture module, and InT L is an axial distance from an object side surface of the first lens element to an image side surface of the fifth lens element.
In an embodiment of the present invention, the lens assembly includes six lens elements with refractive power, and the lens assembly includes, in order from an object side to an image side, a first lens element, a second lens element, a third lens element, a fourth lens element, a fifth lens element and a sixth lens element, and satisfies the following conditions of 0.1 ≤ InT L/HOS ≤ 0.95, wherein HOS is an axial distance from an object side surface of the first lens element to the image plane, and InT L is an axial distance from an object side surface of the first lens element to the image side surface of the sixth lens element.
In an embodiment of the present invention, the lens assembly includes seven lens elements with refractive power, and the lens assembly includes, in order from an object side to an image side, a first lens element, a second lens element, a third lens element, a fourth lens element, a fifth lens element, a sixth lens element and a seventh lens element, and the lens assembly satisfies the following condition that 0.1 ≤ InT L/HOS ≤ 0.95, wherein HOS is an axial distance from an object side surface of the first lens element to an image side surface of the image capture module, and InT L is an axial distance from an object side surface of the first lens element to an image side surface of the seventh lens element.
In an embodiment of the present invention, the lens assembly further includes an aperture, and the aperture satisfies the following formula: 0.2 or more of InS/HOS or less than 1.1; wherein InS is the distance between the aperture and the imaging surface of the image capture module on the optical axis; HOS is the distance from the lens surface of the lens group farthest from the imaging surface to the imaging surface on the optical axis.
The embodiment of the present invention relates to the following terms and their code numbers of the related element parameters in the optical imaging system of the lens set, as the reference for the following description:
lens parameters related to length or height
The maximum imaging height of the optical imaging system is represented by HOI, the height of the optical imaging system (i.e. the distance between the object side surface of the first lens and the imaging surface on the optical axis) is represented by HOS, the distance between the object side surface of the first lens and the image side surface of the last lens of the optical imaging system is represented by InT L, the distance between the fixed diaphragm (aperture) and the imaging surface of the optical imaging system is represented by InS, the distance between the first lens and the second lens of the optical imaging system is represented by IN12 (for example), and the thickness of the first lens of the optical imaging system on the optical axis is represented by TP1 (for example).
Material dependent lens parameters
The abbe number of the first lens of the optical imaging system is denoted (exemplified) by NA 1; the refractive law of the first lens is denoted by Nd1 (for example).
Viewing angle dependent lens parameters
The viewing angle is denoted AF; half of the viewing angle is denoted by HAF; the chief ray angle is denoted MRA.
Lens parameters related to entrance and exit pupils
The entrance pupil diameter of the optical imaging system is denoted by HEP; the maximum Effective radius of any surface of a single lens refers to the vertical height between the intersection point (Effective halo diameter; EHD) of the light rays of the incident light passing through the extreme edge of the entrance pupil at the maximum viewing angle of the system and the optical axis. For example, the maximum effective radius of the object-side surface of the first lens is indicated by EHD11 and the maximum effective radius of the image-side surface of the first lens is indicated by EHD 12. The maximum effective radius of the object-side surface of the second lens is indicated by EHD21 and the maximum effective radius of the image-side surface of the second lens is indicated by EHD 22. The maximum effective radius of any surface of the remaining lenses in the optical imaging system is expressed and so on.
Parameters relating to lens surface profile arc length and surface profile
The maximum effective radius profile curve length of any surface of a single lens refers to the point of intersection of the surface of the lens and the optical axis of the optical imaging system as a starting point, and the curve arc length between the two points is the maximum effective radius profile curve length from the starting point along the surface profile of the lens to the end point of the maximum effective radius, and is expressed by ARS. For example, the profile curve length for the maximum effective radius of the object-side surface of the first lens is shown as ARS11 and the profile curve length for the maximum effective radius of the image-side surface of the first lens is shown as ARS 12. The profile curve length for the maximum effective radius of the object-side surface of the second lens is denoted as ARS21 and the profile curve length for the maximum effective radius of the image-side surface of the second lens is denoted as ARS 22. The length of the profile curve of the maximum effective radius of any surface of the remaining lenses in the optical imaging system is expressed in analogy.
The contour curve length of 1/2 entrance pupil diameter (HEP) of any surface of a single lens refers to the intersection point of the surface of the lens and the optical axis of the optical imaging system as a starting point, and the curve arc length between the two points is the contour curve length of 1/2 entrance pupil diameter (HEP) from the starting point along the surface contour of the lens to the coordinate point of the vertical height of the surface from the optical axis 1/2 entrance pupil diameter, and is expressed by ARE. For example, the contour curve length for the 1/2 entrance pupil diameter (HEP) of the object-side surface of the first lens is denoted as ARE11, and the contour curve length for the 1/2 entrance pupil diameter (HEP) of the image-side surface of the first lens is denoted as ARE 12. The contour curve length for the 1/2 entrance pupil diameter (HEP) of the object-side surface of the second lens is denoted as ARE21, and the contour curve length for the 1/2 entrance pupil diameter (HEP) of the image-side surface of the second lens is denoted as ARE 22. The profile curve length representation of 1/2 entrance pupil diameter (HEP) for either surface of the remaining lenses in the optical imaging system, and so on.
Parameters related to lens profile depth
The distance between the intersection point of the object-side surface of the sixth lens element on the optical axis and the end point of the maximum effective radius of the object-side surface of the sixth lens element, which is horizontal to the optical axis, is represented by InRS61 (depth of maximum effective radius); the distance between the intersection point of the image-side surface of the sixth lens element on the optical axis and the end point of the maximum effective radius of the image-side surface of the sixth lens element, which is horizontal to the optical axis, is represented by InRS62 (depth of maximum effective radius). The depth (amount of depression) of the maximum effective radius of the object-side or image-side surface of the other lens is expressed in a manner comparable to that described above.
Parameters relating to lens surface shape
The critical point C refers to a point on the surface of the particular lens that is tangent to a tangent plane perpendicular to the optical axis, except for the intersection with the optical axis. For example, the perpendicular distance between the critical point C51 on the object-side surface of the fifth lens element and the optical axis is HVT51 (for example), the perpendicular distance between the critical point C52 on the image-side surface of the fifth lens element and the optical axis is HVT52 (for example), the perpendicular distance between the critical point C61 on the object-side surface of the sixth lens element and the optical axis is HVT61 (for example), and the perpendicular distance between the critical point C62 on the image-side surface of the sixth lens element and the optical axis is HVT62 (for example). The representation of the critical point on the object-side or image-side surface of the other lens and its perpendicular distance from the optical axis is comparable to the above.
The inflection point on the object-side surface of the seventh lens closest to the optical axis is IF711, the amount of this point depression is SGI711 (for example), SGI711 is the horizontal displacement distance parallel to the optical axis between the intersection point of the object-side surface of the seventh lens on the optical axis and the inflection point on the object-side surface of the seventh lens closest to the optical axis, and the vertical distance between this point of IF711 and the optical axis is HIF711 (for example). The inflection point on the image-side surface of the seventh lens closest to the optical axis is IF721, the amount of this point depression SGI721 (for example), SGI711 is the horizontal displacement distance parallel to the optical axis between the intersection point of the image-side surface of the seventh lens on the optical axis and the inflection point on the image-side surface of the seventh lens closest to the optical axis, and the vertical distance between this point of IF721 and the optical axis is 721 HIF (for example).
The second inflection point on the object-side surface of the seventh lens closer to the optical axis is IF712, the amount of this point depression SGI712 (illustrated), SGI712 is the horizontal displacement distance parallel to the optical axis between the intersection point of the object-side surface of the seventh lens on the optical axis and the second inflection point on the object-side surface of the seventh lens closer to the optical axis, and the vertical distance between this point of the IF712 and the optical axis is HIF712 (illustrated). An inflection point on the image-side surface of the seventh lens element second near the optical axis is IF722, a depression amount SGI722 (for example) of the point is a horizontal displacement distance parallel to the optical axis between the SGI722, that is, an intersection point of the image-side surface of the seventh lens element on the optical axis and the inflection point on the image-side surface of the seventh lens element second near the optical axis, and a vertical distance between the point of the IF722 and the optical axis is HIF722 (for example).
The third inflection point on the object-side surface of the seventh lens near the optical axis is IF713, the depression amount SGI713 (for example) of the third inflection point is SGI713, i.e., the horizontal displacement distance parallel to the optical axis between the intersection point of the object-side surface of the seventh lens on the optical axis and the third inflection point on the object-side surface of the seventh lens near the optical axis, and the vertical distance between the point of the IF713 and the optical axis is HIF713 (for example). The third inflection point on the image-side surface of the seventh lens element near the optical axis is IF723, the depression amount SGI723 (for example) is a horizontal displacement distance parallel to the optical axis between the SGI723, that is, the intersection point of the image-side surface of the seventh lens element on the optical axis and the third inflection point on the image-side surface of the seventh lens element near the optical axis, and the vertical distance between the point of the IF723 and the optical axis is HIF723 (for example).
The fourth inflection point on the object-side surface of the seventh lens near the optical axis is IF714, the depression amount SGI714 (for example) is the horizontal displacement distance parallel to the optical axis between SGI714, i.e., the intersection point of the object-side surface of the seventh lens on the optical axis, and the fourth inflection point on the object-side surface of the seventh lens near the optical axis, and the vertical distance between the point of IF714 and the optical axis is HIF714 (for example). A fourth inflection point on the image-side surface of the seventh lens element near the optical axis is IF724, the depression amount SGI724 (for example) is a horizontal displacement distance parallel to the optical axis between the SGI724, i.e., an intersection point of the image-side surface of the seventh lens element on the optical axis, and the fourth inflection point on the image-side surface of the seventh lens element near the optical axis, and a vertical distance between the point of the IF724 and the optical axis is HIF724 (for example).
The representation of the inflection points on the object-side surface or the image-side surface of the other lens and the vertical distance between the inflection points and the optical axis or the amount of the depression of the inflection points is compared with the representation in the foregoing.
Aberration-related variable
Optical Distortion (Optical Distortion) of an Optical imaging system is expressed in ODT; its TV distortion (TVDistortion) is expressed in TDT and can further define the degree of aberration shift described between 50% and 100% imaging field of view; the spherical aberration offset is expressed as DFS; the coma aberration offset is denoted DFC.
The profile curve length of any surface of a single lens in the maximum effective radius range affects the ability of the surface to correct aberrations and optical path differences between the light beams of each field, and the longer the profile curve length, the higher the aberration correction ability, but at the same time, the manufacturing difficulty is increased, so that the profile curve length of any surface of a single lens in the maximum effective radius range must be controlled, and particularly, the proportional relationship (ARS/TP) between the profile curve length (ARS) of the surface in the maximum effective radius range and the Thickness (TP) of the lens on the optical axis to which the surface belongs must be controlled. For example, the length of the profile curve of the maximum effective radius of the object-side surface of the first lens is represented by ARS11, the thickness of the first lens on the optical axis is TP1, the ratio of the two is ARS11/TP1, the length of the profile curve of the maximum effective radius of the image-side surface of the first lens is represented by ARS12, and the ratio of the length of the profile curve of the maximum effective radius of the image-side surface of the first lens to TP1 is ARS12/TP 1. The length of the profile curve of the maximum effective radius of the object-side surface of the second lens is represented by ARS21, the thickness of the second lens on the optical axis is TP2, the ratio of the two is ARS21/TP2, the length of the profile curve of the maximum effective radius of the image-side surface of the second lens is represented by ARS22, and the ratio of the length of the profile curve of the maximum effective radius of the image-side surface of the second lens to TP2 is ARS22/TP 2. The relationship between the length of the profile curve of the maximum effective radius of any surface of the rest of the lenses in the optical imaging system and the Thickness (TP) of the lens on the optical axis to which the surface belongs is expressed in the same way.
The lateral aberration at 0.7HOI through the entrance pupil edge and incident on the imaging plane for the longest operating wavelength of visible light of the meridional normal plane light fan of the optical imaging system is denoted by P L TA, the lateral aberration at 0.7HOI through the entrance pupil edge and incident on the imaging plane for the shortest operating wavelength of visible light of the meridional normal plane light fan of the optical imaging system is denoted by PSTA, the lateral aberration at 0.7HOI through the entrance pupil edge and incident on the imaging plane for the longest operating wavelength of visible light of the meridional normal plane light fan of the optical imaging system is denoted by N L TA, the lateral aberration at 0.7HOI through the entrance pupil edge and incident on the imaging plane for the shortest operating wavelength of visible light of the meridional normal plane light fan of the optical imaging system is denoted by NSTA, the lateral aberration at 0.7HOI through the entrance pupil edge and incident on the imaging plane for the longest operating wavelength of visible light of the sagittal normal plane of the optical imaging system is denoted by SSTA, and the shortest operating wavelength of visible light of the sagittal plane optical imaging system is denoted by SSTA at 0.7HOI through the entrance pupil edge and imaging plane for the shortest operating wavelength of the imaging plane.
The profile length of any surface of the single lens in the 1/2 entrance pupil diameter (HEP) height range particularly affects the ability of the surface to correct aberrations in the shared region of each field of view and the optical path difference between the light beams of each field of view, and the longer the profile length, the greater the ability to correct aberrations, while also increasing manufacturing difficulties, so that the profile length of any surface of the single lens in the 1/2 entrance pupil diameter (HEP) height range, particularly the ratio (ARE/TP) between the profile length (ARE) of the surface in the 1/2 entrance pupil diameter (HEP) height range and the Thickness (TP) of the lens on the optical axis to which the surface belongs, must be controlled. For example, the length of the profile curve of the 1/2 entrance pupil diameter (HEP) height of the object-side surface of the first lens is ARE11, the thickness of the first lens on the optical axis is TP1, the ratio of the two is ARE11/TP1, the length of the profile curve of the 1/2 entrance pupil diameter (HEP) height of the image-side surface of the first lens is ARE12, and the ratio of the length of the profile curve to the TP1 is ARE12/TP 1. The length of the profile curve of the 1/2 entrance pupil diameter (HEP) height of the object-side surface of the second lens is represented by ARE21, the thickness of the second lens on the optical axis is TP2, the ratio of the two is ARE21/TP2, the length of the profile curve of the 1/2 entrance pupil diameter (HEP) height of the image-side surface of the second lens is represented by ARE22, and the ratio of the length of the profile curve to TP2 is ARE 22/TP 2. The relationship between the length of the profile curve of 1/2 entrance pupil diameter (HEP) height of any surface of the remaining lenses in the optical imaging system and the Thickness (TP) of the lens on the optical axis to which that surface belongs is expressed by analogy.
By means of the mobile carrier auxiliary system, the physiological state of the driver can be measured, corresponding detection signals are generated to the warning device, and the warning device can generate corresponding warning signals according to the detection signals for subsequent use and processing, so that driving safety is further improved. For example, the warning device generates warning information when receiving the detection signal of physiological abnormality.
Drawings
The above and other features of the present invention will be described in detail with reference to the accompanying drawings.
Fig. 1A is a block diagram of a mobile carrier assistance system according to a first system embodiment of the present invention;
fig. 1B is a schematic view of a mobile carrier auxiliary system according to a first system embodiment of the present invention disposed on a mobile carrier;
fig. 1C is a schematic view of a wearable device according to a first system embodiment of the present invention.
Fig. 1D is a schematic perspective view of an electronic rearview mirror for vehicle according to an embodiment of the present invention.
Fig. 1E is a schematic short-side cross-sectional view of a display device according to a first system embodiment of the present invention;
fig. 2A is a schematic diagram of a first optical embodiment of the present invention;
fig. 2B is a graph showing spherical aberration, astigmatism and optical distortion of the first optical embodiment of the present invention from left to right in sequence;
FIG. 3A is a schematic diagram of a second optical embodiment of the present invention;
fig. 3B is a graph showing spherical aberration, astigmatism and optical distortion of the second optical embodiment of the present invention from left to right in sequence;
FIG. 4A is a schematic diagram of a third optical embodiment of the present invention;
fig. 4B is a graph showing spherical aberration, astigmatism and optical distortion of a third optical embodiment of the present invention from left to right in sequence;
FIG. 5A is a schematic diagram of a fourth optical embodiment of the present invention;
fig. 5B is a graph showing spherical aberration, astigmatism and optical distortion of a fourth optical embodiment of the present invention from left to right in sequence;
fig. 6A is a schematic diagram illustrating a fifth optical embodiment of the present invention;
fig. 6B is a graph showing spherical aberration, astigmatism and optical distortion of a fifth optical embodiment of the present invention from left to right in sequence;
fig. 7A is a schematic diagram illustrating a sixth optical embodiment of the present invention;
fig. 7B is a graph showing the spherical aberration, astigmatism and optical distortion of the sixth optical embodiment of the present invention from left to right in sequence.
Description of reference numerals: optical imaging system 10, 20, 30, 40, 50, 60
Aperture 100, 200, 300, 400, 500, 600
First lens 110, 210, 310, 410, 510, 610
Object side surfaces 112, 212, 312, 412, 512, 612
Image side 114, 214, 314, 414, 514, 614
Second lens 120, 220, 320, 420, 520, 620
Object side surfaces 122, 222, 322, 422, 522, 622
Like side surfaces 124, 224, 324, 424, 524, 624
Third lens 130, 230, 330, 430, 530, 630
Object side 132, 232, 332, 432, 532, 632
Image side 134, 234, 334, 434, 534, 634
Fourth lens 140, 240, 340, 440, 540
Object side surfaces 142, 242, 342, 442, 542
Like side 144, 244, 344, 444, 544
Fifth lens 150, 250, 350, 450
Object side 152, 252, 352, 452
Like side 154, 254, 354, 454
Sixth lens 160, 260, 360
Object side 162, 262, 362
Image side 164, 264, 364
Seventh lens 270
Object side 272
Image side surface 274
Infrared filter 180, 280, 380, 480, 570, 670
Imaging planes 190, 290, 390, 490, 580, 680
Image sensing devices 192, 292, 392, 492, 590, 690
Action vehicle 0000
File-changing device 0001
Driver seat 0002
Safety belt 0003
Power system 0004
Steering wheel 0005
Mobile carrier auxiliary system 0006
Driving state detecting device 0010
Physiological state detection module 0011
Image capturing module 0012
Rhythm detection module 0013
Blood pressure detecting module 0014
Blood component detection module 0015
Alcohol concentration detection module 0016
Respiration rate detecting module 0017
Brightness sensor 0018
Storage module 0020
Operational module 0022
Update module 0030
Warning device 0040
Wearable device 0042
Body 0043
Control device 0050
Starting device 0052
Start button 0053
0060 vehicle state detecting device
Alarm element 0062
Display device 0064
Vehicle electronic rearview mirror 0100
Housing 0110
Glare sensor 0112
Frame glue 0114
First light-transmitting element 0120
First light collecting surface 0122
First light emitting surface 0124
Second light-transmitting element 0130
Second light collecting surface 0132
Second light emitting surface 0134
Electro-optic dielectric layer 0140
Light-transmitting electrode 0150
Transparent conductive layer 0160
Electric connector 0170
Control element 0180
Reflective layer 0190
Detailed Description
The main design content of the mobile carrier auxiliary system includes system implementation design and optical implementation design, and the following description is related to the system embodiment:
fig. 1A is a block diagram of a mobile vehicle assistance system 0006 according to a first system embodiment of the present invention, and fig. 1B is a vehicle using a mobile vehicle 0000 as an example. As shown in the figure, the mobile vehicle assistance system 0006 of the present embodiment at least includes a driving state detection device 0010 and a warning device 0040, wherein the driving state detection device 0010 includes a physiological state detection module 0011, a storage module 0020 and an operation module 0022, and the physiological state detection module 0011 detects at least one physiological state of a driver of the mobile vehicle 0000; the storage module 0020 and the operation module 0022 are disposed in the mobile carrier 0000, and the storage module 0020 stores at least one allowable parameter corresponding to at least one physiological state and at least one actuation mode exceeding the allowable parameter; the operation module 0022 is in signal connection with the physiological status detection module 0011 and the storage module 0020 in a wired or wireless manner, and the operation module 0022 detects whether the plurality of physiological statuses of the driver exceed the allowable parameter and generates a corresponding detection signal. The operation module 0022 can be a controller, such as MCU, DSP, etc.
In this embodiment, the physiological status detecting module 0011 can include a plurality of modules, the plurality of modules include an image capturing module 0012, a rhythm detecting module 0013, a blood pressure detecting module 0014, a blood component detecting module 0015, an alcohol concentration detecting module 0016, and a respiration rate detecting module 0017, and the storage module 0020 stores allowable parameters corresponding to the physiological status detected by each module. The storage module 0020 can be electrically connected to an update module 0030, and the plurality of allowable parameters stored in the storage module 0020 can be updated by the update module 0030.
The warning device 0040 is electrically connected to the operation module 0022 and the storage module 0020, and the warning device 0040 generates a warning message according to the received detection signal, and more particularly, generates the warning message for subsequent use when receiving the detection signal indicating that the plurality of physiological states of the driver exceed the allowable parameter.
The image capturing module 0012 is disposed in the mobile carrier 0000 and configured to capture at least a driving image of the driver in the mobile carrier 0000. The image capturing module 0012 includes a lens set and an image sensor, wherein the lens set includes at least two lenses with refractive power for imaging onto the image sensor to generate driving images. The conditions of the lens assembly will be described in the various optical embodiments.
The physiological status detecting module of this embodiment further includes a brightness sensor 0018 electrically connected to the image capturing module 0012 for detecting brightness of at least the direction in which the image capturing module 0012 captures the image, and when the brightness detected by the brightness sensor 0018 is greater than an upper threshold, the image capturing module 0012 captures the driving image in a manner of decreasing the amount of light entering, and when the brightness detected by the brightness sensor 0018 is less than a lower threshold, the image capturing module 0012 captures the driving image in a manner of increasing the amount of light entering. Thus, a driving image with appropriate brightness can be obtained, and overexposure or underexposure can be avoided.
In this embodiment, the computing module 0022 analyzes the physiological state of the driver according to the driving image as whether the line of sight direction of the driver is toward at least one of the traveling direction of the mobile vehicle 0000, the time of line of sight change, the frequency of line of sight change, the time of eye closing, and the frequency of blinking, and the storage module 0020 stores allowable parameters corresponding to at least one of the line of sight direction, the time of line of sight change, the frequency of line of sight change, the time of eye closing, and the frequency of blinking. The operation module 0022 determines whether the plurality of physiological states of the driver exceed the allowable parameters according to the driving image, and generates the corresponding detection signal to the warning device 0040.
The physiological status detected by the rhythm detection module 0013 is a rhythm of the heart or a change in the rhythm of the heart, and the storage module 0020 stores allowable parameters corresponding to the rhythm of the heart or the change in the rhythm of the heart. The calculating module 0022 determines whether the physiological status of the driver exceeds the allowable parameter according to the detection result of the heart rate detecting module 0013 and generates the corresponding detection signal. When the warning device 0040 receives the detection signal that one of the physiological states detected by the image capturing module 0012 and the cardiac rhythm detecting module 0013 exceeds the allowable parameter, the warning information is generated. In practice, the warning message can be generated when the detection signals indicating that the physiological states detected by the image capturing module 0012 and the cardiac rhythm detecting module 0013 both exceed the allowable parameters are received, so as to achieve double confirmation and make the determination more accurate.
The blood pressure detecting module 0014 is configured to be touched by the driver, the detected physiological status of the driver is blood pressure or blood pressure change, and the storage module 0020 stores allowable parameters corresponding to the blood pressure or the blood pressure change. The operation module 0022 determines whether the physiological status of the driver exceeds the allowable parameter according to the detection result of the blood pressure detection module 0014 and generates the corresponding detection signal. When the warning device 0040 receives the detection signal that one of the physiological states detected by the image capturing module 0012 and the blood pressure detecting module 0014 exceeds the allowable parameter, the warning information is generated. In practice, the warning message can be generated when the detection signals that the physiological states detected by the image capturing module 0012 and the blood pressure detecting module 0014 both exceed the allowable parameters are received, so as to achieve double confirmation and make the determination more accurate.
The physiological status of the driver detected by the blood component detecting module 0015 is alcohol concentration, blood oxygen concentration or blood sugar concentration in blood, and the storage module 0020 stores allowable parameters corresponding to the alcohol concentration, the blood oxygen concentration or the blood sugar concentration. The operation module 0022 determines whether the physiological status of the driver exceeds the allowable parameter according to the detection result of the blood component detection module 0015 and generates the corresponding detection signal. When the warning device 0040 receives the detection signal that one of the physiological states detected by the image capturing module 0012 and the blood component detecting module 0015 exceeds the allowable parameter, the warning information is generated. In practice, the warning message may be generated when the detection signals indicating that the physiological statuses detected by the image capturing module 0012 and the blood component detecting module 0015 both exceed the allowable parameters are received, so as to achieve double confirmation and make the determination more accurate.
The physiological status of the driver detected by the alcohol concentration detecting module 0016 is the alcohol concentration in the exhaled air or blood, and the storage module 0020 stores allowable parameters corresponding to the alcohol concentration. The operation module 0022 determines whether the physiological status of the driver exceeds the allowable parameter according to the detection result of the alcohol concentration detection module 0016 and generates the corresponding detection signal. When the warning device 0040 receives the detection signal that the physiological status detected by the alcohol concentration detection module 0016 exceeds the allowable parameter, the warning information is generated. Incidentally, the alcohol concentration detecting module 0016 can be disposed on a document replacing device 0001 of the mobile vehicle 0000, the driver operates the document replacing device 0001 to switch the driving state of the mobile vehicle 0000, and when the driver operates the document replacing device 0001, the hand of the driver contacts the alcohol concentration detecting module 0016 on the document replacing device 0001.
In practice, the physiological status detecting module 0011 may also include at least one of the modules. In addition, at least one of the modules of the physiological status detecting module 0011 can be disposed on a wearable device 0042 (for example, the watch of fig. 1C) of the driver, and can enter or leave the mobile vehicle 0000 along with the driver. The physiological status detecting module 0011 wirelessly transmits the detection result to the calculating module 0022. For example, the image capturing module 0012 is located on the body 0043 of the wearable device 0042, the image capturing direction is outside the body 0043, and the heart rate detecting module 0013, the blood pressure detecting module 0014, the blood component detecting module 0015, the alcohol concentration detecting module 0016, etc. may also be disposed on the body 0043.
The respiration rate detecting module 0017 is used for detecting the respiration rate of the driver, and the storage module 0020 stores allowable parameters corresponding to the respiration rate. The operation module 0022 determines whether the physiological state of the driver exceeds the allowable parameter and generates the corresponding detection signal according to the result detected by the respiration rate detection module 0017, for example, the respiration rate detection module 0017 may be disposed on the seat back of the driver seat 0002 at a position corresponding to the upper back of the driver to detect the fluctuation of the upper back of the driver during respiration, or disposed on the seat belt 0003 to detect the fluctuation of the chest of the driver during respiration, and when the respiration rate is lower than a predetermined rate, it is determined that the driver is tired or dozed. When the warning device 0040 receives the detection signal that one of the physiological states detected by the image capturing module 0012 and the respiration rate detecting module 0017 exceeds the allowable parameter, the warning information is generated. In practice, the warning message may be generated when the detected physiological status of the image capturing module 0012 and the respiration rate detecting module 0017 exceeds the detection signal of the allowable parameter, so as to achieve double confirmation and make the determination more accurate.
In this embodiment, the mobile vehicle assistance system 0006 further includes a control device 0050 and an activation device 0052, wherein the control device 0050 is disposed on the mobile vehicle and electrically connected to the computing module 0022 and the storage module 0020. The control device 0050 reads the corresponding action mode from the storage module to control the mobile vehicle according to the received detection signal indicating whether the plurality of physiological conditions of the driver exceed the allowable parameters.
The activation device 0052 is connected to the control device 0050 in a wired or wireless signal, and the driver can operate the activation device 0052 to activate or deactivate a power system 0004 of the mobile vehicle 0000, such as an engine of a fuel-burning vehicle or a motor of an electric vehicle. The activation device 0052 can cooperate with the control device 0050 to properly control the mobile vehicle 0000, wherein when the mobile vehicle 0000 is in the state where the power system 0004 is turned off and the driver operates the activation device 0052 to activate the power system 0004, the control device 0050 can perform different controls on the mobile vehicle 0000 according to different physiological states of the driver, wherein when the control device 0050 receives the detection signals that the plurality of physiological states of the driver do not exceed the allowable parameters, the control device 0050 controls the mobile vehicle 0000 in the actuation mode allowing the power system 0004 to be activated, so that the driver can drive the mobile vehicle 0000; on the contrary, when the control device 0050 receives the detection signal that at least one of the physiological states of the driver exceeds the allowable parameter, the control device 0050 controls the mobile vehicle 0000 in the actuation mode that the power system 0004 is disabled, so as to avoid dangerous driving in a state that the driver is not suitable for driving.
For example, to prevent the driver from driving the mobile vehicle 0000 after drinking, when the mobile vehicle 0000 is in the state where the power system 0004 is turned off and the driver operates the activation device 0052 to activate the power system 0004 and the control device 0050 receives the detection signal that the physiological state detected by the alcohol concentration detection module 0016 does not exceed the allowable parameter, the control device 0050 controls the mobile vehicle 0000 in an actuation mode allowing the power system 0004 to be activated, so that the driver can drive the mobile vehicle 0000; on the contrary, when the control device 0050 receives the detection signal that the physiological state detected by the alcohol concentration detection module 0016 exceeds the allowable parameter, the control device 0050 controls the mobile vehicle 0000 in the activation mode in which the power system 0004 is disabled, so that the driver cannot drive the mobile vehicle 0000.
In this embodiment, the activation device 0052 is disposed in the mobile vehicle 0000, for example, near the driver seat 0002, and the activation device 0052 has an activation button 0053 for the driver to press to operate the activation device 0052 to activate or deactivate the power system 0004. At least one module of the physiological status detecting module 0011 can be disposed on the activation button 0053. The alcohol concentration detecting module 0016 is disposed on the activating device 0052, preferably, the activating button 0053, and when the driver presses the activating button 0053, the finger simultaneously touches the alcohol concentration detecting module 0016, and the alcohol concentration detecting module 0016 transmits the detected result to the operation module 0022 in a wired or wireless transmission manner. For another example, the heart rate detecting module 0013 may be disposed on the activation button 0053, when the driver presses the activation button 0053, the finger touches the heart rate detecting module 0013 at the same time, and the heart rate detecting module 0013 transmits the detected physiological state (heart rate or heart rate change) of the driver to the computing module 0022 in a wired or wireless transmission manner.
In practice, the activation device 0052 may also be a remote controller of the mobile vehicle 0000, and at least one module of the physiological status detection module 0011 may be disposed on the remote controller, for example, the heart rate detection module 0013 may be disposed on the remote controller, when the driver holds the remote controller, the hand touches the heart rate detection module 0013, and the heart rate detection module 0013 transmits the detected physiological status (heart rate or heart rate change) of the driver to the operation module 0022 in a wireless transmission manner.
In order to avoid dangerous driving caused by physical discomfort of the driver when the mobile vehicle 0000 is in the activated state of the power system 0004, when the control device 0050 receives a detection signal indicating that at least one of the physiological states of the driver exceeds the allowable parameter for a predetermined time, the control device 0050 controls the mobile vehicle 0000 in an automatic driving mode to take over for human driving.
The action vehicle assisting system 0006 further includes a vehicle status detecting device 0060, the vehicle status detecting device 0060 is disposed on the action vehicle 0000 and electrically connected to the control device 0050, the vehicle status detecting device 0060 detects the movement status of the action vehicle 0000 and generates a status signal. In practice, the vehicle status detecting device 0060 may comprise at least one of a steering angle sensor, an inertia meter and a speed sensor, wherein the steering angle sensor detects the steering angle of the mobile vehicle 0000; the inertia measuring device is used for detecting the acceleration, the inclination angle or the yaw rate of the mobile carrier 0000; the speed sensor is used for detecting the speed of the mobile carrier 0000. The vehicle state detecting device 0060 outputs the corresponding state signal according to the detection result of at least one of the speed sensor, the steering angle sensor and the inertia measurer.
The control device 0050 controls the mobile vehicle 0000 to drive automatically according to the status signal. For example, the mobile carrier 0000 is correspondingly controlled according to the automatic driving mode and the current moving speed state signal of the mobile carrier 0000, so as to avoid the discomfort caused by changing the operation mode of the personnel in the mobile carrier 0000.
In order to prompt the driver according to the warning information, the auxiliary system 0006 of the mobile vehicle further includes a warning element 0062 and a display device 0064 electrically connected to the warning device 0040, wherein the warning element 0062 is used for receiving the warning information and generating corresponding light, sound, vibration or physical touch when the warning device 0040 sends the warning information, the warning element 0062 can be a Buzzer (Buzzer) or/and a light Emitting Diode (L light Emitting Diode, L ED), which can be respectively disposed at the left and right sides of the mobile vehicle, such as the inner and outer areas of the mobile vehicle 0000 adjacent to the driver seat, such as the a pillar, the left/right rearview mirror, the dashboard, the front glass, etc., to start corresponding to the detection condition of the mobile vehicle 0000, the warning element 0062 can also be a vibrator or a seat belt tensioner, wherein the vibrator can be disposed on the driver seat 0002, the steering wheel 0005 or the seat belt 0003 to generate vibration or physical touch of the driver, and the seat belt tensioner is connected to the driver to generate the physical touch effect of the driver.
The display device 0064 is used to display the warning message, for example, in the form of at least one of an image and a text. One of the warning device 0062 and the display device 0064 is used to indicate the physiological status of the driver to be unsuitable for driving the mobile vehicle 0000.
Fig. 1D is a schematic perspective view of the display device of the present embodiment, which is an electronic rearview mirror 0100 for a vehicle with a display, and fig. 1E is a schematic cross-sectional view of the short side of fig. 1D. The electronic rearview mirror 0100 for vehicle of the present invention can be installed on a mobile vehicle such as a vehicle to assist the vehicle in traveling or provide information related to the traveling of the vehicle, such as a vehicle, and the electronic rearview mirror 0100 for vehicle can be an internal rearview mirror installed inside the vehicle or an external rearview mirror installed outside the vehicle, both of which are used to assist the vehicle driver to know the position of other vehicles. The present invention is not limited thereto. Besides, the vehicle is not limited to a vehicle, and the vehicle may refer to other kinds of vehicles, for example: land trains, aircraft, water vessels, and the like.
The electronic rearview mirror 0100 for vehicle is assembled in a housing 0110, and the housing 0110 has an opening (not shown). Specifically, the opening of the casing 0110 overlaps the reflective layer 0190 of the electronic rearview mirror 0100 for vehicle (fig. 1D), so that the external light can be transmitted to the reflective layer 0190 located inside the casing 0110 after passing through the opening, and the electronic rearview mirror 0100 for vehicle functions as a mirror. When the driver of the vehicle is driving, the driver faces the opening, for example, and the driver can view the external light reflected by the vehicular electronic rearview mirror 0100 to know the position of the vehicle behind.
With continued reference to fig. 1E, the electronic rearview mirror 0100 for vehicle comprises a first transparent element 0120 and a second transparent element 0130, wherein the first transparent element 0120 faces the driver, and the second transparent element 0130 is disposed at a side far from the driver. Specifically, the first transparent element 0120 and the second transparent element 0130 are transparent substrates, and the material thereof may be glass, for example. However, the material of the first transparent element 0120 and the second transparent element 0130 can also be, for example, plastic, quartz, PET substrate or other applicable materials, wherein the PET substrate has the characteristics of low cost, easy manufacturing, and thinness besides the packaging and protection effects.
In this embodiment, the first transparent element 0120 includes a first light-receiving surface 0122 and a first light-emitting surface 0124, and an external light image from the rear of the driver is incident to the first transparent element 0120 through the first light-receiving surface 0122 and emitted from the first light-emitting surface 0124. The second light-transmitting element 0130 includes a second light-receiving surface 0132 and a second light-emitting surface 0134, the second light-receiving surface 0132 faces the first light-emitting surface 0124, and a gap is formed between the first light-emitting surface 0124 and a sealant 0114. The external light image is emitted from the first light emitting surface 0124 to the second light transmissive element 0130, and then emitted from the second light emitting surface 0134.
An electro-optical medium layer 0140 is disposed in the gap formed by the first light emitting surface 0124 of the first light transmissive element 0120 and the second light receiving surface 0132 of the second light transmissive element 0130. At least one transparent electrode 0150 is disposed between the first transparent element 0120 and the electro-optic medium layer 0140. The electro-optic medium layer 0140 is disposed between the first light transmissive element 0120 and the at least one reflective layer 0190. A transparent conductive layer 0160 is disposed between the first light-transmitting element 0120 and the electro-optic medium layer 0140, and another transparent conductive layer 0160 is disposed between the second light-transmitting element 0130 and the electro-optic medium layer 0140. One electrical connector 0170 is connected to the transparent conductive layer 0160, and the other electrical connector 0170 is connected to the transparent electrode 0150, and the transparent electrode 0150 is electrically connected to the electro-optic dielectric layer 0140 directly or through the other transparent conductive layer 0160, so as to transmit electric energy to the electro-optic dielectric layer 0140 and change the transparency of the electro-optic dielectric layer 0140. When an external light image with brightness exceeding a predetermined value is generated, such as strong head light from a rear vehicle, the glare sensor 0112 electrically connected to the control element 0180 can receive the light energy and convert the light energy into a signal, the control element 0180 can analyze whether the brightness of the external light image exceeds a predetermined brightness, and if the glare is generated, the electric energy is provided to the electro-optic medium layer 0140 through the electric connection element 0170 to generate the anti-glare effect. If the intensity of the external light image is too strong, the glare effect will be caused to affect the sight of the eyes of the driver, thereby endangering the driving safety.
The transparent electrode 0150 and the reflective layer 0190 may cover the surface of the first transparent element 0120 and the surface of the second transparent element 0130, for example, globally, but the present invention is not limited thereto. In this embodiment, the material of the light-transmitting electrode 0150 can be selected from metal oxides, such as: indium tin oxide, indium zinc oxide, aluminum tin oxide, aluminum zinc oxide, indium germanium zinc oxide, other suitable oxides, or stacked layers of at least two of the foregoing. In addition, the reflective layer 0190 may have conductivity, and the reflective layer 0190 may include at least one material selected from a group consisting of silver (Ag), copper (Cu), aluminum (Al), titanium (Ti), chromium (Cr), and molybdenum (Mo), or an alloy thereof, or include silicon dioxide or a transparent conductive material. Alternatively, the transparent electrode 0150 and the reflective layer 0190 may also include other types of materials, and the invention is not limited thereto.
The electro-optic medium layer 0140 may be made of organic materials, or inorganic materials, but the invention is not limited thereto. In this embodiment, the electro-optic medium layer 0140 may be made of electrochromic material (electrochromic material), and is disposed between the first light-transmitting element 0120 and the second light-transmitting element 0130, and between the first light-transmitting element 0120 and the reflective layer 0190. Specifically, the light-transmissive electrode 0150 is disposed between the first light-transmissive element 0120 and the electro-optic dielectric layer 0140 (electrochromic material layer EC), and the reflective layer 0190 of the embodiment may be disposed between the second light-transmissive element 0130 and the electro-optic dielectric layer 0140. In addition, in the embodiment, the electronic rearview mirror 0100 further includes sealant 0114. The sealant 0114 is disposed between the first transparent element 0120 and the second transparent element 0130 and surrounds the electro-optic dielectric layer 0140. The sealant 0114, the first light-transmitting element 0120 and the second light-transmitting element 0130 encapsulate the electro-optic dielectric layer 0140 together.
In this embodiment, the transparent conductive layer 0160 is disposed between the electro-optic medium layer 0140 and the reflective layer 0190. Specifically, the anti-oxidation layer can be used as the anti-oxidation layer of the reflection layer 0190, and the electro-optic medium layer 0140 can be prevented from directly contacting the reflection layer 0190, so that the reflection layer 0190 is prevented from being corroded by organic materials, and the electronic rearview mirror 0100 for the vehicle in this embodiment has a long service life. In addition, the sealant 0114, the transparent electrode 0150 and the transparent conductive layer 0160 jointly encapsulate the electro-optic dielectric layer 0140. In this embodiment, the transparent conductive layer 0160 includes at least one material selected from a group of materials consisting of Indium Tin Oxide (ITO), Indium Zinc Oxide (IZO), aluminum-doped zinc oxide thin film (Al-doped zno), and fluorine-doped tin oxide.
In this embodiment, the electronic rearview mirror 0100 for vehicle can be selectively provided with an electrical connector 0170 such as a wire or a conductive structure connected to the transparent electrode 0150 and the reflective layer 0190 respectively. The transparent electrode 0150 and the reflective layer 0190 may be electrically connected to at least one control element 0180 providing a driving signal by the conductive wires or structures, respectively, to drive the electro-optic medium layer 0140.
When the electro-optic dielectric layer 0140 is enabled, the electro-optic dielectric layer 0140 undergoes an electrochemical redox reaction to change its energy level, thereby exhibiting a dull (dim) state. When the external light passes through the opening of the casing 0110 and reaches the electro-optic medium layer 0140, the external light is absorbed by the electro-optic medium layer 0140 in an extinction state, so that the electronic rearview mirror 0100 for the vehicle is switched to the anti-glare mode. On the other hand, when the electro-optic medium layer 0140 is disabled, the electro-optic medium layer 0140 is in a light transmitting state. At this time, the external light passing through the opening of the casing 0110 passes through the electro-optic medium layer 0140 and is reflected by the reflective layer 0190, so that the electronic rearview mirror 0100 for vehicle is switched to the mirror mode.
In particular, the first light transmitting element 0120 has a first light receiving surface 0122 remote from the second light transmitting element 0130. For example, the external light from the other vehicle behind enters the electronic rearview mirror 0100 through the first light-receiving surface 0122, and the electronic rearview mirror 0100 reflects the external light so that the external light leaves the electronic rearview mirror 0100 through the first light-receiving surface 0122. In addition, the human eyes of the vehicle driver can receive the external light reflected by the electronic rearview mirror 0100 for the vehicle, and further know the position of the other vehicle behind. In addition, the reflective layer 0190 may have optical properties of partially transmitting and partially reflecting by selecting appropriate materials and designing appropriate film thicknesses.
The display of the electronic rearview mirror 0100 for vehicle may be L CD or L ED, and the display may be disposed inside or outside the casing 0110, for example, on a side of the second transparent element 0130 away from the first transparent element 0120, or for example, on a second light emitting surface 0134 of the second transparent element 0130 away from the first transparent element 0120. since the reflective layer 0190 has an optical property of partially penetrating and partially reflecting, the image light emitted from the display may pass through the reflective layer 0190, so that the user can view the internal image displayed on the display to display the warning information.
Besides prompting the driver, the warning information may also be stored in advance in the storage module 0020 for at least one emergency contact information, and when the warning device 0040 generates the warning information, the warning information is sent to the emergency contact information. For example, the warning device 0040 may have a communication module, which can be connected to a telecommunication network or the internet, and send the warning message to an electronic device (e.g., a mobile phone, a computer, etc.) corresponding to the emergency contact information via the communication module. Thereby, the emergency contact is prompted that the physiological status of the driver is not suitable for driving the mobile vehicle 0000.
In the mobile carrier assistance system of the first system embodiment, the physiological status detecting module 0011 can detect the physiological status of the driver and generate a corresponding detecting signal to the warning device 0040, and the warning device generates warning information for subsequent use when receiving the detecting signal indicating that the plurality of physiological statuses of the driver exceed the allowable parameters (i.e., receiving the detecting signal indicating that the physiological status is abnormal).
Possible optical embodiments of the lens assembly are described below. In the utility model discloses the optical imaging system that the lens battery formed can use three operating wavelength to design, is 486.1nm, 587.5nm, 656.2nm respectively, wherein 587.5nm is the main reference wavelength who draws technical feature for main reference wavelength. The optical imaging system can also be designed using five operating wavelengths, 470nm, 510nm, 555nm, 610nm, 650nm, respectively, where 555nm is the primary reference wavelength for the primary extraction features.
The ratio of the focal length f of the optical imaging system to the focal length fp of each lens with positive refractive power is PPR, the ratio of the focal length f of the optical imaging system to the focal length fn of each lens with negative refractive power is NPR, the sum of the PPR of all the lenses with positive refractive power is Σ PPR, and the sum of the NPR of all the lenses with negative refractive power is Σ NPR, which helps to control the total refractive power and the total length of the optical imaging system when the following conditions are satisfied: 0.5 ≦ Σ PPR/| Σ NPR ≦ 15, preferably, the following condition may be satisfied: 1 ≦ Σ PPR/| Σ NPR | < 3.0.
The optical imaging system may further include an image sensor disposed on the imaging surface. Half of the diagonal length of the effective sensing area of the image sensor (i.e. the imaging height of the optical imaging system or the maximum image height) is HOI, and the distance from the object-side surface of the first lens to the imaging surface on the optical axis is HOS, which satisfies the following conditions: HOS/HOI is less than or equal to 50; and HOS/f is more than or equal to 0.5 and less than or equal to 150. Preferably, the following conditions are satisfied: HOS/HOI is more than or equal to 1 and less than or equal to 40; and HOS/f is more than or equal to 1 and less than or equal to 140. Therefore, the optical imaging system can be kept miniaturized and can be carried on light and thin portable electronic products.
Additionally, the utility model discloses an among the optical imaging system, can set up an at least light ring according to the demand to reduce stray light, help promoting image quality.
The utility model discloses an among the optical imaging system, the diaphragm configuration can be leading light ring or put the light ring, and wherein leading light ring meaning light ring sets up between shot object and first lens promptly, and the middle-placed light ring then shows that the light ring sets up between first lens and imaging surface. If the diaphragm is a front diaphragm, the exit pupil of the optical imaging system can generate a longer distance with the imaging surface to accommodate more optical elements, and the image receiving efficiency of the image sensing element can be increased; if the diaphragm is arranged in the middle, the wide-angle lens is beneficial to expanding the field angle of the system, so that the optical imaging system has the advantage of a wide-angle lens. The distance between the diaphragm and the imaging surface is InS, which satisfies the following condition: 0.1-1 InS/HOS-1.1. Thus, the optical imaging system can be kept compact and has wide-angle characteristics.
In the optical imaging system of the present invention, the distance between the object-side surface of the first lens element and the image-side surface of the sixth lens element is InT L, the total thickness of all the lens elements with refractive power on the optical axis is Σ TP, which satisfies the following condition that Σ TP/InT L is not less than 0.1 and is not more than 0.9, thereby, the contrast of the system imaging and the qualification rate of the lens elements manufacturing can be considered at the same time, and an appropriate back focus is provided to accommodate other elements.
The radius of curvature of the object-side surface of the first lens is R1, and the radius of curvature of the image-side surface of the first lens is R2, which satisfies the following conditions: the | R1/R2 | is not less than 0.001 and not more than 25. Therefore, the first lens element has proper positive refractive power strength, and the spherical aberration is prevented from increasing and speeding up. Preferably, the following conditions are satisfied: 0.01 ≦ R1/R2 ≦ 12.
The radius of curvature of the object-side surface of the sixth lens is R11, and the radius of curvature of the image-side surface of the sixth lens is R12, which satisfy the following conditions: -7< (R11-R12)/(R11+ R12) < 50. Therefore, astigmatism generated by the optical imaging system is favorably corrected.
The first lens and the second lens are separated by a distance IN12 on the optical axis, which satisfies the following condition: IN12/f ≦ 60, thereby improving the chromatic aberration of the lens and improving the performance of the lens.
The distance between the fifth lens element and the sixth lens element is IN56, which satisfies the following condition: IN56/f is less than or equal to 3.0, which is helpful to improve the chromatic aberration of the lens to improve the performance of the lens.
The thicknesses of the first lens element and the second lens element on the optical axis are TP1 and TP2, respectively, which satisfy the following conditions: (TP1+ IN12)/TP2 is more than or equal to 0.1 and less than or equal to 10. Therefore, the method is beneficial to controlling the manufacturing sensitivity of the optical imaging system and improving the performance of the optical imaging system.
The thicknesses of the fifth lens element and the sixth lens element on the optical axis are TP5 and TP6, respectively, and the distance between the two lens elements on the optical axis is IN56, which satisfies the following conditions: 0.1 ≦ (TP6+ IN56)/TP5 ≦ 15, thereby contributing to control of the sensitivity of the optical imaging system fabrication and reducing the overall system height.
The thicknesses of the second lens element, the third lens element and the fourth lens element on the optical axis are TP2, TP3 and TP4, respectively, the distance between the second lens element and the third lens element on the optical axis is IN23, and the distance between the fourth lens element and the fifth lens element on the optical axis is IN45, which satisfies the following conditions: 0.1 is not less than TP 4/(IN 34+ TP4+ IN45) < 1. Therefore, the optical lens helps to slightly correct aberration generated in the process of incident light advancing layer by layer and reduces the total height of the system.
The utility model discloses an among the optical imaging system, the critical point C61 of sixth lens object side is HVT61 with the vertical distance of optical axis, the critical point C62 of sixth lens image side is HVT62 with the vertical distance of optical axis, sixth lens object side is the horizontal displacement distance of the nodical to critical point C61 position in the optical axis of on the optical axis for SGC61, the nodical to critical point C62 position of sixth lens image side on the optical axis is SGC62 in the horizontal displacement distance of optical axis, can satisfy the following condition: HVT61 is more than or equal to 0mm and less than or equal to 3 mm; 0mm < HVT62 is less than or equal to 6 mm; 0 is less than or equal to HVT61/HVT 62; 0mm | -SGC 61 | -is not less than 0.5 mm; 0mm < | SGC62 | is less than or equal to 2 mm; and 0 | SGC62 | l/(| SGC62 | TP6) is less than or equal to 0.9. Therefore, the aberration of the off-axis field can be effectively corrected.
The utility model discloses an optical imaging system its satisfies following condition: HVT62/HOI is more than or equal to 0.2 and less than or equal to 0.9. Preferably, the following conditions are satisfied: HVT62/HOI is more than or equal to 0.3 and less than or equal to 0.8. Therefore, aberration correction of the peripheral field of view of the optical imaging system is facilitated.
The utility model discloses an optical imaging system its satisfies following condition: HVT62/HOS is more than or equal to 0 and less than or equal to 0.5. Preferably, the following conditions are satisfied: HVT62/HOS is more than or equal to 0.2 and less than or equal to 0.45. Therefore, aberration correction of the peripheral field of view of the optical imaging system is facilitated.
The utility model discloses an among the optical imaging system, sixth lens object side in the optical axis intersect to the sixth lens object side between the most recent optical axis of the point of inflection with the parallel horizontal displacement distance of optical axis with SGI611 representation, sixth lens image side in the optical axis intersect to the sixth lens image side between the most recent optical axis of the point of inflection with the parallel horizontal displacement distance of optical axis with SGI621 representation, it satisfies the following condition: 0< SGI611/(SGI611+ TP6) ≦ 0.9; 0< SGI621/(SGI621+ TP6) ≦ 0.9. Preferably, the following conditions are satisfied: SGI611/(SGI611+ TP6) is more than or equal to 0.1 and less than or equal to 0.6; SGI621/(SGI621+ TP6) is more than or equal to 0.1 and less than or equal to 0.6.
A horizontal displacement distance parallel to the optical axis between an intersection point of the object-side surface of the sixth lens element on the optical axis and an inflection point of the object-side surface of the sixth lens element second close to the optical axis is represented by SGI612, and a horizontal displacement distance parallel to the optical axis between an intersection point of the image-side surface of the sixth lens element on the optical axis and an inflection point of the image-side surface of the sixth lens element second close to the optical axis is represented by SGI622, which satisfies the following conditions: 0< SGI612/(SGI612+ TP6) ≦ 0.9; 0< SGI622/(SGI622+ TP6) ≦ 0.9. Preferably, the following conditions are satisfied: SGI612/(SGI612+ TP6) is more than or equal to 0.1 and less than or equal to 0.6; SGI622/(SGI622+ TP6) is more than or equal to 0.1 and less than or equal to 0.6.
The vertical distance between the inflection point of the nearest optical axis of the object-side surface of the sixth lens and the optical axis is represented by HIF611, the vertical distance between the inflection point of the nearest optical axis of the image-side surface of the sixth lens and the optical axis is represented by HIF621, and the following conditions are satisfied: HIF611 | of 0.001mm ≦ 5 mm; HIF621 | ≦ HIF 0.001mm ≦ 5 mm. Preferably, the following conditions are satisfied: HIF611 | of 0.1 mm. ltoreq.3.5 mm; 1.5mm ≦ HIF621 ≦ 3.5 mm.
The vertical distance between the second inflection point near the optical axis of the object-side surface of the sixth lens and the optical axis is represented by HIF612, and the vertical distance between the second inflection point near the optical axis and the optical axis from the intersection point on the optical axis of the image-side surface of the sixth lens to the image-side surface of the sixth lens is represented by HIF622, which satisfies the following conditions: 0.001mm < l > HIF612 l < l > 5 mm; 0.001mm ≦ HIF622 ≦ 5 mm. Preferably, the following conditions are satisfied: 0.1mm < l HIF622 | < 3.5 mm; 0.1mm ≦ HIF612 | 3.5 mm.
The vertical distance between the third near-optical-axis inflection point of the object-side surface of the sixth lens and the optical axis is represented by HIF613, and the vertical distance between the third near-optical-axis inflection point of the image-side surface of the sixth lens and the optical axis is represented by HIF623, which satisfies the following conditions: 0.001mm < l-HIF 613 < l > 5 mm; 0.001 mm-623 mm less than or equal to 5 mm. Preferably, the following conditions are satisfied: 0.1mm < cord HIF623 | < cord > 3.5 mm; 0.1 mm. ltoreq. HIF613 | of 3.5 mm.
The vertical distance between the fourth inflection point near the optical axis of the object-side surface of the sixth lens and the optical axis is represented by HIF614, and the vertical distance between the fourth inflection point near the optical axis and the optical axis from the intersection point on the optical axis of the image-side surface of the sixth lens to the image-side surface of the sixth lens is represented by HIF624, wherein the following conditions are satisfied: 0.001mm < l > HIF614 | < l > 5 mm; 0.001mm ≦ HIF624 ≦ 5 mm. Preferably, the following conditions are satisfied: 0.1mm < l HIF624 l < 3.5 mm; 0.1mm ≦ HIF614 ≦ 3.5 mm.
The utility model discloses an implementation of optical imaging system can be by the lens staggered arrangement that has high dispersion coefficient and low dispersion coefficient to help optical imaging system chromatic aberration's correction.
The equation for the above aspheric surface is:
z=ch2/[1+[1-(k+1)c2h2]0.5]+A4h4+A6h6+A8h8+A10h10+ A12h12+A14h14+A16h16+A18h18+A20h20+… (1)
where z is a position value referenced to a surface vertex at a position of height h in the optical axis direction, k is a cone coefficient, c is an inverse of a curvature radius, and a4, a6, A8, a10, a12, a14, a16, a18, and a20 are high-order aspheric coefficients.
The utility model provides an among the optical imaging system, the material of lens can be plastics or glass. When the lens material is plastic, the production cost and the weight can be effectively reduced. In addition, when the lens is made of glass, the thermal effect can be controlled and the design space of the refractive power configuration of the optical imaging system can be increased. In addition, the object side and the image side of first lens to seventh lens among the optical imaging system can be the aspheric surface, and it can obtain more control variable, except that being used for subducing the aberration, compare in the use of traditional glass lens and the use figure of reducible lens even, consequently can effectively reduce the utility model discloses optical imaging system's overall height.
Furthermore, in the optical imaging system provided by the present invention, if the lens surface is a convex surface, the lens surface is a convex surface at the paraxial region in principle; if the lens surface is concave, it means in principle that the lens surface is concave at the paraxial region.
The utility model discloses an optical imaging system more visual demand is applied to in the optical system that removes and focus to have good aberration concurrently and revise and good imaging quality's characteristic, thereby enlarge the application aspect.
The utility model discloses a more visual demand of optical imaging system includes a drive module, and this drive module can be coupled with these a plurality of lenses and make these a plurality of lenses produce the displacement. The driving module may be a Voice Coil Motor (VCM) for driving the lens to focus, or an optical hand vibration prevention element (OIS) for reducing the frequency of out-of-focus caused by lens vibration during the shooting process.
The utility model discloses an optical imaging system more visual demand makes in first lens, second lens, third lens, fourth lens, fifth lens, sixth lens and the seventh lens at least one lens be the light filtering component that the wavelength is less than 500nm, and it can be by the at least one coating film on the surface of the lens of this special utensil filtering function or this lens itself can be by the preparation of the material that has the short wavelength of filtering out and form.
The imaging plane is a curved surface (for example, a spherical surface with a curvature radius), which helps to reduce the incidence angle required by the focusing light on the imaging plane, and helps to improve the relative illumination in addition to the length (TT L) that helps to achieve the micro-optical imaging system.
According to the above embodiments, specific examples are provided in conjunction with the following optical examples and will be described in detail with reference to the drawings.
First optical embodiment
Referring to fig. 2A and fig. 2B, fig. 2A is a schematic diagram of an optical imaging system 10 with a lens assembly according to a first optical embodiment of the present invention, and fig. 2B is a graph of spherical aberration, astigmatism and optical distortion of the optical imaging system 10 according to the first optical embodiment in order from left to right. In fig. 2A, the optical imaging system 10 includes, in order from an object side to an image side, a first lens element 110, an aperture stop 100, a second lens element 120, a third lens element 130, a fourth lens element 140, a fifth lens element 150, a sixth lens element 160, an ir-pass filter 180, an image plane 190 and an image sensor 192.
The first lens element 110 with negative refractive power has a concave object-side surface 112 and a concave image-side surface 114, and is aspheric, and the object-side surface 112 has two inflection points. The profile curve length for the maximum effective radius of the object-side surface of the first lens is denoted as ARS11 and the profile curve length for the maximum effective radius of the image-side surface of the first lens is denoted as ARS 12. The contour curve length for the 1/2 entrance pupil diameter (HEP) of the object-side surface of the first lens is denoted as ARE11, and the contour curve length for the 1/2 entrance pupil diameter (HEP) of the image-side surface of the first lens is denoted as ARE 12. The thickness of the first lens on the optical axis is TP 1.
A horizontal displacement distance parallel to the optical axis between an intersection point of the object-side surface 112 of the first lens element 110 on the optical axis and an inflection point of the object-side surface 112 of the first lens element 110 closest to the optical axis is represented by SGI111, and a horizontal displacement distance parallel to the optical axis between an intersection point of the image-side surface 114 of the first lens element 110 on the optical axis and an inflection point of the image-side surface 114 of the first lens element 110 closest to the optical axis is represented by SGI121, which satisfies the following conditions: SGI111 ═ 0.0031 mm; | SGI111 |/(| SGI111 | + TP1) | 0.0016.
A horizontal displacement distance parallel to the optical axis between an intersection point of the object-side surface 112 of the first lens element 110 on the optical axis and a second inflection point close to the optical axis of the object-side surface 112 of the first lens element 110 is represented by SGI112, and a horizontal displacement distance parallel to the optical axis between an intersection point of the image-side surface 114 of the first lens element 110 on the optical axis and a second inflection point close to the optical axis of the image-side surface 114 of the first lens element 110 is represented by SGI122, which satisfies the following conditions: SGI 112-1.3178 mm; | SGI112 |/(| SGI112 | + TP1) | -0.4052.
The vertical distance between the optical axis and the inflection point of the nearest optical axis of the object-side surface 112 of the first lens element 110 is represented by HIF111, and the vertical distance between the optical axis and the inflection point of the nearest optical axis of the image-side surface 114 of the first lens element 110 to the image-side surface 114 of the first lens element 110 is represented by HIF121, which satisfies the following conditions: HIF 111-0.5557 mm; HIF111/HOI is 0.1111.
The vertical distance between the second paraxial inflection point of the object-side surface 112 of the first lens element 110 and the optical axis is denoted by HIF112, and the vertical distance between the second paraxial inflection point of the image-side surface 114 of the first lens element 110 and the optical axis from the intersection point of the image-side surface 114 of the first lens element 110 and the optical axis is denoted by HIF122, which satisfies the following conditions: HIF 112-5.3732 mm; HIF112/HOI 1.0746.
The second lens element 120 with positive refractive power has a convex object-side surface 122 and a convex image-side surface 124, and is aspheric, and the object-side surface 122 has a inflection point. The profile curve length for the maximum effective radius of the object-side surface of the second lens is denoted as ARS21 and the profile curve length for the maximum effective radius of the image-side surface of the second lens is denoted as ARS 22. The contour curve length for the 1/2 entrance pupil diameter (HEP) of the object-side surface of the second lens is denoted as ARE21, and the contour curve length for the 1/2 entrance pupil diameter (HEP) of the image-side surface of the second lens is denoted as ARE 22. The second lens has a thickness TP2 on the optical axis.
A horizontal displacement distance parallel to the optical axis between an intersection point of the object-side surface 122 of the second lens element 120 on the optical axis and an inflection point of the nearest optical axis of the object-side surface 122 of the second lens element 120 is represented by SGI211, and a horizontal displacement distance parallel to the optical axis between an intersection point of the image-side surface 124 of the second lens element 120 on the optical axis and an inflection point of the nearest optical axis of the image-side surface 124 of the second lens element 120 is represented by SGI221, which satisfies the following conditions: SGI 211-0.1069 mm; | SGI211 |/(| SGI211 | + TP2) | -0.0412; SGI221 ═ 0 mm; | SGI221 |/(| SGI221 | + TP2) | 0.
The vertical distance between the optical axis and the inflection point of the nearest optical axis of the object-side surface 122 of the second lens element 120 is represented by HIF211, and the vertical distance between the optical axis and the inflection point of the nearest optical axis of the image-side surface 124 of the second lens element 120 to the image-side surface 124 of the second lens element 120 is represented by HIF221, which satisfies the following conditions: HIF 211-1.1264 mm; HIF211/HOI 0.2253; HIF221 ═ 0 mm; HIF221/HOI is 0.
The third lens element 130 with negative refractive power has a concave object-side surface 132 and a convex image-side surface 134, and is aspheric, and the object-side surface 132 and the image-side surface 134 have inflection points. The maximum effective radius of the object-side surface of the third lens has a profile curve length represented by ARS31 and the maximum effective radius of the image-side surface of the third lens has a profile curve length represented by ARS 32. The contour curve length for the 1/2 entrance pupil diameter (HEP) of the object-side surface of the third lens is denoted as ARE31, and the contour curve length for the 1/2 entrance pupil diameter (HEP) of the image-side surface of the third lens is denoted as ARE 32. The thickness of the third lens on the optical axis is TP 3.
A horizontal displacement distance parallel to the optical axis between an intersection point of the object-side surface 132 of the third lens element 130 on the optical axis and an inflection point of the object-side surface 132 of the third lens element 130 closest to the optical axis is represented by SGI311, and a horizontal displacement distance parallel to the optical axis between an intersection point of the image-side surface 134 of the third lens element 130 on the optical axis and an inflection point of the image-side surface 134 of the third lens element 130 closest to the optical axis is represented by SGI321, which satisfies the following conditions: SGI 311-0.3041 mm; | SGI311 |/(| SGI311 | + TP3) | -0.4445; SGI 321-0.1172 mm; | SGI321 |/(| SGI321 | + TP3) | -0.2357.
The vertical distance between the optical axis and the inflection point of the object-side surface 132 of the third lens element 130 closest to the optical axis is denoted by HIF311, and the vertical distance between the optical axis and the inflection point of the image-side surface 134 of the third lens element 130 closest to the optical axis is denoted by HIF321, which satisfies the following conditions: HIF311 1.5907 mm; HIF311/HOI 0.3181; HIF 321-1.3380 mm; HIF 321/HOI 0.2676.
The fourth lens element 140 with positive refractive power has a convex object-side surface 142 and a concave image-side surface 144, and is aspheric, wherein the object-side surface 142 has two inflection points and the image-side surface 144 has one inflection point. The profile curve length for the maximum effective radius of the object-side surface of the fourth lens is denoted as ARS41 and the profile curve length for the maximum effective radius of the image-side surface of the fourth lens is denoted as ARS 42. The contour curve length of the 1/2 entrance pupil diameter (HEP) of the object-side surface of the fourth lens is denoted as ARE41, and the contour curve length of the 1/2 entrance pupil diameter (HEP) of the image-side surface of the fourth lens is denoted as ARE 42. The thickness of the fourth lens element on the optical axis is TP 4.
A horizontal displacement distance parallel to the optical axis between an intersection point of the object-side surface 142 of the fourth lens element 140 on the optical axis and an inflection point of the object-side surface 142 of the fourth lens element 140 closest to the optical axis is represented by SGI411, and a horizontal displacement distance parallel to the optical axis between an intersection point of the image-side surface 144 of the fourth lens element 140 on the optical axis and an inflection point of the image-side surface 144 of the fourth lens element 140 closest to the optical axis is represented by SGI421, which satisfies the following conditions: SGI411 ═ 0.0070 mm; | SGI411 |/(| SGI411 | + TP4) | 0.0056; SGI421 ═ 0.0006 mm; | SGI421 |/(| SGI421 | + TP4) | 0.0005.
A horizontal displacement distance parallel to the optical axis between the intersection point of the object-side surface 142 of the fourth lens element 140 on the optical axis and the second inflection point of the object-side surface 142 of the fourth lens element 140 close to the optical axis is indicated by SGI412, and a horizontal displacement distance parallel to the optical axis between the intersection point of the image-side surface 144 of the fourth lens element 140 on the optical axis and the second inflection point of the image-side surface 144 of the fourth lens element 140 close to the optical axis is indicated by SGI422, which satisfies the following conditions: SGI412 ═ -0.2078 mm; | SGI412 |/(| SGI412 | + TP4) | -0.1439.
The vertical distance between the inflection point of the object-side surface 142 of the fourth lens element 140 closest to the optical axis and the optical axis is denoted by HIF411, and the vertical distance between the inflection point of the image-side surface 144 of the fourth lens element 140 closest to the optical axis and the optical axis is denoted by HIF421, which satisfies the following conditions: HIF411 mm 0.4706 mm; HIF411/HOI 0.0941; HIF421 of 0.1721 mm; HIF 421/HOI ═ 0.0344.
The vertical distance between the second paraxial inflection point of the object-side surface 142 of the fourth lens element 140 and the optical axis is denoted by HIF412, and the vertical distance between the second paraxial inflection point of the image-side surface 144 of the fourth lens element 140 and the optical axis from the intersection point of the image-side surface 144 of the fourth lens element 140 and the optical axis to the image-side surface 144 of the fourth lens element 140 is denoted by HIF422, which satisfies the following conditions: HIF412 ═ 2.0421 mm; HIF412/HOI 0.4084.
The fifth lens element 150 with positive refractive power has a convex object-side surface 152 and a convex image-side surface 154, and is aspheric, wherein the object-side surface 152 has two inflection points and the image-side surface 154 has one inflection point. The maximum effective radius of the object-side surface of the fifth lens has a contour curve length represented by ARS51 and the maximum effective radius of the image-side surface of the fifth lens has a contour curve length represented by ARS 52. The contour curve length of the 1/2 entrance pupil diameter (HEP) of the object-side surface of the fifth lens is denoted as ARE51, and the contour curve length of the 1/2 entrance pupil diameter (HEP) of the image-side surface of the fifth lens is denoted as ARE 52. The thickness of the fifth lens element on the optical axis is TP 5.
A horizontal displacement distance parallel to the optical axis between an intersection point of the object-side surface 152 of the fifth lens element 150 on the optical axis and an inflection point of the nearest optical axis of the object-side surface 152 of the fifth lens element 150 is represented by SGI511, and a horizontal displacement distance parallel to the optical axis between an intersection point of the image-side surface 154 of the fifth lens element 150 on the optical axis and an inflection point of the nearest optical axis of the image-side surface 154 of the fifth lens element 150 is represented by SGI521, which satisfies the following conditions: SGI 511-0.00364 mm; | SGI511 |/(| SGI511 | + TP5) | 0.00338; SGI521 ═ 0.63365 mm; | SGI521 |/(| SGI521 | + TP5) | -0.37154.
A horizontal displacement distance parallel to the optical axis between an intersection point of the object-side surface 152 of the fifth lens element 150 on the optical axis and a second inflection point close to the optical axis of the object-side surface 152 of the fifth lens element 150 is represented by SGI512, and a horizontal displacement distance parallel to the optical axis between an intersection point of the image-side surface 154 of the fifth lens element 150 on the optical axis and a second inflection point close to the optical axis of the image-side surface 154 of the fifth lens element 150 is represented by SGI522, which satisfies the following conditions: SGI512 ═ 0.32032 mm; | SGI512 |/(| SGI512 | + TP5) | -0.23009.
A horizontal displacement distance parallel to the optical axis between the intersection point of the object-side surface 152 of the fifth lens element 150 on the optical axis and the third inflection point near the optical axis of the object-side surface 152 of the fifth lens element 150 is denoted by SGI513, and a horizontal displacement distance parallel to the optical axis between the intersection point of the image-side surface 154 of the fifth lens element 150 on the optical axis and the third inflection point near the optical axis of the image-side surface 154 of the fifth lens element 150 is denoted by SGI523, which satisfies the following conditions: SGI513 ═ 0 mm; | SGI513 |/(| SGI513 | + TP5) | 0; SGI523 ═ 0 mm; | SGI523 |/(| SGI523 | + TP5) | 0.
A horizontal displacement distance parallel to the optical axis between an intersection point of the object-side surface 152 of the fifth lens element 150 on the optical axis and a fourth inflection point of the object-side surface 152 of the fifth lens element 150 near the optical axis is denoted by SGI514, and a horizontal displacement distance parallel to the optical axis between an intersection point of the image-side surface 154 of the fifth lens element 150 on the optical axis and a fourth inflection point of the image-side surface 154 of the fifth lens element 150 near the optical axis is denoted by SGI524, which satisfies the following conditions: SGI514 ═ 0 mm; | SGI514 |/(| SGI514 | + TP5) | 0; SGI524 ═ 0 mm; | SGI524 |/(| SGI524 | + TP5) | 0.
The vertical distance between the inflection point of the nearest optical axis of the object-side surface 152 of the fifth lens 150 and the optical axis is represented by HIF511, and the vertical distance between the inflection point of the nearest optical axis of the image-side surface 154 of the fifth lens 150 and the optical axis is represented by HIF521, which satisfies the following conditions: HIF 511-0.28212 mm; HIF 511/HOI 0.05642; HIF521 ═ 2.13850 mm; HIF521/HOI 0.42770.
The vertical distance between the second paraxial inflection of object-side surface 152 of fifth lens element 150 and the optical axis is denoted by HIF512, and the vertical distance between the second paraxial inflection of image-side surface 154 of fifth lens element 150 and the optical axis is denoted by HIF522, satisfying the following conditions: HIF 512-2.51384 mm; HIF 512/HOI 0.50277.
The vertical distance between the third near-optic axis inflection point of the object-side surface 152 of the fifth lens 150 and the optical axis is denoted by HIF513, and the vertical distance between the third near-optic axis inflection point of the image-side surface 154 of the fifth lens 150 and the optical axis is denoted by HIF523, which satisfies the following conditions: HIF513 ═ 0 mm; HIF513/HOI ═ 0; HIF523 ═ 0 mm; HIF523/HOI ═ 0.
The vertical distance between the fourth off-curve of the object-side surface 152 of the fifth lens element 150 near the optical axis and the optical axis is denoted by HIF514, and the vertical distance between the fourth off-curve of the image-side surface 154 of the fifth lens element 150 near the optical axis and the optical axis is denoted by HIF524, which satisfy the following conditions: HIF514 ═ 0 mm; HIF514/HOI ═ 0; HIF524 ═ 0 mm; HIF524/HOI ═ 0.
The sixth lens element 160 with negative refractive power has a concave object-side surface 162 and a concave image-side surface 164, wherein the object-side surface 162 has two inflection points and the image-side surface 164 has one inflection point. Therefore, the angle of each field of view incident on the sixth lens can be effectively adjusted to improve aberration. The maximum effective radius of the sixth lens object-side surface has a contour curve length represented by ARS61 and the maximum effective radius of the sixth lens image-side surface has a contour curve length represented by ARS 62. The contour curve length of the 1/2 entrance pupil diameter (HEP) of the object-side surface of the sixth lens is denoted as ARE61, and the contour curve length of the 1/2 entrance pupil diameter (HEP) of the image-side surface of the sixth lens is denoted as ARE 62. The thickness of the sixth lens element on the optical axis is TP 6.
A horizontal displacement distance parallel to the optical axis between an intersection point of the object-side surface 162 of the sixth lens element 160 on the optical axis and an inflection point of the object-side surface 162 of the sixth lens element 160 closest to the optical axis is represented by SGI611, and a horizontal displacement distance parallel to the optical axis between an intersection point of the image-side surface 164 of the sixth lens element 160 on the optical axis and an inflection point of the image-side surface 164 of the sixth lens element 160 closest to the optical axis is represented by SGI621, which satisfies the following conditions: SGI611 ═ 0.38558 mm; | SGI611 |/(| SGI611 | + TP6) | -0.27212; SGI 621-0.12386 mm; | SGI621 |/(| SGI621 | + TP6) | -0.10722.
A horizontal displacement distance parallel to the optical axis between an intersection point of the object-side surface 162 of the sixth lens element 160 on the optical axis and a second inflection point near the optical axis of the object-side surface 162 of the sixth lens element 160 is denoted by SGI612, and a horizontal displacement distance parallel to the optical axis between an intersection point of the image-side surface 164 of the sixth lens element 160 on the optical axis and a second inflection point near the optical axis of the image-side surface 164 of the sixth lens element 160 is denoted by SGI621, which satisfies the following conditions: SGI612 ═ -0.47400 mm; | SGI612 |/(| SGI612 | + TP6) | -0.31488; SGI622 ═ 0 mm; | SGI622 |/(| SGI622 | + TP6) | 0.
The vertical distance between the inflection point of the object-side surface 162 of the sixth lens element 160 closest to the optical axis and the optical axis is HIF611, and the vertical distance between the inflection point of the image-side surface 164 of the sixth lens element 160 closest to the optical axis and the optical axis is HIF621, which satisfies the following conditions: HIF611 ═ 2.24283 mm; HIF 611/HOI 0.44857; HIF 621-1.07376 mm; HIF621/HOI 0.21475.
The vertical distance between the second near-optic axis inflection point of the object-side surface 162 of the sixth lens element 160 and the optical axis is denoted by HIF612, and the vertical distance between the second near-optic axis inflection point of the image-side surface 164 of the sixth lens element 160 and the optical axis is denoted by HIF622, which satisfy the following conditions: HIF612 ═ 2.48895 mm; HIF 612/HOI 0.49779.
The vertical distance between the third near-optic axis inflection point of the object-side surface 162 of the sixth lens element 160 and the optical axis is denoted by HIF613, and the vertical distance between the third near-optic axis inflection point of the image-side surface 164 of the sixth lens element 160 and the optical axis is denoted by HIF623, which satisfy the following conditions: HIF613 ═ 0 mm; HIF613/HOI ═ 0; HIF623 ═ 0 mm; HIF623/HOI is 0.
The vertical distance between the fourth inflection point near the optical axis of the object-side surface 162 of the sixth lens element 160 and the optical axis is denoted by HIF614, and the vertical distance between the fourth inflection point near the optical axis of the image-side surface 164 of the sixth lens element 160 and the optical axis is denoted by HIF624, which satisfy the following conditions: HIF614 ═ 0 mm; HIF614/HOI ═ 0; HIF624 ═ 0 mm; HIF624/HOI ═ 0.
The infrared filter 180 is made of glass, and is disposed between the sixth lens element 160 and the image plane 190 without affecting the focal length of the optical imaging system 10.
In the optical imaging system 10 of the present embodiment, the focal length is f, the entrance pupil diameter is HEP, and half of the maximum viewing angle is HAF, and the values thereof are as follows: f is 4.075 mm; f/HEP is 1.4; and HAF 50.001 degrees and tan (HAF) 1.1918.
In this embodiment, the focal length of the first lens element 110 is f1, and the focal length of the sixth lens element 160 is f6, which satisfies the following conditions: f 1-7.828 mm; | f/f1 | -0.52060; f6 ═ 4.886; and | f1 | -f 6 |.
In the optical imaging system 10 of the present embodiment, the focal lengths of the second lens 120 to the fifth lens 150 are f2, f3, f4, and f5, respectively, which satisfy the following conditions: f2 | + -f 3 | + f4 | + f5 | -95.50815 mm; | f1 | + -f 6 | -12.71352 mm and | -f 2 | + f3 | -f 4 | + | f5 | -f 1 | -f 6 |.
The ratio of the focal length f of the optical imaging system 10 to the focal length fp of each lens with positive refractive power is PPR, the ratio of the focal length f of the optical imaging system 10 to the focal length fn of each lens with negative refractive power is NPR, in the optical imaging system 10 of the present embodiment, the sum of the PPR of all the lenses with positive refractive power is Σ PPR ═ f/2 + f/f4+ f/f5 ═ 1.63290, the sum of the NPR of all the lenses with negative refractive power is Σ NPR ═ f/f1 +| |/f 3 +|/f 6 ═ 1.51305, and Σ PPR/∑ NPR | (1.07921). The following conditions are also satisfied: | f/f2 | -0.69101; | f/f3 | -0.15834; | f/f4 | -0.06883; | f/f5 | -0.87305; | f/f6 | -0.83412.
In the optical imaging system 10 of the present embodiment, the distance between the object-side surface 112 of the first lens 110 and the image-side surface 164 of the sixth lens 160 is InT L, the distance between the object-side surface 112 of the first lens 110 and the image-forming surface 190 is HOS, the distance between the aperture stop 100 and the image-forming surface 180 is InS, the half of the diagonal length of the effective sensing area of the image sensor 192 is HOI, and the distance between the image-side surface 164 of the sixth lens and the image-forming surface 190 is BF L, which satisfies the following conditions of InT L + BF L ═ HOS, HOS ═ 19.54120mm, HOI ═ 5.0 mm, HOS/HOI ═ 3.90824, HOS/f ═ 4.7952, InS ═ 11.685mm, and InS/HOS ═ 0.59794.
In the optical imaging system 10 of the present embodiment, the total thickness of all the lenses with refractive power on the optical axis is Σ TP, which satisfies the following conditions, Σ TP 8.13899mm and Σ TP/InT L0.52477, thereby, it is able to provide an appropriate back focus to accommodate other devices while simultaneously considering the contrast of the system imaging and the yield of the lens manufacturing.
In the optical imaging system 10 of the present embodiment, the radius of curvature of the object-side surface 112 of the first lens 110 is R1, and the radius of curvature of the image-side surface 114 of the first lens 110 is R2, which satisfies the following conditions: R1/R2 | -8.99987. Therefore, the first lens element 110 has a proper positive refractive power strength, and prevents the spherical aberration from increasing too fast.
In the optical imaging system 10 of the present embodiment, the curvature radius of the object-side surface 162 of the sixth lens 160 is R11, and the curvature radius of the image-side surface 164 of the sixth lens 160 is R12, which satisfies the following conditions: (R11-R12)/(R11+ R12) ═ 1.27780. Thereby, astigmatism generated by the optical imaging system 10 is favorably corrected.
In the optical imaging system 10 of the present embodiment, the sum of the focal lengths of all the lenses with positive refractive power is Σ PP, which satisfies the following condition: f2+ f4+ f5 is 69.770 mm; and f5/(f2+ f4+ f5) ═ 0.067. Therefore, the positive refractive power of the single lens can be properly distributed to other positive lenses, so that the generation of remarkable aberration in the process of the incident light ray is inhibited.
In the optical imaging system 10 of the present embodiment, the sum of the focal lengths of all the lenses with negative refractive power is Σ NP, which satisfies the following condition: Σ NP ═ f1+ f3+ f6 ═ 38.451 mm; and f6/(f1+ f3+ f6) ═ 0.127. Therefore, the negative refractive power of the sixth lens element 160 can be properly distributed to the other negative lens elements, so as to suppress the occurrence of significant aberration during the incident light beam traveling process.
IN the optical imaging system 10 of the present embodiment, the distance between the first lens element 110 and the second lens element 120 on the optical axis is IN12, which satisfies the following condition: IN 12-6.418 mm; IN12/f 1.57491. Therefore, the chromatic aberration of the lens is improved to improve the performance of the lens.
IN the optical imaging system 10 of the present embodiment, the distance between the fifth lens element 150 and the sixth lens element 160 on the optical axis is IN56, which satisfies the following condition: IN56 is 0.025 mm; IN56/f 0.00613. Therefore, the chromatic aberration of the lens is improved to improve the performance of the lens.
In the optical imaging system 10 of the present embodiment, the thicknesses of the first lens element 110 and the second lens element 120 on the optical axis are TP1 and TP2, respectively, which satisfy the following conditions: TP 1-1.934 mm; TP 2-2.486 mm; and (TP1+ IN12)/TP2 ═ 3.36005. Thereby helping to control the manufacturing sensitivity and improve the performance of the optical imaging system 10.
IN the optical imaging system 10 of the present embodiment, the thicknesses of the fifth lens element 150 and the sixth lens element 160 on the optical axis are TP5 and TP6, respectively, and the distance between the two lens elements on the optical axis is IN56, which satisfies the following conditions: TP5 ═ 1.072 mm; TP6 ═ 1.031 mm; and (TP6+ IN56)/TP5 ═ 0.98555. Thereby helping to control the sensitivity of the optical imaging system 10 fabrication and reducing the overall system height.
IN the optical imaging system 10 of the present embodiment, the distance between the third lens element 130 and the fourth lens element 140 on the optical axis is IN34, and the distance between the fourth lens element 140 and the fifth lens element 150 on the optical axis is IN45, which satisfies the following conditions: IN34 is 0.401 mm; IN45 is 0.025 mm; and TP 4/(IN 34+ TP4+ IN45) ═ 0.74376. Therefore, the aberration generated in the process of the incident light traveling is corrected in a layer-by-layer micro-amplitude manner, and the total height of the system is reduced.
In the optical imaging system 10 of the present embodiment, a horizontal displacement distance between an intersection point of the object-side surface 152 of the fifth lens element 150 on the optical axis and the maximum effective radius position of the object-side surface 152 of the fifth lens element 150 on the optical axis is InRS51, a horizontal displacement distance between an intersection point of the image-side surface 154 of the fifth lens element 150 on the optical axis and the maximum effective radius position of the image-side surface 154 of the fifth lens element 150 on the optical axis is InRS52, and a thickness of the fifth lens element 150 on the optical axis is TP5, which satisfies the following conditions: InRS 51-0.34789 mm; InRS 52-0.88185 mm; | InRS51 |/TP 5 ═ 0.32458 and | InRS52 |/TP 5 ═ 0.82276. Therefore, the lens is beneficial to the manufacture and the molding of the lens, and the miniaturization of the lens is effectively maintained.
In the optical imaging system 10 of the present embodiment, a vertical distance between a critical point of the object-side surface 152 of the fifth lens element 150 and the optical axis is HVT51, and a vertical distance between a critical point of the image-side surface 154 of the fifth lens element 150 and the optical axis is HVT52, which satisfies the following conditions: HVT51 ═ 0.515349 mm; HVT 52-0 mm.
In the optical imaging system 10 of the present embodiment, a horizontal displacement distance between an intersection point of the object-side surface 162 of the sixth lens element 160 on the optical axis and the maximum effective radius position of the object-side surface 162 of the sixth lens element 160 on the optical axis is InRS61, a horizontal displacement distance between an intersection point of the image-side surface 164 of the sixth lens element 160 on the optical axis and the maximum effective radius position of the image-side surface 164 of the sixth lens element 160 on the optical axis is InRS62, and a thickness of the sixth lens element 160 on the optical axis is TP6, which satisfies the following conditions: InRS 61-0.58390 mm; InRS62 ═ 0.41976 mm; | InRS61 |/TP 6 ═ 0.56616 and | InRS62 |/TP 6 ═ 0.40700. Therefore, the lens is beneficial to the manufacture and the molding of the lens, and the miniaturization of the lens is effectively maintained.
In the optical imaging system 10 of the present embodiment, a vertical distance between a critical point of the object-side surface 162 of the sixth lens element 160 and the optical axis is HVT61, and a vertical distance between a critical point of the image-side surface 164 of the sixth lens element 160 and the optical axis is HVT62, which satisfies the following conditions: HVT61 ═ 0 mm; HVT 62-0 mm.
In the optical imaging system 10 of the present embodiment, it satisfies the following conditions: HVT51/HOI 0.1031. Thereby, aberration correction of the peripheral field of view of the optical imaging system 10 is facilitated.
In the optical imaging system 10 of the present embodiment, it satisfies the following conditions: HVT51/HOS 0.02634. Thereby, aberration correction of the peripheral field of view of the optical imaging system 10 is facilitated.
In the optical imaging system 10 of the present embodiment, the second lens element 120, the third lens element 130 and the sixth lens element 160 have negative refractive power, the abbe number of the second lens element 120 is NA2, the abbe number of the third lens element 130 is NA3, and the abbe number of the sixth lens element 160 is NA6, which satisfy the following conditions: NA6/NA2 is less than or equal to 1. Thereby, correction of chromatic aberration of the optical imaging system 10 is facilitated.
In the optical imaging system 10 of the present embodiment, the TV distortion at the image formation of the optical imaging system 10 is TDT, and the optical distortion at the image formation is ODT, which satisfy the following conditions: TDT 2.124%; and the ODT is 5.076 percent.
The following list I and list II are referred to cooperatively.
Figure BDA0002070402320000391
Figure BDA0002070402320000401
TABLE II aspheric coefficients of the first optical example
Figure BDA0002070402320000402
According to the first and second tables, the following values related to the length of the profile curve can be obtained:
Figure BDA0002070402320000404
Figure BDA0002070402320000411
in FIG. 2B, the detailed structural data of the first optical embodiment is shown, wherein the units of the radius of curvature, the thickness, the distance, and the focal length are mm, and surfaces 0-16 sequentially represent the surfaces from the object side to the image side. Table II shows aspheric data of the first optical embodiment, where k represents the cone coefficients in the aspheric curve equation, and A1-A20 represents the aspheric coefficients of order 1-20 of each surface. In addition, the following tables of the optical embodiments correspond to the schematic diagrams and aberration graphs of the optical embodiments, and the definitions of the data in the tables are the same as those of the first table and the second table of the first optical embodiment, which is not repeated herein. In addition, the following optical embodiments have the same mechanical element parameters as those of the first optical embodiment.
Second optical embodiment
Referring to fig. 3A and fig. 3B, fig. 3A is a schematic diagram of an optical imaging system 20 of a lens assembly according to a second optical embodiment of the present invention, and fig. 3B is a graph of spherical aberration, astigmatism and optical distortion of the optical imaging system 20 of the second optical embodiment in order from left to right. In fig. 3A, the optical imaging system 20 includes, in order from an object side to an image side, an aperture stop 200, a first lens element 210, a second lens element 220, a third lens element 230, a fourth lens element 240, a fifth lens element 250, a sixth lens element 260, a seventh lens element 270, an ir-filter 280, an image plane 290, and an image sensor 292.
The first lens element 210 with negative refractive power has a convex object-side surface 212 and a concave image-side surface 214, and is made of glass.
The second lens element 220 with negative refractive power has a concave object-side surface 222 and a convex image-side surface 224.
The third lens element 230 with positive refractive power has a convex object-side surface 232 and a convex image-side surface 234.
The fourth lens element 240 with positive refractive power has a convex object-side surface 242 and a convex image-side surface 244.
The fifth lens element 250 with positive refractive power has a convex object-side surface 252 and a convex image-side surface 254, and is made of glass.
The sixth lens element 260 with negative refractive power has a concave object-side surface 262 and a concave image-side surface 264. Therefore, the angle of incidence of each field of view on the sixth lens element 260 can be effectively adjusted to improve aberration.
The seventh lens element 270 with negative refractive power has a convex object-side surface 272 and a convex image-side surface 274. Thereby, the back focal length is advantageously shortened to maintain miniaturization. In addition, the angle of incidence of the light rays in the off-axis field can be effectively suppressed, and the aberration of the off-axis field can be further corrected.
The infrared filter 280 is made of glass, and is disposed between the seventh lens element 270 and the image plane 290 without affecting the focal length of the optical imaging system 20.
Please refer to the following table three and table four.
Figure BDA0002070402320000421
Figure BDA0002070402320000431
TABLE IV aspheric coefficients of the second optical example
Figure BDA0002070402320000432
In the second optical embodiment, the curve equation of the aspherical surface represents the form as in the first optical embodiment. In addition, the following parameters are defined in the same way as in the first optical embodiment, and are not repeated herein.
According to the third table and the fourth table, the following conditional expressions can be obtained:
Figure BDA0002070402320000441
according to table three and table four, the following values related to the profile curve length can be obtained:
Figure BDA0002070402320000442
Figure BDA0002070402320000451
according to the third table and the fourth table, the following conditional expressions can be obtained:
Figure BDA0002070402320000452
third optical embodiment
Referring to fig. 4A and 4B, fig. 4A is a schematic diagram of an optical imaging system 30 with a lens assembly according to a third optical embodiment of the present invention, and fig. 4B is a graph sequentially showing a spherical aberration, an astigmatism and an optical distortion of the optical imaging system 30 according to the third optical embodiment from left to right. In fig. 4A, the optical imaging system 30 includes, in order from an object side to an image side, a first lens element 310, a second lens element 320, a third lens element 330, an aperture stop 300, a fourth lens element 340, a fifth lens element 350, a sixth lens element 360, an ir-pass filter 380, an image plane 390 and an image sensor 392.
The first lens element 310 with negative refractive power has a convex object-side surface 312 and a concave image-side surface 314.
The second lens element 320 with negative refractive power has a concave object-side surface 322 and a convex image-side surface 324.
The third lens element 330 with positive refractive power has a convex object-side surface 332 and a convex image-side surface 334, which are both aspheric, and the image-side surface 334 has a point of inflection.
The fourth lens element 340 with negative refractive power has a concave object-side surface 342 and a concave image-side surface 344, which are both aspheric, and the image-side surface 344 has an inflection point.
The fifth lens element 350 with positive refractive power has a convex object-side surface 352 and a convex image-side surface 354.
The sixth lens element 360 with negative refractive power has a convex object-side surface 362 and a concave image-side surface 364, and both the object-side surface 362 and the image-side surface 364 have an inflection point. Thereby, the back focal length is advantageously shortened to maintain miniaturization. In addition, the angle of incidence of the light rays in the off-axis field can be effectively suppressed, and the aberration of the off-axis field can be further corrected.
The infrared filter 380 is made of glass, and is disposed between the sixth lens element 360 and the image plane 390 without affecting the focal length of the optical imaging system 30.
Please refer to table five and table six below.
Figure BDA0002070402320000461
TABLE VI aspheric coefficients of the third optical example
Figure BDA0002070402320000462
Figure BDA0002070402320000471
In the third optical embodiment, the curve equation of the aspherical surface represents the form as in the first optical embodiment. In addition, the following parameters are defined in the same way as in the first optical embodiment, and are not repeated herein.
According to table five and table six, the following conditional values can be obtained:
Figure BDA0002070402320000473
Figure BDA0002070402320000481
according to table five and table six, the following values related to the profile curve length can be obtained:
Figure BDA0002070402320000482
according to table five and table six, the following conditional values can be obtained:
Figure BDA0002070402320000483
fourth optical embodiment
Referring to fig. 5A and 5B, fig. 5A is a schematic diagram of an optical imaging system 40 of a lens assembly according to a fourth optical embodiment of the present invention, and fig. 5B is a graph of spherical aberration, astigmatism and optical distortion of the optical imaging system 40 of the fourth optical embodiment in order from left to right. In fig. 5A, the optical imaging system 40 includes, in order from an object side to an image side, a first lens element 410, a second lens element 420, an aperture stop 400, a third lens element 430, a fourth lens element 440, a fifth lens element 450, an ir-filter 480, an image plane 490 and an image sensor 492.
The first lens element 410 with negative refractive power has a convex object-side surface 412 and a concave image-side surface 414, and is made of glass.
The second lens element 420 with negative refractive power has a concave object-side surface 422 and a concave image-side surface 424, which are both aspheric, and the object-side surface 422 has an inflection point.
The third lens element 430 with positive refractive power has a convex object-side surface 432 and a convex image-side surface 434, and is aspheric, and the object-side surface 432 has an inflection point.
The fourth lens element 440 with positive refractive power has a convex object-side surface 442 and a convex image-side surface 444, which are both aspheric, and the object-side surface 442 has a inflection point.
The fifth lens element 450 with negative refractive power has a concave object-side surface 452 and a concave image-side surface 454, which are both aspheric, and the object-side surface 452 has two inflection points. Thereby, the back focal length is advantageously shortened to maintain miniaturization.
The infrared filter 480 is made of glass, and is disposed between the fifth lens element 450 and the image plane 490 without affecting the focal length of the optical imaging system 40.
Please refer to table seven and table eight below.
Figure BDA0002070402320000491
Figure BDA0002070402320000501
TABLE eighth and fourth optical examples aspheric coefficients
Figure BDA0002070402320000503
In a fourth optical embodiment, the curve equation for the aspheric surface represents the form as in the first optical embodiment. In addition, the following parameters are defined in the same way as in the first optical embodiment, and are not repeated herein.
According to the seventh and eighth tables, the following conditional values can be obtained:
Figure BDA0002070402320000504
Figure BDA0002070402320000511
according to table seven and table eight, the following values related to the profile curve length can be obtained:
Figure BDA0002070402320000512
according to the seventh and eighth tables, the following conditional values can be obtained:
Figure BDA0002070402320000521
fifth optical embodiment
Referring to fig. 6A and 6B, fig. 6A is a schematic diagram illustrating an optical imaging system 50 of a lens assembly according to a fifth optical embodiment of the present invention, and fig. 6B is a graph sequentially showing a spherical aberration, an astigmatism and an optical distortion of the optical imaging system 50 according to the fifth optical embodiment from left to right. In fig. 6A, the optical imaging system 50 includes, in order from an object side to an image side, an aperture stop 500, the first lens element 510, the second lens element 520, the third lens element 530, the fourth lens element 540, an ir-filter 570, an image plane 580 and an image sensor 590.
The first lens element 510 with positive refractive power has a convex object-side surface 512 and a convex image-side surface 514, and is aspheric, and the object-side surface 512 has an inflection point.
The second lens element 520 with negative refractive power has a convex object-side surface 522 and a concave image-side surface 524, and is aspheric, wherein the object-side surface 522 has two inflection points and the image-side surface 524 has one inflection point.
The third lens element 530 with positive refractive power has a concave object-side surface 532 and a convex image-side surface 534, and is aspheric, and the object-side surface 532 has three inflection points and the image-side surface 534 has one inflection point.
The fourth lens element 540 with negative refractive power is made of plastic, has a concave object-side surface 542 and a concave image-side surface 544, and is aspheric, wherein the object-side surface 542 has two inflection points and the image-side surface 544 has one inflection point.
The ir filter 570 is made of glass, and is disposed between the fourth lens element 540 and the image plane 580 without affecting the focal length of the optical imaging system 50.
Please refer to table nine and table ten below.
Figure BDA0002070402320000531
TABLE Ten, aspheric coefficients of the fifth optical example
Figure BDA0002070402320000532
Figure BDA0002070402320000541
In a fifth optical embodiment, the curve equation for the aspheric surface represents the form as in the first optical embodiment. In addition, the following parameters are defined in the same way as in the first optical embodiment, and are not repeated herein.
The following conditional values are obtained according to table nine and table ten:
Figure BDA0002070402320000542
Figure BDA0002070402320000551
the following conditional values are obtained according to table nine and table ten:
Figure BDA0002070402320000552
values associated with the profile curve length can be obtained according to table nine and table ten:
Figure BDA0002070402320000553
sixth optical embodiment
Referring to fig. 7A and 7B, fig. 7A is a schematic diagram of an optical imaging system 60 of a lens assembly according to a sixth optical embodiment of the present invention, and fig. 7B is a graph illustrating spherical aberration, astigmatism and optical distortion of the optical imaging system 60 of the sixth optical embodiment in order from left to right. In fig. 7A, the optical imaging system 60 includes, in order from an object side to an image side, a first lens element 610, an aperture stop 600, a second lens element 620, a third lens element 630, an ir-filter 670, an image plane 680 and an image sensor 690.
The first lens element 610 with positive refractive power has a convex object-side surface 612 and a concave image-side surface 614.
The second lens element 620 with negative refractive power has a concave object-side surface 622 and a convex image-side surface 624, and is aspheric, and the image-side surface 624 has an inflection point.
The third lens element 630 with positive refractive power has a convex object-side surface 632, a convex image-side surface 634, and two aspheric surfaces, wherein the object-side surface 632 has two inflection points and the image-side surface 634 has one inflection point.
The infrared filter 670 is made of glass, and is disposed between the third lens element 630 and the image plane 680 without affecting the focal length of the optical imaging system 60.
Please refer to the following table eleven and table twelve.
Figure BDA0002070402320000561
Aspheric coefficients of the twelfth and sixth optical examples
Figure BDA0002070402320000571
In the sixth optical embodiment, the curve equation of the aspherical surface represents the form as in the first optical embodiment. In addition, the following parameters are defined in the same way as in the first optical embodiment, and are not repeated herein.
The following conditional values were obtained according to table eleven and table twelve:
Figure BDA0002070402320000572
the following conditional values were obtained according to table eleven and table twelve:
Figure BDA0002070402320000573
Figure BDA0002070402320000581
values associated with the profile curve length are obtained according to table eleven and table twelve:
Figure BDA0002070402320000582
the utility model discloses a visual demand of optical imaging system reaches the required mechanism space of reduction by the lens of different numbers.
Although the present invention has been described with reference to the above embodiments, it is to be understood that the invention is not limited to the disclosed embodiments, but may be modified and practiced by those skilled in the art without departing from the spirit and scope of the invention.
While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present invention as defined by the following claims and their equivalents.

Claims (40)

1. A mobile carrier assistance system includes:
a driving state detection device, which comprises a physiological state detection module, a storage module and an operation module; the physiological state detection module is used for detecting at least one physiological state of a driver; the storage module is arranged in a mobile carrier and stores allowable parameters corresponding to the at least one physiological state; the operation module is arranged in the mobile carrier and is in signal connection with the physiological state detection module and the storage module so as to detect whether the plurality of physiological states of the driver exceed the allowable parameters and generate a corresponding detection signal; and
the warning device is electrically connected with the operation module and used for generating warning information when receiving the detection signals that the plurality of physiological states of the driver exceed the allowable parameters;
the physiological state detection module comprises an image capturing module for capturing at least one driving image of the driver in the mobile carrier, and the operation module judges whether the plurality of physiological states of the driver exceed the allowable parameters according to the driving image and generates corresponding detection signals; the method is characterized in that:
the image capturing module comprises a lens set, and the lens set comprises at least two lenses with refractive power; in addition, the lens group further satisfies the following condition:
1.0≤f/HEP≤10.0;
0deg < HAF ≤ 150 deg; and
0.9≤2(ARE/HEP)≤2.0
wherein f is the focal length of the lens group; HEP is the diameter of an entrance pupil of the lens group; HAF is half of the maximum visual angle of the lens group; AR is E is the length of the contour curve obtained along the contour of any lens surface in the lens group starting at the intersection of the lens surface with the optical axis and ending at a position at a vertical height from the entrance pupil diameter of the optical axis 1/2.
2. The system of claim 1, further comprising a control device disposed on the mobile carrier and electrically connected to the computing module and the storage module; the storage module further stores an actuation mode corresponding to whether the allowable parameter is exceeded or not; the control device reads the corresponding action mode from the storage module to control the mobile carrier according to the received detection signal whether the plurality of physiological conditions of the driver exceed the allowable parameters.
3. The system of claim 2, further comprising an activation device in signal communication with the control device, wherein the driver operates the activation device to activate or deactivate a power system of the mobile vehicle; when the mobile carrier is in a state that the power system is closed, the driver operates the starting device to start the power system, and the control device receives detection signals that the plurality of physiological states of the driver do not exceed the allowable parameters, the mobile carrier is controlled in an actuating mode allowing the power system to be started; when the mobile carrier is in a state that the power system is closed, the driver operates the starting device to start the power system, and the control device receives a detection signal that at least one physiological state of the driver exceeds the allowable parameter, the mobile carrier is controlled in an actuating mode of prohibiting the starting of the power system.
4. The system of claim 3, wherein the physiological status detection module is disposed on the activation device.
5. The system of claim 4, wherein the activation device is disposed in the mobile carrier.
6. The system of claim 3, wherein the control device controls the mobile carrier in an autonomous driving mode when the mobile carrier is in a state of being activated by the power system and the control device receives a detection signal indicating that at least one of the physiological states of the driver exceeds the allowable parameter for a predetermined time.
7. The mobile vehicle assistance system of claim 6 further comprising a vehicle condition detection device; the vehicle state detection device is arranged on the action carrier and used for detecting the running state of the action carrier and generating a state signal; the control device is further electrically connected with the state detection device and used for controlling the mobile carrier to automatically drive according to the state signal.
8. The system of claim 1, wherein the storage module further stores at least one emergency contact message, and the alert device is further electrically connected to the storage module for sending the alert message to the emergency contact message.
9. The system of claim 1, comprising a warning device electrically connected to the warning device for generating a corresponding light, sound, vibration or physical contact with the driver when the warning device emits the warning message.
10. The system of claim 1, further comprising a display device electrically connected to the warning device for displaying the warning message.
11. The system of claim 10, wherein the display device displays the warning message as at least one of an image and a text.
12. The mobile vehicle accessory system of claim 10, wherein the display device is an electronic rearview mirror for a vehicle.
13. The mobile carrier assistance system of claim 12 wherein the display device comprises:
a first light-transmitting element having
A first light-receiving surface; and
a first light-emitting surface, from which an image is incident to the first light-transmitting element and emitted;
a second light-transmitting element disposed on the first light-emitting surface, forming a gap with the first light-transmitting element, and including:
a second light-receiving surface; and
a second light-emitting surface, from which the image is emitted to the second light-transmitting element and then emitted;
an electro-optic medium layer arranged between the gap formed by the first light-emitting surface of the first light-transmitting element and the second light-receiving surface of the second light-transmitting element;
at least one light-transmitting electrode arranged between the first light-transmitting element and the electro-optic medium layer;
at least one reflecting layer, wherein the electro-optic medium layer is configured between the first light-transmitting element and the reflecting layer;
at least one transparent conductive layer configured between the electro-optic medium layer and the reflecting layer;
at least one electrical connector, which is connected with the electro-optic medium layer and transmits an electric energy to the electro-optic medium layer to change the transparency of the electro-optic medium layer; and
and the control element is connected with the electrical connecting piece and controls the electrical connecting piece to provide the electric energy for the electro-optic medium layer when light exceeding a brightness is generated in the image.
14. The system of claim 1, wherein the physiological status detection module is disposed on a wearable device of the driver to enter or leave the mobile carrier with the driver.
15. The system of claim 1, wherein the physiological status of the driver analyzed by the computing module according to the driving image is whether the direction of the driver's sight line is toward the traveling direction of the mobile carrier, the time of the sight line change, the frequency of the sight line change, the time of the eye closing, and the frequency of the blinking, and the storage module stores corresponding allowable parameters.
16. The system of claim 1, wherein the physiological status detection module further comprises a rhythm detection module for being touched by the driver, the detected physiological status of the driver is a rhythm of heart or a change in rhythm of heart, and the storage module stores corresponding allowable parameters.
17. The system of claim 16, wherein the warning device generates the warning message when the detection signals indicating that the physiological status detected by the image capturing module and the cardiac rhythm detecting module both exceed the allowable parameter are received by the warning device.
18. The system of claim 16, wherein the warning device generates the warning message when receiving the detection signal indicating that one of the physiological states detected by the image capturing module and the cardiac rhythm detecting module exceeds the allowable parameter.
19. The system of claim 1, wherein the physiological status detection module further comprises a blood pressure detection module for being touched by the driver, the detected physiological status of the driver is blood pressure or blood pressure change, and the storage module stores corresponding allowable parameters.
20. The system of claim 19, wherein the warning device generates the warning message when the detection signals indicating that the physiological statuses detected by the image capturing module and the blood pressure detecting module both exceed the allowable parameter are received.
21. The system of claim 19, wherein the warning device generates the warning message when the warning device receives the detection signal indicating that one of the physiological states detected by the image capturing module and the blood pressure detecting module exceeds the allowable parameter.
22. The system of claim 1, wherein the physiological status detection module further comprises a blood component detection module, and the detected physiological status of the driver is alcohol concentration, blood oxygen concentration or blood glucose concentration in blood, and the storage module stores corresponding allowable parameters.
23. The system of claim 22, wherein the warning device generates the warning message when the detection signals indicating that the physiological status detected by the image capturing module and the blood component detecting module both exceed the allowable parameters are received.
24. The system of claim 22, wherein the warning device generates the warning message when the warning device receives the detection signal indicating that one of the physiological states detected by the image capturing module and the blood component detecting module exceeds the allowable parameter.
25. The system of claim 14, wherein the physiological status detection module further comprises an alcohol concentration detection module, the detected physiological status of the driver is alcohol concentration in exhaled air or blood, the storage module stores corresponding allowable parameters, and the warning device generates the warning message when receiving the detection signal indicating that the physiological status detected by the alcohol concentration detection module exceeds the allowable parameters.
26. The system of claim 25, further comprising a control device and an activation device, wherein the control device is disposed on the mobile carrier and electrically connected to the computing module and the storage module; the starting device is in signal connection with the control device, and the driver can operate the starting device to start or stop a power system of the mobile vehicle; when the mobile carrier is in a state that the power system is closed, the driver operates the starting device to start the power system, and the control device receives the detection signal that the physiological state detected by the alcohol concentration detection module does not exceed the allowable parameter, the mobile carrier is controlled in an actuation mode allowing the power system to be started; when the mobile carrier is in a state that the power system is closed, the driver operates the starting device to start the power system, and the control device receives the detection signal that the physiological state detected by the alcohol concentration detection module exceeds the allowable parameter, the mobile carrier is controlled in an actuation mode of prohibiting the power system from being started.
27. The system of claim 26, wherein the alcohol concentration detection module is disposed on an activation device disposed in the mobile vehicle, the activation device having an activation button, the driver operating the activation device to turn on or off the power system by pressing the activation button.
28. The system of claim 25, wherein the alcohol concentration detection module is disposed on a file-changing device of the mobile vehicle, the driver operates the file-changing device to switch the driving status of the mobile vehicle, and when the driver operates the file-changing device, the hand of the driver contacts the alcohol concentration detection module on the file-changing device.
29. The system of claim 25, wherein the physiological status detection module is disposed on a wearable device of the driver and can enter or leave the mobile carrier with the driver.
30. The system of claim 1, wherein the physiological status detection module further comprises a respiration rate detection module for detecting a respiration rate of the driver, the detected physiological status of the driver is the respiration rate, and the storage module stores corresponding allowable parameters.
31. The system of claim 30, wherein the warning device generates the warning message when the detection signals indicating that the physiological statuses of the image capturing module and the respiration rate detecting module both exceed the allowable parameter are received.
32. The system of claim 1, further comprising an update module electrically connected to the storage module for updating the plurality of allowable parameters stored in the storage module.
33. The system of claim 1, wherein the physiological status detection module further comprises a brightness sensor electrically connected to the image capture module for detecting brightness at least in a direction in which the image capture module captures the image, and when the brightness detected by the brightness sensor is greater than an upper threshold, the image capture module captures the driving image in a manner of reducing the amount of light entering, and when the brightness detected by the brightness sensor is less than a lower threshold, the image capture module captures the driving image in a manner of increasing the amount of light entering.
34. The system of claim 1, wherein the lens assembly further satisfies the following conditions:
ARS/EHD is more than or equal to 0.9 and less than or equal to 2.0; wherein, ARS is the length of a contour curve obtained along the contour of the lens surface by taking the intersection point of any lens surface of any lens in the lens group and the optical axis as a starting point and the maximum effective radius of the lens surface as an end point; the EHD is the maximum effective radius of any surface of any lens in the lens group.
35. The system of claim 1, wherein the lens assembly further satisfies the following conditions:
PLTA≤100μm;PSTA≤100μm;NLTA≤100μm;
NSTA less than or equal to 100 μm, S L TA less than or equal to 100 μm, SSTA less than or equal to 100 μm, and-TDT- | less than 250%;
firstly defining HOI as the maximum imaging height perpendicular to the optical axis on the imaging surface of the image acquisition module, P L TA as the transverse aberration of the longest working wavelength of visible light of the positive meridian light fan of the image acquisition module passing through the edge of the entrance pupil and incident on the imaging surface at 0.7HOI, PSTA as the transverse aberration of the shortest working wavelength of visible light of the positive meridian light fan of the image acquisition module passing through the edge of the entrance pupil and incident on the imaging surface at 0.7HOI, N L TA as the transverse aberration of the longest working wavelength of visible light of the negative meridian light fan of the image acquisition module passing through the edge of the entrance pupil and incident on the imaging surface at 0.7HOI, NSTA as the transverse aberration of the shortest working wavelength of visible light of the negative meridian light fan of the image acquisition module passing through the edge of the entrance pupil and incident on the imaging surface at 0.7HOI, S L TA as the transverse aberration of the longest working wavelength of visible light of the radial light of the image acquisition module passing through the edge of the entrance pupil and incident on the imaging surface at 0.7HOI, and the shortest working wavelength of the incident image of the entrance pupil edge of the imaging surface at 0.7HOI of the imaging module, and the image acquisition module passing through the TDTA as the shortest working wavelength of the imaging surface at 0.7 image acquisition module at the entrance pupil, and the TDTA of the image acquisition module at the imaging surface at the shortest.
36. The system of claim 1, wherein the lens assembly comprises four lens elements with refractive power, in order from an object side to an image side, a first lens element, a second lens element, a third lens element and a fourth lens element, and the lens assembly satisfies the following conditions:
0.1-InT L/HOS-0.95, wherein HOS is the distance on the optical axis from the object-side surface of the first lens element to the image plane of the image capture module, and InT L is the distance on the optical axis from the object-side surface of the first lens element to the image-side surface of the fourth lens element.
37. The system of claim 1, wherein the lens assembly comprises five lens elements with refractive power, in order from an object side to an image side, a first lens element, a second lens element, a third lens element, a fourth lens element and a fifth lens element, and the lens assembly satisfies the following conditions:
0.1≤InTL/HOS≤0.95;
HOS is the distance between the object side surface of the first lens element and the image plane of the image capture module on the optical axis, and InT L is the distance between the object side surface of the first lens element and the image side surface of the fifth lens element on the optical axis.
38. The system of claim 1, wherein the lens assembly comprises six lens elements with refractive power, in order from an object side to an image side, a first lens element, a second lens element, a third lens element, a fourth lens element, a fifth lens element and a sixth lens element, and the lens assembly satisfies the following conditions:
0.1≤InTL/HOS≤0.95;
wherein HOS is the distance on the optical axis from the object-side surface of the first lens element to the image plane, and InT L is the distance on the optical axis from the object-side surface of the first lens element to the image-side surface of the sixth lens element.
39. The system of claim 1, wherein the lens assembly comprises seven lens elements with refractive power, in order from an object side to an image side, a first lens element, a second lens element, a third lens element, a fourth lens element, a fifth lens element, a sixth lens element and a seventh lens element, and the lens assembly satisfies the following conditions 0.1 ≤ InT L/HOS ≤ 0.95, wherein HOS is an axial distance from an object-side surface of the first lens element to an image-side surface of the image capture module, and InT L is an axial distance from an object-side surface of the first lens element to an image-side surface of the seventh lens element.
40. The system of claim 1, wherein the lens assembly further comprises an aperture, and the aperture satisfies the following formula: 0.2 or more of InS/HOS or less than 1.1; wherein InS is the distance between the aperture and the imaging surface of the image capture module on the optical axis; HOS is the distance from the lens surface of the lens group farthest from the imaging surface to the imaging surface on the optical axis.
CN201920753471.8U 2019-04-30 2019-05-23 Action carrier auxiliary system Active CN211066569U (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
TW108115010 2019-04-30
TW108115010A TWI749323B (en) 2019-04-30 2019-04-30 Mobile Vehicle Assist System

Publications (1)

Publication Number Publication Date
CN211066569U true CN211066569U (en) 2020-07-24

Family

ID=71636491

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201910435646.5A Pending CN111839478A (en) 2019-04-30 2019-05-23 Action carrier auxiliary system
CN201920753471.8U Active CN211066569U (en) 2019-04-30 2019-05-23 Action carrier auxiliary system

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN201910435646.5A Pending CN111839478A (en) 2019-04-30 2019-05-23 Action carrier auxiliary system

Country Status (3)

Country Link
US (1) US11498419B2 (en)
CN (2) CN111839478A (en)
TW (1) TWI749323B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2601343B (en) * 2020-11-27 2024-04-10 Continental Automotive Tech Gmbh An adaptive warning system and method thereof
WO2023233297A1 (en) * 2022-05-31 2023-12-07 Gentex Corporation Respiration monitoring system using a structured light
CN116269394B (en) * 2023-02-13 2024-03-05 湖南汽车工程职业学院 Wearable human body driving fatigue monitoring device

Family Cites Families (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2773521B1 (en) * 1998-01-15 2000-03-31 Carlus Magnus Limited METHOD AND DEVICE FOR CONTINUOUSLY MONITORING THE DRIVER'S VIGILANCE STATE OF A MOTOR VEHICLE, IN ORDER TO DETECT AND PREVENT A POSSIBLE TREND AS IT GOES TO SLEEP
US6575902B1 (en) * 1999-01-27 2003-06-10 Compumedics Limited Vigilance monitoring system
US7027621B1 (en) * 2001-03-15 2006-04-11 Mikos, Ltd. Method and apparatus for operator condition monitoring and assessment
AU2003211065A1 (en) * 2002-02-19 2003-09-09 Volvo Technology Corporation System and method for monitoring and managing driver attention loads
DE10322458A1 (en) * 2003-05-16 2004-12-02 Daimlerchrysler Ag Method and device for influencing the stress of a driver in a motor vehicle
JP4697185B2 (en) * 2006-12-04 2011-06-08 トヨタ自動車株式会社 Arousal level determination device and arousal level determination method
DE102007047709A1 (en) * 2007-10-05 2009-04-09 Robert Bosch Gmbh Alcohol immobilizer with emergency driving option
US8411245B2 (en) * 2009-02-06 2013-04-02 Gentex Corporation Multi-display mirror system and method for expanded view around a vehicle
EP2885151B1 (en) * 2012-08-14 2022-10-12 Volvo Lastvagnar AB Method for determining the operational state of a driver
KR20140046327A (en) * 2012-10-10 2014-04-18 삼성전자주식회사 Multi display apparatus, input pen, multi display apparatus controlling method and multi display system
TWI546215B (en) * 2013-02-21 2016-08-21 啟碁科技股份有限公司 Drunk driving prevention method and apparatus
US9751534B2 (en) * 2013-03-15 2017-09-05 Honda Motor Co., Ltd. System and method for responding to driver state
US10153796B2 (en) * 2013-04-06 2018-12-11 Honda Motor Co., Ltd. System and method for capturing and decontaminating photoplethysmopgraphy (PPG) signals in a vehicle
AU2013206671B2 (en) * 2013-07-03 2015-05-14 Safemine Ag Operator drowsiness detection in surface mines
JP6208003B2 (en) * 2013-12-25 2017-10-04 富士フイルム株式会社 Imaging lens and imaging apparatus
US20160035273A1 (en) * 2014-08-01 2016-02-04 Gentex Corporation Electroluminescent device
US9302584B2 (en) * 2014-08-25 2016-04-05 Verizon Patent And Licensing Inc. Drowsy driver prevention systems and methods
WO2016073617A1 (en) * 2014-11-06 2016-05-12 Maven Machines, Inc. Wearable device and system for monitoring physical behavior of a vehicle operator
US9775565B1 (en) * 2014-11-21 2017-10-03 Tammy Berg-Neuman Device and system for monitoring operator biometric condition and blood alcohol presence to prevent driving of a vehicle by an alcohol or otherwise impaired operator
US9402577B2 (en) * 2014-12-23 2016-08-02 Automotive Research & Test Center Driver's fatigue detection system and method
TWI565611B (en) * 2015-03-26 2017-01-11 神達電腦股份有限公司 Driving safety detecting method, system, in-vehicle computer and intelligent wearable device
TWI583989B (en) * 2015-07-02 2017-05-21 先進光電科技股份有限公司 Optical image capturing system
TWI601973B (en) * 2015-07-15 2017-10-11 先進光電科技股份有限公司 Optical image capturing system
TWI574222B (en) * 2015-11-06 2017-03-11 飛捷科技股份有限公司 Physiological status monitoring system integrated with low energy bluetooth mesh network
US10076287B2 (en) * 2015-12-29 2018-09-18 Automotive Research & Testing Center Method for monitoring physiological status of vehicle driver
JP2020512616A (en) * 2017-02-10 2020-04-23 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Driver and passenger health and sleep interactions
TWI638739B (en) * 2017-06-15 2018-10-21 正修學校財團法人正修科技大學 An interruptible power system for preventing drunk driving
TWM557687U (en) * 2017-09-05 2018-04-01 Liao Jia De Anti-drunk driving control device capable of improving driving safety
US10618522B2 (en) * 2018-03-27 2020-04-14 Hong Kong Productivity Council (HKPC) Drowsiness detection and intervention system and method
US10611382B2 (en) * 2018-06-11 2020-04-07 Honda Motor Co., Ltd. Methods and systems for generating adaptive instructions
TWM585412U (en) * 2019-04-30 2019-10-21 先進光電科技股份有限公司 Assisting system for mobile vehicles

Also Published As

Publication number Publication date
TWI749323B (en) 2021-12-11
CN111839478A (en) 2020-10-30
US11498419B2 (en) 2022-11-15
TW202042183A (en) 2020-11-16
US20200346545A1 (en) 2020-11-05

Similar Documents

Publication Publication Date Title
CN209895073U (en) Action carrier auxiliary system
CN209911624U (en) Action carrier auxiliary system
CN210376835U (en) Action carrier auxiliary system
CN215871629U (en) Action carrier auxiliary system
CN211066569U (en) Action carrier auxiliary system
CN210075422U (en) Action carrier auxiliary system
CN209858830U (en) Action carrier auxiliary system
CN210376834U (en) Action carrier auxiliary system
CN111416951A (en) Action carrier auxiliary system and vehicle auxiliary system
CN209895117U (en) Action carrier auxiliary system and automobile-used electron rear-view mirror
CN211335968U (en) Action carrier auxiliary system
CN209784640U (en) Action carrier auxiliary system
TWM585412U (en) Assisting system for mobile vehicles
TWM585220U (en) Assisting system for mobile vehicles
TWM583396U (en) Assisting system for mobile vehicles

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant