WO2019176391A1 - ドライブレコーダ、表示制御方法およびプログラム - Google Patents
ドライブレコーダ、表示制御方法およびプログラム Download PDFInfo
- Publication number
- WO2019176391A1 WO2019176391A1 PCT/JP2019/004421 JP2019004421W WO2019176391A1 WO 2019176391 A1 WO2019176391 A1 WO 2019176391A1 JP 2019004421 W JP2019004421 W JP 2019004421W WO 2019176391 A1 WO2019176391 A1 WO 2019176391A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- occupant
- vehicle
- imaging condition
- camera
- Prior art date
Links
- 238000000034 method Methods 0.000 title description 10
- 238000003384 imaging method Methods 0.000 claims abstract description 133
- 238000012545 processing Methods 0.000 claims abstract description 31
- 238000001514 detection method Methods 0.000 claims abstract description 28
- 230000002123 temporal effect Effects 0.000 claims description 3
- 230000006870 function Effects 0.000 description 6
- 238000010586 diagram Methods 0.000 description 4
- 238000012544 monitoring process Methods 0.000 description 4
- 230000001133 acceleration Effects 0.000 description 3
- 241001465754 Metazoa Species 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 239000000470 constituent Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 241000282326 Felis catus Species 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000014509 gene expression Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C5/00—Registering or indicating the working of vehicles
- G07C5/08—Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
- G07C5/0841—Registering performance data
- G07C5/085—Registering performance data using electronic data carriers
- G07C5/0866—Registering performance data using electronic data carriers the electronic data carrier being a digital video recorder in combination with video camera
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q9/00—Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling
- B60Q9/008—Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for anti-collision purposes
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/20—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/22—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
- B60R1/23—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
- B60R1/27—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view providing all-round vision, e.g. using omnidirectional cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/20—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/29—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area inside the vehicle, e.g. for viewing passengers or cargo
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/59—Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
- G06V20/593—Recognising seat occupancy
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/103—Static body considered as a whole, e.g. static pedestrian or occupant recognition
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C5/00—Registering or indicating the working of vehicles
- G07C5/08—Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
- G07C5/0816—Indicating performance data, e.g. occurrence of a malfunction
- G07C5/0825—Indicating performance data, e.g. occurrence of a malfunction using optical means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/54—Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/95—Computational photography systems, e.g. light-field imaging systems
- H04N23/951—Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/44—Receiver circuitry for the reception of television signals according to analogue transmission standards
- H04N5/445—Receiver circuitry for the reception of television signals according to analogue transmission standards for displaying additional information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/188—Capturing isolated or intermittent images triggered by the occurrence of a predetermined event, e.g. an object reaching a predetermined position
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/10—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/20—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/8006—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for monitoring and displaying scenes of vehicle interior, e.g. for monitoring passengers or cargo
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30268—Vehicle interior
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/84—Camera processing pipelines; Components thereof for processing colour signals
- H04N23/88—Camera processing pipelines; Components thereof for processing colour signals for colour balance, e.g. white-balance circuits or colour temperature control
Definitions
- the present invention relates to a drive recorder, a display control method, and a program.
- the drive recorder may capture images of the interior of the vehicle at the same time in addition to the image of the exterior of the vehicle such as the side or rear of the vehicle.
- the drive recorder may capture images of the interior of the vehicle at the same time in addition to the image of the exterior of the vehicle such as the side or rear of the vehicle.
- a camera that captures an image of a rear seat that is difficult to be seen by a driver while driving is displayed, and a child on the rear seat is displayed on a display device that is installed at a position that is easy to see from the driver.
- Patent Document 1 An apparatus that images both the interior and exterior of a vehicle using a single wide-angle camera
- the optimal values of the imaging conditions such as exposure and white balance may differ from area to area. In that case, it becomes difficult to image all of the plurality of regions satisfactorily.
- the present invention has been made in view of the above circumstances, and an object thereof is to provide a technique for appropriately capturing a plurality of areas with a single camera.
- a drive recorder includes an image acquisition unit that acquires image data captured by a camera attached to a vehicle, and an exposure of the camera based on a first image area of the image data acquired by the image acquisition unit, A first imaging condition relating to at least one of color and brightness is determined, and at least one of camera exposure, color and brightness based on a second image area different from the first image area of the image data acquired by the image acquisition unit.
- An imaging condition determining unit that determines a second imaging condition related to one, an imaging control unit that causes the camera to capture a first image using the first imaging condition and a second image using the second imaging condition in different temporal frames, An image recording unit that records moving image data based on the first image, an occupant detection unit that detects an occupant of the vehicle, and an occupant detected by the occupant detection unit An image processing unit that generates an occupant image in a display mode in which the visibility of an image area including the occupant is higher than that of the second image is displayed based on the second image to be displayed, and a moving image based on the occupant image is displayed on the display device.
- a display control unit that determines a second imaging condition related to one, an imaging control unit that causes the camera to capture a first image using the first imaging condition and a second image using the second imaging condition in different temporal frames, An image recording unit that records moving image data based on the first image, an occupant detection unit that detects an occupant of the vehicle, and an occupant detected by the
- Another aspect of the present invention is a display control method.
- Determining and determining a second imaging condition relating to at least one of exposure, color and brightness of the camera based on a second image area different from the first image area of the acquired image data, and in a temporally different frame A step of causing the camera to capture a first image using the first imaging condition and a second image using the second imaging condition, a step of recording moving image data based on the first image, and a step of detecting a vehicle occupant Based on the second image including the detected occupant, the visibility of the image portion including the occupant is higher than the second image.
- Comprising a step of generating an image comprising the steps of displaying on a display device a moving image based on the occupant image.
- the state of the occupant can be appropriately projected using the camera of the drive recorder.
- FIG. 1 is a diagram schematically showing a vehicle 70 on which the drive recorder 10 according to the embodiment is mounted.
- the drive recorder 10 acquires and records image data from the camera 42 that images both the inside and the outside of the vehicle 70, and causes the display device 50 to display an image as necessary.
- the camera 42 is a super wide-angle camera having an angle of view ⁇ of about 150 to 180 degrees.
- the camera 42 images the front of the vehicle through the windshield 72 of the vehicle 70 and also images the driver 80 seated on the front seat 74 of the vehicle 70 and the passenger 82 seated on the rear seat 76.
- the attachment position of the camera 42 is not specifically limited, For example, it can be attached to the position of the rear view mirror of the vehicle 70.
- FIG. 2 is a diagram illustrating an example of an image captured by the camera 42.
- the front area A includes roads and oncoming vehicles on which the vehicle 70 travels
- the side area B includes left and right window frames of the vehicle 70 and scenery that can be seen through the windows.
- the rear region C includes a driver 80 who sits in the driver's seat, a passenger 81 who sits in the passenger seat, a passenger 82 who sits in the rear seat, and the like.
- the camera 42 is not limited to the one that captures an ultra-wide-angle image as shown in FIG. 2, and may be any camera as long as it has an imaging range both inside and outside the vehicle 70.
- the camera may be installed so as to image the interior of the vehicle 70 and the rear of the vehicle.
- a camera may be installed so as to capture the rear of the vehicle from the position of the rear view mirror of the vehicle 70.
- a camera may be installed on the rear side of the front seat 74 of the vehicle 70 or the ceiling of the rear seat 76 so that both the rear seat 76 and the rear side of the vehicle are imaged by the camera.
- different imaging conditions are applied to images both inside and outside the vehicle in frames that are different in time for capturing a moving image.
- a first image using a first imaging condition based on an image area captured outside the vehicle interior and a second image using a second imaging condition based on an image area captured inside the vehicle interior, In different frames.
- a moving image with high visibility outside the vehicle and a moving image with high visibility inside the vehicle can be provided.
- a moving image suitable for monitoring the state of the passenger can be provided. For example, it is possible to provide a clear moving image that captures the state of a child sitting in the rear seat.
- FIG. 3 is a block diagram schematically showing the functional configuration of the drive recorder 10.
- Each functional block shown in the figure can be realized by hardware such as a computer CPU and memory, or a mechanical device, and can be realized by software by a computer program or the like. It is drawn as a functional block to be realized. Therefore, those skilled in the art will understand that these functional blocks can be realized in various forms by a combination of hardware and software.
- the drive recorder 10 includes an acquisition unit 12, an imaging control unit 14, an input / output unit 16, and a processing unit 18.
- the acquisition unit 12 includes a vehicle information acquisition unit 20 and an image acquisition unit 22.
- the input / output unit 16 includes an operation receiving unit 24, a display control unit 26, a sound control unit 28, and an image recording unit 30.
- the processing unit 18 includes an occupant detection unit 32, an imaging condition determination unit 34, a safety determination unit 35, a notification content determination unit 36, and an image processing unit 38.
- the vehicle information acquisition unit 20 acquires information related to the vehicle 70 and information related to the surrounding situation of the vehicle 70 from the in-vehicle device 40.
- the in-vehicle device 40 include a vehicle speed sensor, a rudder angle sensor, an acceleration sensor, an in-vehicle camera, a radar sensor, a position information sensor (GPS sensor), a navigation device, and a seating sensor, but are not limited thereto. Absent.
- the vehicle information acquisition unit 20 may acquire such information through CAN (Controller (Area Network) of the vehicle 70.
- the image acquisition unit 22 acquires image data captured by the camera 42 described above.
- the imaging control unit 14 controls the operation of the camera 42.
- the imaging control unit 14 causes the camera 42 to capture an image according to the imaging condition determined by the imaging condition determination unit 34.
- the imaging control unit 14 causes the camera 42 to capture a first image using the first imaging condition and a second image using the second imaging condition in temporally different frames when capturing a moving image.
- the imaging control unit 14 may cause the camera 42 to alternately capture the first image and the second image.
- the imaging controller 14 causes the first image to be captured in the odd-numbered frame and the second image to be captured in the even-numbered frame.
- the camera 42 is preferably at a high frame rate. For example, by using a 60 fps (flamesflaper second) camera, both the first image and the second image can be captured at 30 fps.
- the operation reception unit 24 receives an operation input from the input device 48.
- the display control unit 26 causes the display device 50 to display an image generated by the image processing unit 38.
- the display device 50 is a display device such as a liquid crystal display, and is attached to the position of the center console or dashboard of the vehicle 70.
- the display device 50 may be attached to the position of the camera 42.
- the input device 48 is a so-called touch panel sensor, and is provided in the display area of the display device 50. Note that the input device 48 is not a touch type, and may be configured by buttons or the like arranged around the display device 50.
- the voice control unit 28 outputs a voice corresponding to the notification content determined by the notification content determination unit 36 to the speaker 52.
- the voice control unit 28 outputs, for example, a notification voice for alerting the vehicle traveling to the speaker 52.
- the image recording unit 30 causes the recording device 54 to record image data acquired by the image acquisition unit 22, image data generated by performing image processing on the acquired image data, and the like.
- the recording device 54 is, for example, a flash memory or a hard disk.
- the recording device 54 may be provided in an external device such as a smartphone or a tablet connected by wireless communication such as Wi-Fi (registered trademark).
- the occupant detection unit 32 detects an occupant boarding the vehicle 70 based on the information acquired by the acquisition unit 12.
- the occupant detection unit 32 detects an occupant based on information on the seating sensor acquired by the vehicle information acquisition unit 20, for example.
- the occupant detection unit 32 detects, for example, the presence or absence of a occupant in the rear seat of the vehicle 70, and detects the occupant's seating position (eg, left side, center, right side) when the occupant is in the rear seat.
- the occupant detection unit 32 may detect the presence or absence of the occupant and the seating position of the occupant based on the image data acquired by the image acquisition unit 22.
- the type and position of an occupant to be detected by the occupant detection unit 32 may be specified by the user through an operation input from the input device 48.
- a specific seat such as a child seat is a detection target or an infant or child is a detection target.
- an animal kept as a pet such as a dog or a cat may be a detection target, and the “occupant” in the present embodiment may be a concept including both humans and animals.
- objects such as valuables mounted on the vehicle 70 may be detected.
- the imaging condition determination unit 34 determines the imaging condition of the camera 42 based on the image data acquired by the image acquisition unit 22.
- the imaging condition determination unit 34 sets a reference point or a reference area on an image captured by the camera 42, and determines an imaging condition so that conditions such as exposure and color in the reference point or reference area are optimal.
- the imaging condition determining unit 34 determines the first imaging condition with reference to the position on the image outside the vehicle, and determines the second imaging condition with reference to the position on the image inside the vehicle.
- the first reference point 84 is set in the front area A of the image shown in FIG. 2, and the first imaging condition is determined so that the luminance and chromaticity at the first reference point 84 are optimal.
- the imaging condition determination unit 34 includes the first imaging condition and the first imaging condition so that feedback control that suppresses a sudden change in luminance or chromaticity at a reference point or a reference region is realized in an image captured using each imaging condition.
- a second imaging condition is determined.
- the imaging condition determination unit 34 may determine the second imaging condition by setting the second reference point at the position of the occupant detected by the occupant detection unit 32. For example, the imaging condition determination unit 34 may determine the second imaging condition at the position of the passenger 82 in FIG. Two reference points may be set.
- the camera shutter speed and aperture value for determining the exposure condition, and automatic gain control (AGC; for compensating the brightness and color of the image after imaging) are determined. Both determination of image processing methods such as Auto (Gain Control) and automatic white balance control (AWC; Auto White Barance Control) are included.
- the safety determination unit 35 determines the level of vehicle travel safety based on the information acquired by the acquisition unit 12.
- the safety determination unit 35 determines the speed of the vehicle 70, the steering angle, the acceleration, the presence or absence of obstacles such as other vehicles and structures around the vehicle, the distance to the obstacles, the position of the host vehicle with respect to the travel lane, and the road being traveled.
- the level of safety is determined based on attributes and the like.
- the safety determination unit 35 determines safety at, for example, three levels. Level 1 is considered "safe", for example, when waiting for traffic lights at intersections, running on straight roads such as expressways and highways, and there are obstacles around Respond to situations that are not.
- Level 2 is a state that is regarded as “careful”, and corresponds to a situation in which other vehicles exist in the vicinity of the vehicle while traveling on a curved road or in the vicinity of an intersection, although the possibility of collision is low.
- Level 3 is a state that is considered “warning required”, where there is another vehicle close enough to predict the possibility of collision, or where the host vehicle deviates significantly from the lane in which it is traveling.
- the safety determination level is not limited to three levels, and may be determined in two levels such as “normal” and “caution”. For example, in addition to the level 1 situation described above, it may be determined as “normal” when the general road is traveling at a normal speed such as within the speed limit. Further, the safety judgment level may be four or more.
- the safety determination unit 35 may determine the safety by acquiring the vehicle travel speed and acceleration information, the operation information such as the accelerator, the brake, and the handle from the vehicle, and may determine the safety of the road being traveled, the traffic jam Peripheral information such as information and brightness around the vehicle may be acquired from a sensor or navigation system provided in the vehicle to determine safety.
- the safety determination unit 35 may determine safety using information acquired from a device outside the vehicle by wireless communication or the like.
- the safety level is determined depending on whether or not driving support functions such as Adaptive Cruise Control (ACC) and Lane Keep Assist (LKAS) are in operation. Alternatively, it may be determined that the safety level is relatively high when the driving support function is in operation.
- the notification content determination unit 36 determines the notification content for alerting the vehicle traveling according to the safety level determined by the safety determination unit 35.
- the notification content determination unit 36 determines the content to be notified to the driver when the safety determination unit 35 determines that it is a safety level (for example, level 3) that requires a warning to the driver.
- the notification content determination unit 36 determines the notification of “front collision warning” when the distance to an obstacle ahead of the host vehicle is equal to or less than a reference value corresponding to the vehicle speed, and determines the position of the host vehicle with respect to the traveling lane. When the departure amount exceeds the reference value, the notification of the “lane departure warning” is determined.
- the notification content determination unit 36 may determine the notification of “overspeed warning” when the traveling speed of the vehicle exceeds the speed limit, or when an operation such as sudden start, sudden braking, or sudden steering is performed. Notification of “vehicle behavior warning” may be determined.
- a notification image warning image
- a notification sound is output from the speaker 52.
- an image imitating a sign indicating danger can be used. For example, an image displaying a character or mark of “!” On a yellow background, or an image displaying characters such as “danger” or “forward warning” Can be used.
- the notification sound a buzzer sound, an alarm sound, or the like can be used, and a voice that reads out a message such as “please pay attention to the front” or “slow down the speed” may be used. Further, the display size of the notification image and the volume of the notification sound may be changed according to the determined safety level and the determined notification content.
- the image processing unit 38 generates an image to be displayed on the display device 50 and image data to be recorded on the recording device 54 based on the information acquired by the acquisition unit 12.
- the image processing unit 38 classifies the image data acquired by the image acquisition unit 22 for each imaging condition, and generates separate moving image data.
- the image processing unit 38 generates the first moving image data using only the frame of the first image captured using the first imaging condition, and only the frame of the second image captured using the second imaging condition.
- the second moving image data is generated using.
- the first moving image data generated in this way is recorded in the recording device 54 as moving image data for a drive record.
- the second moving image data may be recorded in the recording device 54 together.
- the image processing unit 38 When the occupant detection unit 32 detects an occupant, the image processing unit 38 generates an occupant image based on the second image captured using the second imaging condition.
- the “occupant image” is an image for clearly displaying the state of the occupant, and image processing is performed so that the image area including the occupant is more visible than the second image serving as the original data. It is an image given.
- the image processing unit 38 performs coordinate conversion of the polar coordinate system image shown in FIG. 2 into an orthogonal coordinate system image, cuts out and enlarges a partial area including an occupant in the second image, An occupant image is generated by image processing for adjusting chromaticity. The occupant image generated in this way is displayed as a moving image for monitoring the occupant on the display device 50.
- the image processing unit 38 may not generate an occupant image when no occupant is detected by the occupant detection unit 32. Therefore, when an occupant is not detected by the occupant detection unit 32, a moving image based on the occupant image may not be displayed on the display device 50.
- the image processing unit 38 may change the display mode of the occupant image to be generated according to the safety level determined by the safety determining unit 35. For example, when it is determined that the safety is relatively high, for example, when the safety is “level 1”, the passenger image is generated so that the moving image based on the passenger image is continuously displayed on the display device 50. . On the other hand, when it is determined that the safety is relatively low, for example, when the safety is “level 2”, the passenger image is generated so that the display time of the moving image based on the passenger image is limited. For example, an occupant image display period and a non-display period are alternately repeated at certain time intervals (such as 1 second or 2 seconds) so that the occupant image is not continuously displayed.
- the image processing unit 38 may change the display time of the occupant image according to the vehicle speed, and may shorten the display time of the occupant image as the vehicle speed increases.
- the image processing unit 38 may not display an occupant image during an operation of an accelerator, a brake, a steering wheel, or the like of the vehicle.
- the image processing unit 38 changes the display mode of the occupant image when it is determined that the vehicle is traveling in front of the intersection or curve based on the map information or the like, and the occupant as the vehicle approaches the intersection or curve.
- the image display time may be shortened.
- the image processing unit 38 When it is determined that the safety is low, for example, when the safety is “level 3”, the image processing unit 38 hides the moving image based on the occupant image or displays the occupant image so that the visibility of the occupant image is low. May be changed. In this case, the image processing unit 38 may generate a warning image instead of the occupant image and display the warning image on the display device 50, or generate an image in which the warning image is superimposed on the occupant image and generate the warning image on the display device 50. It may be displayed. Thus, by changing the display mode of the occupant image based on the safety determination level, the driver can easily grasp the state of the rear seat and can support safer driving of the vehicle.
- the safety determination unit 35 may switch the determination criterion used for determining the safety level according to whether or not an occupant image is displayed on the display device 50.
- the safety determination unit 35 includes a plurality of determination criteria, and when the occupant image is not displayed on the display device 50, the normal first determination criterion is used, and the occupant image is displayed on the display device 50.
- a second determination criterion different from the first determination criterion may be used.
- the second determination criterion is determined such that the safety determined for the same event is relatively lower than that of the first determination criterion. For example, when the first determination criterion is used, the determination criterion is determined so that a part of the event determined as “level 1” is determined as “level 2” instead of “level 1”.
- the safety is determined more strictly in the second determination criterion than in the first determination criterion.
- the driver's attention is likely to be directed to the display device 50. Therefore, in such a situation, the safety level is strictly determined, so that warning or notification can be given early.
- the display mode of the display device 50 can be changed more appropriately.
- a gaze detection unit that detects the driver's gaze direction is further provided, and when it is detected that the driver is gazing at the occupant image, the judgment criterion is switched so that the safety is judged to be relatively low. Good.
- FIG. 4 is a flowchart showing the flow of the display control method.
- the imaging condition determination unit 34 determines a first imaging condition relating to at least one of exposure and color of the camera 42 with reference to the outside of the vehicle, and second imaging relating to at least one of exposure and color of the camera 42 with reference to the interior of the vehicle. Conditions are determined (S10).
- the imaging control unit 14 causes the camera 42 to capture the first image using the first imaging condition and the second image using the second imaging condition in frames that are temporally different, and the image acquisition unit 22 Second image data is acquired (S12).
- the image recording unit 30 causes the recording device 54 to record moving image data based on the first image (S14).
- the image processing unit 38 When an occupant is detected by the occupant detection unit 32 (Y in S16), the image processing unit 38 has a display mode in which the visibility of the image portion including the occupant is higher than that of the second image based on the second image. An image is generated (S18).
- the display control unit 26 causes the display device 50 to display a moving image based on the generated occupant image (S20).
- S18 and S20 When an occupant is not detected by the occupant detection unit 32 (N in S16), the processes of S18 and S20 are skipped.
- a single camera can be shared and a first image with high visibility outside the vehicle and a second image with high visibility inside the vehicle can be captured together. Further, when an occupant is detected, an image suitable for occupant monitoring can be provided by generating and displaying an occupant image based on the second image. According to the present embodiment, both the drive record function and the monitoring function can be provided at a lower cost than in the case where a dedicated camera for imaging the interior of the vehicle or the passenger is provided.
- the present invention has been described with reference to the above-described embodiment.
- the present invention is not limited to the above-described embodiment, and the configuration shown in each display example is appropriately combined or replaced. Are also included in the present invention.
- the method of satisfactorily imaging both the outdoor and indoor areas of a vehicle with a single camera has been described.
- the above-described processing may be applied to satisfactorily capture each of a plurality of arbitrary image areas included in image data captured by a single camera.
- the camera attached to the vehicle may image only the outside of the vehicle, may image only the inside of the vehicle, or may image both the outside and inside of the vehicle.
- the imaging condition determination unit may determine individual imaging conditions related to at least one of camera exposure, color, and brightness for each of a plurality of arbitrary image areas included in the image data acquired by the image acquisition unit. For example, the first imaging condition may be determined for the first image area of the image data, and the second imaging condition may be determined for a second image area different from the first image area of the image data.
- the imaging condition determination unit may determine three or more imaging conditions. For example, the imaging condition determination unit may determine the third imaging condition for a third image area different from the first image area and the second image area of the image data. Then, the fourth imaging condition may be determined for a fourth image region different from the first image region, the second image region, and the third image region of the image data.
- the imaging condition determination unit may divide one image data acquired by the image acquisition unit into a plurality of n image areas and determine n imaging conditions corresponding to each of the n image areas. Good.
- Each of the n image regions may be all outside the vehicle, all may be inside the vehicle, some may be outside the vehicle, and some may be inside the vehicle.
- the imaging condition determination unit is an image area excluding a road surface where the vehicle travels and an area where the sky is reflected, for example, an area where other vehicles around the vehicle, passers-by, buildings, etc.
- the imaging conditions may be determined based on the above.
- the imaging condition determination unit may determine the imaging condition with respect to an imaging region including a passenger with respect to the imaging condition based on the interior of the vehicle.
- the imaging control unit may cause the camera to capture an image using each of the plurality of imaging conditions determined by the imaging condition determining unit in different temporal frames.
- the first image using the first imaging condition and the second image using the second imaging condition may be captured by the camera in temporally different frames.
- the imaging control unit may cause the camera to capture the third image using the third imaging condition or the fourth image using the fourth imaging condition in a temporally different frame from that of the first image and the second image.
- the imaging control unit may cause the camera to capture n images corresponding to each of the n imaging conditions determined by the imaging condition determination unit in mutually different frames.
- the occupant detection unit may detect a plurality of occupants.
- the occupant detection unit may simultaneously detect, for example, the first occupant on the right side of the rear seat, the second occupant in the center of the rear seat, the third occupant on the left side of the rear seat, and the fourth occupant next to the driver seat.
- the image processing unit may generate an occupant image corresponding to each of the detected plurality of occupants.
- the image processing unit may generate, for each occupant, a plurality of occupant images in a display mode in which an image area including each of the occupants is highly visible.
- a first occupant image with high visibility for the first occupant For example, a first occupant image with high visibility for the first occupant, a second occupant image with high visibility for the second occupant, a third occupant image with high visibility for the third occupant, and a high visibility for the fourth occupant.
- a four occupant image may be generated. Images used for generating each occupant image may have different imaging conditions, or may be images captured using imaging conditions optimized for an image area including the corresponding occupant.
- the first occupant image is generated based on a second image that uses an imaging condition (for example, a second imaging condition) based on an image area including the first occupant, and the second occupant image includes the second occupant.
- the image may be generated based on a third image using an imaging condition based on the image area (for example, a third imaging condition).
- the display control unit may display a moving image based on a plurality of occupant images corresponding to each of the plurality of occupants on the display device. For example, when two persons, a first occupant and a second occupant, are detected, the display control unit causes the display device to display a first moving image based on the first occupant image and a second moving image based on the second occupant image. Also good.
- the first moving image and the second moving image may be displayed simultaneously in different areas of the display device, or may be displayed separately on the display device at different timings. In the latter case, the period during which the first moving image is displayed and the period during which the second moving image is displayed may be alternately repeated. For example, the passenger displayed every 5 seconds may be switched.
- each of a plurality of arbitrary image areas can be imaged under appropriate imaging conditions using a single camera. For example, by using the second imaging condition or the third imaging condition based on a plurality of passengers in the vehicle while capturing an image suitable for recording by the drive recorder by using the first imaging condition based on the outside of the vehicle. Images suitable for the monitors of a plurality of passengers can be taken.
- SYMBOLS 10 ... Drive recorder, 12 ... Acquisition part, 14 ... Imaging control part, 18 ... Processing part, 20 ... Vehicle information acquisition part, 22 ... Image acquisition part, 26 ... Display control part, 30 ... Image recording part, 32 ... Crew detection 34, imaging condition determination unit, 35 ... safety determination unit, 36 ... notification content determination unit, 38 ... image processing unit, 42 ... camera, 50 ... display device, 70 ... vehicle.
- the state of the occupant can be appropriately projected using the camera of the drive recorder.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Mechanical Engineering (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computing Systems (AREA)
- Traffic Control Systems (AREA)
- Closed-Circuit Television Systems (AREA)
- Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
- Time Recorders, Dirve Recorders, Access Control (AREA)
- Studio Devices (AREA)
Abstract
Description
Claims (8)
- 車両に取り付けられるカメラにより撮像される画像データを取得する画像取得部と、
前記画像取得部が取得する画像データの第1画像領域を基準として前記カメラの露出、色および輝度の少なくとも一つに関する第1撮像条件を決定し、前記画像取得部が取得する画像データの前記第1画像領域とは異なる第2画像領域を基準として前記カメラの露出、色および輝度の少なくとも一つに関する第2撮像条件を決定する撮像条件決定部と、
時間的に異なるフレームにおいて、前記第1撮像条件を用いる第1画像および前記第2撮像条件を用いる第2画像を前記カメラに撮像させる撮像制御部と、
前記第1画像に基づく動画像データを記録する画像記録部と、
前記車両の乗員を検出する乗員検出部と、
前記乗員検出部により検出される乗員が含まれる第2画像に基づいて、前記乗員が含まれる画像領域の視認性が前記第2画像よりも高い表示態様の乗員画像を生成する画像処理部と、
前記乗員画像に基づく動画像を表示装置に表示させる表示制御部と、を備えることを特徴とするドライブレコーダ。 - 前記撮像条件決定部は、前記画像取得部が取得する画像データの前記第1画像領域および前記第2画像領域とは異なる第3画像領域を基準として前記カメラの露出、色および輝度の少なくとも一つに関する第3撮像条件をさらに決定し、
前記撮像制御部は、前記第1画像および前記第2画像の撮像とは時間的に異なるフレームにおいて、前記第3撮像条件を用いる第3画像を前記カメラに撮像させ、
前記画像処理部は、前記乗員検出部により検出される前記乗員とは別の乗員が含まれる第3画像に基づいて、前記別の乗員が含まれる画像領域の視認性が前記第3画像よりも高い表示態様の別の乗員画像をさらに生成し、
前記表示制御部は、前記乗員画像に基づく動画像および前記別の乗員画像に基づく動画像を前記表示装置に表示させることを特徴とする請求項1に記載のドライブレコーダ。 - 前記乗員検出部は、乗員の着座位置をさらに検出し、
前記撮像条件決定部は、前記乗員検出部により検知される乗員の着座位置を基準として前記第2撮像条件を決定することを特徴とする請求項1または2に記載のドライブレコーダ。 - 車両に関する情報および前記車両の周辺状況に関する情報の少なくとも一方を取得する車両情報取得部と、
前記車両情報取得部が取得する情報に基づいて、車両走行の安全性のレベルを判定する安全性判定部と、をさらに備え、
前記画像処理部は、前記安全性判定部により判定される安全性のレベルに応じて、生成する乗員画像の表示態様を変化させることを特徴とする請求項1から3のいずれか一項に記載のドライブレコーダ。 - 車両に関する情報および前記車両の周辺状況に関する情報の少なくとも一方を取得する車両情報取得部と、
前記車両情報取得部が取得する情報に基づいて、車両走行の安全性のレベルを判定する安全性判定部と、
前記安全性判定部により判定される安全性のレベルに応じて、車両走行に関する注意喚起のための報知内容を決定する報知内容決定部と、をさらに備え、
前記安全性判定部は、前記表示装置に前記乗員画像に基づく動画像が表示されている場合、そうでない場合とは異なる判定基準を用いて安全性のレベルを判定することを特徴とする請求項1から3のいずれか一項に記載のドライブレコーダ。 - 前記画像処理部は、前記報知内容決定部により報知内容が決定される場合、前記報知内容を示す報知画像を重畳した乗員画像を生成し、
前記表示制御部は、前記報知画像が重畳された乗員画像に基づく動画像を前記表示装置に表示させることを特徴とする請求項5に記載のドライブレコーダ。 - 車両に取り付けられるカメラにより撮像される画像データを取得するステップと、
前記取得した画像データの第1画像領域を基準として前記カメラの露出、色および輝度の少なくとも一つに関する第1撮像条件を決定し、前記取得した画像データの前記第1画像領域とは異なる第2画像領域を基準として前記カメラの露出、色および輝度の少なくとも一つに関する第2撮像条件を決定するステップと、
時間的に異なるフレームにおいて、前記第1撮像条件を用いる第1画像および前記第2撮像条件を用いる第2画像を前記カメラに撮像させるステップと、
前記第1画像に基づく動画像データを記録するステップと、
前記車両の乗員を検出するステップと、
前記検出される乗員が含まれる第2画像に基づいて、前記乗員が含まれる画像部分の視認性が前記第2画像よりも高い表示態様の乗員画像を生成するステップと、
前記乗員画像に基づく動画像を表示装置に表示させるステップと、を備えることを特徴とする表示制御方法。 - 車両に取り付けられるカメラにより撮像される画像データを取得する機能と、
前記取得した画像データの第1画像領域を基準として前記カメラの露出、色および輝度の少なくとも一つに関する第1撮像条件を決定し、前記取得した画像データの前記第1画像領域とは異なる第2画像領域を基準として前記カメラの露出、色および輝度の少なくとも一つに関する第2撮像条件を決定する機能と、
時間的に異なるフレームにおいて、前記第1撮像条件を用いる第1画像および前記第2撮像条件を用いる第2画像を前記カメラに撮像させる機能と、
前記第1画像に基づく動画像データを記録する機能と、
前記車両の乗員を検出する機能と、
前記検出される乗員が含まれる第2画像に基づいて、前記乗員が含まれる画像部分の視認性が前記第2画像よりも高い表示態様の乗員画像を生成する機能と、
前記乗員画像に基づく動画像を表示装置に表示させる機能と、をコンピュータに実現させることを特徴とするプログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201980012265.4A CN111699680B (zh) | 2018-03-15 | 2019-02-07 | 行车记录仪、显示控制方法以及存储介质 |
JP2020505676A JP6825743B2 (ja) | 2018-03-15 | 2019-02-07 | ドライブレコーダ、表示制御方法およびプログラム |
EP19768412.9A EP3767944B1 (en) | 2018-03-15 | 2019-02-07 | Drive recorder, display control method, and program |
US17/020,900 US11050929B2 (en) | 2018-03-15 | 2020-09-15 | Driver recorder, display control method and program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2018-047568 | 2018-03-15 | ||
JP2018047568 | 2018-03-15 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/020,900 Continuation US11050929B2 (en) | 2018-03-15 | 2020-09-15 | Driver recorder, display control method and program |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019176391A1 true WO2019176391A1 (ja) | 2019-09-19 |
Family
ID=67907697
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2019/004421 WO2019176391A1 (ja) | 2018-03-15 | 2019-02-07 | ドライブレコーダ、表示制御方法およびプログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US11050929B2 (ja) |
EP (1) | EP3767944B1 (ja) |
JP (1) | JP6825743B2 (ja) |
CN (1) | CN111699680B (ja) |
WO (1) | WO2019176391A1 (ja) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021245730A1 (ja) * | 2020-06-01 | 2021-12-09 | 三菱電機株式会社 | 車載撮影制御装置、車載撮影制御システム、及び撮影制御方法 |
CN114556905A (zh) * | 2019-10-31 | 2022-05-27 | Jvc建伍株式会社 | 记录***、记录方法以及程序 |
WO2023175686A1 (ja) * | 2022-03-14 | 2023-09-21 | 日本電気株式会社 | 乗員情報取得装置、乗員情報取得システム、乗員情報取得方法及び非一時的なコンピュータ可読媒体 |
EP4207109A4 (en) * | 2020-08-25 | 2024-04-03 | Jvckenwood Corp | DRIVING RECORDER, CONTROL METHOD FOR DRIVING RECORDER, AND SHOOTING CONTROL PROGRAM |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3869788A1 (en) * | 2020-02-19 | 2021-08-25 | Ricoh Company, Ltd. | Image capturing device, image communication system, method for display control, and carrier means |
JP7470540B2 (ja) * | 2020-03-18 | 2024-04-18 | 本田技研工業株式会社 | 内部機器の異常判定装置、異常判定方法、及びプログラム |
US20220164958A1 (en) * | 2020-11-24 | 2022-05-26 | Panasonic Automotive Systems Company Of America, Division Of Panasonic Corporation Of North America | Method Of Using Camera For Both Internal And External Monitoring Of A Vehicle |
JP7140819B2 (ja) * | 2020-12-25 | 2022-09-21 | 本田技研工業株式会社 | 撮像装置 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2010050012A1 (ja) * | 2008-10-29 | 2010-05-06 | 京セラ株式会社 | 車載用カメラモジュール |
JP2012228931A (ja) | 2011-04-26 | 2012-11-22 | Nissan Motor Co Ltd | 画像処理装置及び画像処理方法 |
JP2018121104A (ja) * | 2017-01-23 | 2018-08-02 | 株式会社ザクティ | ドライブレコーダ装置 |
JP2018196066A (ja) * | 2017-05-19 | 2018-12-06 | 株式会社ユピテル | ドライブレコーダー、ドライブレコーダー用表示装置及びプログラム |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004144512A (ja) * | 2002-10-22 | 2004-05-20 | Denso Corp | 乗員検知システム |
US7538864B2 (en) * | 2005-03-24 | 2009-05-26 | Hunter Engineering Company | Vehicle wheel alignment system scanned beam imaging sensor |
JP2009065624A (ja) * | 2007-09-10 | 2009-03-26 | Niles Co Ltd | 多方向撮像カメラ |
WO2009157446A1 (ja) * | 2008-06-24 | 2009-12-30 | トヨタ自動車株式会社 | 死角表示装置及び運転支援装置 |
US20100014711A1 (en) * | 2008-07-16 | 2010-01-21 | Volkswagen Group Of America, Inc. | Method for controlling an illumination in a vehicle interior in dependence on a head pose detected with a 3D sensor |
JP5471038B2 (ja) * | 2009-05-27 | 2014-04-16 | アイシン精機株式会社 | 校正目標検出装置と、校正目標を検出する校正目標検出方法と、校正目標検出装置のためのプログラム |
JP2011124879A (ja) * | 2009-12-11 | 2011-06-23 | Opt Kk | 車両用撮像システム及び車載用撮像装置 |
JP5682304B2 (ja) * | 2010-12-27 | 2015-03-11 | トヨタ自動車株式会社 | 画像提供装置 |
US8659408B2 (en) * | 2012-05-22 | 2014-02-25 | Delphi Technologies, Inc. | Object detection system and method using a camera and a multiple zone temperature sensor |
JP5994437B2 (ja) * | 2012-07-04 | 2016-09-21 | 株式会社デンソー | 車両周囲画像表示制御装置および車両周囲画像表示制御プログラム |
KR102149187B1 (ko) * | 2014-02-21 | 2020-08-28 | 삼성전자주식회사 | 전자 장치와, 그의 제어 방법 |
JP6481846B2 (ja) * | 2014-03-27 | 2019-03-13 | 日本精機株式会社 | 車両用警報装置 |
DE102014207994A1 (de) * | 2014-04-29 | 2015-10-29 | Conti Temic Microelectronic Gmbh | Vorrichtung zum Erkennen von Niederschlag für ein Kraftfahrzeug |
JP6384188B2 (ja) * | 2014-08-12 | 2018-09-05 | ソニー株式会社 | 車両用表示装置と表示制御方法および後方モニタリングシステム |
CN106170072B (zh) * | 2016-07-18 | 2022-06-10 | 中国科学院地理科学与资源研究所 | 视频采集***及其采集方法 |
-
2019
- 2019-02-07 WO PCT/JP2019/004421 patent/WO2019176391A1/ja active Application Filing
- 2019-02-07 CN CN201980012265.4A patent/CN111699680B/zh active Active
- 2019-02-07 JP JP2020505676A patent/JP6825743B2/ja active Active
- 2019-02-07 EP EP19768412.9A patent/EP3767944B1/en active Active
-
2020
- 2020-09-15 US US17/020,900 patent/US11050929B2/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2010050012A1 (ja) * | 2008-10-29 | 2010-05-06 | 京セラ株式会社 | 車載用カメラモジュール |
JP2012228931A (ja) | 2011-04-26 | 2012-11-22 | Nissan Motor Co Ltd | 画像処理装置及び画像処理方法 |
JP2018121104A (ja) * | 2017-01-23 | 2018-08-02 | 株式会社ザクティ | ドライブレコーダ装置 |
JP2018196066A (ja) * | 2017-05-19 | 2018-12-06 | 株式会社ユピテル | ドライブレコーダー、ドライブレコーダー用表示装置及びプログラム |
Non-Patent Citations (1)
Title |
---|
See also references of EP3767944A4 |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114556905A (zh) * | 2019-10-31 | 2022-05-27 | Jvc建伍株式会社 | 记录***、记录方法以及程序 |
EP4054182A4 (en) * | 2019-10-31 | 2022-12-21 | JVCKenwood Corporation | RECORDING DEVICE, RECORDING METHOD AND PROGRAM |
US11877099B2 (en) | 2019-10-31 | 2024-01-16 | Jvckenwood Corporation | Recording system, recording method, and program |
CN114556905B (zh) * | 2019-10-31 | 2024-05-10 | Jvc建伍株式会社 | 记录***、记录方法以及计算机可读存储介质 |
WO2021245730A1 (ja) * | 2020-06-01 | 2021-12-09 | 三菱電機株式会社 | 車載撮影制御装置、車載撮影制御システム、及び撮影制御方法 |
EP4207109A4 (en) * | 2020-08-25 | 2024-04-03 | Jvckenwood Corp | DRIVING RECORDER, CONTROL METHOD FOR DRIVING RECORDER, AND SHOOTING CONTROL PROGRAM |
WO2023175686A1 (ja) * | 2022-03-14 | 2023-09-21 | 日本電気株式会社 | 乗員情報取得装置、乗員情報取得システム、乗員情報取得方法及び非一時的なコンピュータ可読媒体 |
Also Published As
Publication number | Publication date |
---|---|
CN111699680B (zh) | 2022-03-01 |
US20200412944A1 (en) | 2020-12-31 |
CN111699680A (zh) | 2020-09-22 |
EP3767944A4 (en) | 2021-01-20 |
JP6825743B2 (ja) | 2021-02-03 |
US11050929B2 (en) | 2021-06-29 |
EP3767944A1 (en) | 2021-01-20 |
EP3767944B1 (en) | 2022-07-06 |
JPWO2019176391A1 (ja) | 2020-12-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2019176391A1 (ja) | ドライブレコーダ、表示制御方法およびプログラム | |
EP3330148B1 (en) | Automatic driving system for vehicles | |
US10026319B2 (en) | Collision warning system | |
US9511730B1 (en) | Collision warning system | |
JP6354776B2 (ja) | 車両の制御装置 | |
JP4134939B2 (ja) | 車両周辺表示制御装置 | |
JP5664784B2 (ja) | 車両用情報伝達装置 | |
JP7047763B2 (ja) | 運転支援装置および方法、移動体、並びにプログラム | |
JP2017185946A (ja) | 車両の自動運転システム | |
JP2009244959A (ja) | 運転支援装置、運転支援方法 | |
US11167752B2 (en) | Driving assistant apparatus, driving assistant method, moving object, and program | |
JP6223143B2 (ja) | 運転支援装置および運転支援方法 | |
JP7282069B2 (ja) | 車両報知装置 | |
JP6658358B2 (ja) | 車両の制御装置 | |
JP2021041884A (ja) | 車両制御装置 | |
WO2018168050A1 (ja) | 集中度判定装置、集中度判定方法及び集中度判定のためのプログラム | |
JP7393738B2 (ja) | ドライバ状態推定装置 | |
US20230132456A1 (en) | Image processing device, mobile object, image processing method, and storage medium | |
WO2023176737A1 (ja) | 表示制御装置、表示制御方法 | |
WO2023058494A1 (ja) | 車両用制御装置及び車両用制御方法 | |
JP2023055197A (ja) | 車両用制御装置及び車両用制御方法 | |
JP2020102098A (ja) | 運転支援装置および運転支援方法 | |
CN117136157A (zh) | 自动驾驶控制装置、自动驾驶控制程序、提示控制装置以及提示控制程序 | |
CN115285114A (zh) | 车辆与车辆周围环境交互的***、方法、车辆及程序产品 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19768412 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2020505676 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2019768412 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2019768412 Country of ref document: EP Effective date: 20201015 |