WO2013191079A1 - Dispositif de visualisation de voisinage de véhicule - Google Patents

Dispositif de visualisation de voisinage de véhicule Download PDF

Info

Publication number
WO2013191079A1
WO2013191079A1 PCT/JP2013/066332 JP2013066332W WO2013191079A1 WO 2013191079 A1 WO2013191079 A1 WO 2013191079A1 JP 2013066332 W JP2013066332 W JP 2013066332W WO 2013191079 A1 WO2013191079 A1 WO 2013191079A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
image
cut
images
range
Prior art date
Application number
PCT/JP2013/066332
Other languages
English (en)
Japanese (ja)
Inventor
圭助 本田
英人 栗本
Original Assignee
市光工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 市光工業株式会社 filed Critical 市光工業株式会社
Publication of WO2013191079A1 publication Critical patent/WO2013191079A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/25Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view to the sides of the vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/802Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for monitoring and displaying vehicle exterior blind spot views
    • B60R2300/8026Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for monitoring and displaying vehicle exterior blind spot views in addition to a rear-view mirror system

Definitions

  • the present invention relates to a vehicle periphery visual recognition device that displays an image of a vehicle periphery captured by an imaging device on a display device and visually recognizes the vehicle periphery.
  • the conventional vehicle periphery visual recognition device of Patent Literature 1 cuts out a fisheye image to be used from a fisheye image captured by a fisheye camera, corrects the cutout fisheye image into a normal image, and displays the corrected normal image. It is displayed on the device.
  • the conventional vehicle periphery visual recognition apparatus of patent document 2 images the side of a vehicle with an imaging part, and projects the imaged image by a projector part.
  • the range of the displayed image and the range of the projected video are almost constant, and thus the image and video in the range according to the vehicle situation cannot be obtained.
  • the problem to be solved by the present invention is that the conventional vehicle periphery visual recognition device cannot obtain images and videos in a range corresponding to the vehicle situation.
  • an image pickup device that picks up an image of the periphery of the vehicle, a detection device that detects vehicle information, and an image of a range to be used from the entire image of the periphery of the vehicle picked up by the image pickup device.
  • an image processing device that selects at least one cut-out image from the plurality of cut-out images and at least one cut-out image selected by the image processing device are displayed. And a display device.
  • the imaging device uses a wide-angle lens
  • the image processing device is based on vehicle information from the detection device, and at least one of the plurality of cut-out images.
  • at least one image correcting unit that corrects the wide-angle image into a normal image.
  • the imaging device uses a wide-angle lens
  • the image processing device cuts out a plurality of images in a range to be used from the entire range of images around the vehicle imaged by the imaging device.
  • a plurality of image correction units that correct a wide-angle image from the imaging device to a normal image, and at least one of a plurality of normal image images that are respectively corrected by the plurality of image correction units.
  • a determination unit that selects one normal image based on vehicle information from the detection device.
  • one of the plurality of cut-out images is in a range equivalent to (or including substantially equal to) the viewing range obtained by the vehicle rear-view mirror, or in a wide range. It is a cut-out image of the rear of the vehicle, and the cut-out image of the rear of the vehicle is always displayed on a part of the display device.
  • one of the plurality of cut-out images is a cut-out image in a range that is substantially the same as the range in which the driver's blind spot is formed by the front pillar of the vehicle or a wide range.
  • a cut image is preferentially displayed on the display device when it is determined that the vehicle is traveling in the blind spot direction of the front pillar based on vehicle information from the detection device. To do.
  • This invention is characterized by comprising a user interface for the user to select at least one clipped image from a plurality of clipped images.
  • the vehicle periphery visual recognition apparatus of this invention selects and displays at least one cut-out image among a plurality of cut-out images around the vehicle based on the vehicle information, at least one according to the vehicle situation is displayed. A cut image can be obtained.
  • the vehicle periphery visual recognition apparatus of this invention can provide the cut-out image which a driver needs as visual information based on vehicle information, can visually recognize the vehicle periphery reliably, and is traffic safety. Can contribute.
  • FIG. 1 is a functional block diagram of an overall configuration showing Embodiment 1 of a vehicle periphery visual recognition apparatus according to the present invention.
  • FIG. 2 is an explanatory diagram showing an imaging range of an imaging device (fisheye camera) mounted on the right side of the vehicle.
  • FIG. 3 is an explanatory diagram showing an entire range image (fisheye image) on the right side of the periphery of the vehicle imaged by an imaging device (fisheye camera) mounted on the right side of the vehicle.
  • FIG. 4 is an explanatory diagram showing an image (fish-eye image) of the entire right range around the vehicle divided into a plurality of areas.
  • FIG. 5 is an explanatory diagram showing a plurality of areas of the entire image (fisheye image) on the right side around the vehicle for each area.
  • FIG. 1 is a functional block diagram of an overall configuration showing Embodiment 1 of a vehicle periphery visual recognition apparatus according to the present invention.
  • FIG. 2 is an explanatory diagram showing an imaging range of an imaging
  • FIG. 6 is an explanatory diagram showing a plurality of areas on the right road surface around the vehicle corresponding to a plurality of areas in the entire image (fisheye image) on the right side around the vehicle.
  • FIG. 7 is a flowchart showing the operation.
  • FIG. 8 is a functional block diagram of an overall configuration showing Embodiment 2 of the vehicle periphery visual recognition apparatus according to the present invention.
  • front, rear, upper, lower, left, and right are front, rear, upper, lower, and front of the vehicle C when the vehicle periphery visual recognition device according to the present invention is mounted on the vehicle C. Left and right.
  • the symbol “F” indicates the front side of the vehicle C (the forward direction side of the vehicle C).
  • the symbol “B” indicates the rear side of the vehicle C.
  • the symbol “U” indicates the upper side when the front side F is viewed from the driver side.
  • the symbol “D” indicates the lower side when the front side F is viewed from the driver side.
  • Symbol “L” indicates the left side when the front side F is viewed from the driver side.
  • the symbol “R” indicates the right side when the front side F is viewed from the driver side.
  • the vehicle periphery visual recognition device includes an imaging device (camera) 1, a detection device 2, an image processing device (image processing ECU) 3, and a display device (monitor) 4. , And a user interface (hereinafter referred to as “UI”) 5.
  • UI user interface
  • the imaging device 1 is mounted on both the left and right sides of the vehicle C. For example, it is mounted on the mirror base of an outside mirror device (door mirror device) 6 mounted on the left and right doors of the vehicle C or on the vehicle body of the vehicle C. That is, it is mounted near the root of the front pillar (A pillar) 10 of the vehicle C.
  • the imaging device 1 mounted on the right side R of the vehicle C will be described.
  • the imaging device 1 mounted on the left side L of the vehicle C has substantially the same configuration as the imaging device 1 mounted on the right side R of the vehicle C, and the right side R of the vehicle C. Since the captured image is substantially symmetrical with the captured image of the imaging device 1 mounted on the device, the description thereof is omitted.
  • the imaging device 1 is connected to the image processing device 3.
  • the imaging device 1 captures information around the vehicle C, and outputs the captured information around the vehicle to the image processing device 3 as image data.
  • the imaging device 1 is a fish-eye camera using a wide-angle lens, for example, a fish-eye lens. Therefore, the entire image A0 around the vehicle imaged by the imaging device 1 forms a circle centered on the optical axis Z of the imaging device 1, as shown in FIGS.
  • the optical axis Z of the image pickup apparatus 1 faces sideways (right side R) so as to be orthogonal or almost orthogonal to the vehicle C as viewed from above. In addition, you may face to the front-back direction somewhat with respect to a horizontal direction. 2B, the optical axis Z of the imaging device 1 is slightly outward (right side R) with respect to the vertical line of the road surface G on the lower side D from the vehicle C as viewed from the front. ing.
  • information on the periphery of the vehicle imaged by the imaging device 1 is output to the image processing device 3 as image data as shown in FIG. That is, a part of the vehicle C (a part on the lower side D from the curve C) is used as image data located on the lower side D of the image A0 in the entire range and the road surface G (between the curve G and the curve C).
  • Part is image data located in the middle of the image A0 in the entire range, and the space above the road surface G (the portion above the curve G) is an image located on the upper side U of the image A0 in the entire range.
  • Each data is output to the image processing apparatus 3.
  • the optical axis Z of the imaging device 1 is slightly outward with respect to the vertical line of the road surface G on the lower side D from the vehicle C as viewed from the front. Therefore, a partial range (ratio) of the vehicle C in the entire range of the image A0 is such that the optical axis Z of the imaging device 1 is perpendicular to the road surface G on the lower side D from the vehicle C when viewed from the front. Compared with the case where it is directed in the line direction, the space range (ratio) on the road surface G that is smaller and occupies the entire range of the image A0 is larger. Thereby, the visual recognition range around the said vehicle C spreads.
  • a part of the vehicle C is captured in the entire range of the image A0, the relative positional relationship between the vehicle C and information around the vehicle becomes clear.
  • a part of the vehicle C may be omitted from the entire image A0.
  • a part of the space on the road surface G may be omitted from the entire range of the image A0.
  • the imaging device 1 uses a wide-angle lens such as a fish-eye lens, an image close to the boundary of the imaging range among the images (wide-angle image such as a fish-eye image) captured by the imaging device 1 (circles in FIGS. 3 to 5).
  • the image close to the circumference is larger in distortion than the image close to the center of the imaging range (image close to the center of the circle (the optical axis Z) in FIGS. 3 to 5). For this reason, a clearer image can be obtained by taking a certain margin for the boundary of the imaging range, that is, by not using an image close to the boundary of the imaging range.
  • the imaging range of the imaging device 1 may be used as much as possible.
  • the detection device 2 is connected to the image processing device 3.
  • the detection device 2 detects information on the vehicle C and outputs the detected vehicle information to the image processing device 3 as a detection signal.
  • the detection device 2 includes a vehicle speed sensor (not shown), a direction instruction detection unit (not shown), a gear position detection unit, a steering angle sensor (not shown), and a vehicle position detection unit (not shown). ) And an ultrasonic sensor (not shown).
  • the vehicle speed sensor detects the speed of the vehicle C, and outputs a vehicle speed signal whose pulse changes according to the vehicle speed to the image processing device 3.
  • the vehicle information detected by the vehicle speed sensor is the speed of the vehicle C.
  • the direction instruction detection unit detects a direction instruction operation performed by a driver and outputs a direction instruction signal to the image processing apparatus 3.
  • the direction indication detection unit is composed of, for example, a left turn signal switch and a right turn signal switch.
  • the blinker switch on the left side is turned on automatically when the driver turns on when turning left and right at an intersection, and when the steering wheel returns more than a predetermined angle after turning left and right at an intersection. It is turned off.
  • the left turn signal switch on the left side outputs an ON signal (for example, a high level signal) to the image processing device 3 when it is ON, and an OFF signal (for example, a low level signal) when it is OFF. This is output to the device 3.
  • the vehicle information detected by the direction indication detection unit is a left turn and a right turn at an intersection of the vehicle C or the like.
  • the gear position detection unit detects a gear position switching operation performed by a driver and outputs a gear position signal to the image processing device 3.
  • the gear position detection unit includes, for example, a reverse position switch, a parking position switch, and a neutral position switch.
  • the reverse position switch outputs a reverse position signal to the image processing device 3 when the driver switches the gear to the reverse position when the vehicle C is moved backward.
  • the parking position switch outputs a parking position signal to the image processing device 3 when the vehicle C is turned on when the driver switches the gear to the parking position.
  • the neutral position switch outputs a neutral position signal to the image processing device 3 when the vehicle C is turned on when the driver switches the gear to the neutral position when the vehicle C is stopped.
  • the vehicle information detected by the gear position detection unit is the reverse or stop of the vehicle C.
  • the steering angle sensor detects a steering angle (steering angle and synonym), steering direction and angular velocity of a steering handle (not shown, synonymous terms such as steering wheel and steering wheel), a steering angle signal, a steering direction signal and an angular velocity.
  • a signal is output to the image processing apparatus 3. That is, the steering angle sensor is used when, for example, the vehicle C travels on a curved road (a road with a left curve or a road with a right curve), and when the vehicle C makes a left turn or a right turn on an intersection.
  • the steering angle (rotation angle), steering direction (rotation direction), and angular velocity (rotation speed) of the steering wheel that the driver steers are detected, and the steering angle signal, the steering direction signal, and the angular velocity signal are sent to the image processing device 3.
  • the vehicle information detected by the steering angle sensor is a left turn and a right turn on a curved road or intersection of the vehicle C.
  • the vehicle position detection unit detects the position of the vehicle C and outputs a vehicle position signal to the image processing device 3.
  • the vehicle position detection unit includes, for example, a GPS receiver (for example, car navigation) that receives a position information signal output from GPS or a ground station (such as an electronic reference point).
  • the GPS receiver (for example, car navigation) receives a position information signal output from GPS or a ground station (such as an electronic reference point) and outputs the position information signal to the image processing device 3 as a vehicle position signal.
  • the vehicle information detected by the vehicle position detection unit is the position of the vehicle C.
  • the ultrasonic sensor detects a reflected wave from an object (person or object) around the vehicle C, and outputs an ultrasonic signal that changes in pulse according to the distance to the object to the image processing apparatus 3. To do.
  • the vehicle information detected by the ultrasonic sensor is an object (a person or an object) around the vehicle C.
  • the image processing device 3 is connected to the imaging device 1, the detection device 2, and the display device 4, respectively.
  • the image processing device 3 describes images A1, A2, A3, A4, A5, A6 (hereinafter referred to as “A1 to A6”) in the range to be used from the image A0 of the entire range around the vehicle imaged by the imaging device 1.
  • A1 to A6 images A1, A2, A3, A4, A5, A6 (hereinafter referred to as “A1 to A6”) in the range to be used from the image A0 of the entire range around the vehicle imaged by the imaging device 1.
  • a plurality of six are cut out, and at least one of the six cut-out images A1 to A6 is selected based on the vehicle information from the detection device 2. is there.
  • the first cut-out image (mirror view) A1 is a cut-out image in a range equivalent to or wider than the viewing range obtained by a vehicle rear-view mirror such as the outside mirror device (door mirror device) 6.
  • This is a cut-out image of the rear side of the vehicle C, that is, the rear side B of the vehicle C (right side R).
  • the first cut-out image A1 is an image as visual information for visually recognizing the far rear B of the vehicle C.
  • the outside mirror device 6 is not mounted on the vehicle C, it is essential that the first cut-out image A1 is always displayed on the display device 4 under any circumstances. (When the first cut-out image is essential).
  • the outside mirror device 6 is mounted on the vehicle C, it may not be necessary to always display the first cut image A1 on the display device 4.
  • a second cut-out image (under mirror view) A2 is a cut-out image in a range of a lower side D than the first cut-out image A1, and is near the tire on the rear side B of the vehicle C that becomes a blind spot of the driver It is a cut-out image of the range.
  • the second cut-out image A2 is when the vehicle C is moving forward at a low speed, and when the user is performing an input operation on the UI 5, and when the vehicle C is moving backward.
  • the third cut-out image (pillar view) A3 is a cut-out image in a range that is almost the same as or wider than the range that becomes the blind spot of the driver by the front pillar 10 of the vehicle C.
  • the third cut-out image A3 is an obliquely forward vehicle that becomes a blind spot of a driver by the front pillar 10 of the vehicle C when the vehicle C enters the intersection and turns left or right. That is, it is an image as visual information for visually recognizing a range from the front side F of the vehicle C to the side (right side R) of the vehicle C.
  • the fourth cut-out image (lateral view, right-side view) A4 is a cut-out image in a range substantially equal to or wider than the range that becomes the blind spot of the driver right next to the vehicle C (right side R).
  • the fourth cutout image A4 is a blind spot of the driver when the vehicle C enters the intersection and turns left or right as shown in FIG. It is an image as visual information for visually recognizing the side of C, that is, the range of the right side R of the vehicle C.
  • the fifth cut-out image (side view) A5 is a cut-out image in the vicinity of the tire on the front side F of the vehicle C, which is a blind spot of the driver.
  • the fifth cutout image A5 is displayed when the vehicle C starts (advances and retreats), when the vehicle C travels on a narrow road, and when the vehicle C has a width on the shoulder of the parallel parking. It is an image as visual information for visually recognizing a range from the front side F near the vehicle C to the lower side D of the vehicle C, which is a blind spot of the driver when the vehicle is approaching.
  • a sixth cut-out image (around view, overhead view) A6 is a range that is almost the same as a range that is a blind spot of the driver directly beside the vehicle C (right side R) and directly below the vehicle C (downside D) or This is a wide range of cut-out images.
  • the sixth cut-out image A6 is an image as visual information that allows the host vehicle (the vehicle C) to be objectively viewed from almost directly above when the vehicle C is parked and parked.
  • the image processing apparatus 3 includes a determination unit (determination circuit) 7 and at least one, in this example, three image correction units (image correction circuits) 11, 12, and 13.
  • the determination unit 7 selects at least one of the six cutout images A1 to A6 based on the vehicle information from the detection device 2.
  • the determination unit 7 always selects the first cut-out image A1 when the first cut-out image A1 is an essential condition, for example, when the outside mirror device 6 is not mounted on the vehicle C. Further, the determination unit 7 gives priority to the first cutout image A1 over the other cutout images A2 to A6 for fail-safe when the vehicle information cannot be obtained from the detection device 2. To select.
  • the three image correction units 11, 12, and 13 cut out a plurality of images A1 to A6 in the range to be used from the image A0 of the entire range around the vehicle imaged by the imaging device 1, in this example, A fish-eye image (see FIGS. 4 and 5) that is at least one of the cut-out images selected by the determination unit 7 and that is corrected from the imaging device 1 is corrected to a normal image (not shown). . That is, the fisheye image is distorted.
  • the three image correction units 11, 12, and 13 correct the fisheye image into a normal image image by normal image conversion such as coordinates and rotation angle.
  • the first image correction unit 11 always corrects the first cut-out image A1 exclusively.
  • the second image correction unit 12 corrects the main cut-out image using the one cut-out image selected by the determination unit 7 as a main cut-out image.
  • the third image correcting unit 13 sets one cut-out image from the remaining cut-out images other than the main cut-out image selected by the determination unit 7 as an auxiliary cut-out image, and the auxiliary cut-out image. Is to correct.
  • the display device 4 displays at least one cut-out image selected by the image processing device 3.
  • the display device 4 includes a first display unit 8 and a second display unit 9.
  • the display area of the first display unit 8 is smaller than the display area of the second display unit 9.
  • the first display unit 8 always displays the first cut-out image A1 corrected from a fisheye image from the first image correction unit 11 to a normal image exclusively.
  • the second display unit 9 mainly displays the main cut-out image corrected from a fisheye image from the second image correction unit 12 to a normal image, and from the third image correction unit 13.
  • the auxiliary cut image corrected from a fisheye image to a normal image is displayed as an auxiliary.
  • the display device 4 is provided, for example, in the center of an instrument panel in a driver's cab, and displays the cut images from the left and right imaging devices 1 together (simultaneously).
  • the display device 4 may be provided on the left and right front pillars 10 in the driver's cab, for example, and may be distributed to the left and right according to the left and right imaging devices 1. That is, the cut-out image from the imaging device 1 on the left side L is displayed on the display device 4 on the left side L, and the cut-out image from the imaging device 1 on the right side R is displayed on the display device 4 on the right side R. To display.
  • the display device 4 uses, for example, an organic EL display, a plasma display, a liquid crystal display, or a laser projector or a head-up display that displays on the front window glass of the vehicle C.
  • the display device 4 is provided integrally or separately from a display device (not shown) having other functions.
  • the UI 5 is used by the user (driver) to select at least one clipped image from among the plurality of clipped images by, for example, a button operation.
  • the UI 5 selects at least one cut-out image from among the six cut-out images A1 to A6 by a user operation.
  • the selected cut image is corrected from a fish-eye image to a normal image in the second image correction unit 12, or the second image correction unit 12 and the third image correction unit 13, and the display is performed. It is displayed on the second display unit 9 of the device 4.
  • the UI 5 may combine the first display unit 8 and the second display unit 9 of the display device 4 into one display unit or may be divided into three or more display units according to a user operation. it can.
  • the UI 5 can change the display area of the first display unit 8 and the display area of the second display unit 9 of the display device 4 according to a user operation.
  • the UI 5 includes the first cut-out image A1 displayed on the first display unit 8 of the display device 4 and the other cut-out image A2 displayed on the second display unit 9 by a user operation.
  • To A6 can be switched.
  • the UI 5 can adjust the cutout ranges of the six cutout images A1 to A6 by user operation.
  • the UI 5 cancels the first cut-out image essential condition by the user's operation, and the first cut-out image A1 is displayed in the first cut-out image A1.
  • the function of constantly displaying on the 1 display unit 8 and the function of the vehicle periphery visual recognition apparatus according to the first embodiment can be canceled.
  • the UI 5 can change a vehicle speed threshold value and the like by a user operation.
  • the vehicle periphery visual recognition apparatus according to the first embodiment is configured as described above, and the operation thereof will be described below with reference to the flowchart of FIG.
  • the first cut-out image A1 is selected by the determination unit 7 of the image processing device 3 and is selected from the fisheye image by the first image correction unit 11 of the image processing device 3.
  • the image is corrected to a normal image and is always displayed on the first display unit 8 of the display device 4.
  • the rear rear side B of the vehicle C can be visually recognized by the first cut-out image A1, which can contribute to traffic safety.
  • the determination unit 7 of the image processing device 3 inputs vehicle information from the gear position detection unit of the detection device 2 and determines whether or not the gear position is reverse (gear position reverse? S1).
  • the determination unit 7 inputs vehicle information from the gear position detection unit, and the gear position is parked (P ) Or neutral (N) is determined (gear position PorN? S2).
  • the vehicle C stops If it is, the determination unit 7 selects the first cutout image A1.
  • the first cut-out image A1 is corrected from a fisheye image to a normal image by the second image correction unit 12 and displayed on the second display unit 9 of the display device 4 (A1 display S3).
  • the rear rear side B of the vehicle C is visually recognized by the first cut-out image A1 displayed on the second display unit 9 having a display area larger than that of the first display unit 8, similarly to the outside mirror device 6. Can contribute to road safety.
  • the determination unit 7 Inputs vehicle information from the vehicle speed sensor of the detection device 2 and determines whether or not the vehicle speed is equal to or higher than a threshold value (for example, 30 km / h) (vehicle speed threshold value or higher? S4).
  • a threshold value for example, 30 km / h
  • the determination unit 7 inputs the vehicle information from the direction indication detection unit of the detection device 2 and It is determined whether or not the turn signal switch or the right turn signal switch is ON (a turn signal ON signal? S5).
  • the determination unit 7 selects the first cut-out image A1, and the first cut-out image A1 is displayed on the second display unit 9. (A1 display S3). And it returns to said (S1).
  • the determination unit 7 performs the third cut image A3 and the fourth cut image A4. And select.
  • the third cutout image A3 is corrected from a fisheye image to a normal image by the second image correction unit 12, and is displayed on the second display unit 9 of the display device 4 in place of the first cutout image A1.
  • the fourth cut-out image A4 is corrected from the fisheye image to the normal image by the third image correcting unit 13, and the third display unit 4 of the display device 4 is replaced with the third cut-out image A1 instead of the first cut-out image A1. It is displayed simultaneously with the output image A3 (A3, A4 simultaneous display S6). And it returns to said (S1).
  • the front pillar 10 of the vehicle C is turned on.
  • a pedestrian P or a bicycle approaching the vehicle C from the blind spot can be visually recognized at an early stage, which can contribute to traffic safety.
  • the third cutout image A3, the fourth cutout image A4, or the first cutout image A1 and the third cutout image A3 are displayed on the second display unit 9 by the user's operation of the UI5.
  • the first cut image A1 and the fourth cut image A4, or the first cut image A1, the third cut image A3, and the fourth cut image A4 can be displayed.
  • the vehicle information from the detection device 2 includes the gear position from the gear position detection unit, and These are the vehicle speed from the vehicle speed sensor and the direction instruction from the direction instruction detection unit. That is, it can be determined that the vehicle C is moving forward based on the vehicle information from the gear position detection unit, and the vehicle travels at a predetermined speed, for example, 30 km / h or more, based on the vehicle information from the vehicle speed sensor. Moreover, it can be determined that the vehicle C has entered the intersection or the like and is turning left or right based on the vehicle information from the direction indication detection unit.
  • the vehicle information of the steering angle, the steering direction and the angular velocity from the steering angle sensor, and the vehicle position from the vehicle position detection unit may be added to the vehicle information.
  • the vehicle information from the steering angle sensor it can be determined that the vehicle C is traveling in the blind spot direction of the front pillar 10 of the vehicle C, and in particular, the third cut-out image A3 can be preferentially displayed. it can. Further, it can be determined from the vehicle information from the vehicle position detection unit that the vehicle C is located at an intersection or the like. For this reason, if the third cut-out image A3 and the fourth cut-out image A4 are preferentially displayed at the intersection where the vehicle position detection unit sets the destination and turns left or right, the intersection of left turn or right turn is displayed. Etc. can be clearly shown to the driver.
  • the determination unit 7 inputs user operation information from the UI 5 and determines whether or not the user is operating the UI 5. (UI input? (Button etc.) S7).
  • the determination unit 7 selects the fifth cut-out image A5.
  • the fifth cut-out image A5 is corrected from the fisheye image to the normal image by the second image correction unit 12, and the third cut-out image A3 and the fourth cut-out image A4 are displayed on the second display unit 9 of the display device 4. It is displayed instead (A5 display S8). And it returns to said (S1).
  • the vehicle information from the detection device 2 includes the gear position from the gear position detection unit and the vehicle speed from the vehicle speed sensor. That is, it can be determined from the vehicle information from the gear position detection unit that the vehicle C advances immediately from the stopped state, and it is determined from the vehicle information from the vehicle speed sensor that the vehicle C is moving forward from the stopped state. In addition, it can be determined that the vehicle C is traveling at a low speed due to width adjustment or the like.
  • the vehicle information from the ultrasonic sensor may be added to the vehicle information. Based on the vehicle information from the ultrasonic sensor, it can be determined that an object (a person or an object) is present around the vehicle C, and in particular, the fifth cut-out image A5 can be preferentially displayed.
  • the determination unit 7 sets the UI5 The operation information of the user is input, and it is determined whether or not the user is operating the UI 5 (UI input? (Button etc.) S9).
  • the determination unit 7 selects the sixth cut-out image A6.
  • the sixth cutout image A6 is corrected from a fisheye image to a normal image by the second image correction unit 12, and is displayed on the second display unit 9 of the display device 4 in place of the fifth cutout image A5 (A6). Display S10). And it returns to said (S1).
  • the sixth cut-out image A6 displayed on the second display unit 9 enables the host vehicle to be objectively visually recognized from directly above when the vehicle C is parked in the garage. You can park in the garage.
  • the vehicle information from the detection device 2 is the gear position from the gear position detection unit. That is, it can be determined from the vehicle information from the gear position detection unit that the vehicle C is going backwards for parking in the garage.
  • the determination unit 7 selects the second cut-out image A2.
  • the second cutout image A2 is corrected from a fisheye image to a normal image by the second image correction unit 12, and is displayed on the second display unit 9 of the display device 4 in place of the sixth cutout image A6 (A2). Display S11). And it returns to said (S1).
  • the second cut-out image A2 displayed on the second display unit 9 causes the vehicle C from the rear side B near the vehicle C to become a driver's blind spot when the vehicle C starts (forwards and reverses).
  • the range to the lower side D can be visually recognized, and it can contribute to traffic safety.
  • the vehicle periphery visual recognition device is based on the vehicle information from the detection device 2 and displays at least one cut-out image of the six cut-out images A1 to A6 around the vehicle from the imaging device 1 as an image. Since the image is selected by the processing device 3 and displayed on the display device 4, at least one cut-out image corresponding to the vehicle situation can be obtained. As a result, the vehicle periphery visual recognition apparatus according to the first embodiment displays, based on the vehicle information, the cut image required by the driver (at least one of the six cut images A1 to A6) as visual information. And the surroundings of the vehicle C can be surely seen and can contribute to traffic safety.
  • the imaging device 1 is a fisheye camera using a wide-angle lens, for example, a fisheye lens
  • the periphery of the vehicle C can be imaged at a wide angle, and visual information over a wide range can be obtained. Can contribute to road safety.
  • the vehicle periphery visual recognition apparatus uses the image correction units 11, 12, and 13 the fisheye image from the imaging device 1 of the fisheye camera can be corrected to a normal image, so that the driver Can provide accurate visual information and contribute to road safety.
  • the vehicle periphery visual recognition apparatus includes the image correction units 11, 12, and 13 after the determination unit 7 in the image processing device 3, so that the number of the image correction units 11, 12, and 13 is the minimum necessary. The number can be made, and the manufacturing cost can be reduced accordingly.
  • the vehicle periphery visually recognizing device since the first cut-out image A1 is always displayed on the first display unit 8 of the display device 4, the vehicle C is distant from the vehicle C in the same manner as the outside mirror device 6.
  • the rear side B can be visually recognized, and it can contribute to traffic safety.
  • the vehicle periphery visual recognition device when it is determined that the third cutout image A3 is traveling in the blind spot direction of the front pillar 10 based on the vehicle information from the detection device 2. Are preferentially displayed on the display device 4. For this reason, the range which becomes a blind spot of a driver with the front pillar 10 of the vehicle C can be visually recognized by the 3rd cut-out image A3, and it can contribute to traffic safety.
  • any cut image displayed on the first display unit 8 and the second display unit 9 of the display device 4 can be arbitrarily selected by a user operation.
  • the visual information required by the user can be obtained and can contribute to traffic safety.
  • FIG. 8 shows Embodiment 2 of the vehicle periphery visual recognition apparatus according to the present invention.
  • the vehicle periphery visual recognition apparatus in this Embodiment 2 is demonstrated.
  • the same reference numerals as those in FIGS. 1 to 7 denote the same components.
  • the configurations of the image processing device 30 of the vehicle periphery visual recognition device of the second embodiment and the image processing device 3 of the vehicle periphery visual recognition device of the first embodiment are different. That is, the image processing device 30 of the vehicle periphery visual recognition device of the second embodiment cuts out six images A1 to A6 in the range to be used from the entire range image A0 around the vehicle C imaged by the imaging device 1, Six image correction units 21, 22, 23, 24, 25, and 26 that correct six fish-eye images from the imaging device 1 to normal images, respectively, and six image correction units. And a determination unit 7 that selects at least one normal image among the six normal image images that have been image-corrected in 21 to 26 based on vehicle information from the detection device 2.
  • the vehicle periphery visual recognition device is configured as described above, it is possible to achieve substantially the same operational effects as the vehicle periphery visual recognition device according to the first embodiment.
  • the determination unit 7 is provided after the same number of image correction units 21 to 26 as the six cut-out images A1 to A6, so that the determination unit 7 determines the cut-out image to be displayed. Only by doing this, the cut image after image correction can be displayed on the display device 4. As a result, there is no time lag compared to a case where the cut-out image to be displayed by the determination unit 7 is corrected and then displayed on the display device 4.
  • the outside mirror device 6 is mounted on the vehicle C.
  • the case where the outside mirror device 6 is not mounted on the vehicle C may be used.
  • the imaging device 1 is mounted on the left and right sides of the vehicle C, for example, on the mirror base of the outside mirror device 6 mounted on the left and right doors of the vehicle C, or on the vehicle C. It is mounted on the car body. That is, it is mounted near the base of the front pillar 10 of the vehicle C.
  • the position where the imaging device 1 is mounted is not particularly limited.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Image Analysis (AREA)

Abstract

La présente invention concerne un dispositif de visualisation de voisinage de véhicule qui comprend un dispositif d'imagerie (1), un dispositif de détection (2), un dispositif de traitement d'image (3) et un dispositif d'affichage (4). Le dispositif d'imagerie (1) capture des images du voisinage d'un véhicule (C). Le dispositif de détection (2) détecte des informations de véhicule. Le dispositif de traitement d'image (3) extrait six images en tant que plage d'images pour une utilisation à partir d'une image (A0) de la plage entière du voisinage du véhicule (C), ladite image (A0) ayant été capturée par l'unité d'imagerie (1), et sélectionne au moins une image extraite parmi les six images extraites (A1-A6) sur la base des informations de véhicule provenant du dispositif de détection (2). Le dispositif d'affichage (4) affiche au moins une image extraite qui a été sélectionnée par le dispositif de traitement d'image (3). En conséquence, la présente invention obtient une plage d'images qui correspond à l'état du véhicule (C).
PCT/JP2013/066332 2012-06-19 2013-06-13 Dispositif de visualisation de voisinage de véhicule WO2013191079A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012-137739 2012-06-19
JP2012137739A JP6127391B2 (ja) 2012-06-19 2012-06-19 車両周辺視認装置

Publications (1)

Publication Number Publication Date
WO2013191079A1 true WO2013191079A1 (fr) 2013-12-27

Family

ID=49768676

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/066332 WO2013191079A1 (fr) 2012-06-19 2013-06-13 Dispositif de visualisation de voisinage de véhicule

Country Status (2)

Country Link
JP (1) JP6127391B2 (fr)
WO (1) WO2013191079A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018062222A (ja) * 2016-10-12 2018-04-19 日産自動車株式会社 車両用周辺監視方法及び車両用周辺監視装置
CN108340836A (zh) * 2018-04-13 2018-07-31 华域视觉科技(上海)有限公司 一种汽车a柱显示***

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019116220A (ja) * 2017-12-27 2019-07-18 株式会社東海理化電機製作所 車両用視認装置

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000118298A (ja) * 1998-10-14 2000-04-25 Daihatsu Motor Co Ltd 車両用モニタ装置
JP2004159186A (ja) * 2002-11-07 2004-06-03 Sony Corp 監視用車載撮像システム
JP2004304415A (ja) * 2003-03-31 2004-10-28 Mazda Motor Corp 車両用監視装置
JP2005057536A (ja) * 2003-08-05 2005-03-03 Nissan Motor Co Ltd 映像提示装置
JP2007223338A (ja) * 2004-07-12 2007-09-06 Sharp Corp 表示システム、移動体、表示方法、表示プログラムおよびその記録媒体
WO2009157446A1 (fr) * 2008-06-24 2009-12-30 トヨタ自動車株式会社 Dispositif d’affichage d’angle mort et dispositif de support

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11136584A (ja) * 1997-10-30 1999-05-21 Toshiba Tec Corp パノラマ撮像システム

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000118298A (ja) * 1998-10-14 2000-04-25 Daihatsu Motor Co Ltd 車両用モニタ装置
JP2004159186A (ja) * 2002-11-07 2004-06-03 Sony Corp 監視用車載撮像システム
JP2004304415A (ja) * 2003-03-31 2004-10-28 Mazda Motor Corp 車両用監視装置
JP2005057536A (ja) * 2003-08-05 2005-03-03 Nissan Motor Co Ltd 映像提示装置
JP2007223338A (ja) * 2004-07-12 2007-09-06 Sharp Corp 表示システム、移動体、表示方法、表示プログラムおよびその記録媒体
WO2009157446A1 (fr) * 2008-06-24 2009-12-30 トヨタ自動車株式会社 Dispositif d’affichage d’angle mort et dispositif de support

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018062222A (ja) * 2016-10-12 2018-04-19 日産自動車株式会社 車両用周辺監視方法及び車両用周辺監視装置
CN108340836A (zh) * 2018-04-13 2018-07-31 华域视觉科技(上海)有限公司 一种汽车a柱显示***
CN108340836B (zh) * 2018-04-13 2024-07-16 华域视觉科技(上海)有限公司 一种汽车a柱显示***

Also Published As

Publication number Publication date
JP2014003482A (ja) 2014-01-09
JP6127391B2 (ja) 2017-05-17

Similar Documents

Publication Publication Date Title
JP5836490B2 (ja) 運転支援装置
EP3361721B1 (fr) Dispositif et procédé d'assistance d'affichage
US10029621B2 (en) Rear view camera system using rear view mirror location
JP6346614B2 (ja) 情報表示システム
US20190315275A1 (en) Display device and operating method thereof
JP5117003B2 (ja) 運転支援装置
JP4766841B2 (ja) 車両に搭載されるカメラ装置及び車両周辺監視装置
US10166922B2 (en) On-vehicle image display device, on-vehicle image display method for vehicle, and on-vehicle image setting device
US10118566B2 (en) Image processing device, image processing method, and image display system
JP5136950B2 (ja) 車載機器の操作装置
WO2019008764A1 (fr) Procédé et dispositif d'aide au stationnement
JP5549235B2 (ja) 運転支援装置
US8768575B2 (en) Vehicle periphery monitoring device
JP4772409B2 (ja) 画像表示システム
JP2008222153A (ja) 合流支援装置
US20220086400A1 (en) Vehicular display system
CN108725320A (zh) 图像显示装置
WO2014156166A1 (fr) Système d'aide au stationnement et procédé d'aide au stationnement
JP2011193485A (ja) 車両に搭載されるカメラ装置及び車両周辺監視装置
JP6448714B2 (ja) 情報表示システム
JP6127391B2 (ja) 車両周辺視認装置
JP2004051063A (ja) 車両周辺視認装置
KR20190117831A (ko) 차량용 접이식 사이드 미러 장치 및 이를 이용한 차량의 전방 및 측방 화면 표시 시스템
JP2005205983A (ja) 自車周辺視認装置
JP2005081860A (ja) 車両用表示装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13806893

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13806893

Country of ref document: EP

Kind code of ref document: A1