WO2006095689A1 - Dispositif d’assistance à la conduite, méthode d’assistance à la conduite et programme d’assistance à la conduite - Google Patents

Dispositif d’assistance à la conduite, méthode d’assistance à la conduite et programme d’assistance à la conduite Download PDF

Info

Publication number
WO2006095689A1
WO2006095689A1 PCT/JP2006/304282 JP2006304282W WO2006095689A1 WO 2006095689 A1 WO2006095689 A1 WO 2006095689A1 JP 2006304282 W JP2006304282 W JP 2006304282W WO 2006095689 A1 WO2006095689 A1 WO 2006095689A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
video
image data
driving support
information
Prior art date
Application number
PCT/JP2006/304282
Other languages
English (en)
Japanese (ja)
Inventor
Goro Kobayashi
Koji Koga
Takeshi Sato
Original Assignee
Pioneer Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pioneer Corporation filed Critical Pioneer Corporation
Publication of WO2006095689A1 publication Critical patent/WO2006095689A1/fr

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/168Driving aids for parking, e.g. acoustic or visual feedback on parking space

Definitions

  • Driving support device driving support method, and driving support program
  • the present invention relates to a driving support device, a driving support method, and a driving support program.
  • use of the present invention is not limited to the above-described driving support device, driving support method, and driving support program.
  • Such a driving support device divides a monitor screen that displays an image of the rear of the vehicle, for example, taken by an in-vehicle camera into a main screen and a sub-screen, and the captured image is displayed in a wide area behind the vehicle.
  • the video is displayed on the main screen as a video to be displayed, and guidelines are superimposed on the main screen, and the area behind the vehicle in the captured video is converted into an image looking down from the virtual viewpoint and displayed on the sub-screen.
  • the guideline is superimposed and displayed on the sub-screen as a guideline with the viewpoint changed in the same way as the viewpoint conversion.
  • the main screen and the center axis of the sub-screen are made to coincide with each other, and the sub-screen is arranged below the main screen (see, for example, Patent Document 1).
  • such a driving support device is provided with, for example, three in-vehicle cameras that photograph at least the rear and the left and right rear sides, and displays three images at the same time when parking, and particularly the left and right rear side cameras. Is taken with an angle of view so that a part of the rear tire is visible, and a pole-shaped guide that is on the outermost side of the vehicle and near the rear tire is perpendicular to the ground. are further configured to direction lapping the vehicle is superimposed on the arrow Bunryokuru so (e.g., see Patent Document 2.) 0
  • Patent Document 1 Japanese Patent Application Laid-Open No. 2003-104145
  • Patent Document 2 Japanese Patent Laid-Open No. 2003-199093 Disclosure of the invention
  • the driving support apparatus is an image data receiving means for receiving image data of a video of a vehicle, a behavior information receiving means for receiving behavior information of the vehicle, and the image data receiving means.
  • a video generation unit that generates a video from outside the vehicle based on the image data received by the behavior information received by the behavior information reception unit, and a video generated by the video generation unit Image data transmitting means for transmitting the image data to the vehicle.
  • the driving support apparatus includes an image data receiving unit that receives image data of a video of a vehicle, a behavior information acquiring unit that acquires behavior information of the vehicle, and the image data receiving unit.
  • Image generating means for generating an image from the outside of the vehicle based on the image data received by the means and the behavior information acquired by the behavior information acquiring means, and the image generating means
  • display means for displaying an image.
  • the driving support method includes an image data receiving step for receiving image data of a video of a vehicle, a behavior information receiving step for receiving behavior information of the vehicle, and the image data. Based on the image data received by the receiving step and the behavior information received by the behavior information receiving step, a video generating step for generating a video from the outside of the vehicle, and the video generating step An image data transmission step of transmitting image data of the video to the vehicle.
  • the driving support method includes an image data receiving step of receiving image data of a video of a vehicle, a behavior information acquiring step of acquiring behavior information of the vehicle, and the image data receiving The image data received by the process and the behavior information acquisition A video generation step of generating a video from outside the vehicle based on the behavior information acquired by the process, and a display step of displaying the video generated by the video generation step, To do.
  • the driving support program according to the invention of claim 8 causes a computer to execute the driving support method according to claim 6 or 7.
  • FIG. 1 is a block diagram illustrating an example of a functional configuration of a driving support system to which the driving support apparatus according to the first embodiment is applied.
  • FIG. 2 is a flowchart illustrating an example of a driving support processing procedure of the driving support system to which the driving support device according to the first embodiment is applied.
  • FIG. 3 is a block diagram of an example of a functional configuration of the driving support apparatus according to the second embodiment.
  • FIG. 4 is a flowchart showing an example of a driving support processing procedure of the driving support apparatus according to the second embodiment.
  • FIG. 5 is a side view showing an example of an arrangement configuration of a photographing unit of a driving support system to which the driving support device according to the first embodiment is applied.
  • FIG. 6 is a top view showing an example of an arrangement configuration of a photographing unit of a driving support system to which the driving support device according to the first embodiment is applied.
  • FIG. 7 is a block diagram illustrating an example of a hardware configuration of the driving support apparatus according to the first embodiment.
  • FIG. 8 is a flowchart illustrating an example of a driving support processing procedure of the driving support apparatus according to the first embodiment.
  • FIG. 9 is an explanatory diagram of an example of an image generated by the driving support apparatus according to the first embodiment.
  • FIG. 10 is an explanatory diagram of an example of an image generated by the driving support apparatus according to the first embodiment.
  • FIG. 11 is a flowchart illustrating an example of a driving support processing procedure of the driving support apparatus according to the second embodiment.
  • FIG. 12 is a block diagram of an example of a hardware configuration of the driving support apparatus according to the third embodiment.
  • FIG. 13 is a flowchart of an example of a driving assistance processing procedure of the driving assistance apparatus according to the third embodiment.
  • FIG. 14 is a flowchart of an example of a driving assistance processing procedure of the driving assistance apparatus according to the fourth embodiment.
  • FIG. 1 is a block diagram showing an example of a functional configuration of a driving support system to which the driving support device according to the first embodiment of the present invention is applied.
  • the driving support system 190 includes a driving support device 100, a photographing unit 110, and a vehicle 150.
  • the driving support device 100 includes an image data receiving unit 101, a behavior information receiving unit 102, a video generating unit 103, and an image data transmitting unit 104.
  • Vehicle 1 50 includes a reception unit 151, a display unit 152, and a behavior information acquisition unit 153.
  • the image data receiving unit 101 of the driving support device 100 receives image data of a video of the vehicle 150.
  • the image data is received by wireless communication, for example.
  • An image of the vehicle 150 is captured by the imaging unit 110, for example.
  • the image of the vehicle 150 includes the images of the vehicle 150 and the surroundings of the vehicle 150.
  • the periphery of the vehicle 150 means, for example, the periphery of the entire vehicle 150 in the up / down, left / right, and front / rear direction.
  • the image data of the video received by the image data receiving unit 101 includes, for example, image data of a video captured by the imaging unit 110 in front and rear, side, and upper side of the vehicle 150, for example.
  • the behavior information receiving unit 102 receives behavior information related to the behavior of the vehicle 150.
  • the behavior information is received by, for example, wireless communication.
  • the behavior information is acquired by the behavior information acquisition unit 153 of the vehicle 150.
  • the behavior information is information indicating the movement or stop state of the vehicle 150. For example, information on the steering (handle) and the transmission (gear) of the vehicle 150, information on the speed (speed information, Acceleration information, angular velocity information), tilt angle information, lateral G (Gmvity) information, current location information, and information associated with operation input.
  • the video generation unit 103 generates a video from the outside of the vehicle 150 based on the image data received by the image data reception unit 101 and the behavior information received by the behavior information reception unit 102.
  • the image from the outside of the vehicle 150 is, for example, an image of a viewpoint facing the vehicle 150 from around the vehicle 150.
  • the video generation unit 103 generates a video on which a guide image indicating the approaching state of the vehicle 150 is superimposed and displayed as necessary.
  • the image generation unit 103 generates an image of a viewpoint having a force in a direction opposite to the direction in which the vehicle 150 moves. For example, when the vehicle 150 is moving backward, an image in which the rear portion of the vehicle 150 is viewed from the backward destination is generated.
  • the image data transmission unit 104 transmits the video image data generated by the video generation unit 103 to the vehicle 150.
  • the image data is transmitted by wireless communication, for example.
  • the photographing unit 110 photographs the vehicle 150 and the surroundings of the vehicle 150.
  • This photographing unit 110 is provided in so-called external structures such as walls and ceilings of parking facilities, members straddling roads, members covering the driving lane, bridges, road surfaces (road pavement), tunnels, traffic lights, utility poles, signs, etc. Yes.
  • the photographing unit 110 includes a transmission unit (not shown) for transmitting image data of a photographed video to the outside, for example.
  • the receiving unit 151 of the vehicle 150 receives image data transmitted from, for example, the image data transmitting unit 104 of the driving support device 100.
  • the display unit 152 displays the video that the image data received by the receiving unit 151 constitutes, for example, on a display screen provided inside the vehicle 150.
  • the behavior information acquisition unit 153 acquires behavior information of the vehicle 150. The behavior information acquired by the behavior information acquisition unit 153 is transmitted to the behavior information reception unit 102 of the driving support device 100.
  • the driving support device 100 displays images from the outside of the vehicle 150 based on the image data of the image from the imaging unit 110 and the behavior information from the vehicle 150.
  • An image can be generated and image data of the generated video can be provided to the vehicle 150.
  • an image including the rear blind spot of the vehicle 150 generated by the driving support device 100 is displayed. It can be confirmed on the display screen of the display unit 152 as an objective video as seen from the viewpoint of the vehicle guide.
  • the driver can objectively visually recognize the situation around the vehicle 150 in detail and instantaneously, so that it can be expected to support driving operation and contribute to safe driving.
  • transmission / reception of various types of information performed between the driving support device 100, the photographing unit 110, and the vehicle 150 may be performed directly by the wireless communication described above. For example, via a server on the Internet. It may be done indirectly.
  • FIG. 2 is a flowchart showing an example of a driving support processing procedure of the driving support system to which the driving support device according to the first embodiment of the present invention is applied.
  • the description will be made with reference to FIG. 1, but the same parts as those already described are denoted by the same reference numerals and the description thereof is omitted.
  • the image data receiving unit 101 of the driving support apparatus 100 Thus, the image data of the vehicle 150 imaged by the imaging unit 110 is received (step S201).
  • the behavior information received by the behavior information acquiring unit 153 of the vehicle 150 is received by the behavior information receiving unit 102 of the driving support device 100 (step S 202).
  • the driving assistance device 100 determines what information the behavior information is (for example, information on the steering and transmission of the vehicle 150), and the received behavior information. Judgment is made (step S203). Then, the video generation unit 103 of the driving support device 100 generates a video from the outside of the vehicle 150 based on the image data received by the image data reception unit 101 and the determined behavior information (V Step S204).
  • the video generation unit 103 may generate a video on which a guide image indicating the approaching state of the vehicle 150 is superimposed as necessary.
  • the image data transmission unit 104 of the driving support apparatus 100 transmits the generated video image data to the vehicle 150 (step S205).
  • the vehicle 150 that has received the transmitted image data by the reception unit 151, the received image data is provided to the display unit 152, and the display unit 152 displays an image on the display screen (step S). 206).
  • the series of driving support processing according to this flowchart is completed.
  • the vehicle 150 may perform driving assistance (voice guidance) together with the video displayed by the guidance voice or the like.
  • voice guidance voice guidance
  • this driving support system 190 By performing the driving support process in this manner, in this driving support system 190, an image from the outside of the vehicle 150 generated by the driving support device 100 on the vehicle 150 side can be displayed. It is possible to supplement the driver's field of view, such as the blind spot, and to support driving operations in the vehicle 150 to improve driving safety.
  • the driving support device receives the received image data of the video of the vehicle, the motion information, and the like. Based on the, generate a video from the outside of the vehicle, the image data of the generated video Can be sent to the vehicle. For this reason, the vehicle can perform a driving operation while displaying an image formed by the transmitted image data and visually confirming the situation around the vehicle. As a result, driving operation can be supported to improve driving safety.
  • the driving support device is separate from the vehicle and transmits information to and from the vehicle by communication.
  • the driving support apparatus has a configuration in which the driving support apparatus is mounted on a vehicle.
  • FIG. 3 is a block diagram illustrating an example of a functional configuration of the driving support apparatus according to the second embodiment of the present invention.
  • the driving support device 300 includes an image data reception unit 301, a behavior information acquisition unit 302, a video generation unit 303, and a display unit 304, and includes an image data transmission unit 305.
  • the configuration is mounted on a vehicle 350.
  • the image data receiving unit 301 of the driving support device 300 receives image data of a video of the vehicle 350.
  • the image data is received by wireless communication, for example.
  • An image of the vehicle 350 is captured by the imaging unit 310, for example.
  • the image of the vehicle 350 includes images of the vehicle 350 and the surroundings of the vehicle 350.
  • the periphery of the vehicle 350 means the periphery of the entire vehicle 350, for example, the top, bottom, left, and right.
  • Specific examples of the video image data received by the image data receiving unit 301 include video image data obtained by photographing the front and rear, the side, and the upper side of the vehicle 350 by the photographing unit 310, for example.
  • the behavior information acquisition unit 302 acquires behavior information related to the behavior of the vehicle 350 detected by the detection unit 351 of the vehicle 350.
  • the behavior information is information indicating the movement or stop state of the vehicle 350, for example, information on the steering (handle) and the transmission (gear) of the vehicle 350, information on the speed (speed information, acceleration). Information, angular velocity information), tilt angle information, lateral G (Gravity) information, current location information, and information associated with operation input.
  • the video generation unit 303 generates a video from the outside of the vehicle 350 based on the image data received by the image data reception unit 301 and the behavior information acquired by the behavior information acquisition unit 302.
  • the image from the outside of the vehicle 350 is, for example, an image from the periphery of the vehicle 350 toward the vehicle 350 with a directional viewpoint.
  • the video generation unit 303 generates a video on which a guide image indicating the approaching state of the vehicle 350 is superimposed and displayed as necessary.
  • the image generation unit 303 generates an image of a viewpoint having a force in a direction opposite to the direction in which the vehicle 350 moves. For example, when the vehicle 350 is moving backward, an image in a state in which the rear portion of the vehicle 350 is viewed from the backward position is generated.
  • the display unit 304 displays the video generated by the video generation unit 303 on, for example, a display screen provided inside the vehicle 350.
  • the image data transmission unit 305 transmits the image data of the video generated by the video generation unit 303 to the outside. The transmission of the image data is performed by wireless communication, for example.
  • the photographing unit 310 photographs the vehicle 350 and the surroundings of the vehicle 350.
  • the photographing unit 310 is provided in so-called external structures such as walls and ceilings of parking facilities, members that straddle roads, members that cover driving lanes, bridges, road surfaces (road pavement), tunnels, traffic lights, utility poles, signs, etc. Yes.
  • the photographing unit 310 includes a transmission unit (not shown) for transmitting image data of a photographed video to the outside, for example. Note that the photographing unit 310 may be remotely operated by the driving support device 300, for example.
  • the detection unit 351 of the vehicle 350 includes, for example, a speed sensor that detects the speed of the traveling vehicle 350, an inclination angle sensor that detects the inclination angle of the vehicle 350, and an acceleration that detects the acceleration of the vehicle 350.
  • Sensor angular velocity sensor that detects the angular velocity of vehicle 350 during cornering
  • lateral G sensor that detects lateral G that is an outward force (gravity) generated by centrifugal force during cornering
  • radio wave of satellite force The GPS is configured to detect the current location information (latitude / longitude information) of the vehicle 350 by receiving the signal
  • the gyro sensor that detects the traveling direction of the vehicle 350, and the like.
  • the driving support device 300 generates a video from the outside of the vehicle 350 based on the image data of the video from the imaging unit 310 and the behavior information from the detection unit 351 of the vehicle 350, and displays the video.
  • the video generated by the unit 304 can be displayed on the vehicle 350. For this reason, for example, even if the driver of the vehicle 350 is inexperienced in driving, the vehicle 350 is parked.
  • the video including the rear blind spot of the vehicle 350 generated by the driving support device 300 is displayed on the display screen of the display unit 304 with an objective image that is viewed from the viewpoint of the vehicle guide. be able to.
  • the driver can objectively visually recognize the situation around the vehicle 350 in detail and instantaneously, so that it can be expected to support driving operation and contribute to safe driving.
  • transmission / reception of various information performed between the driving support device 300 and the photographing unit 310 or between the driving support device 300 and the outside may be performed directly by the above-described wireless communication. It may be performed indirectly via the server above.
  • FIG. 4 is a flowchart showing an example of a driving support processing procedure of the driving support apparatus according to the second embodiment of the present invention.
  • the description will be made with reference to FIG. 3, but the same portions as those already described are denoted by the same reference numerals and the description thereof is omitted.
  • the image data receiving unit 301 of the driving support device 300 receives the image data of the video of the vehicle 350 captured by the imaging unit 310 (step S401).
  • the driving assistance device 300 determines whether or not the behavior information acquisition unit 302 has acquired the behavior information detected by the detection unit 351 of the vehicle 350 (step S402).
  • the driving support device 300 determines the behavior information force S what kind of information (for example, information regarding the steering and transmission of the vehicle 350). (Step S403).
  • step S402 When it is determined that it is acquired (step S402: No), the driving support device 300 repeats the process of step S402 until the behavior information is acquired by the behavior information acquisition unit 302. Subsequent to step S403, the video generation unit 303 of the driving support device 300 generates a video from the outside of the vehicle 350 based on the image data received by the image data reception unit 301 and the determined behavior information. (Step S404). During the video generation process in step S404, the video generation unit 303 checks the approaching state of the vehicle 350 as necessary. You may generate a video with the guide image shown superimposed.
  • step S405 After the video is generated by the video generation unit 303, the display unit 304 displays the video on the display screen based on the image data of the generated video (step S405).
  • driving assistance voice guidance
  • step S405 driving assistance may be performed in the vehicle 350 together with the video displayed by the guidance voice or the like. Specifically, for example, guidance voice for notifying the approaching situation between the vehicle 350 and an obstacle can be cited.
  • the driving support apparatus As described above, according to the driving support apparatus according to the second embodiment of the present invention, based on the received image data of the video of the vehicle and the acquired behavior information! An external video can be generated and the generated video can be displayed. For this reason, the vehicle can be operated while displaying an image of the external force of the vehicle in the vehicle and objectively viewing the situation around the vehicle. This makes it possible to support driving operations and improve driving safety.
  • Examples 1 to 4 according to the first embodiment and the second embodiment will be described in detail.
  • Examples 1 and 2 are examples corresponding to the first embodiment, and examples 3 and 4 are examples corresponding to the second embodiment.
  • FIG. 5 is a side view showing an example of an arrangement configuration of the photographing unit of the driving support system to which the driving support device according to the first embodiment of the present invention is applied.
  • FIG. 6 is a top view showing an example of the arrangement configuration of the photographing unit of the driving assistance system to which the driving assistance apparatus according to the first embodiment of the present invention is applied.
  • FIG. 5 for example, around the vehicle 150 in the parking facility, cameras 511 to 515, which are the imaging unit 110 in FIG. 1, are arranged on an external structure such as a wall or ceiling of the parking facility. .
  • the camera 511 captures the surroundings of the vehicle 150 and the vehicle 150 from obliquely above the rear of the vehicle 150
  • the camera 512 captures the surroundings of the vehicle 150 and the vehicle 150 from above the rear of the vehicle 150
  • the camera 513 captures the vehicle 150 and the surroundings of the vehicle 150 from obliquely above the front of the vehicle 150
  • the camera 514 captures the surroundings of the vehicle 150 and the vehicle 150 from behind the vehicle 150
  • the camera 515 captures the surroundings of the vehicle 150 and the vehicle 150 from the front of the vehicle 150.
  • cameras 516 to 519 that are the imaging unit 110 in FIG. 1 are arranged around the vehicle 150 together with the cameras 514 and 515.
  • the camera 516 photographs the surroundings of the vehicle 150 and the vehicle 150 from the left rear side of the vehicle 150
  • the camera 517 photographs the surroundings of the vehicle 150 and the vehicle 150 from the rear right side of the vehicle 150.
  • the camera 518 images the surroundings of the vehicle 150 and the vehicle 150 from the front left side of the vehicle 150
  • the camera 519 images the surroundings of the vehicle 150 and the vehicle 150 from the front right side of the vehicle 150.
  • the photographing unit 110 in FIG. 1 may have a plurality of configurations including the cameras 511 to 519, and may have a single configuration including a single camera force.
  • FIG. 7 is a block diagram illustrating an example of a hardware configuration of the driving support apparatus according to the first embodiment of the present invention.
  • the driving support device 100 includes a CPU 701, a ROM 702, a RAM 703, an H DD (node disk drive) 704, an HD (node disk) 705, and a CDZDVD drive 706.
  • CDZDVD707 as an example of a removable recording medium, input iZF (interface) 708, input key 709, remote control 710, video iZF (interface) 711, display (or touch panel) 712, and audio IZF (Interface) 713, speaker power 714, microphone 715, and communication IZF (interface) 716.
  • Each component 701 to 716 is connected by a bus 720.
  • the CPU 701 governs overall control of the driving support device 100, and creates a control program. Therefore, by executing various arithmetic processes, the respective components 701 to 716 included in the driving support device 100 are controlled in an integrated manner.
  • the ROM 702 stores so-called fixed data such as a boot program. The data stored in ROM 702 can be rewritten by, for example, user operation.
  • the RAM 703 stores so-called variable data in a rewritable manner, and is used as a work area for the CPU 701.
  • the RAM 703 may be a volatile memory that erases data stored when the power is turned off, or may be a non-volatile memory that is knocked up by a notch or the like.
  • the HDD 704 controls data read / write to the HD 705 according to the control of the CPU 701.
  • the HD705 stores data written under the control of the HDD704.
  • the CDZDVD drive 706 controls data read / write to the CDZDVD707 according to the control of the CPU 701.
  • the CDZDVD707 is a detachable recording medium from which recorded data is read according to the control of the CDZDVD drive 706.
  • a writable recording medium can be used as the CDZDVD707.
  • this removable recording medium may be a CD-ROM (CD-R, CD-RW), MO, memory card, or the like.
  • the input IZF 708 inputs data transmitted from an input key 709 for inputting characters, numerical values, various instructions, and the like, or a remote controller 710 having a part or all of the input keys 709.
  • an output IZF is provided as necessary, and a scanner that optically reads characters and images and a printer that prints characters and images can be connected via the output IZF.
  • the video IZF 711 is connected to a video display (or touch panel) 712.
  • the video IZF711 is, for example, a graphic controller that controls the entire display (or touch panel) 712, a buffer memory such as VRAM (Video RAM) that temporarily stores image information that can be displayed immediately, and a graphic. Based on the image data output from the controller, it is configured by a control IC that controls the display (or touch panel) 712.
  • Display (or touch panel) 712 displays icons, cursors, menus, windows, or various information such as characters and images.
  • a CRT, a TFT liquid crystal display, a plasma display, or the like can be adopted as the display 712.
  • the touch panel 712 is connected to the input I / F 708, detects that the touch panel 712 is depressed Z not pressed, and inputs data to the input IZF 708.
  • the audio IZF 713 is connected to an audio output speaker 714 and an audio input microphone 715.
  • the audio IZF 713 includes, for example, a DZA converter that performs DZA conversion of audio digital data, and an amplifier that amplifies an audio analog signal output from the DZA converter.
  • the speaker 714 outputs sound
  • the microphone 715 inputs sound read out from an external cover.
  • the communication IZF 716 inputs video image data from the photographing unit (camera) 110 and inputs / outputs various data such as image data and motion information to / from the vehicle 150.
  • the communication IZF716 has a function of a communication controller that performs communication with an external server such as the Internet.
  • the functions of the image data receiving unit 101, the behavior information receiving unit 102, and the image data transmitting unit 104 in FIG. 1 are specifically realized by, for example, the CPU 701, the ROM 702, the RAM 703, and the communication IZF 716.
  • the video generation unit 103 in FIG. 1 specifically realizes its function by, for example, the CPU 701, the ROM 702, the RAM 703, and the video I / F 711.
  • FIG. 8 is a flowchart showing an example of a driving support processing procedure of the driving support apparatus according to the first embodiment of the present invention. Specifically, the processing shown in FIG. 8 is realized by the CPU 701 executing a program stored (recorded) in the ROM 702, RAM 703, HD 705, CDZDVD 707, etc. shown in FIG.
  • the description will be made with reference to FIG. 7, but the same parts as those already described will be denoted by the same reference numerals and the description thereof will be omitted.
  • the driving support device 100 receives image data of the video of the vehicle 150 captured by the imaging unit 110 through the communication IZF716 (step S801). Receive image data After that, the driving assistance device 100 uses the behavior information acquisition unit 153 of the vehicle 150 (see FIG. 1, the same applies hereinafter) as information on the steering and the transmission that is the behavior information of the vehicle 150. Receive information about handle and gear (step S802
  • the CPU 701 determines the steering direction and gear of the vehicle 150 (step S803).
  • the operation state of the vehicle 150 for example, the operation of moving backward by turning the steering wheel straight
  • the driving assistance apparatus 100 generates an image having a force in the direction opposite to the direction in which the vehicle 150 moves, as an image from the outside of the vehicle 150, by the CPU 701 (step S804).
  • the driving support device 100 generates a guide image for the generated video in response to input instructions from the input key 709, the remote controller 710, the touch panel 712, and the like through the input IZF708. It is determined whether or not there is a superimposed display (step S805). Examples of the guide image include guidelines that are virtually formed around the vehicle 150. If it is determined that there is a superimposed display (step S805: Yes), the CPU 701 adds a guide image to the generated video (step S806).
  • driving support apparatus 100 transmits the image data of the generated video to vehicle 150 through communication IZF 716 (step S807). If it is determined in step S805 that there is no superimposed display (step S805: No), the process proceeds to step S807 and image data is transmitted. Thus, a series of driving support processing according to this flowchart is completed. In addition, in the vehicle 150 that has received the image data, the video that the image data constitutes can be displayed on the display screen of the display unit 152 (see FIG. 1, the same applies hereinafter).
  • step S807 an image displayed on display unit 152 of vehicle 150 after the image data transmission process in step S807 will be briefly described.
  • a description will be given of a case where the image is generated from the viewpoint of looking at the rear of the vehicle 150 generated by the driving assistance device 100 when the steering direction determined in step S803 is straight and the gear is reverse. .
  • FIGS. 9 and 10 show images generated by the driving support apparatus according to Example 1 of the invention. It is explanatory drawing which shows an example.
  • the display screen of the display unit 1 52 of the vehicle 150 (see FIG. 1, the same applies hereinafter) is provided on the left and right sides of the vehicle 150 with the image power when the rear part of the vehicle 150 is viewed. Parking line 1002 is displayed.
  • the rear portion of the vehicle 150 is displayed on the display screen of the display unit 152 as the vehicle moves backward.
  • a guidance image such as the virtual stop guideline 1001 is displayed on the display screen of the display unit 152, for example.
  • the guide image may also be displayed as an animated image, for example.
  • the driving support apparatus is based on the received image data of the vehicle image and the received information on the steering direction and gear of the vehicle. Video from the direction opposite to the direction of movement can be generated and the image data of the generated video can be sent to the vehicle. For this reason, in the vehicle, the transmitted video can be displayed as an objective video, and the vehicle can be driven, and the safety during driving can be improved.
  • the driving assistance device based on the received image data of the video of the vehicle and the information on the steering direction and gear of the vehicle that is the received behavior information, the driving direction and the opposite direction of the vehicle are detected. It was configured to be able to generate video and send image data of the generated video to the vehicle.
  • the received behavior information includes the received behavior information.
  • the received behavior information Based on the current location information of a certain vehicle, a configuration is adopted in which an image from the direction opposite to the direction of movement of the vehicle is generated and the image data of the generated image can be transmitted to the vehicle. Since the hardware configuration of the second embodiment is the same as that of the first embodiment, the same reference numerals as those in FIG. 7 are used in the following description, and illustration and description thereof are omitted.
  • FIG. 11 is a flowchart showing an example of a driving support processing procedure of the driving support apparatus according to the second embodiment of the present invention. Specifically, the processing shown in FIG. 11 is realized by the CPU 702 executing a program stored (recorded) in the ROM 702, RAM 703, HD 705, CD / DVD 707, etc. shown in FIG.
  • the driving assistance device 100 receives image data of a video of the vehicle 150 captured by the imaging unit 110 through the communication IZF716 (step S1101). After receiving the image data, the driving assistance device 100 obtains the current location information of the vehicle 150, which is the behavior information of the vehicle 150, acquired by the behavior information acquisition unit 153 of the vehicle 150 (see FIG. 1, the same applies hereinafter). Receive (step S 1102).
  • the CPU 701 After receiving the current location information, the CPU 701 reads the map information stored (recorded) in the HD705 or CDZDVD707, refers to the read map information, and the received current location information indicates the specific location. Whether it is point information indicating a parking facility, for example (step S1103). When it is determined that the location information indicates a parking facility (step S1103: Yes), the driving support device 100 receives information regarding the steering wheel and gear of the vehicle 150 acquired by the behavior information acquisition unit 153 of the vehicle 150. (Step S1104). If it is determined in step S1103 that the location information does not indicate a parking facility (step S1103: No), the process proceeds to step S1102 to receive current location information.
  • the CPU 701 determines the steering direction and gear of the vehicle 150 (step S1105).
  • the operation state of the vehicle 150 for example, operation such as reversing by turning the steering wheel straight
  • the driving assistance apparatus 100 generates an image having a force in the direction opposite to the direction in which the vehicle 150 moves as an image of the external force of the vehicle 150 by the CPU 701 (step S 1106).
  • the driving support device 100 guides the generated video through the input IZF708 according to input instructions from the input key 709, the remote control 710, the touch panel 712, and the like. Determine whether the image is overlaid (Steps S1107).
  • An example of the guide image is a guide line virtually formed around the vehicle 150. If it is determined that there is a superimposed display (step S1107: Yes), the CPU 701 adds a guide image to the generated video (step S1108).
  • step S1109 the driving support device 100 transmits the image data of the generated video to the vehicle 150 through the communication IZF716 (step S1109). If it is determined in step S1107 that there is no superimposed display (step S1107: No), the process proceeds to step SI109 and image data is transmitted. Thus, the series of driving support processing according to this flowchart is completed. In addition, in the vehicle 150 that has received the image data, the video that the image data constitutes can be displayed on the display screen of the display unit 152 (see FIG. 1, the same applies hereinafter).
  • the driving support apparatus relates to the received image data of the video of the vehicle, the received current position information of the vehicle, the received steering direction and gear of the vehicle. Based on the information, it is possible to generate an image from the direction opposite to the moving direction of the vehicle, and to transmit the image data of the generated image to the vehicle. For this reason, in a vehicle, it is possible to display an image transmitted at a specific point such as a parking facility as an image from an objective viewpoint, and to improve driving safety. Is possible.
  • the driving support apparatus according to Example 1 and Example 2 was configured separately from the vehicle.
  • the driving support apparatus according to the third embodiment is configured when applied to a navigation apparatus mounted on a vehicle.
  • FIG. 12 is a block diagram illustrating an example of a hardware configuration of the driving support apparatus according to the third embodiment of the present invention.
  • the driving support device 300 includes a navigation control unit 1200, a user operation unit (remote control and touch panel) 1201, a display screen (display) 1202, and an acquisition unit (GPS
  • the navigation control unit 1200 controls the entire driving support device 300, for example, and comprehensively controls each unit included in the driving support device 300 by executing various arithmetic processes according to a control program or the like.
  • the navigation control unit 1200 includes, for example, a CPU (Central Processing Unit) that executes predetermined arithmetic processing, a ROM (Read Only Memory) that stores various control programs, and a RAM (Random Access Memory) that functions as a work area for the CPU. ) Etc., and can be realized by a microcomputer.
  • a CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • the navigation control unit 1200 for example, at the time of route guidance of the vehicle 350 related to navigation (see Fig. 3, the same applies hereinafter), information related to the current location of the vehicle 350 (current location) acquired by the acquisition unit 1203 described later. Information) and the map information obtained from the recording medium 1204 through the recording medium decoding unit 1205, it is calculated which position on the map the vehicle 350 is driving, and the calculation result is displayed on the display screen 1202. Output.
  • the navigation control unit 1200 inputs / outputs information on route guidance between the route search unit 1208, the route guidance unit 1209, and the guidance sound generation unit 1210, which will be described later, and displays information obtained as a result of the above route guidance. Output to screen 1202 and guidance sound output unit 1206.
  • the navigation control unit 1200 for example, in the driving support process, the image data of the video of the vehicle 350 taken by the external photographing unit (camera) 310 and the vehicle 350 obtained by the obtaining unit 1203. Based on the behavior information regarding the behavior, the video processed by the video processing unit 1212 is displayed on the display screen 1202 inside the vehicle 350.
  • User operation unit 1201 outputs information input by the user, such as characters, numerical values, and various instructions, to navigation control unit 1200.
  • information input by the user such as characters, numerical values, and various instructions
  • various known forms such as a push button switch that detects physical depression Z non-depression, a touch panel, a keyboard, and a joystick can be employed.
  • This user operation unit 1201 is, for example, a microphone for inputting sound of external power. It is good also as a form which performs input operation with an audio
  • the user operation unit 1201 is provided integrally with the driving support apparatus 300, and may be operated by a position force separated from the driving support apparatus 300 like a remote controller. It may be. Further, the user operation unit 1201 may be configured in any one of the various forms described above, or may be configured in a plurality of forms. A user such as a driver of the vehicle 350 inputs information by appropriately performing an input operation according to the form of the user operation unit 1201.
  • Information input by the input operation of the user operation unit 1201 includes, for example, destination information regarding navigation. Specifically, for example, when the driving support device 300 is provided in an in-vehicle navigation device or the like, a point to be reached by a passenger of the vehicle 350 is set. Further, as information input to the user operation unit 1201, for example, regarding display processing on the display screen 1202, display form information of a displayed video can be cited. Specifically, for example, an image displayed in the form (image size, display position, etc.) and timing desired by the passenger of the vehicle 350 is displayed and set.
  • the touch panel when a touch panel is adopted as the form of the user operation unit 1201, the touch panel is used by being stacked on the display screen 1202, for example.
  • the information input by the input operation is recognized by managing the display timing of the image on the display screen 1202, the operation timing of the user on the touch panel (user operation unit 1201), and the position coordinates thereof.
  • a touch panel stacked on the display screen 1202 is adopted, so that a large amount of information can be input without increasing the size of the user operation unit 1201. You can do it.
  • this touch panel various known touch panels such as a resistance film type and a pressure sensitive type can be adopted.
  • Display screen 1202 includes, for example, a CRT (Cathode Ray Tube), a TFT (Thin Film Transistor) liquid crystal display, an organic EL (Electroluminescence) display, a plasma display, and the like.
  • the display screen 1202 includes, for example, a video IZF (not shown) or a display device for video display connected to the video IZF. can do.
  • the video IZF is, for example, a graphic controller that controls the entire display device, a buffer memory such as a VRAM (Video RAM) that temporarily stores image data that can be displayed immediately, and the like. Based on the image data output from the graphics controller, it consists of a control IC that controls the display of the display device and a GPU (Graphics Processing Unit).
  • VRAM Video RAM
  • This display screen 1202 displays, for example, icons, cursors, menus, windows, or various information such as characters and images.
  • a video processed by a video processing unit 1212 described later, a map image based on map information from the recording medium 1204, and the like are displayed.
  • the acquisition unit 1203 acquires current point information (latitude / longitude information) of the vehicle 350 on which the driving support device 300 is mounted, for example, by receiving radio waves of artificial satellite power.
  • the current location information is information for receiving a radio wave from an artificial satellite and obtaining a geometric position with respect to the artificial satellite as described above, and can be measured anywhere on the earth.
  • the acquisition unit 1203 includes a GPS antenna (not shown).
  • the acquisition unit 1203 acquires behavior information of the vehicle 350.
  • the behavior information of the vehicle 350 includes, for example, information related to the steering gear (gear) and transmission (gear) of the vehicle 350, speed information, acceleration information, angular velocity information, inclination angle information, lateral G (Gmvity) information, current position information, and operation. Information that accompanies input.
  • the GPS Global Positioning System
  • the acquisition unit 1203 can be configured by, for example, a tuner that demodulates radio waves received by satellite power, an arithmetic circuit that calculates the current position based on the demodulated information, and the like.
  • the radio waves from the artificial satellite are 1. 57542 GHz carrier waves, using L1 radio waves carrying CZA (Coarse and Access) codes and navigation messages.
  • CZA Coarse and Access
  • the current location (latitude and longitude) of the vehicle 350 on which the driving support device 300 is mounted is detected.
  • the behavior information of the vehicle 350 including the detection of the current location of the vehicle 350
  • information collected by various sensors such as a vehicle speed sensor and a gyro sensor may be taken into account.
  • the vehicle speed sensor detects the vehicle speed from the output shaft of the transmission of the vehicle 350 on which the driving support device 300 is mounted, for example.
  • the angular velocity sensor detects, for example, the angular velocity when the vehicle 350 rotates and outputs angular velocity information and relative orientation information.
  • the mileage sensor calculates the number of pulses per one rotation of the wheel by counting the number of pulses of a pulse signal with a predetermined period output along with the rotation of the wheel, and calculates the number of pulses per one rotation. Based on the travel distance information.
  • the inclination angle sensor detects, for example, the inclination angle of the road surface and outputs inclination angle information.
  • the lateral G sensor detects lateral G, which is an outward force (gravity) generated by centrifugal force when cornering the vehicle 350, for example, and outputs lateral G information.
  • the current position information of the vehicle 350 acquired by the acquisition unit 1203 and the information detected by the vehicle speed sensor, gyro sensor, angular velocity sensor, mileage sensor, inclination angle sensor, and lateral G sensor are the navigation control unit. Output to 1200.
  • the recording medium 1204 records various control programs and various information in a state that can be read by a computer.
  • the recording medium 1204 accepts writing of information by the recording medium decoding unit 1205 and records the written information in a nonvolatile manner.
  • the recording medium 1 204 can be realized by HD (Hard Disk), for example.
  • the recording medium 1204 is not limited to the above-mentioned HD. Instead of HD or in addition to HD, various recording media decoding unit 1205 such as various types of DVD (Digital Versatile Disk) and CD (Compact Disk) are used. A removable and portable medium may be used as the recording medium 1204.
  • the recording medium 1204 is not limited to DVD and CD, but can be attached to and detached from a recording medium decoding unit 1205 such as a CD-ROM (CD-R CD-RW) MO (Magneto-Optical disk), a memory card, and the like. Media with portability can be used for ⁇ IJ.
  • the recording medium 1204 includes a driving support program and a navigation system for realizing the present invention.
  • Program various image and audio data, and map information.
  • the image data refers to, for example, a two-dimensional array value representing an image of an image of an external force of the vehicle 350
  • the audio data refers to data for reproducing music such as music.
  • the map information includes, as information included in the map information, for example, background information representing features (features) such as buildings, rivers, and the ground surface, and road shape information representing the shape of the road. This is information that is drawn in 2D or 3D on the display screen 1202.
  • the background information includes background shape information representing the shape of the background and background type information representing the type of the background.
  • the background shape information includes information indicating, for example, representative points of features, polylines, polygons, coordinates of features, and the like.
  • the background type information includes, for example, text information indicating the name, address, and telephone number of the feature, type information indicating the type of the feature such as a building or a river.
  • the road shape information is information related to a road network having a plurality of nodes and links.
  • the node is information indicating an intersection where a plurality of roads such as a three-way intersection, a crossroad, and a five-way intersection intersect.
  • a link is information indicating a road connecting nodes. Some of these links have shape interpolation points that allow the expression of curved roads.
  • the road shape information includes traffic condition information. This traffic condition information is information indicating the characteristics of the intersection, the length (distance) of each link, the vehicle width, the traveling direction, the traffic prohibition, the road type, and the like.
  • intersections include, for example, intersections with complicated shapes such as three- and five-way intersections, intersections with roads that branch at shallow angles, intersections around destinations, highway entrances and junctions, and route deviations.
  • route deviation rate can be calculated, for example, past driving history power.
  • road types include expressways, toll roads, and general roads.
  • the force for recording map information or the like on the recording medium 1204 is not limited to this.
  • the map information may be stored in a storage device such as a memory provided in the navigation control unit 1200, or may be recorded only if it is provided integrally with the hardware of the driving support device 300. It may be provided outside the driving support device 300. If provided outside, even if For example, data communication may be performed with an external storage device via a communication unit 1207, which will be described later, to acquire map information.
  • the recording medium decoding unit 1205 reads information from the recording medium 1204 and writes it to Z.
  • the recording medium decoding unit 1205 is an HDD (Hard Disk Drive).
  • the recording medium decoding unit 1205 is a drive device such as a DVD drive or a CD drive.
  • CD-ROM CD-R, CD-RW
  • MO memory card
  • recording medium decoding unit 1205 A dedicated drive device capable of reading information recorded on various recording media may be used as the recording medium decoding unit 1205 as appropriate.
  • the guidance sound output unit 1206 reproduces the navigation guidance sound by controlling the output to the connected speaker 1211.
  • the guidance sound output unit 1206 can be realized by, for example, a voice IZF, not shown connected to the speaker 1211 for voice output.
  • the audio IZF is, for example, a DZA converter that performs DZA conversion of digital audio signals, an amplifier that amplifies analog audio signals output from the DZA converter, and an AZD converter that performs AZD conversion of analog audio signals. It can be composed of
  • the communication unit 1207 communicates with, for example, another driving support device, communicates with the photographing unit 310, or communicates with an external server.
  • the communication unit 1207 transmits and receives image data of the video of the vehicle 350 captured by the imaging unit 310 between the imaging unit 310 and the navigation control unit 1200.
  • the communication unit 1207 of this embodiment is directly wirelessly communicated with other driving support devices such as a mobile phone, which may be a communication module that communicates with a communication server via a base station (not shown). It may be a communication module that performs.
  • the wireless communication by the communication unit 1207 means that a wire line serving as a communication medium is not used. This refers to communications performed using radio waves, infrared rays, ultrasonic waves, and the like. Examples of standards that enable wireless communication include wireless LAN (Local Area Network) and IrDA (Infrared
  • a wireless LAN can be used as a preferable example in terms of information transfer speed and the like.
  • the communication unit 1207 may receive road traffic information such as traffic jams and traffic restrictions regularly (or irregularly). Receiving road traffic information by the communication unit 1207 can be done concretely, VICS (Vehicle Information and Communication System) center power can also be performed at the timing when the road traffic information is distributed, and to the VICS Center You may carry out by requesting road traffic information distribution regularly.
  • Communication unit 1207 can be realized as an AM / FM tuner, a TV tuner, a VICS / beacon receiver, and other communication devices, for example.
  • rvicsj is a real-time transmission of road traffic information such as traffic jams and traffic regulations edited and processed at the VICS Center in real time. It is an information communication system that displays characters and figures on in-vehicle devices such as devices. As a method of transmitting road traffic information (VICS information) edited and processed in this VICS center to the navigation device, there is a method of using “beacon” and “FM multiplex broadcasting” installed on each road. .
  • beacons include “radio wave beacons” mainly used on expressways and “optical beacons” used on major general roads.
  • FM multiplex broadcasting road traffic information in a wide area can be received.
  • Beacon receive necessary road traffic information at the location where the vehicle is located, such as detailed information on the most recent road based on the location of the vehicle (vehicle 350). Is possible. If the communication method with other driving support devices is different from the communication method for receiving the map information and the road traffic information, the communication unit 1207 corresponds to each communication method. With multiple communication means! /, You can! /
  • the route search unit 1208 receives the current location information of the vehicle 350 acquired by the acquisition unit 1203 and Based on the destination information input by the user, for example, the optimum route from the current location to the destination is calculated.
  • the route guidance unit 1209 includes information on the route (guidance route) searched by the route search unit 1208, or route information received by the communication unit 1207, current location information acquired by the acquisition unit 1203, and the recording medium 1204. Based on the map information obtained via the recording medium decoding unit 1205, real-time route guidance information is generated.
  • the route guidance information generated by the route guidance unit 1209 is output to the display screen 1202 via the navigation control unit 1200.
  • Guidance sound generator 1210 generates tone and voice information corresponding to various patterns. That is, based on the route guidance information generated by the route guidance unit 1209, the virtual sound source corresponding to the guidance point is set and the voice guidance information is generated, and the guidance sound is output via the navigation control unit 1200. Output to part 1206. In addition, the guidance sound generation unit 1210 may generate voice guidance information that guides the vehicle 350 in accordance with the video displayed on the display screen 1202, for example, during driving support.
  • the speaker 1211 reproduces (outputs) the navigation guidance sound output from the guidance sound output unit 1206 and the sound output from the audio processing unit 1214 described later.
  • this speaker 1211 may be provided with headphones and the like, so that the output form of the guidance sound and voice is appropriately changed so that the sound field of the guidance sound and voice output from the entire vehicle 350 is not generated.
  • the video processing unit 1212 performs overall video processing in the driving support device 300. Specifically, the video processing unit 1212 performs video processing of the image data of the video of the vehicle 350 acquired from the communication unit 1207 through the video input / output IZF 1213, for example. Further, the video processing unit 1212 may perform video (image) processing such as image data acquired from the communication unit 1207 through the video input / output IZF 1213 or image data recorded in the recording medium 1204. Specifically, the video processing unit 1212 is configured by a GPU, for example.
  • the video processing by the video processing unit 1212 for example, with respect to the video of the vehicle 350 acquired through the video input / output I ZF1213, the horizontal flip processing, the vertical flip processing, the synthesis processing, Perform video processing such as guideline superimposed display processing such as guidelines. It can be mentioned. Note that the video processing unit 1212 performs video processing in accordance with, for example, a control command from the navigation control unit 1200.
  • the video processing unit 1212 may be configured to have a DSP (Digital Signal Processor) function, for example.
  • DSP Digital Signal Processor
  • Video input / output IZF 1213 performs input / output processing of an image (image data) input / output to / from video processing unit 1212 from the outside, for example.
  • This video input / output IZF1213 is an example of image data from the recording medium 1204 that stores video shot by the DSC or DVC, or video stored in the DSC or DVC, such as USB (Universal Serial Bus) or IEE Eld. 94 (Institute of Electrical and Electronic Engineers 1394) and image data input from the communication unit 1207 through infrared communication, etc., and image data of the image of the vehicle 350 captured by the imaging unit 310 and received by the communication unit 1207.
  • the image data output from the video processing unit 1212 is output to the recording medium 1204, the communication unit 1207, and the like.
  • the video input / output IZF 1213 may have a controller function for controlling read Z write of the recording medium 1204 when inputting / outputting image data to / from the recording medium 1204, for example.
  • the video input / output IZF 1213 may have a function of a communication controller that controls communication with the communication unit 1207, for example, when inputting / outputting image data to / from the communication unit 1207.
  • the audio processing unit 1214 selects audio data obtained from the recording medium 1204 through the recording medium decoding unit 1205, audio data obtained from the communication unit 1207 through the navigation control unit 1200, and the selected audio data. Perform the playback process.
  • the audio processing unit 1214 may perform reproduction processing of audio data stored in a storage device such as an audio database (hereinafter referred to as “audio DB”) (not shown).
  • audio DB an audio database
  • the sound data reproduced by the sound processing unit 1214 includes, for example, sound data such as sound data and sound effects constituting a musical composition. Playback processing by the audio processor 1214 This includes, for example, control of the sound field formed by the sound output from the speaker 1211.
  • the audio processing unit 1214 may be configured to reproduce radio or television audio, for example.
  • the audio processing unit 1214 controls the output of audio output from, for example, the speaker 1211. More specifically, for example, sound volume adjustment / colorizing processing, sound image localization processing, and the like are performed based on selected and reproduced sound data, and the sound output state is controlled.
  • the audio output control by the audio processing unit 1214 is performed by, for example, an input operation from the user operation unit 1201 or a control by the navigation control unit 1200.
  • the external photographing unit 310 is provided in a so-called external structure such as a wall part or a ceiling part of a parking facility, for example.
  • the photographing unit 310 is arranged as in the cameras 511 to 519 in FIGS. 5 and 6, for example.
  • the photographing unit 310 has a C-MOS or a photoelectric conversion element such as a CCD, which may be configured by the above-described photographing devices such as DSC and DVC, and photographs an image of the vehicle 350.
  • the external photographing unit 310 is wirelessly connected to, for example, the driving support device 300 or an external server.
  • an image (image) of the vehicle 350 is constantly or remotely operated from the navigation control unit 1200.
  • the image data of the image shot by the external shooting unit 310 is received by the communication unit 1207 and output to the video processing unit 1212 via the video input / output IZF 1213.
  • the image data receiving unit 301 in FIG. 3 specifically realizes its function by the communication unit 1207, for example, and the behavior information acquisition unit 302 realizes its function by the acquisition unit 1203, for example.
  • the video generation unit 303 in FIG. 3 specifically realizes its function by, for example, the navigation control unit 1200 and the video processing unit 1212, and the display unit 304 realizes its function by, for example, the display screen 1202 .
  • FIG. 13 shows a driving support processing procedure of the driving support device according to the third embodiment of the present invention. It is a flowchart which shows an example of order. In the flowchart of FIG. 13, the description will be made with reference to FIG. 12, but the same parts as those already described are denoted by the same reference numerals and the description thereof is omitted.
  • the navigation control unit 1200 of the driving support device 300 receives image data of the video of the vehicle 350 captured by the imaging unit 310 via the communication unit 1207 (step S1301).
  • the received image data is input from the communication unit 1207 to the video processing unit 1212 through the video input / output IZF 1213.
  • the navigation control unit 1200 determines whether or not the display screen 1202 as the display unit 304 (see FIG. 3, the same applies hereinafter) of the driving support device 300 is ON (step S). 1302).
  • step S1302 When it is determined that the display screen 1202 is ON (step S1302: Yes), the navigation control unit 1200 determines whether or not the acquisition unit 1203 has acquired information on the steering wheel and gear of the vehicle 350. (Step S1303). If it is determined that information on the steering wheel and gear is acquired (step S1303: Yes), the navigation control unit 1200 determines the steering direction and gear of the vehicle 350 (step S1304). By the determination process in step S1304, the operation state of the vehicle 350 (for example, the operation such as reversing with the steering wheel turned to the left) is determined.
  • the operation state of the vehicle 350 for example, the operation such as reversing with the steering wheel turned to the left
  • step S1302 determines whether or not the navigation control unit 1200 has acquired information about the force I / O half IJ disconnection, which is the display screen 1202 force SON of step S1302, and information on the handle and gear of step S1303, respectively. Repeat the determination process.
  • step S1304 when the operating state of the vehicle 350 is determined, the navigation control unit 1200 outputs a video processing command to the video processing unit 1212.
  • the video processing unit 1212 determines the input image data. Based on the result, an image from the direction opposite to the direction in which the vehicle 350 moves is generated (step S1305).
  • the driving support device 300 overlaps the guidance image with the generated image by the navigation control unit 1200, for example, by the input operation from the user operation unit 1201 during the image generation process of step S1305. It is determined whether or not there is a tatami display (step S 1306).
  • step S1306 When it is determined that there is a superimposed display (step S1306: Yes), the video processing unit 1212 adds a guide image to the generated video (step S1307). Then, the navigation control unit 1200 displays the generated video on the display screen 1202 based on the video image data generated by the video processing unit 1212 (step S 1308). After displaying the video, the navigation control unit 1200 determines whether or not the display screen 1202 is OFF (step S1309). If it is determined that the display screen 1202 is OFF (step S1309: Yes), the driving support device 300 ends the series of driving support processing according to this flowchart.
  • step S1309: No when navigation control unit 1200 determines that display screen 1202 is not OFF (step S1309: No), the process returns to step S1308, and the display processing of the generated video is continued. If it is determined in step S 1306 that there is no superimposed display (step S 1306: No), the process proceeds to step SI 308 to display the generated video.
  • the driving support apparatus is based on the received image data of the video of the vehicle and the acquired information on the steering direction and gear of the vehicle. It can generate video from the direction opposite to the moving direction and display the generated video. For this reason, it is possible to drive by displaying an image from the outside of the vehicle as an objective viewpoint image on the display screen, and it is possible to improve driving safety.
  • Example 4
  • the driving assistance apparatus according to Embodiment 4 of the present invention will be described.
  • the driving support device based on the received image data of the video of the vehicle and the information on the steering direction and gear of the vehicle, which is the acquired behavior information, the direction of movement of the vehicle and the direction from the opposite direction are obtained. It was set as the structure which can produce
  • the driving support device in addition to the received image data of the video of the vehicle and the information on the steering direction and gear of the vehicle that is the acquired behavior information, Based on the current location information of a certain vehicle, a configuration is adopted in which an image from the direction opposite to the direction in which the vehicle moves can be generated and the generated image can be displayed.
  • FIG. 14 is a flowchart showing an example of a driving assistance processing procedure of the driving assistance apparatus according to Embodiment 4 of the present invention.
  • the navigation control unit 1200 of the driving assistance device 300 receives image data of the video of the vehicle 350 captured by the imaging unit 310 via the communication unit 1207 (step S1401).
  • the received image data is input from the communication unit 1207 to the video processing unit 1212 through the video input / output IZF 1213.
  • the navigation control unit 1200 determines whether or not the display screen 1202 as the display unit 304 (see FIG. 3, the same applies hereinafter) of the driving support device 300 is ON (step S). 1402).
  • step S1402 determines whether the display screen 1202 is ON (step S1402: Yes).
  • step S. 1403 If it is determined that the current location information has been acquired (step S1403: Yes), the navigation control unit 1200 reads the map information from the recording medium 1204 and refers to the read map information to obtain the acquired current location information. It is determined whether it is point information indicating a specific point, that is, whether or not the point information indicates a parking facility, for example (step S 1404).
  • Step S 1404 If it is determined that the location information indicates a parking facility (Step S 1404: Yes), Navigation ff3 ⁇ 4 ⁇ ⁇ 1200 [Accordingly, acquisition ⁇ 1203 It is determined whether or not the power related to the information is acquired (step S1405). If it is determined that information regarding the steering wheel and gear is acquired (step S 1405: Yes), the navigation control unit 1200 determines the steering direction and gear of the vehicle 350 (step S 1406). By the determination process in step S1406, the operation state of the vehicle 350 (for example, the operation of turning the steering wheel to the right and moving backward) is determined.
  • the operation state of the vehicle 350 for example, the operation of turning the steering wheel to the right and moving backward
  • step S1402 determines whether or not the screen 1202 is ON (step S1402: No)
  • navigation control unit 1200 displays step S1402.
  • the process of determining whether or not the screen 1202 is ON is repeated.
  • step S1403 determines whether or not the current location information has not been acquired (step S 1403: No)
  • step S1404 it is determined that the location information indicating the parking facility is not valid (step S 1404: No) repeats the process of determining whether or not the current location information of step S1403 has been acquired.
  • step S1405 If it is determined in step S1405 that information on the handle and gear is not acquired (step S1405: No), the navigation control unit 1200 uses the navigation control unit 1200 to obtain information on the handle and gear in step S1405. Repeat the process of determining whether or not it has been acquired.
  • step S1406 when the operation state of the vehicle 350 is determined, the navigation control unit 1200 outputs a video processing command to the video processing unit 1212.
  • the video processing unit 1212 outputs the input image data and the point.
  • an image from the direction opposite to the direction in which the vehicle 350 moves is generated (step S1407).
  • the driving support device 300 generates a plan image for the generated video by the navigation control unit 1200 by, for example, an input operation from the user operation unit 1201 during the video generation processing in step S1407. It is determined whether or not there is a superimposed display (step S 1408).
  • step S1408 If it is determined that there is a superimposed display (step S1408: Yes), the video processing unit 1212 adds a guide image to the generated video (step S1409). Then, the navigation control unit 1200 displays the generated video on the display screen 1202 based on the video image data generated by the video processing unit 1212 (step S1410). After displaying the video, the navigation control unit 1200 determines whether or not the display screen 1202 is OFF (step S1411). If it is determined that the display screen 1202 is OFF (step S1411: Yes), the driving support device 300 ends the series of driving support processing according to this flowchart.
  • step S1411: No when navigation control unit 1200 determines that display screen 1202 is not OFF (step S1411: No), the process returns to step S1410, and the display processing of the generated video is continued. If it is determined in step S1408 that there is no superimposed display (step S1408: No), the process proceeds to step S1410 to display the generated video. Is called.
  • the driving assistance apparatus relates to the received image data of the video of the vehicle, the acquired current position information of the vehicle, the acquired steering direction and gear of the vehicle. Based on the information, it is possible to generate an image from the direction opposite to the moving direction of the vehicle and display the generated image. For this reason, it is possible to display the video from the outside of the vehicle as a video from an objective viewpoint on the display screen at a specific point such as a parking facility, and to drive, thereby improving driving safety. It becomes possible.
  • the driving support device, the driving support method, and the driving support program according to the present invention based on the received image data of the video of the vehicle and the behavior information of the vehicle, An external video can be generated. For this reason, it is possible to objectively visually recognize the situation around the vehicle using the generated video.
  • the driving support method described in the present embodiment can be realized by executing a program prepared in advance on a computer such as a personal computer or a workstation.
  • This program is recorded on a computer-readable recording medium such as a hard disk, a flexible disk, a CD-ROM, an MO, and a DVD, and is executed by being read by the computer.
  • this program may be a transmission medium that can be distributed via a network such as the Internet.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

L'invention concerne un dispositif d’assistance à la conduite possédant une section de réception de données image (101) pour recevoir des données image de vidéo ; une section de réception d’informations de comportement (102) pour recevoir des informations sur le comportement d’un véhicule (150) ; une section de création de vidéo (103) pour créer une vidéo du véhicule (150) tel que vu de l’extérieur du véhicule, basée sur les données image reçues par la section de réception de données image (101) et sur les informations de comportement reçues par la section de réception d’informations de comportement (102) ; une section de transmission de données image (104) pour transmettre au véhicule (150) des données image de la vidéo créées par la section de création de vidéo (103).
PCT/JP2006/304282 2005-03-11 2006-03-06 Dispositif d’assistance à la conduite, méthode d’assistance à la conduite et programme d’assistance à la conduite WO2006095689A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2005069779 2005-03-11
JP2005-069779 2005-03-11

Publications (1)

Publication Number Publication Date
WO2006095689A1 true WO2006095689A1 (fr) 2006-09-14

Family

ID=36953283

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2006/304282 WO2006095689A1 (fr) 2005-03-11 2006-03-06 Dispositif d’assistance à la conduite, méthode d’assistance à la conduite et programme d’assistance à la conduite

Country Status (1)

Country Link
WO (1) WO2006095689A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013212723A (ja) * 2012-03-30 2013-10-17 Fujitsu Ten Ltd 駐車支援装置、及び駐車支援方法
JP2015074321A (ja) * 2013-10-08 2015-04-20 本田技研工業株式会社 駐車支援システム
CN112770147A (zh) * 2021-01-21 2021-05-07 日照职业技术学院 基于云安全认证的无人驾驶透视盒子及其实现方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002019556A (ja) * 2000-07-04 2002-01-23 Matsushita Electric Ind Co Ltd 監視システム
JP2002054320A (ja) * 2000-08-10 2002-02-20 Yazaki Corp 駐車補助装置
JP2004064696A (ja) * 2002-07-31 2004-02-26 Nissan Motor Co Ltd 駐車支援装置
JP2004351977A (ja) * 2003-05-27 2004-12-16 Matsushita Electric Ind Co Ltd 車外映像表示装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002019556A (ja) * 2000-07-04 2002-01-23 Matsushita Electric Ind Co Ltd 監視システム
JP2002054320A (ja) * 2000-08-10 2002-02-20 Yazaki Corp 駐車補助装置
JP2004064696A (ja) * 2002-07-31 2004-02-26 Nissan Motor Co Ltd 駐車支援装置
JP2004351977A (ja) * 2003-05-27 2004-12-16 Matsushita Electric Ind Co Ltd 車外映像表示装置

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013212723A (ja) * 2012-03-30 2013-10-17 Fujitsu Ten Ltd 駐車支援装置、及び駐車支援方法
JP2015074321A (ja) * 2013-10-08 2015-04-20 本田技研工業株式会社 駐車支援システム
CN112770147A (zh) * 2021-01-21 2021-05-07 日照职业技术学院 基于云安全认证的无人驾驶透视盒子及其实现方法

Similar Documents

Publication Publication Date Title
JP6717742B2 (ja) ナビゲーション命令を表示する装置及び方法
JP4516111B2 (ja) 画像編集装置、画像編集方法、画像編集プログラムおよびコンピュータに読み取り可能な記録媒体
CN101573590A (zh) 导航装置及用于显示导航信息的方法
JP2001082969A (ja) ナビゲーション装置
JP2006084208A (ja) ナビゲーション装置及び進行方向案内方法
WO2006101012A1 (fr) Dispositif, procede et programme de mise a jour d'informations cartographiques et support d'enregistrement lisible par ordinateur
JP4652099B2 (ja) 画像表示装置、画像表示方法、画像表示プログラム、および記録媒体
WO2006103955A1 (fr) Dispositif d’affichage publicitaire, procédé d’affichage publicitaire et programme d’affichage publicitaire
WO2006095689A1 (fr) Dispositif d’assistance à la conduite, méthode d’assistance à la conduite et programme d’assistance à la conduite
JP4078923B2 (ja) ナビゲーションシステム及びプログラム
JPWO2008056401A1 (ja) 地図表示装置、地図表示方法、地図表示プログラム、および記録媒体
JP2008107223A (ja) 経路誘導装置、経路誘導方法、経路誘導プログラムおよび記録媒体
JP2001083872A (ja) ナビゲーション装置
JP4254553B2 (ja) 地図表示装置
JP2008160447A (ja) 放送番組受信装置、放送番組受信計画装置、放送番組受信方法、放送番組受信計画方法、プログラム、および記録媒体
WO2006109469A1 (fr) Dispositif de support de composition musicale, procede de support de composition musicale, programme de support de composition musicale et moyen d'enregistrement
JP2004108937A (ja) 情報送信装置、ナビゲーション装置、システム、方法及びプログラム
JP2007263580A (ja) 経路探索装置、経路探索方法、経路探索プログラムおよび記録媒体
JP4628796B2 (ja) ナビゲーション装置
JP2011033403A (ja) 情報処理装置、情報処理方法、情報処理プログラムおよび記録媒体
JP3964099B2 (ja) 地図表示装置、記録媒体及び地図表示方法
JP2008160445A (ja) 放送波情報表示装置、放送波情報表示方法、放送波情報表示プログラム、および記録媒体
JP2010203969A (ja) ナビゲーション装置、表示制御方法、表示制御プログラムおよび記録媒体
JP2000321974A (ja) 地図表示装置
JP4023259B2 (ja) ナビゲーションシステム及び地図表示方法のプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: RU

122 Ep: pct application non-entry in european phase

Ref document number: 06715302

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP