WO2018167815A1 - Dispositif de commande d'affichage et procédé de commande d'affichage - Google Patents

Dispositif de commande d'affichage et procédé de commande d'affichage Download PDF

Info

Publication number
WO2018167815A1
WO2018167815A1 PCT/JP2017/009932 JP2017009932W WO2018167815A1 WO 2018167815 A1 WO2018167815 A1 WO 2018167815A1 JP 2017009932 W JP2017009932 W JP 2017009932W WO 2018167815 A1 WO2018167815 A1 WO 2018167815A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
vehicle
display control
control device
obstacle
Prior art date
Application number
PCT/JP2017/009932
Other languages
English (en)
Japanese (ja)
Inventor
下谷 光生
龍太 久良木
井崎 公彦
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to JP2019505313A priority Critical patent/JP6727400B2/ja
Priority to PCT/JP2017/009932 priority patent/WO2018167815A1/fr
Publication of WO2018167815A1 publication Critical patent/WO2018167815A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to a display control device and a display control method for controlling a display unit.
  • Patent Documents 1 and 2 propose a technique for displaying a corner pole using a virtual image or holography of a HUD (head-up display) so that the cost of hardware can be reduced.
  • the driver may want to visually recognize the positional relationship between the vehicle part and the obstacle other than the part where the corner pole is provided. There is.
  • the corner pole is fixed, such visual recognition cannot be performed.
  • a method of installing a plurality of corner poles or a mechanism for moving one corner pole mechanically can be considered.
  • the appearance is deteriorated and the cost of hardware is reduced. There was a problem that increased.
  • the present invention has been made in view of the above-described problems, and an object thereof is to provide a technique capable of virtually moving a corner pole.
  • the display control device is a display control device that controls a display unit, and the display unit can display one or more display objects that are visible from the driver's seat of the vehicle over the scenery outside the vehicle.
  • the display control device displays an information acquisition unit that acquires information and a first display object that is a display object on the display unit, and based on the information acquired by the information acquisition unit, on the front side end of the vehicle body A control unit that moves the first display object within a corresponding predetermined range.
  • the first display object is moved within a predetermined range corresponding to the front end of the vehicle body. According to such a configuration, the corner pole can be virtually moved.
  • FIG. 1 is a block diagram illustrating a configuration of a display control device according to a first embodiment.
  • 6 is a block diagram illustrating a configuration of a display control device according to Embodiment 2.
  • FIG. 10 is a flowchart illustrating an operation of the display control apparatus according to the second embodiment.
  • FIG. 10 is a diagram for explaining an operation of the display control apparatus according to the second embodiment.
  • FIG. 10 is a diagram for explaining an operation of the display control apparatus according to the second embodiment.
  • FIG. 10 is a diagram for explaining an operation of the display control apparatus according to the second embodiment.
  • FIG. 10 is a diagram for explaining an operation of the display control apparatus according to the second embodiment.
  • FIG. 10 is a diagram for explaining an operation of the display control apparatus according to the second embodiment.
  • FIG. 10 is a diagram for explaining an operation of the display control apparatus according to the second embodiment.
  • FIG. 10 is a flowchart illustrating an operation of a display control apparatus according to a second modification of the second embodiment.
  • 14 is a flowchart illustrating an operation of a display control apparatus according to a fourth modification of the second embodiment.
  • FIG. 16 is a diagram for explaining an operation of a display control apparatus according to Modification 4 of Embodiment 2.
  • FIG. 16 is a diagram for explaining an operation of a display control apparatus according to Modification 4 of Embodiment 2.
  • 10 is a block diagram illustrating a configuration of a display control device according to Embodiment 3.
  • FIG. 14 is a flowchart illustrating an operation of the display control apparatus according to the third embodiment.
  • 10 is a diagram for explaining an operation of a display control apparatus according to Embodiment 3.
  • FIG. 10 is a diagram for explaining an operation of a display control apparatus according to Embodiment 3.
  • FIG. 10 is a diagram for explaining an operation of a display control apparatus according to Embodiment 3.
  • FIG. 10 is a diagram for explaining an operation of a display control apparatus according to Embodiment 3.
  • FIG. 10 is a diagram for explaining the operation of a display control apparatus according to a second modification of the third embodiment.
  • FIG. 10 is a diagram for explaining an operation of a display control apparatus according to a third modification of the third embodiment.
  • FIG. 10 is a diagram for explaining an operation of a display control apparatus according to a third modification of the third embodiment.
  • FIG. 25 is a diagram for explaining an operation of a display control apparatus according to Modification 5 of Embodiment 3.
  • FIG. 10 is a diagram for explaining an operation of a display control apparatus according to Modification 5 of Embodiment 3.
  • FIG. 10 is a diagram for explaining an operation of a display control apparatus according to Modification 5 of Embodi
  • FIG. 25 is a diagram for explaining an operation of a display control apparatus according to Modification 5 of Embodiment 3.
  • FIG. 10 is a diagram for explaining the operation of the display control apparatus according to the fourth embodiment.
  • FIG. 10 is a diagram for explaining the operation of the display control apparatus according to the fourth embodiment.
  • 10 is a flowchart showing the operation of the display control apparatus according to the fourth embodiment.
  • FIG. 10 is a block diagram illustrating a configuration of a display control device according to a fifth embodiment.
  • 10 is a flowchart illustrating an operation of the display control apparatus according to the fifth embodiment.
  • FIG. 10 is a diagram for explaining an operation of a display control apparatus according to a fifth embodiment.
  • FIG. 10 is a diagram for explaining an operation of a display control apparatus according to a fifth embodiment.
  • FIG. 10 is a diagram for explaining an operation of a display control apparatus according to a fifth embodiment.
  • FIG. 10 is a diagram for explaining an operation of a display control apparatus according to a fifth embodiment
  • FIG. 25 is a diagram for explaining the operation of a display control apparatus according to a modification of the fifth embodiment.
  • FIG. 25 is a diagram for explaining the operation of a display control apparatus according to a modification of the fifth embodiment.
  • FIG. 25 is a diagram for describing an operation of a display control apparatus according to Modification 1 of Embodiment 5.
  • FIG. 25 is a diagram for describing an operation of a display control apparatus according to Modification 1 of Embodiment 5.
  • FIG. 25 is a diagram for explaining an operation of a display control apparatus according to Modification 2 of Embodiment 5.
  • FIG. 25 is a diagram for explaining the operation of a display control apparatus according to Modification 2 of Embodiment 5.
  • FIG. 25 is a diagram for explaining the operation of a display control apparatus according to Modification 2 of Embodiment 5.
  • FIG. 25 is a diagram for explaining the operation of a display control apparatus according to Modification 2 of Embodiment 5.
  • FIG. 25 is a diagram for explaining the operation of a display control apparatus according to modification 3 of the fifth embodiment.
  • FIG. 25 is a diagram for explaining the operation of a display control apparatus according to modification 3 of the fifth embodiment.
  • FIG. 25 is a diagram for explaining the operation of a display control apparatus according to modification 3 of the fifth embodiment.
  • FIG. 25 is a diagram for describing an operation of a display control apparatus according to Modification 4 of Embodiment 5.
  • FIG. 25 is a diagram for describing an operation of a display control apparatus according to Modification 4 of Embodiment 5.
  • FIG. 25 is a diagram for describing an operation of a display control apparatus according to Modification 4 of Embodiment 5.
  • FIG. 25 is a diagram for explaining the operation of a display control apparatus according to modification 5 of the fifth embodiment.
  • FIG. 25 is a diagram for explaining the operation of a display control apparatus according to modification 5 of the fifth embodiment.
  • FIG. 25 is a diagram for explaining the operation of a display control apparatus according to modification 5 of the fifth embodiment.
  • FIG. 25 is a diagram for explaining the operation of a display control apparatus according to modification 6 of the fifth embodiment.
  • FIG. 10 is a block diagram illustrating a configuration of a display control device according to a sixth embodiment. 18 is a flowchart showing the operation of the display control apparatus according to the sixth embodiment.
  • FIG. 10 is a diagram for explaining an operation of a display control apparatus according to a sixth embodiment.
  • 25 is a diagram for explaining an operation of a display control apparatus according to the first modification of the sixth embodiment. It is a block diagram which shows the hardware constitutions of the display control apparatus which concerns on another modification. It is a block diagram which shows the hardware constitutions of the display control apparatus which concerns on another modification. It is a block diagram which shows the structure of the server which concerns on another modification. It is a block diagram which shows the structure of the communication terminal which concerns on another modification.
  • Embodiment 1 the display control apparatus according to Embodiment 1 of the present invention will be described as being mounted on a vehicle.
  • a vehicle on which the display control device is mounted and which is a target of attention will be described as “own vehicle”.
  • FIG. 1 is a block diagram showing a configuration of the display control apparatus 1 according to the first embodiment.
  • the display control device 1 in FIG. 1 controls a virtual display unit 21 that is a display unit.
  • the virtual display unit 21 can display one or more display objects that are visually recognized from the driver's seat of the host vehicle over the scenery outside the host vehicle. That is, the virtual display unit 21 can display a virtual display object that can be seen by the driver of the host vehicle as if it actually exists in a three-dimensional space in the real world.
  • a HUD that displays a virtual image or a holography, or an autostereoscopic display device is applied to the virtual display unit 21.
  • the display control device 1 in FIG. 1 includes an information acquisition unit 11 and a control unit 12.
  • the information acquisition unit 11 acquires information that can be used to move the display object.
  • This display object includes a first display object indicating a pole for the driver to visually recognize the positional relationship between the vehicle and the obstacle, that is, a pole corresponding to a corner pole.
  • the first display object is described as a “pole object”.
  • a user operation for moving the pole object is used.
  • the control unit 12 displays the pole object on the virtual display unit 21, and based on the information acquired by the information acquisition unit 11, the control unit 12 displays the pole object within a predetermined range corresponding to the front side end of the body of the host vehicle.
  • the predetermined range includes at least one of a range overlapping the front end portion of the body of the host vehicle and a range around the front end portion of the body of the host vehicle. For example, a front bumper of the own vehicle, or a fender and a front bumper of the own vehicle are applied to the front end of the body of the own vehicle.
  • FIG. 2 is a block diagram showing a configuration of the display control apparatus 1 according to Embodiment 2 of the present invention.
  • constituent elements that are the same as or similar to the constituent elements described above are assigned the same reference numerals, and different constituent elements are mainly described.
  • the virtual display unit 21 is the same as that in the first embodiment.
  • the operation input unit 22 receives various operations such as an operation of moving the pole object from the driver, and outputs an operation signal indicating the received operation to the display control device 1.
  • the operation input unit 22 will be described below as a touch panel.
  • the operation input unit 22 includes, for example, a switch that receives a driver's push operation, a voice operation input device that receives a driver's voice operation, a gesture operation input device that includes a camera that receives a driver's gesture for space as an operation, And at least any one of the line-of-sight operation input devices including a camera or the like that accepts the movement of the driver's line of sight as an operation may be used. Any device that can input the driver's intention may be used. Some of these will be described in detail in later-described modifications.
  • the display control device 1 includes an operation signal input unit 11a, a virtual display object generation unit 12a, a display position control unit 12b, and a display control unit 12c.
  • the operation signal input unit 11a is included in the concept of the information acquisition unit 11 in FIG. 1 of the first embodiment.
  • the virtual display object generation unit 12a, the display position control unit 12b, and the display control unit 12c are included in the concept of the control unit 12 in FIG. 1 of the first embodiment, as indicated by a broken line in FIG.
  • the operation signal input unit 11a acquires an operation signal indicating the operation from the operation input unit 22. For this reason, when the operation input unit 22 receives an operation of moving the pole object, the operation signal input unit 11a acquires information on the operation of moving the pole object from the operation input unit 22.
  • an operation for moving the pole object is referred to as a “movement operation”.
  • the virtual display object generation unit 12a generates a display object to be displayed on the virtual display unit 21 based on the operation acquired by the operation signal input unit 11a.
  • the virtual display object generation unit 12a generates a pole object based on the moving operation acquired by the operation signal input unit 11a.
  • the virtual display object generation unit 12a may generate a pole object based on a signal indicating that the accessory power supply of the host vehicle is turned on.
  • the display position control unit 12b determines the display position of the display object generated by the virtual display object generation unit 12a based on the operation acquired by the operation signal input unit 11a.
  • the display position control unit 12b determines the position of the virtual image as the display position.
  • the position of the virtual image is a position in a three-dimensional coordinate space with a specific position (for example, a driver's seat or a windshield) of the host vehicle as a reference such as an origin.
  • the three-dimensional coordinate space is a polar coordinate space
  • the position of the virtual image is defined by, for example, the virtual image direction that is the direction of the virtual image and the virtual image distance that is the distance to the virtual image.
  • the coordinate space is an orthogonal coordinate space
  • the position of the virtual image is defined by coordinates on three orthogonal coordinate axes defined in the front-rear direction, the left-right direction, and the up-down direction of the vehicle, for example.
  • the display position control unit 12b determines the display position of the pole object at a position overlapping with the fender and the front bumper of the host vehicle based on the moving operation acquired by the operation signal input unit 11a. .
  • the display control unit 12c controls the virtual display unit 21 so that the display object generated by the virtual display object generation unit 12a is displayed at the display position determined by the display position control unit 12b.
  • the display control unit 12c controls the virtual display unit 21 so that the pole object generated by the virtual display object generation unit 12a is displayed at the display position determined by the display position control unit 12b. To do.
  • FIG. 3 is a flowchart showing the operation of the display control apparatus 1 according to the second embodiment.
  • step S1 the virtual display object generation unit 12a generates a pole object based on the operation acquired by the operation signal input unit 11a.
  • the display position control unit 12b determines the display position of the pole object as the initial position.
  • the display control unit 12c controls the virtual display unit 21 so as to display the pole object at the initial position.
  • the initial position is assumed to be the position of the front end portion on the passenger seat side of the body of the host vehicle, like the position of a general corner pole.
  • FIG. 4 is a view showing a state where the vehicle can be seen through the windshield 31 from the interior of the host vehicle when step S1 is performed.
  • FIG. 4 shows a hood 32 and a rearview mirror 33 of the host vehicle, and a pole object 34 displayed at the same initial position as the corner pole. Note that the bonnet 32 may not be visible depending on the vehicle type of the host vehicle.
  • FIG. 5 which shows typically the state which can be seen through the windshield 31 from the room
  • the sign of the host vehicle 2 is attached to the bonnet 32.
  • step S3 the display position control unit 12b determines whether or not the operation signal input unit 11a has acquired a moving operation. If it is determined that the moving operation has been acquired, the process proceeds to step S3. If it is determined that the moving operation has not been acquired, the process of step S2 is performed again.
  • FIG. 6 is a diagram illustrating an example of an arrangement relationship between the host vehicle 2 and another vehicle 51a that is an obstacle.
  • FIG. 6 shows not only the pole object 34 displayed at the initial position 2a in step S1, but also the positions 2b and 52 of the respective parts forming the shortest distance between the host vehicle 2 and the other vehicle 51a. Is also shown.
  • the driver who is going to drive the own vehicle 2 in the vicinity of the other vehicle 51a determines whether the own vehicle 2 collides with the other vehicle 51a at positions 2b and 52. It is thought that the distance between them is measured. At this time, if the pole object 34 is displayed at the position 2b instead of the initial position 2a, it can be expected that the judgment of the collision is enhanced. Therefore, in the case of FIG. 6, the driver performs a moving operation in step S2 of FIG. 3, and the process proceeds to step S3.
  • step S3 the display position control unit 12b determines the display position of the pole object 34 based on the moving operation acquired by the operation signal input unit 11a.
  • the display control unit 12c controls the virtual display unit 21 so that the pole object 34 is displayed at the display position determined by the display position control unit 12b. Thereby, the pole object 34 moves. Thereafter, the process returns to step S2.
  • FIG. 7 is a view showing a state where the vehicle can be seen through the windshield 31 from the interior of the host vehicle when Step S2 and Step S3 are performed. As shown in FIG. 7, step S2 and step S3 are repeatedly performed, so that the pole object 34 moves from the initial position to a position intended by the driver.
  • the virtual display unit 21 is a HUD that displays a virtual image and the virtual image distance is defined based on the driver seat, the pole object 34 moves in the direction from the driver seat side to the passenger seat side. Accordingly, the virtual image distance of the pole object 34 becomes longer.
  • FIG. 8 is a diagram showing a state in which the pole object 34 has moved from the initial position 2a to the position 2b in the arrangement relationship of FIG.
  • the driver of the host vehicle 2 drives the host vehicle 2 so that the host vehicle 2 does not collide with the other vehicle 51a while measuring the distance between the positions 2b and 52 using the pole object 34 as an index. It can be performed.
  • the pole object 34 can be moved based on the operation of moving the pole object 34. Therefore, the user can move the pole object 34 to the intended position.
  • the initial position of the pole object 34 is the position of the front end on the passenger seat side of the body of the host vehicle, and therefore the pole object 34 is used in the same manner as a conventional corner pole. Can do.
  • ⁇ Modification 1 of Embodiment 2> It is assumed that the display control unit 12c according to the second embodiment described above changes both the virtual image distance and the virtual image direction of the pole object 34 based on the moving operation when the virtual display unit 21 is a HUD. It was. However, the present invention is not limited to this, and the display control unit 12c may change the virtual image direction of the pole object 34 without changing the virtual image distance of the pole object 34 based on the moving operation. According to such a configuration, a HUD that does not change the virtual image distance can be used for the virtual display unit 21.
  • the initial position of the pole object may be appropriately calibrated.
  • the calibration may be automatically performed based on the detection result or estimation result of the driver's eye position, or the driver manually sets the pole object to an appropriate display position by operating the menu screen. May be implemented.
  • the display control apparatus 1 includes a vehicle information acquisition unit included in the concept of the information acquisition unit 11 of FIG.
  • FIG. 9 is a flowchart showing the operation of the display control apparatus 1 according to this modification.
  • the operation in FIG. 9 is the same as that in which step S11 is added between step S1 and step S2 in FIG.
  • step S11 the vehicle information acquisition unit acquires the speed of the host vehicle from an ECU (Electronic Control Unit) of the host vehicle.
  • the display position control unit 12b determines whether or not the speed of the host vehicle acquired by the vehicle information acquisition unit is smaller than a predetermined speed (for example, 10 km / h). If it is determined that the speed of the host vehicle is lower than a predetermined speed, the process proceeds to step S2, and if it is determined that the speed of the host vehicle is equal to or higher than the predetermined speed, the process is performed. Return to S1.
  • a predetermined speed for example, 10 km / h
  • step S2 If it is determined in step S2 that the moving operation has been acquired, the process proceeds to step S3. If it is determined that the moving operation has not been acquired, the process returns to step S11. After the process of step S3 is performed, the process returns to step S11.
  • the position of the pole object 34 is fixed to the initial position. According to such a configuration, the pole object 34 can be prevented from interfering with the driver's vision while the host vehicle is traveling.
  • the vehicle information acquisition unit described above may acquire automatic driving information indicating whether or not automatic driving is being performed on the own vehicle from the ECU of the own vehicle, instead of the speed of the own vehicle. Then, in step S11 of FIG. 9, the display position control unit 12b determines whether or not the automatic driving information acquired by the vehicle information acquisition unit indicates that automatic driving is being performed on the host vehicle. Also good. If it is not indicated that the vehicle is performing automatic driving, the process proceeds to step S2, and if the vehicle is indicating that automatic driving is being performed, the process proceeds to step S1. The operation may be performed so as to return to step (b).
  • the display position control unit 12b configured in this manner fixes the position of the pole object 34 at the initial position when the automatic driving information indicates that automatic driving is being performed in the host vehicle. According to such a configuration, similarly to the above, it is possible to suppress the pole object 34 from interfering with the driver's vision during the automatic driving of the host vehicle.
  • the display control device 1 includes a peripheral brightness acquisition unit included in the concept of the information acquisition unit 11 of FIG.
  • the surrounding brightness acquisition unit acquires the brightness around the host vehicle acquired by the brightness sensor, or acquires an illumination signal when an illumination display is performed on the host vehicle.
  • the control unit 12 determines the color and brightness of the pole object 34 when the brightness around the host vehicle acquired by the surrounding brightness acquisition unit is equal to or less than the threshold value or when the illumination signal is acquired by the surrounding brightness acquisition unit. Is changed to a conspicuous color and brightness even in a dark environment. According to such a configuration, the pole object 34 can be easily seen even in a dark environment.
  • the color in this specification includes not only a single color but also a color tone such as a combination of a plurality of colors.
  • the operation input unit 22 in FIG. 2 has been described as using a touch panel.
  • the present invention is not limited to this.
  • a gesture operation input device that detects an instruction direction indicated by the driver's finger may be used as the operation input unit 22. Then, when the pointing direction is detected by the gesture operation input device, the display control device 1 may move the pole object 34 to a position on the pointing direction.
  • the display control apparatus 1 determines that the pole object 34 exists in the instruction direction detected by the gesture operation input device, and when another instruction direction is detected by the gesture operation input device, the other instruction direction The pole object 34 may be moved to the upper position.
  • a line-of-sight operation input device that detects the driver's line-of-sight direction may be used as the operation input unit 22. Then, after determining that the pole object 34 is present in the line-of-sight direction detected by the line-of-sight operation input device, the display control device 1 determines another line-of-sight direction when another line-of-sight direction is detected by the line-of-sight operation input device. The pole object 34 may be moved to the upper position.
  • both the gesture operation input device and the voice operation input device may be used for the operation input unit 22.
  • FIG. 10 is a flowchart showing the operation of the display control apparatus 1 according to this modification when a gesture operation input device and a voice operation input device are used as the operation input unit 22.
  • the operation in FIG. 10 is the same as that in which step S21 is added before step S1 in FIG. 3 and step S2 in FIG. 3 is replaced with steps S22 to S25.
  • step S21 the control unit 12 determines whether or not the operation input unit 22 and thus the operation signal input unit 11a have acquired a display command sound for displaying the pole object 34, for example, a “corner pole display” sound. Determine. If it is determined that the voice of the display command has been acquired, the process proceeds to step S1, and if it is determined that the voice of the display command has not been acquired, the process of step S21 is performed again.
  • step S1 the control unit 12 controls the virtual display unit 21 so as to display the pole object 34 at the initial position.
  • step S22 the control unit 12 causes the operation input unit 22 and, in turn, the operation signal input unit 11a to erase a voice of an erasure command for erasing the display of the pole object 34, for example, a voice of “Corner Pole Erase”. It is determined whether or not If it is determined that the erase command voice has been acquired, the process proceeds to step S23. If it is determined that the erase command voice has not been acquired, the process proceeds to step S24.
  • step S23 the control unit 12 controls the virtual display unit 21 so as to erase the display of the pole object 34. Thereby, the pole object 34 is deleted. Thereafter, the process returns to step S21.
  • step S24 the control unit 12 acquires the voice of the movement command for moving the pole object 34, for example, the voice of “movement of the corner pole” by the operation input unit 22 and the operation signal input unit 11a. It is determined whether or not a gesture operation pointing to the object 34 has been acquired. That is, the control unit 12 determines whether the driver has issued a movement command while pointing in the same direction as the pole object 34 as shown in FIG. If it is determined that the voice of the movement command and the gesture operation indicating the designated direction have been acquired, the process proceeds to step S25, and if not, the process returns to step S22. When the process proceeds to step S25, the control unit 12 may cause the tip of the pole object 34 to blink.
  • step S25 the control unit 12 obtains a voice of a determination command for the operation input unit 22 and thus the operation signal input unit 11a to determine the display position of the pole object 34, for example, a voice “decide a corner pole”, In addition, it is determined whether or not a gesture operation indicating an instruction direction different from the pole object 34 has been acquired. That is, the control unit 12 determines whether the driver has issued a determination command while pointing in a different direction from the pole object 34 as shown in FIG. If it is determined that the voice of the determination command and the gesture operation pointing to the other instruction direction have been acquired, the process proceeds to step S3. If not, the process of step S25 is performed again. If it is determined that the voice of the determination command and the gesture operation indicating another instruction direction are not acquired even if the process of step S25 is performed a certain number of times, the process may return to step S22.
  • step S3 the control unit 12 controls the virtual display unit 21 so that the pole object 34 is displayed in the different instruction direction described above. Thereafter, the process returns to step S22.
  • gesture operation input device is used as the operation input unit 22, and instead of acquiring various commands such as display commands from voice, a specific gesture operation such as swinging of a driver's finger is performed. Even with the configuration for acquiring the above, the same operation as described above can be realized.
  • FIG. 13 is a block diagram showing the configuration of the display control apparatus 1 according to Embodiment 3 of the present invention.
  • constituent elements that are the same as or similar to the constituent elements described above are assigned the same reference numerals, and different constituent elements are mainly described.
  • the periphery detection unit 23 Before describing the internal configuration of the display control device 1, the periphery detection unit 23 will be described.
  • the virtual display unit 21 and the operation input unit 22 are the same as those in the second embodiment.
  • the periphery detection unit 23 acquires information related to obstacles such as other vehicles around the host vehicle. This information includes relative positions of the obstacle and the host vehicle. For example, at least one of an ultrasonic sensor, an image recognition device, a laser radar, a millimeter wave radar, an acoustic recognition device, and a night vision camera may be used for the periphery detection unit 23. A sensing device may be used.
  • the display control device 1 in FIG. 13 has the same configuration as the display control device 1 in FIG. 2 in which an outside information input unit 11b and a relative position acquisition unit 11c are added.
  • the vehicle outside information input unit 11b and the relative position acquisition unit 11c are included in the concept of the information acquisition unit 11 in FIG.
  • the outside-vehicle information input unit 11b acquires information related to obstacles around the host vehicle from the surroundings detection unit 23. This information includes information on relative positions of the obstacle and the host vehicle.
  • the relative position acquisition unit 11c stores in advance the own vehicle internal position, which is any position in the own vehicle necessary for obtaining the relative position acquired by the outside information input unit 11b.
  • the relative position acquisition unit 11c includes the relative position of the obstacle and the host vehicle acquired by the vehicle outside information input unit 11b, the host vehicle internal position stored in advance, and the pole object 34 determined by the display position control unit 12b. Based on the position, a relative position between the obstacle and the pole object 34 is acquired.
  • the control unit 12 changes the display mode of the pole object 34 based on the relative position of the obstacle and the pole object 34 acquired by the relative position acquisition unit 11c.
  • the control unit 12 obtains a pole distance that is a distance between the obstacle and the pole object 34 based on the relative position acquired by the vehicle outside information input unit 11b, and based on the pole distance.
  • the color of the pole object 34 is changed.
  • FIG. 14 is a flowchart showing the operation of the display control apparatus 1 according to the third embodiment. The operation in FIG. 14 is the same as that in which step S31 is added after step S3 in FIG.
  • step S31 the control unit 12 obtains the above-described pole distance based on the relative position acquired by the vehicle exterior information input unit 11b, and changes the color of the pole object 34 based on the pole distance. For example, as shown in FIG. 15, when the pole distance dp between the obstacle 51 and the pole object 34 is larger than a predetermined first distance (for example, 40 cm), the control unit 12 The 34 color is changed to a safety color such as light blue. As shown in FIG. 16, when the pole distance dp is equal to or smaller than the first distance and larger than a predetermined second distance (for example, 20 cm), the control unit 12 changes the color of the pole object 34 to yellow or the like. Change to the attention color. As shown in FIG. 17, when the pole distance dp is equal to or less than the second distance, the control unit 12 changes the color of the pole object 34 to an alarm color such as red. After step S31, the process returns to step S2.
  • a predetermined first distance for example, 40 cm
  • the control unit 12 changes the color of the pole object 34 to
  • the display mode of the pole object 34 is changed based on the relative position of the obstacle around the host vehicle and the pole object 34. According to such a configuration, the driver can know the relative positional relationship such as the pole distance by changing the display mode such as the color of the pole object 34 when the pole object 34 is moved.
  • control unit 12 changes the color of the pole object 34 based on the pole distance, but the present invention is not limited to this. For example, as shown in FIG. 18, based on whether or not the portion 53 of the obstacle 51 that has the shortest distance from the pole object 34 is located within a fan-shaped range 35 that extends forward from the pole object 34.
  • the control unit 12 may change the color of the pole object 34.
  • the display mode changed by the control unit 12 has been described as the color of the pole object 34.
  • the present invention is not limited to this, and the display mode changed by the control unit 12 may be at least one of presence / absence of animation such as blinking of the pole object 34, color, shape, and pattern. .
  • the control unit 12 changes the display mode of the pole object 34 based on the relative position of the obstacle and the pole object 34 acquired by the relative position acquisition unit 11c. It is not a thing.
  • the control unit 12 may change the display mode of the pole object 34 based on the relative position of the obstacle and the host vehicle acquired by the outside information input unit 11b.
  • the control unit 12 may change the display mode of the pole object 34 based on the distance between the obstacle and the bumper of the host vehicle. More specifically, the control unit 12 determines whether or not the distance between the obstacle and the bumper of the host vehicle is equal to or less than a distance (for example, 5 cm) where the obstacle and the bumper are likely to contact each other. Based on the above, the display mode of the pole object 34 may be changed.
  • the periphery detection unit 23 in FIG. 13 may further detect the color of an obstacle around the host vehicle, and the vehicle outside information input unit 11b may further acquire the color of the obstacle detected by the periphery detection unit 23.
  • the control part 12 may change the color of the pole object 34 based on the color of the obstruction acquired by the vehicle exterior information input part 11b.
  • the control unit 12 may change the color of the pole object 34 to a color that is the same as or similar to the color of the obstacle acquired by the outside-vehicle information input unit 11b.
  • the color here includes not only one color but also a color tone such as a pattern in which a plurality of colors are combined.
  • the surrounding detection unit 23 detects the relative position between the obstacle and the pole object 34 and the color of the obstacle for each of a plurality of obstacles around the host vehicle
  • the outside information input unit 11b detects the surroundings.
  • the information detected by the unit 23 may be acquired.
  • the control part 12 may change the color of the pole object 34 based on the relative position and color about each of several obstacles which were acquired by the vehicle exterior information input part 11b.
  • the control unit 12 may change the color of the pole object 34 based on the color of the obstacle closest to the pole object 34 among the plurality of obstacles 51.
  • FIG. 19 the example of FIG.
  • the driver can determine which part of the own vehicle is close to which obstacle, and which obstacle affects the running of the own vehicle.
  • the control unit 12 has changed the color of all parts of the pole object 34 based on the color of the obstacle acquired by the outside information input unit 11b.
  • the present invention is not limited to this. Absent.
  • the control unit 12 may change the color of a part of the pole object 34 (for example, the tip) based on the color of the obstacle acquired by the outside information input unit 11b.
  • the control unit 12 may change the color of the remaining part (for example, the side surface) of the pole object 34 based on the relative position between the obstacle and the pole object 34.
  • the control unit 12 can control a display unit (not shown) different from the virtual display unit 21 that can be displayed in the host vehicle.
  • 23 includes a camera.
  • the periphery detection unit 23 may capture the periphery of the host vehicle with a camera, and the vehicle information input unit 11b may acquire the image from the periphery detection unit 23.
  • the control part 12 displays the image acquired by the vehicle exterior information input part 11b on another display part which can be displayed within the own vehicle, and superimposes the display object corresponding to the pole object 34 on the said image.
  • the display object corresponding to the pole object 34 is a display object that is the same as or similar to the pole object 34.
  • FIG. 20 is a diagram illustrating an example of an arrangement relationship between the host vehicle 2 and another vehicle 51a that is an obstacle.
  • FIG. 21 shows that the corresponding pole object 37, which is a display object corresponding to the pole object 34, is superimposed on the image 36 acquired by the outside information input unit 11b and displayed on another display unit in the arrangement relationship of FIG. It is a figure which shows a state. Since the front portion of the other vehicle 51a in FIG. 20 is inclined to the left side, strictly speaking, the front portion of the other vehicle 51a is inclined to the left side in the image, but this is not an important point. In FIG. 1, the front portion of the other vehicle is shown not to tilt.
  • the driver can easily associate the scenery seen directly with the scenery seen from the camera by the pole object 34 and the corresponding pole object 37.
  • a general display device is used for another display unit.
  • the present invention is not limited to this, and a display device capable of displaying a virtual display object using stereoscopic vision may be used on another display unit. In that case, a virtual display object may be applied to the corresponding pole object 37.
  • Embodiment 3 the control part 12 assumed moving the pole object 34 based on a driver
  • the present invention is not limited to this, and the control unit 12 automatically moves the pole object 34 from one side of the passenger seat and the driver seat to the other side within a range where the vehicle fender and the front bumper overlap. May be.
  • the driver does not need to perform the moving operation of the pole object 34 when the driver tries to know the relative positional relationship between the obstacle and the host vehicle by changing the display mode. Can be reduced.
  • the control unit 12 moves the pole object 34 within a range that overlaps with the fender and the front bumper of the host vehicle based on the moving operation, as indicated by a two-dot chain line in FIGS. 22 and 23. I let you.
  • the present invention is not limited to this, and as shown by the solid lines in FIGS. 22 and 23, the control unit 12 is configured to move away from the fender and the front bumper of the host vehicle 2 based on the moving operation.
  • the pole object 34 may be moved forward. For example, a distance of about 10 cm is used as the maximum separation distance between the pole object 34 and the fender and front bumper of the host vehicle 2.
  • the control unit 12 can perform the operation described in the third embodiment even when the pole object 34 is moving away from the fender and the front bumper of the host vehicle 2. That is, even when the pole object 34 is moving away from the fender and the front bumper of the host vehicle 2, the control unit 12 determines the pole based on the pole distance between the obstacle around the host vehicle and the pole object 34. The display mode of the object 34 can be changed. Therefore, the driver can confirm how far the vehicle is from the obstacle by moving the pole object 34 to the obstacle side.
  • the control unit 12 changes the display mode of the pole object 34 to a special display mode indicating that it has touched.
  • a display object such as a character or a graphic indicating contact may be further displayed on the virtual display unit 21.
  • the block configuration of the display control apparatus 1 according to the fourth embodiment of the present invention is the same as the block configuration (FIG. 13) of the display control apparatus 1 according to the third embodiment.
  • constituent elements that are the same as or similar to the constituent elements described above are assigned the same reference numerals, and different constituent elements are mainly described.
  • control unit 12 causes the virtual display unit 21 to further display the second display object.
  • the second display object a display object corresponding to a corner pole and indicating a fixed pole in the same manner as the actual corner pole is used.
  • auxiliary pole object the first display object described as the pole object in the above description
  • main pole object the second display object
  • FIGS. 24 and 25 are views showing a state where the vehicle can be seen through the windshield 31 from the interior of the host vehicle.
  • the control unit 12 corresponds to the front side end portion of the body of the host vehicle based on the movement operation acquired by the information acquisition unit 11, similarly to the pole object described so far.
  • the auxiliary pole object 38 is moved within a predetermined range.
  • the control unit 12 moves the main pole object 39 to the front side on the passenger seat side of the body of the host vehicle, like the position of a general corner pole. Secure at the end position.
  • the display mode of the auxiliary pole object 38 and the display mode of the main pole object 39 are different from each other and similar.
  • FIG. 26 is a flowchart showing the operation of the display control apparatus 1 according to the fourth embodiment.
  • step S41 the virtual display object generation unit 12a generates the main pole object 39 and the auxiliary pole object 38 based on the movement operation acquired by the operation signal input unit 11a.
  • the display position control unit 12b determines the display positions of the main pole object 39 and the auxiliary pole object 38 as initial positions.
  • the display control unit 12c controls the virtual display unit 21 so that the main pole object 39 and the auxiliary pole object 38 are displayed at the initial positions. As a result, the state shown in FIG. 24 is obtained.
  • step S42 of FIG. 26 the display position control unit 12b determines whether or not the operation signal input unit 11a has acquired a movement operation. If it is determined that the moving operation has been acquired, the process proceeds to step S43. If it is determined that the moving operation has not been acquired, the process of step S42 is performed again.
  • step S43 the display position control unit 12b determines the display position of the auxiliary pole object 38 based on the moving operation acquired by the operation signal input unit 11a.
  • the display control unit 12c controls the virtual display unit 21 so that the auxiliary pole object 38 is displayed at the display position determined by the display position control unit 12b. As a result, the auxiliary pole object 38 moves without moving the main pole object 39. Thereafter, the process returns to step S42.
  • the auxiliary pole object 38 is moved based on the operation of moving the auxiliary pole object 38 without moving the main pole object 39. Therefore, the driver can use the main pole object 39 in the same manner as a conventional corner pole even after the auxiliary pole object 38 is moved.
  • FIG. 27 is a block diagram showing a configuration of display control apparatus 1 according to Embodiment 5 of the present invention.
  • constituent elements that are the same as or similar to the constituent elements described above are assigned the same reference numerals, and different constituent elements are mainly described.
  • the display control device 1 in FIG. 27 has the same configuration as that obtained by removing the operation signal input unit 11a and the relative position acquisition unit 11c from the display control device 1 in FIG.
  • the virtual display unit 21 and the periphery detection unit 23 are the same as those in the third embodiment.
  • the vehicle exterior information input unit 11b acquires the relative position of the obstacle around the host vehicle and the host vehicle from the periphery detection unit 23.
  • the display position control unit 12b determines the display position of the display object based on the relative position acquired by the vehicle exterior information input unit 11b.
  • the control unit 12 configured as described above is based on the relative positions of the obstacles around the host vehicle and the host vehicle acquired by the vehicle outside information input unit 11b even if there is no movement operation from the driver.
  • the object 34 can be automatically moved.
  • the control unit 12 determines the distance between the obstacle and the own vehicle based on the relative position of the obstacle around the own vehicle and the own vehicle acquired by the outside information input unit 11b. The vehicle distance is calculated. And the control part 12 fixes the position of the pole object 34 to an initial position, when the own vehicle distance is larger than a predetermined threshold value (for example, 40 cm). On the other hand, when the own vehicle distance is equal to or less than the threshold, the control unit 12 determines whether the obstacle is based on the relative position of the obstacle around the own vehicle and the own vehicle acquired by the outside information input unit 11b. The pole object 34 is brought closer. Here, the control unit 12 moves the pole object 34 to the position where the own vehicle distance is the shortest in the movable range of the pole object 34, that is, the portion where the distance to the obstacle is the shortest in the own vehicle.
  • FIG. 28 is a flowchart showing the operation of the display control apparatus 1 according to the fifth embodiment.
  • step S51 the virtual display object generation unit 12a generates the pole object 34 based on the relative position of the obstacle around the host vehicle and the host vehicle.
  • the display position control unit 12b determines the display position of the pole object 34 as the initial position.
  • the display control unit 12c controls the virtual display unit 21 so as to display the pole object 34 at the initial position.
  • step S52 the vehicle exterior information input unit 11b acquires the relative position of the obstacle around the host vehicle and the host vehicle.
  • step S53 the control unit 12 obtains the above-described host vehicle distance based on the relative positions of the obstacles around the host vehicle and the host vehicle acquired by the vehicle outside information input unit 11b. And the control part 12 determines whether the own vehicle distance is below a predetermined threshold value (for example, 40 cm). If it is determined that the host vehicle distance is equal to or less than the predetermined threshold, the process proceeds to step S54. If it is determined that the host vehicle distance is greater than the threshold, the process returns to step S51.
  • a predetermined threshold value for example, 40 cm
  • step S54 the control unit 12 moves the pole object 34 to a position where the own vehicle distance is the shortest. Thereafter, the process returns to step S52.
  • 29 and 30 are diagrams showing examples of the operation result of FIG.
  • FIG. 29 shows a case where the own vehicle distance dv, which is the distance between the obstacle 51 and the own vehicle 2, is larger than the threshold value.
  • the pole object 34 is fixed at the initial position.
  • FIG. 30 shows a case where the host vehicle distance dv is equal to or less than a threshold value. In this case, the pole object 34 moves to a position where the own vehicle distance dv is minimized.
  • the pole object 34 is automatically moved based on the relative positions of the obstacle 51 around the host vehicle 2 and the host vehicle 2. According to such a configuration, the driver does not need to perform the moving operation of the pole object 34, so the burden on the driver can be reduced.
  • the pole object 34 is brought closer to the obstacle based on the relative position of the obstacle around the host vehicle 2 and the host vehicle. According to such a configuration, the driver can know which part of the host vehicle 2 is closest to the obstacle 51 by the position of the pole object 34.
  • control unit 12 determines the virtual image distance and virtual image of the pole object 34 based on the relative positions of the obstacle around the host vehicle and the host vehicle. Instead of changing both the directions, the virtual image direction of the pole object 34 may be changed without changing the virtual image distance of the pole object 34.
  • each modification of the third embodiment may be appropriately combined with the fifth embodiment, or each modification of the fifth embodiment may be appropriately combined with the third embodiment.
  • the control unit 12 according to the fifth embodiment is based on the relative positions of the obstacles around the host vehicle and the host vehicle acquired by the outside information input unit 11b as in the first modification of the third embodiment.
  • the display mode of the pole object 34 may be changed.
  • the control unit 12 according to the fifth embodiment changes the display mode of the pole object 34 based on the host vehicle distance dv that is the distance between the obstacle and the host vehicle based on the relative position. May be.
  • the display mode changed by the control unit 12 may be at least one of presence / absence of animation such as blinking of the pole object 34, color, shape, and pattern.
  • the display mode changed by the control unit 12 may be at least one of presence / absence of animation such as blinking of the pole object 34, color, shape, and pattern.
  • the color of the pole object 34 is changed from the color of the pole object 34 shown in FIG.
  • the change of the display mode is not limited to this.
  • the control unit 12 may increase or decrease the length of the pole object 34 as the host vehicle distance dv decreases.
  • the control unit 12 may blink the pole object 34 or add a pattern to the pole object 34 as the host vehicle distance dv decreases.
  • the control part 12 may change the display mode of the pole object 34 in steps based on the own vehicle distance dv, or may change it continuously.
  • the fourth embodiment may be combined with the fifth embodiment. That is, as shown in FIGS. 33, 34, and 35, the control unit 12 does not move the main pole object 39 based on the relative positions of the obstacle 51 around the host vehicle 2 and the host vehicle 2.
  • the auxiliary pole object 38 may be automatically moved. Even in this case, the same effect as in the fifth embodiment can be obtained.
  • the periphery detection unit 23 in FIG. 27 detects information about obstacles around the host vehicle 2, and the vehicle outside information input unit 11 b acquires information detected by the periphery detection unit 23.
  • the information about the obstacle around the host vehicle 2 includes at least one of the relative position between the obstacle and the host vehicle 2, the attribute of the obstacle, the height of the obstacle, and the color of the obstacle. May be included.
  • control part 12 may make the virtual display part 21 further display the 3rd display object which is a display object which shows the information regarding the obstruction acquired by the vehicle exterior information input part 11b with the pole object 34.
  • the third display object is referred to as “additional object”.
  • the vehicle outside information input unit 11b acquires information on the relative position of the obstacle and the host vehicle as information on the obstacle, and the control unit 12 indicates the host vehicle distance dv based on the relative position.
  • the additional objects 40a, 40b, and 40c are attached to the pole object 34 and displayed on the virtual display unit 21.
  • the display object of the character which shows the value of the own vehicle distance dv is shown as the additional object 40a.
  • the display object of the arrow which shows the value of the own vehicle distance dv by length is shown as the additional object 40b.
  • a display object of two fingers indicating the value of the host vehicle distance dv is shown as an additional object 40c.
  • the driver can know information on obstacles around the host vehicle 2.
  • ⁇ Modification 3 of Embodiment 5> 27 may detect the attribute of an obstacle around the host vehicle, and the vehicle outside information input unit 11b may acquire the attribute of the obstacle detected by the periphery detection unit 23.
  • the obstacle attribute mentioned here corresponds to any of a stationary object, a vehicle, a person, and an animal other than a person.
  • the control part 12 may change the display mode of the pole object 34 based on the attribute of the obstruction acquired by the vehicle exterior information input part 11b.
  • FIG. 39, 40, and 41 are diagrams illustrating an example in which the control unit 12 changes the display mode of the pole object 34 based on the attribute of the obstacle.
  • FIG. 39 shows a pole object 34 indicating a stationary object when the attribute of the obstacle is the stationary object 51d.
  • FIG. 40 shows a pole object 34 indicating a vehicle when the attribute of the obstacle is the vehicle 51e.
  • FIG. 41 shows a pole object 34 indicating a person when the attribute of the obstacle is a person 51f.
  • the driver can know the attributes of the obstacles around the host vehicle.
  • the control unit 12 changes the display mode of the pole object 34 based on the attribute of the obstacle acquired by the outside information input unit 11b, but is not limited thereto.
  • the control unit 12 may attach an additional object indicating the attribute of the obstacle to the pole object 34 as in the second modification of the fifth embodiment.
  • ⁇ Modification 4 of Embodiment 5> 27 may detect the height of an obstacle around the host vehicle, and the vehicle outside information input unit 11b may acquire the height of the obstacle detected by the periphery detection unit 23. And the control part 12 may change the display mode of the pole object 34 based on the height of the obstruction acquired by the vehicle exterior information input part 11b.
  • FIG. 42 and 43 are diagrams illustrating an example in which the control unit 12 changes the display mode of the pole object 34 based on the height of the obstacle.
  • FIG. 42 shows a pole object 34 indicating that the height of the obstacle is relatively low depending on the position of the mark when the height of the obstacle is relatively low.
  • FIG. 43 shows a pole object 34 indicating that the height of the obstacle is relatively high depending on the position of the mark when the height of the obstacle is relatively high.
  • the height of the obstacle may be indicated by the length of the pole object 34, or the height of the obstacle may be indicated by a scale mark (not shown) of the pole object 34. You may show.
  • the driver can know the height of the obstacle around the host vehicle.
  • the control part 12 changed the display mode of the pole object 34 based on the height of the obstruction acquired by the vehicle exterior information input part 11b here, it is not restricted to this.
  • the control unit 12 may attach an additional object indicating the height of the obstacle to the pole object 34 as in the second modification of the fifth embodiment.
  • the pole object 34 may be moved forward. For example, the control unit 12 first automatically moves the pole object 34 to a position where the own vehicle distance is the shortest in a range in which the pole object 34 can move, and then moves the pole object 34 toward the obstacle. You may move automatically.
  • 44, 45 and 46 are diagrams sequentially illustrating a state in which the pole object 34 moved to the position where the own vehicle distance is the shortest is separated from the fender and the front bumper of the own vehicle 2.
  • the control unit 12 makes a pole between the obstacle 51 around the host vehicle 2 and the pole object 34.
  • the display mode of the pole object 34 may be changed based on the distance.
  • the control unit 12 may change the display mode of the pole object 34 to a special display mode indicating that the pole object 34 has been touched.
  • a display object such as a character or a figure indicating that it has been performed may be further displayed on the virtual display unit 21. According to such a configuration, the driver can know the relative positions of the obstacle 51 and the host vehicle 2.
  • the control unit 12 first automatically moves the pole object 34 to the position where the own vehicle distance is the shortest in the movable range of the pole object 34, and then the pole object 34. Was automatically moved to the obstacle.
  • the control unit 12 may combine the automatic movement of the pole object 34 with the movement of the pole object 34 based on the driver's movement operation.
  • the control unit 12 determines, based on the relative positions of the obstacle 51 around the host vehicle 2 and the host vehicle, the position where the host vehicle distance is the shortest in the movable range of the pole object 34.
  • the pole object 34 may be moved automatically, and then the pole object 34 may be moved toward the obstacle based on the driver's movement operation.
  • control unit 12 has displayed one pole object 34 on the virtual display unit 21.
  • the present invention is not limited to this, and the control unit 12 may display a plurality of pole objects 34 on the virtual display unit 21.
  • the periphery detection unit 23 detects the relative positions of the obstacles around the host vehicle and the host vehicle with respect to a plurality of obstacles
  • the outside information input unit 11b detects the obstacles around the host vehicle with respect to the plurality of obstacles.
  • You may acquire the relative position about the own vehicle from the periphery detection part 23.
  • the control part 12 may display the pole object 34 on the virtual display part 21 one by one in the position where the own vehicle distance becomes the shortest about each of several obstacles.
  • the control unit 12 may cause the virtual display unit 21 to display the auxiliary pole objects 38 one by one at the position where the own vehicle distance is the shortest for each of the plurality of obstacles 51. .
  • FIG. 48 is a block diagram showing a configuration of display control apparatus 1 according to Embodiment 6 of the present invention.
  • constituent elements that are the same as or similar to the constituent elements described above are assigned the same reference numerals, and different constituent elements are mainly described.
  • the in-vehicle LAN (Local Area Network) device 24 Before describing the internal configuration of the display control device 1, the in-vehicle LAN (Local Area Network) device 24 will be described.
  • the virtual display unit 21 and the periphery detection unit 23 are the same as those in the third embodiment.
  • the in-vehicle LAN device 24 constitutes a CAN (Controller Area Network) or the like, and communicates various information and control commands between devices in the own vehicle. Thereby, the in-vehicle LAN device 24 acquires the position information of the own vehicle, the control information in traveling of the own vehicle, the information related to the body of the own vehicle, the unique information of the own vehicle from the own vehicle.
  • the periphery detection unit 23 may be connected to the in-vehicle LAN device 24. In this case, the vehicle exterior information input unit 11 b acquires information on obstacles around the host vehicle via the vehicle interior LAN device 24.
  • the display control device 1 in FIG. 48 has the same configuration as that in which the in-vehicle information input unit 11d is added to the display control device 1 in FIG.
  • the in-vehicle information input unit 11d is included in the concept of the information acquisition unit 11 in FIG.
  • the in-vehicle information input unit 11d obtains a first movement locus that is a future movement locus of the host vehicle based on the control information acquired by the in-vehicle LAN device 24 and the like.
  • the first movement locus is referred to as “own vehicle movement locus”.
  • the control unit 12 such as the display position control unit 12b is based on the relative position of the obstacle and the own vehicle acquired by the outside information input unit 11b and the own vehicle movement locus obtained by the in-vehicle information input unit 11d.
  • the control part 12 moves the pole object 34 to the calculated
  • FIG. 49 is a flowchart showing the operation of the display control apparatus 1 according to the sixth embodiment. The operation in FIG. 49 is the same as that obtained by replacing step S54 in FIG. 28 with steps S61 and S62.
  • step S61 the control unit 12 determines the own vehicle based on the relative position of the obstacle and the own vehicle acquired by the outside information input unit 11b and the own vehicle movement trajectory acquired by the in-vehicle information input unit 11d.
  • the own vehicle 2 when traveling along the own vehicle movement locus is indicated by a two-dot chain line.
  • the control unit 12 determines that the portion 2c indicated by the two-dot chain line of the own vehicle 2 is the other vehicle 51a. It is predicted that the portion 54 will come into contact.
  • step S62 the control unit 12 moves the pole object 34 to the obtained part.
  • the control unit 12 moves the pole object 34 to the part 2 d indicated by the solid line corresponding to the part 2 c of the host vehicle 2. Thereafter, the process returns to step S52.
  • the pole object 34 is moved to the part of the host vehicle 2 that will come into contact with an obstacle in the future. Accordingly, the driver can change the movement trajectory of the host vehicle 2 with reference to the display of the pole object 34 so that the host vehicle 2 does not contact the obstacle.
  • the control unit 12 determines that the host vehicle 2 is moving on the own vehicle based on the relative position acquired by the outside information input unit 11b and the own vehicle moving track acquired by the in-vehicle information input unit 11d. , The pole object 34 is moved to the part of the host vehicle 2 that contacts the obstacle.
  • the control unit 12 determines the vehicle 2 based on the relative position acquired by the vehicle outside information input unit 11b and the vehicle movement locus acquired by the vehicle interior information input unit 11d.
  • the pole object 34 may be moved to the portion of the host vehicle 2 that does not contact the obstacle but is closest to the obstacle.
  • the control part 12 may change the display mode of the pole object 34 based on whether it contacts with an obstruction. According to such a configuration, the driver can change the movement trajectory of the host vehicle 2 with reference to the display of the pole object 34 so that the host vehicle 2 does not contact an obstacle.
  • the control unit 12 moves the main pole object 39 to the portion 2e currently closest to the other vehicle 51a in the own vehicle 2, and the own vehicle 2 follows the own vehicle movement locus.
  • the auxiliary pole object 38 may be moved to the portion 2f of the host vehicle 2 that contacts the obstacle.
  • the control unit 12 moves one auxiliary pole object 38 to the currently nearest portion 2e of the own vehicle 2 and, when the own vehicle 2 moves along the own vehicle movement locus, Of these, another auxiliary pole object 38 may be moved to the portion 2f in contact with the obstacle. In this case, the control unit 12 may fix the position of the main pole object 39 to the initial position.
  • the vehicle outside information input unit 11b obtains a second movement locus that is a future movement locus of the obstacle based on the information about the obstacle around the host vehicle detected by the periphery detection unit 23.
  • the second movement locus is referred to as “obstacle movement locus”.
  • the outside information input unit 11b moves the obstacle of the other vehicle based on the tire direction of the other vehicle indicated in the captured image. Find as a trajectory.
  • the vehicle outside information input unit 11b moves the obstacle on the movement locus of the other vehicle based on the information on the automatic driving of the other vehicle acquired by inter-vehicle communication or the like. Find as a trajectory.
  • the control unit 12 such as the display position control unit 12b, the relative position of the obstacle and the own vehicle acquired by the outside information input unit 11b, the own vehicle movement locus obtained by the inboard information input unit 11d, and the outside information input.
  • the own vehicle moves along the own vehicle movement locus and the obstacle moves along the obstacle movement locus based on the obstacle movement locus obtained by the unit 11b
  • the obstacle of the own vehicle At least one of the part closest to the object and the part in contact with the obstacle is obtained.
  • the control unit 12 moves the pole object 34 to at least one of the obtained portions.
  • the operation signal input unit 11a performs a switching operation from the driver via the operation input unit 22. You may get it.
  • the control part 12 is based on the switching operation acquired by the operation signal input part 11a, the display demonstrated in Embodiment 5, the display demonstrated in Embodiment 6, the modification 1 of Embodiment 6, and The display described in 2 may be selectively performed.
  • the information acquisition unit 11 and the control unit 12 in the display control apparatus 1 described above are hereinafter referred to as “information acquisition unit 11 etc.”.
  • the information acquisition unit 11 and the like are realized by a processing circuit 81 illustrated in FIG. That is, the processing circuit 81 displays an information acquisition unit 11 that acquires information and a first display object that is a display object on the display unit, and based on the information acquired by the information acquisition unit 11, the front side of the vehicle body And a control unit 12 that moves the first display object within a predetermined range corresponding to the end.
  • Dedicated hardware may be applied to the processing circuit 81, or a processor that executes a program stored in the memory may be applied.
  • the processor corresponds to, for example, a central processing unit, a processing unit, an arithmetic unit, a microprocessor, a microcomputer, a DSP (Digital Signal Processor) and the like.
  • the processing circuit 81 When the processing circuit 81 is dedicated hardware, the processing circuit 81 includes, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), an FPGA (Field Programmable Gate). Array) or a combination thereof.
  • Each function of each unit such as the information acquisition unit 11 may be realized by a circuit in which processing circuits are distributed, or the function of each unit may be realized by a single processing circuit.
  • the processing circuit 81 When the processing circuit 81 is a processor, the functions of the information acquisition unit 11 and the like are realized by a combination with software or the like.
  • the software or the like corresponds to, for example, software, firmware, or software and firmware.
  • Software or the like is described as a program and stored in the memory 83.
  • the processor 82 applied to the processing circuit 81 reads out and executes a program stored in the memory 83, thereby realizing the functions of the respective units. That is, when executed by the processing circuit 81, the display control device 1 displays the information display step and the first display object, which is a display object, on the display unit.
  • the display control device 1 Based on the acquired information, the display control device 1 And a memory 83 for storing a program to be executed as a result of performing control for moving the first display object within a predetermined range corresponding to the front end of the body.
  • this program causes a computer to execute procedures and methods such as the information acquisition unit 11.
  • the memory 83 is, for example, non-volatile or RAM (Random Access Memory), ROM (Read Only Memory), flash memory, EPROM (Erasable Programmable Read Only Memory), EEPROM (Electrically Erasable Programmable Read Only Memory), or the like. Volatile semiconductor memory, HDD (Hard Disk Drive), magnetic disk, flexible disk, optical disk, compact disk, mini disk, DVD (Digital Versatile Disk), its drive device, etc., or any storage media used in the future May be.
  • each function of the information acquisition unit 11 and the like is realized by either hardware or software.
  • the present invention is not limited to this, and a configuration in which a part of the information acquisition unit 11 or the like is realized by dedicated hardware and another part is realized by software or the like.
  • the function of the information acquisition unit 11 is realized by the processing circuit 81 as dedicated hardware, and the processing circuit 81 as the processor 82 reads and executes the program stored in the memory 83 for the other parts.
  • the function can be realized.
  • the processing circuit 81 can realize the functions described above by hardware, software, or the like, or a combination thereof.
  • the display control device 1 described above includes a navigation device such as PND (Portable Navigation Device), a communication terminal including mobile terminals such as a mobile phone, a smartphone, and a tablet, and functions of applications installed on these devices,
  • the present invention can also be applied to a display control system constructed as a system by appropriately combining servers.
  • each function or each component of the display control device 1 described above may be distributed and arranged in each device that constructs the system, or may be concentrated on any device. Good.
  • the display control apparatus may further include a virtual display unit 21 in FIG.
  • FIG. 54 is a block diagram showing a configuration of the server 91 according to this modification.
  • 54 includes a communication unit 91a and a control unit 91b, and can perform wireless communication with a display control device 93 realized by a navigation device of the host vehicle 92 or the like.
  • the communication unit 91a which is an information acquisition unit, receives information acquired by the display control device 93 by performing wireless communication with the display control device 93.
  • the control unit 91b has a function similar to that of the control unit 12 in FIG. 1 when a processor (not illustrated) of the server 91 executes a program stored in a memory (not illustrated) of the server 91. That is, the control unit 91b generates a control signal for moving the first display object within a predetermined range corresponding to the front end portion of the vehicle body based on the information received by the communication unit 91a. And the communication part 91a transmits the control signal of the control part 91b to the display control apparatus 93.
  • the display control device 93 moves the pole object displayed on the virtual display unit 21 based on the control signal transmitted from the communication unit 91a.
  • the server 91 configured in this way, the same effect as that of the display control device 1 described in the first embodiment can be obtained.
  • FIG. 55 is a block diagram showing a configuration of the communication terminal 96 according to the present modification.
  • the communication terminal 96 in FIG. 55 includes a communication unit 96a similar to the communication unit 91a and a control unit 96b similar to the control unit 91b, and can perform wireless communication with the display control device 98 of the host vehicle 97. It has become.
  • mobile terminals such as mobile phones, smartphones, and tablets carried by the driver of the host vehicle 97 are applied to the communication terminal 96.
  • the same effect as that of the display control device 1 described in the first embodiment can be obtained.
  • the present invention can be freely combined with each embodiment and each modification within the scope of the invention, or can be appropriately modified and omitted with each embodiment and each modification.
  • SYMBOLS 1 Display control apparatus, 2 Own vehicle, 11 Information acquisition part, 12 Control part, 21 Virtual display part, 34 Pole object, 36 image, 37 Corresponding pole object, 38 Auxiliary pole object, 39 Main pole object, 40a, 40b, 40c Additional objects, 51 obstacles.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Instrument Panels (AREA)
  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
  • Traffic Control Systems (AREA)

Abstract

Le but de la présente invention est de fournir une technique capable de déplacer virtuellement un poteau d'angle. Ce dispositif de commande d'affichage est un dispositif de commande d'affichage pour commander une unité d'affichage. L'unité d'affichage peut afficher un ou plusieurs objets d'affichage qui semblent se chevaucher avec le décor à l'extérieur d'un véhicule en vue depuis le siège du conducteur du véhicule. Le dispositif de commande d'affichage comporte : une unité d'acquisition d'informations pour acquérir des informations ; une unité de commande qui permet à un premier objet d'affichage, qui est l'objet d'affichage, d'être affiché sur une unité d'affichage, et déplace, sur la base des informations acquises à partir de l'unité d'acquisition d'informations, le premier objet d'affichage dans une plage prédéterminée correspondant à une partie extrémité côté avant d'une carrosserie de véhicule.
PCT/JP2017/009932 2017-03-13 2017-03-13 Dispositif de commande d'affichage et procédé de commande d'affichage WO2018167815A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2019505313A JP6727400B2 (ja) 2017-03-13 2017-03-13 表示制御装置及び表示制御方法
PCT/JP2017/009932 WO2018167815A1 (fr) 2017-03-13 2017-03-13 Dispositif de commande d'affichage et procédé de commande d'affichage

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/009932 WO2018167815A1 (fr) 2017-03-13 2017-03-13 Dispositif de commande d'affichage et procédé de commande d'affichage

Publications (1)

Publication Number Publication Date
WO2018167815A1 true WO2018167815A1 (fr) 2018-09-20

Family

ID=63521889

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/009932 WO2018167815A1 (fr) 2017-03-13 2017-03-13 Dispositif de commande d'affichage et procédé de commande d'affichage

Country Status (2)

Country Link
JP (1) JP6727400B2 (fr)
WO (1) WO2018167815A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019026747A1 (fr) * 2017-07-31 2019-02-07 日本精機株式会社 Dispositif d'affichage d'image réelle augmentée pour véhicule

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07140918A (ja) * 1993-11-16 1995-06-02 Toyohiko Hatada 車両用表示装置
JPH08295175A (ja) * 1995-04-25 1996-11-12 Mitsubishi Motors Corp 虚像コーナマーカ
JP2010039508A (ja) * 2008-07-31 2010-02-18 Honda Motor Co Ltd 車両用情報報知装置

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005138755A (ja) * 2003-11-07 2005-06-02 Denso Corp 虚像表示装置およびプログラム
JP2010064713A (ja) * 2008-09-12 2010-03-25 Toshiba Corp 画像照射システム、画像照射方法
JP2010250057A (ja) * 2009-04-15 2010-11-04 Stanley Electric Co Ltd 虚像式標識表示装置
JP2012116400A (ja) * 2010-12-02 2012-06-21 Jvc Kenwood Corp コーナーポール投影装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07140918A (ja) * 1993-11-16 1995-06-02 Toyohiko Hatada 車両用表示装置
JPH08295175A (ja) * 1995-04-25 1996-11-12 Mitsubishi Motors Corp 虚像コーナマーカ
JP2010039508A (ja) * 2008-07-31 2010-02-18 Honda Motor Co Ltd 車両用情報報知装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019026747A1 (fr) * 2017-07-31 2019-02-07 日本精機株式会社 Dispositif d'affichage d'image réelle augmentée pour véhicule
JPWO2019026747A1 (ja) * 2017-07-31 2020-05-28 日本精機株式会社 車両用拡張現実画像表示装置

Also Published As

Publication number Publication date
JP6727400B2 (ja) 2020-07-22
JPWO2018167815A1 (ja) 2019-06-27

Similar Documents

Publication Publication Date Title
US10591723B2 (en) In-vehicle projection display system with dynamic display area
JP6413207B2 (ja) 車両用表示装置
JP6340969B2 (ja) 周辺監視装置、及びプログラム
CA3069114C (fr) Procede et dispositif d'aide au stationnement
EP3118047B1 (fr) Dispositif de commande d'affichage, dispositif d'affichage, programme de commande d'affichage, procédé de commande d'affichage et support d'enregistrement
EP2487906B1 (fr) Dispositif de commande et dispositif de surveillance des abords d'un véhicule
JP4475308B2 (ja) 表示装置
JP5999032B2 (ja) 車載表示装置およびプログラム
JP6045796B2 (ja) 映像処理装置、映像処理方法、および映像表示システム
CN108349503B (zh) 驾驶辅助装置
JPWO2015037117A1 (ja) 情報表示システム及び情報表示装置
JP2009196630A (ja) 表示装置
WO2014156614A1 (fr) Dispositif d'affichage de véhicule
JP6805974B2 (ja) 走行支援装置及びコンピュータプログラム
US11181909B2 (en) Remote vehicle control device, remote vehicle control system, and remote vehicle control method
JP6720729B2 (ja) 表示制御装置
JP2017186008A (ja) 情報表示システム
US20190286118A1 (en) Remote vehicle control device and remote vehicle control method
US20240001763A1 (en) Vehicle display system, vehicle display method, and computer-readable non-transitory storage medium storing vehicle display program
WO2018167815A1 (fr) Dispositif de commande d'affichage et procédé de commande d'affichage
JP2019069717A (ja) 駐車支援装置
CN110114809A (zh) 用于在具有变化的输出功能的光信号设备处提醒驾驶员起动的方法和装置
JP2019119357A (ja) 表示システム
JP7051263B2 (ja) 運転計画変更指示装置および運転計画変更指示方法
JP5287827B2 (ja) 表示制御装置及び車載表示システム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17900715

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019505313

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17900715

Country of ref document: EP

Kind code of ref document: A1