WO2024018497A1 - Projection control device and projection control method - Google Patents

Projection control device and projection control method Download PDF

Info

Publication number
WO2024018497A1
WO2024018497A1 PCT/JP2022/027962 JP2022027962W WO2024018497A1 WO 2024018497 A1 WO2024018497 A1 WO 2024018497A1 JP 2022027962 W JP2022027962 W JP 2022027962W WO 2024018497 A1 WO2024018497 A1 WO 2024018497A1
Authority
WO
WIPO (PCT)
Prior art keywords
guide image
vehicle
projection
road surface
branch
Prior art date
Application number
PCT/JP2022/027962
Other languages
French (fr)
Japanese (ja)
Inventor
祐子 山本
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2022/027962 priority Critical patent/WO2024018497A1/en
Publication of WO2024018497A1 publication Critical patent/WO2024018497A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers

Definitions

  • the present disclosure relates to a projection control device and a projection control method for vehicles.
  • Patent Document 1 describes an arrow image that is a route guidance image that guides the own vehicle in the branching direction based on route information when the own vehicle approaches a branch point on the route to the extent that the occupant can see it.
  • a vehicle projection device has been disclosed that projects the image on the road surface in front of the own vehicle.
  • the vehicular projection device disclosed in Patent Document 1 controls a branch indicating portion, which is included in an arrow image and extends along a branch direction at a branch point, to lengthen as the arrow image approaches the branch point.
  • the present disclosure has been made to solve the above-mentioned problems, and in route guidance using image projection on the road surface, vehicle occupants can be informed of the distance to a junction and the distance to the junction without removing the junction from their field of vision. It is an object of the present invention to provide a projection control device that can grasp a branching direction.
  • the projection control device includes a guide route information acquisition unit that acquires guide route information regarding the guide route of the vehicle, and a guide route information acquisition unit that allows the vehicle to reach a branch point on the guide route based on the guide route information acquired by the guide route information acquisition unit.
  • a branch determining unit that determines whether the vehicle is approaching a branch point; and when the branch determining unit determines that the vehicle is approaching a branch point, a first guide image that indicates the presence of the branch point and the branch direction based on the guidance route information; a guide image generating section that generates a first guide image generated by the guide image generating section, and a projection control section that causes a projection device to project a first guide image generated by the guide image generating section onto a road surface in front of the vehicle, the guide image generating section , the distance from the vehicle to the branch point, the position of the first road surface point on the road surface onto which the first end of the first guide image is projected, and the side opposite to the first end of the first guide image.
  • a first guide image is generated based on the position of a second road surface point on the road surface, which is located further away from the first road surface point in the direction of the branch with respect to the branch point, on which the second end of the vehicle is projected.
  • the first guide image is characterized in that as the second end approaches the first end, the first guide image is generated as the second end approaches the first end.
  • FIG. 1 is a diagram illustrating a configuration example of a projection control device according to Embodiment 1.
  • FIG. 3 is a flowchart for explaining an example of the operation of the projection control device according to the first embodiment.
  • FIG. 2 is a diagram illustrating an example of how the projection control unit of the projection control device causes the projection device to project a first guide image onto the road surface in front of the vehicle in the first embodiment.
  • FIG. 7 is a diagram for explaining another example of how the projection control unit of the projection control device causes the projection device to project the first guide image onto the road surface in front of the vehicle in the first embodiment.
  • 5A and 5B are diagrams illustrating an example of the hardware configuration of the projection control device according to the first embodiment.
  • FIG. 1 is a diagram illustrating a configuration example of a projection control device according to Embodiment 1.
  • FIG. 3 is a flowchart for explaining an example of the operation of the projection control device according to the first embodiment.
  • FIG. 2 is a diagram illustrating an example of how the
  • FIG. 3 is a diagram illustrating a configuration example of a projection control device according to a second embodiment.
  • 7 is a flowchart for explaining an example of the operation of the projection control device according to the second embodiment.
  • 2 is a diagram for explaining an example of how the projection control unit of the projection control device causes the projection device to project the first guide image and the second guide image onto the road surface in front of the vehicle in the second embodiment;
  • FIG. be. 7 is a flowchart for explaining another example of the operation of the projection control device according to the second embodiment.
  • the projection control unit of the projection control device causes the projection device to stop projecting the first guide image onto the road surface in front of the vehicle, and projects the second guide image onto the road surface in front of the vehicle.
  • FIG. 3 is a diagram for explaining an example of a state where 7 is a flowchart for explaining another example of the operation of the projection control device according to the second embodiment.
  • FIG. 1 is a diagram showing a configuration example of a projection control device 1 according to the first embodiment.
  • the projection control device 1 is mounted on a vehicle 100.
  • the projection control device 1 is connected to a navigation device 2 and a projection device 3.
  • the navigation device 2 is a general navigation device that is mounted on the vehicle 100 and provides route guidance for the vehicle 100.
  • the projection device 3 is mounted on the vehicle 100 and projects an image for guiding the route of the vehicle 100 (hereinafter referred to as a “guidance image”) onto the road surface in front of the vehicle 100 under the control of the projection control device 1. .
  • guidance image an image for guiding the route of the vehicle 100
  • the projection device 3 is, for example, a lamp provided near the headlights of the vehicle 100 and dedicated to projecting an image onto the road surface. Note that this is just an example, and the projection device 3 may be a device that has other functions, such as a headlamp.
  • the projection control device 1 acquires information regarding the guidance route of the vehicle 100 (hereinafter referred to as "guidance route information") from the navigation device 2, and determines whether the vehicle 100 is approaching a branch point on the guidance route based on the guidance route information. If it is determined that this is the case, the projection device 3 is caused to project a guide image (hereinafter referred to as "first guide image") indicating the existence of a branch point and the branch direction onto the road surface in front of the vehicle 100. Details of the projection control device 1, the first guide image, and the method of projecting the first guide image by the projection control device 1 will be described later.
  • a "branch point” refers to a point where a road or the like branches, such as a crossroads, a three-way intersection (Y-junction), and a T-junction (T-junction).
  • a branch point on the guide route of vehicle 100 is also simply referred to as a branch point.
  • the "branching direction” refers to the course direction of vehicle 100 after branching.
  • the first guide image is an arrow image.
  • the projection control device 1 transmits a first guide image, in other words, an arrow image, to the projection device 3 in a lane after a branch in front of the vehicle 100. It is assumed that the image is projected as an image extending in the left-right direction when viewed from 100 and parallel to the lane after the branch. Note that in the first embodiment, "parallel" is not limited to strictly parallel, but also includes substantially parallel.
  • the projection control device 1 includes a guide route information acquisition section 11, a branch determination section 12, a guide image generation section 13, and a projection control section 14.
  • the guide route information acquisition unit 11 acquires guide route information from the navigation device 2 .
  • the guide route information includes information regarding the route to the destination of the vehicle 100, information regarding the current position of the vehicle 100, and map information.
  • Information regarding the current position of the vehicle 100 is acquired by the navigation device 2 from a GPS (Global Positioning System, not shown) installed in the vehicle 100, for example. Note that this is just an example; for example, the guidance route information acquisition unit 11 may directly acquire information regarding the current position of the vehicle 100 from the GPS and include it in the guidance route information acquired from the navigation device 2.
  • the map information includes, for example, the location of the road, the location of the lane (the lane referred to here is the so-called lane), the shape of the road, the width of the road, the width of the lane (the lane referred to here is the so-called lane), and the location of the branch point. , and information regarding the road type.
  • the "position of a branch point" is represented by the intersection of straight lines passing through the centers of intersecting lanes (the lanes herein are so-called lanes) in the vehicle width direction.
  • the coordinate system in the map information is a so-called "geographical coordinate system” that represents a position on the earth.
  • a map coordinate system is generally expressed in two dimensions by latitude and longitude, and in three dimensions, elevation is added to these.
  • the guidance route information acquisition unit 11 outputs the acquired guidance route information to the branch determination unit 12.
  • the branch determination unit 12 determines whether the vehicle 100 is approaching a branch point based on the guidance route information acquired by the guidance route information acquisition unit 11. Specifically, for example, the branch determination unit 12 determines whether the vehicle 100 is approaching the junction by comparing the distance from the vehicle 100 to the junction with a preset threshold (hereinafter referred to as "approach determination threshold"). Determine whether or not there is. The branch determining unit 12 determines that the vehicle 100 is approaching the branch point when the distance from the vehicle 100 to the branch point is less than or equal to the approach determination threshold. Specifically, the distance from vehicle 100 to the branch point is, for example, the distance from the current position of vehicle 100 to the position of the branch point.
  • the branch determining unit 12 can determine the current position of the vehicle 100 and the position of the branch point based on the guide route information.
  • the approach determination threshold is appropriately set by an administrator or the like.
  • the approach determination threshold is preferably set to a distance within a range in which the projection device 3 can project the first guide image.
  • the branch determination unit 12 determines whether the vehicle 100 is approaching the branch point based on whether the distance from the vehicle 100 to the branch point is less than or equal to the approach determination threshold; This is just one example.
  • the branch determination unit 12 may determine whether the vehicle 100 is approaching the branch point by considering not only the distance from the vehicle 100 to the branch point but also the vehicle speed of the vehicle 100.
  • the branch determination unit 12 determines that the distance from the vehicle 100 to the branch point is less than or equal to the approach determination threshold, and the vehicle speed of the vehicle 100 is greater than or equal to a preset threshold (hereinafter referred to as "vehicle speed determination threshold"). In this case, it may be determined that the vehicle 100 is approaching a branch point.
  • the vehicle speed may be acquired from the navigation device 2 as guidance route information.
  • the guide image generation unit 13 transmits the determination result that the vehicle 100 is approaching the junction (hereinafter referred to as the “branch determination result”). Output to.
  • the branch determining unit 12 also determines whether the vehicle 100 has reached a branch point. For example, the branch determining unit 12 may determine that the vehicle 100 has reached the branch point when the distance from the vehicle 100 to the branch point becomes "0". If the branch determining unit 12 determines that the vehicle 100 has reached the branch point, it outputs information indicating that the vehicle 100 has reached the branch point (hereinafter referred to as “branch point arrival information”) to the projection control unit 14. do.
  • the branch determination unit 12 may output the branch point arrival information to the projection control unit 14 via the guide image generation unit 13 or directly to the projection control unit 14. In addition, in FIG. 1, illustration of an arrow from the branch determination unit 12 to the projection control unit 14 is omitted.
  • the guide image generating unit 13 If the branch determining unit 12 determines that the vehicle 100 is approaching a branch point, the guide image generating unit 13 generates a first guide image based on the guide route information. Note that the guidance image generation section 13 may acquire the guidance route information acquired by the guidance route information acquisition section 11 via the branch determination section 12.
  • the guidance image generation unit 13 projects the position of the vehicle 100 (more specifically, the current position), the position of the branch point, and one end (hereinafter referred to as "first end") of the first guidance image.
  • the position of the point on the road surface (hereinafter referred to as the "first road surface point”) and the end of the first guide image opposite to the first end (hereinafter referred to as the "second end") are projected.
  • a first guide image is generated based on the position of a point on the road surface located further away from the first road surface point (hereinafter referred to as "second road surface point") toward the branch point in the direction of the branch point. do. That is, in the first embodiment, the first end is the starting point of the arrow, and the second end is the ending point of the arrow.
  • the position of the first road surface point is a straight line that passes through a point indicating the position of the vehicle 100 and is parallel to the lane in which the vehicle 100 is traveling, and a lane after a branch (here, the lane is a so-called lane). This is the intersection of the straight line passing through the widthwise center of the lane and parallel to the lane after branching.
  • the guide image generation unit 13 first calculates the coordinates of the position of the first road surface point.
  • the guide image generation unit 13 can calculate the coordinates of the position of the first road surface point from the guide route information.
  • the guide image generation unit 13 determines the position of the second road surface point based on the coordinates of the position of the first road surface point and the distance from the vehicle 100 to the branch point, and calculates the coordinates of the second road surface point. calculate.
  • the guide image generation unit 13 determines the position of the second road surface point in accordance with preset and internally held conditions (hereinafter referred to as "end point determination conditions").
  • the end point determination condition includes when the branch determining unit 12 first determines that the vehicle 100 approaches a certain branch point, in other words, when the distance from the vehicle 100 to the branch point reaches the approach determination threshold.
  • conditions are defined as to how far away from the first road surface point (hereinafter referred to as "reference distance") a point should be set as the second road surface point.
  • the conditions for determining the end point include changing the position of the second road surface point to the first road surface point depending on how close the vehicle 100 has approached the junction after the distance from the vehicle 100 to the junction becomes the approach determination threshold.
  • the conditions for how close to the position are defined.
  • the guide image generation unit 13 can calculate the distance from the vehicle 100 to the branch point from the guide route information. An administrator or the like sets conditions for determining the end point in advance and stores them in the guide image generation section 13.
  • the guide image generation unit 13 generates the coordinates of the position of the vehicle 100, the distance from the vehicle 100 to the branch point, and the coordinates of the first road point using a learned model (hereinafter referred to as the "first machine learning model"). ) to obtain the coordinates of the first end of the first guide image. Further, the guide image generation unit 13 inputs the coordinates of the position of the vehicle 100, the distance from the vehicle 100 to the branch point, and the coordinates of the second road surface point to the first machine learning model, and Obtain the coordinates of the two ends.
  • a learned model hereinafter referred to as the "first machine learning model”
  • the coordinate system of the first guide image projected by the projection device 3 is referred to as the display means coordinate system, and the area in front of the vehicle 100 on the road surface in real space where the projection device 3 can project the first guide image (hereinafter referred to as The coordinate system of the "projectable area") is called the target area coordinate system.
  • the target area coordinate system is a so-called “geographical coordinate system” similar to the coordinate system in map information.
  • the first machine learning model is based on the correspondence between the position on the first guide image and the position on the road surface, takes into account the distance from the vehicle 100 to the branch point, and converts the points in the target area coordinate system into the display means coordinates.
  • the first machine learning model is generated by an administrator or the like in advance, for example, before shipping the product of the projection control device 1, and is stored in a location that can be referenced by the projection control device 1.
  • the administrator or the like takes the vehicle 100 for a test run and experimentally projects the first guide image from the projection device 3 near a junction.
  • the branch point does not need to be an actual branch point; for example, an administrator or the like may set a point on a road with no branches to be considered as a branch point.
  • the projection device 3 projects the first guide image in front of the vehicle 100, the administrator etc.
  • the first guide image projected onto the road surface is also referred to as a "first projected image.”
  • the administrator or the like also obtains the position of the vehicle 100 when the first guide image is projected onto the road surface in front of the vehicle 100 and the distance from the vehicle 100 to the branch point.
  • the administrator or the like performs this experiment multiple times by changing the distance from the vehicle 100 to the branch point.
  • the administrator or the like causes the learning device to learn, inputs the distance from the vehicle 100 to the branch point, and the coordinates of the measurement point in the first projection image, and outputs the coordinates of the measurement point in the first guide image.
  • the coordinates of the measurement point in the first projection image are expressed in the target area coordinate system
  • the coordinates of the measurement point in the first guide image are expressed in the display means coordinate system.
  • the guide image generation unit 13 Upon obtaining the coordinates of the first end of the first guide image and the coordinates of the second end of the first guide image, the guide image generation unit 13 stores the coordinates in a storage unit (not shown) based on these coordinates.
  • a first guide image is generated in accordance with the rules (hereinafter referred to as "first guide image rules").
  • the storage unit stores rules for a first guide image when generating a first guide image to be projected on the projection device 3, which are set in advance by an administrator or the like.
  • the guide image generation unit 13 projects a first projection image showing an arrow starting near the branch point and extending in the branch direction onto the road surface in front of the vehicle 100, in accordance with the first guide image rules stored in the storage unit.
  • a first guide image is generated so that the The first guide image rule includes, for example, a rule that the bar of the arrow is 70 cm wide and the base of the triangle of the arrow is 1 m wide, and is filled in blue in the direction along the lane after the branch.
  • the guide image generation unit 13 generates a blue arrow with a bar width of 70 cm and a triangular base width of 1 m, starting from the first road surface point and ending at the second road surface point.
  • a first guidance image is generated that projects a first projection image showing an arrow extending in the branching direction on the road surface of the lane.
  • the first guide image rules include, for example, the color, pattern, width, etc.
  • a rule is set such that when the generated first guide image is projected onto the road surface in front of the vehicle 100, the first projected image allows the occupant of the vehicle 100 to recognize the existence of a branch point and the branch direction. All you have to do is stay there. Further, here, it is assumed that the rules for the first guide image are set in advance by the administrator or the like, but this is only an example. For example, the occupant of the vehicle 100 or the like may be able to set the rules for the first guide image as appropriate.
  • the guide image generation unit 13 After generating the first guide image, the guide image generation unit 13 outputs the generated first guide image to the projection control unit 14.
  • the projection control unit 14 causes the projection device 3 to project the first guide image generated by the guide image generation unit 13 onto the road surface in front of the vehicle 100.
  • the guide image generation unit 13 first calculates the coordinates of the first road surface point, and then calculates the coordinates of the second road surface point, and then calculates the first machine learning model. to obtain the coordinates of the first end of the first guide image and the coordinates of the second end of the first guide image in the display means coordinate system, and according to this and the rules for generating the first guide image, An example of generating a guide image was given. Then, the projection control unit 14 causes the projection device 3 to project the first guide image generated by the guide image generation unit 13 and expressed by the coordinates of the display means coordinate system.
  • the projection control device 1 may generate the first guide image and control the projection of the first guide image onto the projection device 3 using other methods.
  • the guide image generation unit 13 stores an initial image (in this case, an arrow image) in advance, and uses the initial image based on the calculated distance between the first road surface point and the second road surface point.
  • the first guide image may be generated by enlarging or reducing the image.
  • the projection control unit 14 projects the first end of the first guide image generated by the guide image generation unit 13 onto the first road surface point calculated by the guide image generation unit 13, and By changing the projection angle of the light emitted from the lamp of the projection device 3 so that the end portion is projected on the second road surface point calculated by the guide image generation unit 13, the front of the vehicle 100 is projected with respect to the projection device 3.
  • the first guide image may be projected onto the road surface.
  • FIG. 2 is a flowchart for explaining an example of the operation of the projection control device 1 according to the first embodiment. For example, when the power of the vehicle 100 is turned on and the vehicle 100 starts running, the projection control device 1 repeatedly performs the operation shown in the flowchart of FIG. 2 until the power of the vehicle 100 is turned off. .
  • the guide route information acquisition unit 11 acquires guide route information from the navigation device 2 (step ST1).
  • the guidance route information acquisition unit 11 outputs the acquired guidance route information to the branch determination unit 12.
  • the branch determination unit 12 determines whether the vehicle 100 is approaching a branch point based on the guidance route information acquired by the guidance route information acquisition unit 11 in step ST1 (step ST2).
  • step ST1 If the branch determining unit 12 determines that the vehicle 100 is not approaching the branch point (“NO” in step ST1), the operation of the projection control device 1 returns to the processing in step ST1.
  • the branch determination unit 12 determines that the vehicle 100 is approaching a branch point (“YES” in step ST1)
  • the branch determination unit 12 transmits the branch determination result that the vehicle 100 is approaching the branch point to the guidance image generation unit 13. Output to.
  • the branch determining unit 12 determines that the vehicle 100 is approaching a branch point for the first time after the power of the vehicle 100 is turned on and the vehicle 100 starts traveling. In this case, the branch determination unit 12 outputs a branch determination result indicating that the vehicle 100 is approaching a branch point to the guide image generation unit 13.
  • the guide image generation unit 13 generates a first guide image based on the guide route information acquired by the guide route information acquisition unit 11 in step ST1 (step ST3).
  • the guide image generation unit 13 When the guide image generation unit 13 generates the first guide image, it outputs the generated first guide image to the projection control unit 14 .
  • the projection control unit 14 causes the projection device 3 to project the first guide image generated by the guide image generation unit 13 in step ST3 onto the road surface in front of the vehicle 100 (step ST4).
  • the road surface in front of the vehicle 100 is in a state where the first projection image is projected, for example, as shown in FIG.
  • FIG. 3 is for explaining an example of how the projection control unit 14 of the projection control device 1 causes the projection device 3 to project the first guide image onto the road surface in front of the vehicle 100 in the first embodiment.
  • FIG. 3 is an overhead view of a branch point and the vehicle 100 approaching the branch point.
  • the branch point is indicated by "BP”
  • the first projection image projected onto the road surface in front of the vehicle 100 is indicated by "Ai1".
  • the first road surface point is indicated by "E1"
  • the second road surface point is indicated by "E2”. Since the direction of the second road surface point is the branching direction, it can be seen in FIG. 3 that the guide route for the vehicle 100 is a route that turns right at the branching point. In addition, in FIG. 3, it is assumed that the position of the first road surface point overlaps with the position of the branch point. Now, it is assumed that the branch determining unit 12 has determined that the vehicle 100 is approaching a certain branch point for the first time after the power of the vehicle 100 is turned on and the vehicle 100 starts traveling. The distance from 100 to the branch point is the approach determination threshold. A first projection image in which the distance between the first road surface point and the second road surface point is a reference distance is projected onto the road surface in front of the vehicle 100.
  • step ST4 when the projection control unit 14 causes the projection device 3 to project the first guide image onto the road surface in front of the vehicle 100, the control unit (not shown) of the projection control device 1 controls the vehicle 100 is turned off. If the power of the vehicle 100 is not turned off, the operation of the projection control device 1 returns to the process of step ST1 and performs the process from step ST1 again. Here, it is assumed that the power of vehicle 100 is not turned off.
  • step ST2 the branch determining unit 12 determines that the vehicle 100 is approaching the branch point (in the case of "YES" in step ST2).
  • the guide image generation unit 13 generates a first guide image (step ST3) based on the guide route information acquired by the guide route information acquisition unit 11 in step ST1, and the projection control unit 14 generates a first guide image for the projection device 3.
  • the first guide image generated by the guide image generator 13 in step ST3 is projected onto the road surface in front of the vehicle 100 (step ST4).
  • the road surface in front of the vehicle 100 is in a state where the first projection image is projected, for example, as shown in FIG. 4 .
  • FIG. 4 illustrates another example of how the projection control unit 14 of the projection control device 1 causes the projection device 3 to project the first guide image onto the road surface in front of the vehicle 100 in the first embodiment.
  • FIG. 4 is an overhead view of a branch point and the vehicle 100 approaching the branch point.
  • FIG. 4 shows the fork and the vehicle 100 approaching the fork from above when the vehicle 100 approaches the fork as shown in FIG. 3 and further approaches the fork. This is an overhead view.
  • the guide image generation unit 13 sets a position closer to the first road point by an amount corresponding to the distance that the vehicle 100 approaches the junction than the position shown in FIG. Determine the position of the road surface point. That is, the guide image generation unit 13 generates a first guide image in which the second end is brought closer to the first end by an amount corresponding to the distance that the vehicle 100 is closer to the branch point than the position shown in FIG. do.
  • the projection control unit 14 causes the projection device 3 to project a first guide image generated by the guide image generation unit 13 with the second end close to the first end on the road surface in front of the vehicle 100. .
  • the first projected image shown in FIG. 4 becomes shorter toward the first road surface point than the first projected image shown in FIG. 3.
  • the arrows generated by the guide image generation unit 13 and projected onto the road surface in front of the vehicle 100 by the projection control unit 14 have a higher end point in the arrow shown in FIG. 4 than in the arrow shown in FIG. 3. becomes the arrow approaching the starting point.
  • FIG. 4 for ease of understanding, the difference between the first projection image shown in FIG. 3 and the first projection image shown in FIG. 4 is shown by a dotted line.
  • the guide image generation unit 13 stores the branch determination result in a location where the projection control device 1 can refer to it.
  • the branch approach flag When the branch approach flag has an initial value of "0", the guide image generation unit 13 determines that it is determined for the first time that the vehicle 100 approaches a certain branch point.
  • the guide image generation unit 13 determines a point that is a reference distance away from the first road surface point as the second road surface point in accordance with the end point determination conditions, it sets "1" in the branch approach flag.
  • the guide image generation unit 13 stores the determined coordinates of the first road point and the determined second road point in a storage unit provided at a location where the projection control device 1 can refer to them. .
  • the guide image generation unit 13 refers to the branch approach flag, and if the branch approach flag is set to "1", the guidance image generation unit 13 moves to a certain branch point. In other words, it is determined that the first projection image is already projected on the road surface in front of the vehicle 100 after it is determined for the first time that the vehicle 100 is approaching.
  • the guide image generation unit 13 generates a point that moves the position of the second road surface point stored in the storage unit closer to the first road surface point according to the end point determination conditions according to the distance from the vehicle 100 to the branch point. is determined as the second road surface point. Then, the guide image generation unit 13 updates the coordinates of the second road surface point stored in the storage unit. As a result, the projection control device 1 generates a first guide image in which the second end approaches the first end as the vehicle 100 approaches the branch point. A first projected image in which the second road surface point approaches the first road surface point, in other words, an arrow in which the end point approaches the starting point, can be projected in front of the vehicle 100.
  • step ST2 the branch determining unit 12 also determines whether the vehicle 100 has reached the branch point. If it is determined that the vehicle 100 has reached the branch point, the branch determination section 12 outputs the branch point arrival information to the projection control section 14 . In this case, the projection control unit 14 ends the projection of the first guide image, and the operation of the projection control device 1 skips the processing of step ST3 and step ST4, and returns to the processing of step ST1. At this time, the control section of the projection control device 1 clears the branch approach flag and the coordinates of the first road surface point and the second road surface point stored in the storage section. Note that even when the power of the vehicle 100 is turned off or the power of the vehicle 100 is turned on, the control unit displays the branch approach flag and the coordinates of the first road surface point stored in the storage unit. and the coordinates of the second road surface point.
  • the projection control device 1 repeats the above operations until, for example, the power of the vehicle 100 is turned off.
  • the projection control device 1 determines that the vehicle 100 is approaching a branch point on the guide route based on the guide route information acquired from the navigation device 2, the projection control device 1 displays a number indicating the existence of the branch point and the branch direction.
  • a first guide image is generated, and the projection device 3 is caused to project the generated first guide image onto the road surface in front of the vehicle 100.
  • the projection control device 1 determines the position of the vehicle 100, the position of the branch point, the position of a first road surface point on the road surface onto which the first end of the first guide image is projected, and the first end of the first guide image.
  • the projection control device 1 can allow the occupants of the vehicle 100 to grasp the distance to the fork and the direction of the fork without removing the fork from the driver's field of view.
  • the length of the first guide image projected on the road surface that is, the first projected image toward the fork becomes shorter as the vehicle 100 approaches the fork.
  • the left and right ends of the first projected image in other words, the start and end points of the arrow projected on the road surface do not fall out of the occupant's field of view.
  • the occupant of the vehicle 100 does not have to move his/her line of sight in a direction where the fork is out of sight in an attempt to grasp the distance to the fork and the fork in the direction. You can understand the branching direction.
  • the first guide image is an arrow image, but this is only an example.
  • the first guide image may be an image that starts near the branch point and extends in the branch direction to indicate the branch direction, and is an image that indicates the presence of the branch point and the branch direction.
  • the projection control device 1 generates a first guide image in which the second end approaches the first end as the vehicle 100 approaches the branch point, and projects the first guide image onto the road surface in front of the vehicle 100.
  • the projection device 3 is configured to project the first guide image such that the first projected image becomes shorter in the direction of the fork as the vehicle 100 approaches the fork as viewed from the occupant of the vehicle 100. It should be .
  • the position of the first road surface point is a straight line passing through a point indicating the position of the vehicle 100 and parallel to the lane in which the vehicle 100 is traveling (here, the lane is a so-called lane);
  • the intersection point between the branched lane and a straight line that passes through the widthwise center of the branched lane (the lane referred to here is a so-called lane) and is parallel to the branched lane, this is only an example.
  • the projection control device 1 may set the first road point to a point within a predetermined range from the branch point. Note that the projection control device 1 can calculate the coordinates of the position of a point within a predetermined range from the branch point from the guide route information.
  • the position of the first road surface point is determined by a straight line that passes through a point indicating the position of the vehicle 100 and is parallel to the lane in which the vehicle 100 is traveling (the lane referred to here is a).
  • the location of the intersection with a straight line that passes through the widthwise center of the lane and is parallel to the lane after branching is more likely to indicate the existence of the branch and to the occupants of vehicle 100 than the location of any other point. This location makes it easy to gauge the distance to the junction. This is because it is easier for the occupants of the vehicle 100 to confirm the starting point of the first projected image, that is, near the branch point, if the starting point of the first projected image is in the front direction.
  • the first projection image is a lane ahead of the vehicle 100 after the branch (here, This is an image on the road surface of a so-called lane (see FIGS. 3 and 4), but this is only an example.
  • the projection control device 1 may cause the projection device 3 to project the first guide image onto a location other than the road surface of the lane after the branch (here, the lane is a so-called lane). It is sufficient that the first road surface point is located near the branch point.
  • the projection control device 1 is an in-vehicle device mounted on the vehicle 100, and includes the guidance route information acquisition section 11, the branch determination section 12, the guidance image generation section 13, and the projection control section. 14 and a control section (not shown) are assumed to be included in the vehicle-mounted device.
  • the present invention is not limited to this, and some of the guidance route information acquisition unit 11, branch determination unit 12, guidance image generation unit 13, projection control unit 14, and a control unit (not shown) may be provided in the on-vehicle device of the vehicle 100.
  • other components may be provided in a server connected to the in-vehicle device via a network.
  • the guide route information acquisition section 11, the branch determination section 12, the guide image generation section 13, the projection control section 14, and a control section may all be included in the server.
  • FIG. 5A and 5B are diagrams showing an example of the hardware configuration of the projection control device 1 according to the first embodiment.
  • the functions of the guide route information acquisition section 11, the branch determination section 12, the guide image generation section 13, the projection control section 14, and a control section are realized by the processing circuit 1001. That is, the projection control device 1 performs a process for controlling the projection device 3 to project the first guide image generated based on the guide route information acquired from the navigation device 2 onto the road surface in front of the vehicle 100.
  • a circuit 1001 is provided.
  • Processing circuit 1001 may be dedicated hardware as shown in FIG. 5A, or may be processor 1004 that executes a program stored in memory 1005 as shown in FIG. 5B.
  • the processing circuit 1001 is dedicated hardware, the processing circuit 1001 is, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), or an FPGA (Field-Programmable Circuit). Gate Array), or a combination of these.
  • ASIC Application Specific Integrated Circuit
  • FPGA Field-Programmable Circuit
  • the processing circuit is the processor 1004, the functions of the guidance route information acquisition unit 11, branch determination unit 12, guidance image generation unit 13, projection control unit 14, and control unit (not shown) are implemented by software, firmware, or software. This is realized by a combination of and firmware.
  • Software or firmware is written as a program and stored in memory 1005.
  • the processor 1004 controls the guidance route information acquisition unit 11, the branch determination unit 12, the guidance image generation unit 13, the projection control unit 14, and a control unit (not shown). perform the functions of That is, the projection control device 1 includes a memory 1005 for storing a program that, when executed by the processor 1004, results in the execution of steps ST1 to ST4 in FIG. 2 described above.
  • the program stored in the memory 1005 can be used to explain the processing procedures or methods of the guidance route information acquisition unit 11, the branch determination unit 12, the guidance image generation unit 13, the projection control unit 14, and a control unit (not shown) to the computer. It can also be said that it is something that can be carried out.
  • the memory 1005 is, for example, RAM, ROM (Read Only Memory), flash memory, EPROM (Erasable Programmable Read Only Memory), EEPROM (registered trademark) (Electrically Erasable Programmable Read-Only Memory) or other non-volatile This includes volatile semiconductor memories, magnetic disks, flexible disks, optical disks, compact disks, mini disks, DVDs (Digital Versatile Discs), and the like.
  • the functions of the guidance route information acquisition unit 11, the branch determination unit 12, the guidance image generation unit 13, the projection control unit 14, and a control unit are realized by dedicated hardware; may be realized by software or firmware.
  • the function of the guidance route information acquisition unit 11 is realized by the processing circuit 1001 as dedicated hardware, and the function of the branch determination unit 12, the guidance image generation unit 13, the projection control unit 14, and a control unit (not shown) is realized.
  • the functions can be realized by the processor 1004 reading and executing a program stored in the memory 1005.
  • the projection control device 1 also includes an input interface device 1002 and an output interface device 1003 that perform wired or wireless communication with devices such as the navigation device 2 or the projection device 3.
  • a storage unit (not shown) includes a memory 1005 and the like.
  • the projection control device 1 includes the guide route information acquisition unit 11 that acquires guide route information regarding the guide route of the vehicle 100, and the guide route information acquired by the guide route information acquisition unit 11. If the branch determining unit 12 determines that the vehicle 100 is approaching a branch point on the guide route information, the branch determining unit 12 determines whether the vehicle 100 is approaching a branch point on the guide route.
  • the guide image generating unit 13 generates a first guide image indicating the existence of a branch point and the branch direction based on the information, and the projection device 3 displays the first guide image generated by the guide image generating unit 13 on the road surface in front of the vehicle 100.
  • the guide image generator 13 includes a projection control unit 14 that projects a guide image, and a guide image generator 13 that determines the position of the vehicle 100, the distance from the vehicle 100 to the branch point, and the road surface on which the first end of the first guide image is projected.
  • the position of the first road point above and the second end opposite to the first end of the first guide image are projected further away from the first road point in the direction of the fork with respect to the fork.
  • a first guide image is generated based on the position of the second road surface point on the road surface, and as the vehicle 100 approaches the branch point, a first guide image is generated in which the second end approaches the first end. It was configured as follows. Therefore, in route guidance by projecting images onto the road surface, the projection control device 1 can allow the occupants of the vehicle 100 to grasp the distance to the fork and the direction of the fork without removing the fork from the driver's field of view.
  • the projection control device causes the projection device to project a first guide image indicating the presence of a branch point and the branch direction on the road surface in front of the vehicle.
  • the projection control device includes a shielding object that blocks a region on which the projection device projects the first guide image (hereinafter referred to as “first guide image projection region”) out of the projectable region of the projection device. It is determined whether or not there is an obstruction, and if it is determined that there is an obstruction, the projection device is caused to project a guidance image (hereinafter referred to as "second guidance image”) indicating the direction of the branch point as seen from the vehicle.
  • first guide image projection region a shielding object that blocks a region on which the projection device projects the first guide image (hereinafter referred to as “first guide image projection region”) out of the projectable region of the projection device. It is determined whether or not there is an obstruction, and if it is determined that there is an obstruction, the projection device is caused to project a guidance image (hereinafter referred to as "second guidance image”) indicating
  • FIG. 6 is a diagram showing a configuration example of a projection control device 1a according to the second embodiment.
  • the projection control device 1a is mounted on the vehicle 100.
  • the projection control device 1a according to the second embodiment is connected to the sensor 4 in addition to the navigation device 2 and the projection device 3. Ru.
  • the sensor 4 is mounted on the vehicle 100 and detects objects present in front of the vehicle 100.
  • the sensor 4 outputs information regarding an object detected in front of the vehicle 100 (hereinafter referred to as "sensor information") to the projection control device 1a.
  • the sensor 4 is assumed to be an imaging device that captures an image in front of the vehicle 100.
  • the sensor information is a captured image of the front of the vehicle 100.
  • the senor 4 will be described as an imaging device, and the sensor information will be described as a captured image. Note that this is just an example, and the sensor 4 is not limited to an imaging device.
  • the sensor 4 includes, for example, a lidar or other device capable of detecting the distance to an object present in front of the vehicle 100, such as a distance sensor.
  • the projection control device 1a according to the second embodiment differs from the projection control device 1 according to the first embodiment in that it includes a sensor information acquisition section 15 and an obstruction detection section 16. Further, the specific operations of the guide image generation unit 13a and the projection control unit 14a in the projection control device 1a according to the second embodiment are the same as those of the guide image generation unit 13 and the projection control unit 14a in the projection control device 1 according to the first embodiment, respectively. This is different from the specific operation of the control unit 14.
  • the sensor information acquisition unit 15 acquires a captured image captured by the image capture device from the image capture device. Details of the guide image generation unit 13a will be described later.
  • the sensor information acquisition unit 15 outputs the acquired captured image to the obstructing object detection unit 16 together with information regarding the first guide image projection area output from the guide image generation unit 13a.
  • the shielding object detection unit 16 detects whether or not there is a shielding object that blocks the first guide image projection area, based on the captured image acquired by the sensor information acquisition unit 15. Note that since the installation position and detection range of the sensor 4, in this case the installation position and viewing angle of the imaging device, are known in advance, the shielding object detection unit 16 uses the coordinate system of the captured image and the first guide image projection area. The relative positional relationship with the coordinate system can be grasped.
  • the coordinate system of the first guide image projection area is a target area coordinate system, and is a so-called "geographical coordinate system.”
  • the obstructing object detection unit 16 detects whether or not there is an obstructing object that obstructs the first guide image projection area by, for example, performing a known image recognition process or pattern matching on the captured image. .
  • the shielding object detection section 16 outputs a detection result (hereinafter referred to as "obstruction detection result") as to whether or not there is a shielding object that blocks the first guide image projection area to the guide image generation section 13a.
  • the guide image generating unit 13a calculates the coordinates of the first road point and the coordinates of the second road point.
  • the position of the first road surface point is, for example, the lane in which the vehicle 100 is traveling through the point indicating the position of the vehicle 100 (the lane referred to here is the so-called lane).
  • a straight line that passes through the widthwise center of the lane after the branch here, the lane is what is called a lane
  • the guide image generating section 13a Based on the coordinates of the position of the first road surface point, the guide image generating section 13a generates a second road surface when the branch determining section 12 first determines that the vehicle 100 approaches a certain branch point, in accordance with the end point determination conditions. Calculate the coordinates of a point.
  • the guide image generating unit 13a calculates the coordinates of the position of the second road surface point using the same method as the method of calculating the coordinates of the position of the second road surface point by the guide image generating unit 13, which has already been explained in the first embodiment. do it.
  • the guide image generation unit 13a calculates a first guide image projection area from the calculated coordinates of the position of the first road surface point and the coordinates of the position of the second road surface point.
  • the guide image generation unit 13a outputs information regarding the first guide image projection area to the sensor information acquisition unit 15.
  • the information regarding the first guide image projection area may be, for example, information indicating the coordinates of the entire circumference of the first projection image, or may be the coordinates of the four corners of the minimum rectangle surrounding the first projection image.
  • the guide image generating section 13a generates a first guide image or a first guide image and a second guide image based on the shielding object detection result output from the shielding object detecting section 16. Specifically, when the shielding object detection unit 16 outputs a shielding object detection result indicating that there is a shielding object that blocks the first guiding image projection area, the guiding image generating unit 13a generates a first guiding image and a second guiding image. 2. Generate a guide image. The guide image generation unit 13a generates a first guide image when the shield detection unit 16 outputs a shield detection result indicating that there is no shield that blocks the first guide image projection area. In this case, the guide image generation unit 13a does not generate the second guide image.
  • the guide image generating unit 13a uses the same method as the method by which the guide image generating unit 13 generates the first guide image, which has already been explained in Embodiment 1. Since it is sufficient to generate the first guide image using the steps shown in FIG.
  • the second guide image is an image showing a rectangle.
  • vehicle front point a point located in front of the vehicle 100 by a predetermined distance (hereinafter referred to as "vehicle front point") and the first road surface point.
  • a second guide image having a preset width in this case an image representing a rectangle, is projected.
  • the second guide image projected onto the road surface is also referred to as a "second projected image.” Note that the predetermined distance in front of the vehicle 100 is appropriately set by an administrator or the like.
  • An administrator or the like sets a distance smaller than at least the approach determination threshold value to a predetermined distance in front of the vehicle 100, and stores it in a location that can be referenced by the projection control device 1a.
  • the end of the second guide image corresponding to the first road surface point is referred to as a third end.
  • the end of the second guide image corresponding to the point in front of the vehicle is referred to as a fourth end.
  • the guide image generation unit 13a calculates the coordinates of a point in front of the vehicle from the position of the vehicle 100. Then, the guide image generation unit 13a inputs the distance from the vehicle 100 to the branch point and the coordinates of the first road surface point to the first machine learning model, and obtains the coordinates of the third end of the second guide image. Note that here, the coordinates of the third end of the second guide image are the same as the coordinates of the first end of the first guide image. Further, the guide image generation unit 13a converts the coordinates of the point in front of the vehicle into the coordinates of the display means coordinate system based on the information regarding the relative positional relationship between the target area coordinate system and the display means coordinate system, and generates a second guide image.
  • the information regarding the relative positional relationship between the target area coordinate system and the display means coordinate system is, for example, a coordinate conversion parameter that can convert the coordinates of the target area coordinate system to the coordinates of the display means coordinate system.
  • the administrator or the like may project the second guide image from the projection device 3 in advance, such as before shipping the product of the projection control device 1a, on the projection surface on which the second guide image is projected (for example, A plurality of measurement points of the second projection image on the road surface) and a plurality of measurement points of the second guide image are obtained, and a positional correspondence relationship between the second projection image and the second guide image is derived.
  • the target area coordinate system can convert the coordinates of the target area coordinate system to the coordinates of the display means coordinate system based on the correspondence between the position on the second projection image and the position on the second guide image. Calculate coordinate transformation parameters. Then, the administrator or the like stores the calculated coordinate transformation parameters in a location where the projection control device 1a can refer to them.
  • the guide image generation unit 13a Upon obtaining the coordinates of the first end of the first guide image and the coordinates of the second end of the first guide image, the guide image generation unit 13a stores the coordinates in a storage unit (not shown) based on these coordinates.
  • a second guide image is generated in accordance with the rules (hereinafter referred to as "second guide image rules").
  • the storage unit stores, together with the first guide image rules, rules for the second guide image when creating the second guide image to be projected on the projection device 3, which are set in advance by an administrator or the like. remembered.
  • the guide image generation unit 13a generates a second guide image indicating the direction of the branch point as seen from the vehicle 100, in accordance with the second guide image rules stored in the storage unit.
  • the second guide image rule stores, for example, a rule that the second guide image is painted in blue in a width of 70 cm in the direction along the lane in which the vehicle 100 is traveling.
  • the guide image generation unit 13a is a rectangle with a width of 70 cm extending from the fourth end toward the third end, and is formed on the lane in which the vehicle 100 is traveling (the lane referred to here is a so-called lane).
  • a second guide image is generated that projects a second guide image showing a rectangle extending from the point in front of the vehicle toward the branch point.
  • the second guide image rules include, for example, the color, pattern, width, etc.
  • a rule is set such that when the generated second guide image is projected onto the road surface in front of the vehicle 100, the second projected image is such that the occupant of the vehicle 100 can recognize the direction of the branch point as seen from the vehicle 100. That's fine.
  • the rules for the second guide image are set in advance by the administrator or the like, but this is only an example. For example, the occupant of the vehicle 100 or the like may be able to set the rules for the second guide image as appropriate.
  • the guide image generation unit 13a After generating the first guide image or the first guide image and the second guide image, the guide image generation unit 13a performs projection control on the generated first guide image or the first guide image and the second guide image.
  • the output signal is output to the section 14a.
  • the guide image generation section 13a also outputs the shielding object detection result outputted from the shielding object detection section 16 to the projection control section 14a.
  • the projection control section 14a causes the projection device 3 to project the first guide image generated by the guide image generation section 13 onto the road surface in front of the vehicle 100.
  • the projection control section 14a causes the projection device 3 to detect the presence of a shielding object on the road surface in front of the vehicle 100.
  • the second guide image generated by the guide image generator 13 is projected.
  • FIG. 7 is a flowchart for explaining an example of the operation of the projection control device 1a according to the second embodiment. For example, when the power of the vehicle 100 is turned on and the vehicle 100 starts running, the projection control device 1a repeatedly performs the operation shown in the flowchart of FIG. 7 until the power of the vehicle 100 is turned off. .
  • the guide image generating unit 13a When the branch determining unit 12 determines in step ST12 that the vehicle 100 is approaching a branch point (“YES” in step ST12), the guide image generating unit 13a generates the coordinates of the position of the first road point, The coordinates of the position of the second road surface point are calculated. For example, assume that the branch determining unit 12 determines that the vehicle 100 is approaching a branch point for the first time after the power of the vehicle 100 is turned on and the vehicle 100 starts traveling. The guide image generation unit 13a calculates a first guide image projection area from the calculated coordinates of the position of the first road surface point and the coordinates of the position of the second road surface point (step ST13). The guide image generation unit 13a outputs information regarding the first guide image projection area to the sensor information acquisition unit 15.
  • the sensor information acquisition unit 15 acquires the captured image captured by the imaging device (step ST14). It is now assumed that the branch determining unit 12 has determined that the vehicle 100 is approaching a branch point only after the power of the vehicle 100 is turned on and the vehicle 100 starts traveling. Therefore, the projection control device 1a is in a state where the projection device 3 has not yet projected the guidance image (the first guidance image and the second guidance image) regarding a certain branch point onto the road surface in front of the vehicle 100. be.
  • the captured image acquired by the sensor information acquisition unit 15 is a captured image of the front of the vehicle 100 on which the first guide image and the second guide image are not projected (hereinafter referred to as "pre-projection captured image”).
  • the sensor information acquisition unit 15 outputs the acquired pre-projection captured image to the shielding object detection unit 16 together with the information regarding the first guide image projection area output from the guide image generation unit 13a.
  • the shielding object detection unit 16 detects whether or not there is a shielding object that blocks the first guide image projection area based on the pre-projection captured image acquired by the sensor information acquisition unit 15 in step ST14 (step ST15). .
  • the shielding object detection section 16 outputs the shielding object detection result to the guide image generation section 13a.
  • step ST15 when the shielding object detection section 16 detects that there is a shielding object that shields the first guide image projection area (in the case of "YES" in step ST15), the guide image generating section 13a An image and a second guide image are generated (step ST16).
  • the guide image generation unit 13a outputs the generated first guide image and second guide image to the projection control unit 14a.
  • the guide image generation section 13a also outputs the shielding object detection result outputted from the shielding object detection section 16 to the projection control section 14a.
  • the projection control unit 14a causes the projection device 3 to project the first guide image and the second guide image generated by the guide image generation unit 13 in step ST16 onto the road surface in front of the vehicle 100 (step ST17).
  • the road surface in front of the vehicle 100 is in a state where the first projection image and the second projection image are projected, for example, as shown in FIG. 8 .
  • FIG. 8 shows a state in which the projection control unit 14a of the projection control device 1a causes the projection device 3 to project the first guide image and the second guide image on the road surface in front of the vehicle 100 in the second embodiment.
  • FIG. 3 is a diagram for explaining an example. Note that FIG. 8 is an overhead view of a branch point and the vehicle 100 approaching the branch point. In FIG.
  • the branch point is indicated by "BP"
  • the first projection image projected onto the road surface in front of the vehicle 100 is indicated by "Ai1”.
  • the first road surface point is indicated by "E1”
  • the second road surface point is indicated by "E2”. Since the direction of the second road surface point is the branching direction, it can be seen in FIG. 8 that the guide route for the vehicle 100 is a route that turns left at the branching point.
  • the position of the first road surface point overlaps with the position of the branch point.
  • another vehicle exists in front of the vehicle 100.
  • other vehicles are indicated by "V”.
  • a building exists on the side of the vehicle 100. In FIG. 8, the building is indicated by "BLDG”.
  • the lane after branching (here, the lane is a so-called lane) It is not projected onto the top.
  • the first projection image is projected on the lane after the branch (the lane referred to here is a so-called lane).
  • the third road surface point is indicated by "E3”, and the point in front of the vehicle is indicated by "E4". The branch point, the first road surface point, and the third road surface point are hidden by other vehicles.
  • the first projected image is blocked by other vehicles or buildings and is not projected onto the lane after the branch, but the second projected image extending from the point in front of the vehicle toward the branch point is blocked by other vehicles or buildings. Projected without obstruction. Even if the occupant of vehicle 100 cannot see the first projected image, he or she can recognize that there is a branch point ahead by looking at the second projected image.
  • the guide image generating unit 13a A first guide image is generated (step ST18).
  • the guide image generation unit 13a outputs the generated first guide image to the projection control unit 14a.
  • the guide image generation section 13a also outputs the shielding object detection result outputted from the shielding object detection section 16 to the projection control section 14a.
  • the projection control unit 14a causes the projection device 3 to project the first guide image generated by the guide image generation unit 13a in step ST18 onto the road surface in front of the vehicle 100 (step ST19).
  • step ST17 After performing the process of step ST17 or the process of step ST19, the operation of the projection control device 1a returns to step ST11 and proceeds to the subsequent processes again.
  • the sensor information acquisition unit 15 has already projected the first guide image or the first guide image and the second guide image by the projection device 3, in other words, the first projected image
  • a captured image in which the front of the vehicle 100 on which the first projection image and the second projection image are projected (hereinafter referred to as "post-projection captured image”) is acquired.
  • post-projection captured image a captured image in which the front of the vehicle 100 on which the first projection image and the second projection image are projected.
  • the shielding object detection unit 16 detects whether or not there is a shielding object that shields the first guide image projection area based on the captured image after projection.
  • the guide image generating unit 13a When the shielding object detection unit 16 detects that there is a shielding object that blocks the first guide image projection area, the guide image generating unit 13a generates a first guide image and a second guide image (step ST16), and performs projection control.
  • the unit 14a causes the projection device 3 to project the first guide image and the second guide image onto the road surface in front of the vehicle 100 (step ST17).
  • the projection control unit 14a changes from projecting only the first guide image to projecting the first guide image and the second guide image. Control over the projection device 3 will be switched.
  • the projection control unit 14a controls the projection device 3 to project the first guide image and the second guide image.
  • the shielding object detection unit 16 detects that there is no shielding object that blocks the first guide image projection area
  • the guide image generation unit 13a generates the first guide image (step ST18)
  • the projection control unit 14a The projection device 3 is caused to project the first guide image onto the road surface in front of the vehicle 100 (step ST19).
  • the projection control unit 14a changes the projection of only the first guide image from the projection of the first guide image and the second guide image. Control over the projection device 3 is switched to projection. If the projection device 3 was previously projecting only the first guide image, the projection control unit 14a continues to control the projection device 3 to project only the first guide image.
  • the projection control device 1a repeats the above operations until, for example, the power of the vehicle 100 is turned off.
  • step ST16 and the processing in step ST18 are assumed to be performed after the processing in step ST15, but this is only an example.
  • the processing in step ST16 and the processing in step ST18 may be performed together in step ST12.
  • the guide image generation unit 13a generates the first guide image and the second guide image when calculating the first guide image projection area before the obstruction detection unit 16 detects the presence or absence of an obstruction. You may also leave it there.
  • the projection control unit 14a controls whether the projection device 3 projects the first guide image and the second guide image, or projects only the first guide image, based on the obstruction detection result. do it.
  • the projection control device 1a selects an area on the road surface onto which the first guide image is projected (the first guide image If it is detected that there is a blocking object that blocks the projection area), in addition to the first guide image, a second guide image indicating the direction of the branch point as seen from the vehicle 100 is generated, and the projection device 3 In addition to the first guide image, a second guide image is projected onto the road ahead.
  • the projection control device 1a displays a first projected image to the occupant of the vehicle 100 that allows the occupant to recognize the existence of the branch point and the branch direction, due to the presence of a shield on the road surface in front of the vehicle 100.
  • the direction of the branch point can be presented to the occupants of the vehicle 100 by projecting the second guide image onto the projection device 3.
  • the projection control device 1a can make the occupants of the vehicle 100 aware that they are about to reach a fork in the road.
  • the projection control device 1a also provides a first projected image for the occupant of the vehicle 100 in which there is no obstruction on the road surface in front of the vehicle 100 and the occupant can recognize the presence of the branch point and the branch direction, as intended.
  • the projection device 3 projects only the first guide image and does not project the second guide image.
  • the projection device 3 can suppress power consumption for unnecessary projection of the second guide image.
  • the guidance image generation section 13a when the shielding object detection section 16 detects that there is a shielding object that blocks the first guidance image projection area, the guidance image generation section 13a generates the first guidance image and the second guidance image.
  • the projection control section 14a caused the projection device 3 to project the first guide image and the second guide image generated by the guide image generation section 13 onto the road surface in front of the vehicle 100.
  • the first projection image and the second projection image are projected onto the road surface in front of the vehicle 100, as shown in FIG. 8, for example.
  • the first projected image is blocked by other vehicles, buildings, etc., and is not projected onto the lane after the branch.
  • the projection control device 1a when the projection control device 1a detects that there is a shielding object that blocks the first guide image projection area, the projection control device 1a stops controlling the projection of the first guide image to the projection device 3, The first guide image may not be projected from the projection device 3. Specifically, in the projection control device 1a, when the blocking object detection unit 16 detects that there is a blocking object that blocks the first guide image projection area, the projection control unit 14a instructs the projection device 3 to The second guide image generated by the guide image generator 13a is projected onto the road surface in front of the vehicle 100, and the projection of the first guide image is stopped.
  • a configuration example of the projection control device 1a in this case is as shown in FIG. 6.
  • step ST16 when the guide image generation section 13a generates the first guide image and the second guide image, the projection control section 14a causes the projection device 3 to project the first guide image. If so, the projection of the first guide image is stopped (step ST171).
  • the projection control unit 14a does not cause the projection device 3 to project the first guide image, it maintains a state in which the first guide image is not projected. Furthermore, the projection control unit 14a projects the second guide image onto the road surface in front of the vehicle 100 (step ST172).
  • FIG. 10 shows that in the second embodiment, the projection control unit 14a of the projection control device 1a causes the projection device 3 to stop projecting the first guide image onto the road surface in front of the vehicle 100
  • FIG. 4 is a diagram for explaining an example of how the second guide image is projected onto the road surface of the vehicle.
  • FIG. 10 is an overhead view of a branch point and the vehicle 100 approaching the branch point. The state shown in FIG. 10 differs from the state shown in FIG.
  • the projection device 3 does not project the first guide image, in other words, the first projection image is not projected on the road surface in front of the vehicle 100. are different. Note that in FIG. 10, the first projection image shown in FIG. 8 is shown by a dotted line for ease of understanding.
  • step ST17 After performing the process of step ST172 or the process of step ST19, the operation of the projection control device 1a returns to step ST11 and proceeds to the subsequent processes again.
  • step ST14 which is performed again, the sensor information acquisition unit 15 acquires a post-projection captured image.
  • the shielding object detection unit 16 detects whether or not there is a shielding object that shields the first guide image projection area based on the captured image after projection.
  • the guide image generating unit 13a When the shielding object detection unit 16 detects that there is a shielding object that blocks the first guide image projection area, the guide image generating unit 13a generates a first guide image and a second guide image (step ST16), and performs projection control.
  • the unit 14a causes the projection device 3 to stop projecting the first guide image onto the road surface in front of the vehicle 100 (step ST171), and causes the projection device 3 to project the second guide image onto the road surface in front of the vehicle 100. (Step ST172). If the first guide image was projected onto the projection device 3 last time, the projection control unit 14a will stop the projection of the first guide image. Furthermore, if the projection device 3 was previously projecting only the second guide image, the projection control unit 14a continues to control the projection device 3 to project only the second guide image.
  • the guide image generation unit 13a When the shielding object detection unit 16 detects that there is no shielding object that blocks the first guide image projection area, the guide image generation unit 13a generates the first guide image (step ST18), and the projection control unit 14a, The projection device 3 is caused to project the first guide image onto the road surface in front of the vehicle 100 (step ST19).
  • the projection control unit 14a changes the projection device 3 from projecting only the second guide image to projecting only the first guide image. Control will be switched. If the projection device 3 was previously projecting only the first guide image, the projection control unit 14a continues to control the projection device 3 to project only the first guide image.
  • the projection control device 1a repeats the above operations until, for example, the power of the vehicle 100 is turned off.
  • the processing is performed in the order of step ST171 and step ST172, but this is only an example.
  • the order of the processing in step ST171 and the processing in step ST172 may be reversed, or the processing in step ST171 and the processing in step ST172 may be performed in parallel.
  • the guide image generation unit 13a generates the first guide image in step ST16, but this is only an example.
  • the guide image generation unit 13a does not need to generate the first guide image in step ST16.
  • the guide image generation unit 13a can reduce the processing load by not generating the first guide image.
  • the projection control device 1a selects an area on the road surface onto which the first guide image is projected (the first guide image If it is detected that there is a blocking object that blocks the projection area), in addition to the first guide image, a second guide image indicating the direction of the branch point as seen from the vehicle 100 is generated, and the projection device 3
  • the second guide image may be projected onto the road surface in front of the vehicle, and the projection device 3 may be made to stop projecting the first guide image onto the road surface in front of the vehicle 100.
  • the projection control device 1a displays a first projected image to the occupant of the vehicle 100 that allows the occupant to recognize the existence of the branch point and the branch direction, due to the presence of a shield on the road surface in front of the vehicle 100.
  • the projection control device 1a can avoid causing confusion for the occupants of the vehicle 100.
  • a configuration example of the projection control device 1a in this case is as shown in FIG. 6.
  • the projection control unit 14a switches the projection control for the projection device 3
  • the projection control unit 14a sets the projection switching flag to “1”.
  • the projection switching flag is stored in a location that can be referenced by the projection control device 1a.
  • the projection control section 14a stores the control details for the projection device 3 after switching in the storage section. Specifically, when the projection control unit 14a switches from control for projecting the first guide image on the projection device 3 to control for newly projecting the second guide image, the projection control unit 14a sets the projection switching flag to “1”. Set.
  • the projection control section 14a stores in the storage section the fact that the projection device 3 is caused to project the first guide image and the second guide image.
  • the projection control unit 14a switches from control for projecting the first guide image and second guide image on the projection device 3 to control for projecting only the first guide image
  • the projection control unit 14a sets the projection switching flag to “1”. ”.
  • the projection control section 14a stores in the storage section the fact that the projection device 3 is caused to project only the first guide image.
  • the guide image generating unit 13a refers to the projection switching flag.
  • the projection switching flag is set to "1"
  • the guide image generation unit 13a generates the first guide image or the first guide image and the first guide image according to the control details for the projection device 3 stored in the storage unit. 2 guide images are generated.
  • the guide image generation unit 13a generates the first guide image and the second guide image.
  • Generate a guide image For example, if the storage unit stores that the projection device 3 is to project only the first guide image, the guide image generation unit 13a generates the first guide image.
  • the projection switching flag and the control details for the projection device 3 after switching which are stored in the storage unit, are cleared at the same timing as the branch approach flag. If the storage unit does not store the control details for the projection device 3 after switching, the projection control unit 14a determines that the projection control for the projection device 3 has not been switched yet.
  • the operation of the projection control device 1a is, for example, as shown in the flowchart of FIG. 11 instead of the operation shown in the flowchart of FIG.
  • the specific contents of steps ST101 to ST103, steps ST105 to ST107, and step ST110 in FIG. 11 are the same as the specific contents of steps ST11 to ST16 and step ST18 in FIG. 7, respectively. Therefore, duplicate explanations will be omitted.
  • step ST104 the guide image generation unit 13a determines whether or not the projection control for the projection device 3 has been switched (step ST104). Specifically, the guide image generation unit 13a determines whether the projection switching flag is set to "1".
  • step ST104 If it is determined that the projection control for the projection device 3 has not been switched, that is, if the projection switching flag is not set to "1" ("NO" in step ST104), the guide image generation unit 13a: The information regarding the first guide image projection area calculated in step ST103 is output to the sensor information acquisition section 15. The operation of the projection control device 1a then proceeds to step ST105.
  • the guide image generation unit 13a determines whether the projection control unit 14a has switched to controlling the projection device 3 to project the second guide image (step ST109).
  • the guide image generation unit 13a can determine whether the projection control unit 14a has switched to controlling the projection device 3 to project the second guide image by referring to the storage unit.
  • step ST109 When the guide image generation unit 13a determines that the projection control unit 14a has switched to control for projecting the second guide image on the projection device 3 (“YES” in step ST109), the projection control unit 14a The operation proceeds to step ST107. On the other hand, if the guide image generation unit 13a determines that the projection control unit 14a has not switched to control for projecting the second guide image on the projection device 3 (“NO” in step ST109), in other words, For example, if the projection control unit 14a determines that the control has been switched to projecting only the first guide image on the projection device 3, the operation of the projection control device 1a proceeds to step ST110.
  • the projection control device 1a switches control of projection on the projection device 3 from a state in which the first guide image is projected onto the projection device 3 to a state in which the second guide image is newly projected;
  • the projection control on the projection device 3 is switched from a state in which the second guide image is projected to the projection device 3 to a state in which the first guide image is projected on the projection device 3, after switching, Control of projection to the projection device 3 after switching is maintained regardless of the presence or absence of the projection device 3.
  • the projection control device 1a can reduce the trouble that the projection state of the first projection image or the second projection image projected in front of the vehicle 100 frequently changes for the occupant of the vehicle 100.
  • the projection control device 1a may perform the processing of step ST171 and step ST172 of the flowchart of FIG. 9 instead of the processing of step ST107.
  • the projection control device 1a detects whether or not there is a blocking object that blocks the first guide image projection area for the first time after determining that the vehicle 100 is approaching a branch point. , is performed before the first guide image is projected onto the road surface in front of the vehicle 100, but this is only an example.
  • the projection control device 1a detects whether or not there is a shielding object that blocks the first guide image projection area for the first time after determining that the vehicle 100 is approaching a branch point
  • the projection control device 1a once displays the first guide image. is projected, and it is detected whether or not there is a blocking object that blocks the first guide image projection area based on a captured image after projection in which the front of the vehicle 100 on which the first guide image is projected is imaged. You can.
  • the projection control device 1a generates a first guide image in step ST13 of the flowchart in FIG. 7, and causes the projection device 3 to project the generated first guide image on the road surface in front of the vehicle 100. .
  • step ST13 of the flowchart in FIG. The first guide image generated by the guide image generator 13a is projected.
  • the guide image generation unit 13a calculates the first guide image projection area.
  • the projection control device 1a when the projection control device 1a performs the operation shown in the flowchart of FIG. 9, the projection control device 1a generates a first guide image in step ST13 of FIG. The generated first guide image is projected onto the road surface in front of the vehicle. Further, when the projection control device 1a performs the operation shown in the flowchart of FIG. 11, the projection control device 1a generates a first guide image in step ST103 of FIG. The generated first guide image is projected onto the road surface in front of the vehicle.
  • the projection control device 1a allows the occupant of the vehicle 100 to recognize the existence of the branch point and the branch direction, as intended, due to the presence of a shield on the road surface in front of the vehicle 100. If the first projected image cannot be presented, the direction of the branch point can be presented to the occupants of the vehicle 100 by causing the projection device 3 to project the second guide image. Thereby, the projection control device 1a can make the occupants of the vehicle 100 aware that they are about to reach a fork in the road. The projection control device 1a also provides a first projected image for the occupant of the vehicle 100 in which there is no obstruction on the road surface in front of the vehicle 100 and the occupant can recognize the presence of the branch point and the branch direction, as intended.
  • the projection device 3 projects only the first guide image and does not project the second guide image.
  • the projection device 3 can suppress power consumption for unnecessary projection of the second guide image.
  • the projection control device 1a can accurately detect that the shielding object is blocking the first guide image projection area each time, for example, even if the blocking object is a moving object such as another vehicle.
  • the first guide image is an arrow image as an example, but this is only an example.
  • the first guide image may be an image that starts near the branch point and extends in the branch direction to indicate the branch direction, and is an image that indicates the presence of the branch point and the branch direction.
  • the projection control device 1a generates a first guide image in which the second end approaches the first end as the vehicle 100 approaches the branch point, and generates a first guide image in which the first guide image is projected in front of the vehicle 100.
  • the projection device 3 is configured to project the first guide image so that the first projected image becomes shorter in the direction of the fork as the vehicle 100 approaches the fork as viewed from the occupant of the vehicle 100. All you have to do is stay there.
  • the second guide image is an image showing a rectangle as an example, but this is only an example.
  • the second guide image may be an image that shows the direction of the branch point as seen from the vehicle 100.
  • the second guide image may be an arrow image or an image indicating a message such as "There is a fork ahead.”
  • the projection control device 1a controls the first guide image so that the first road surface point of the first projection image and the third road surface point of the second projection image are the same point.
  • the second guide image is generated so that the first end of the guide image and the third end of the second guide image are at the same point, this is also just an example. It is not essential that the first guide image and the second guide image are connected.
  • the position of the first road surface point is a straight line passing through a point indicating the position of the vehicle 100 and parallel to the lane in which the vehicle 100 is traveling (here, the lane is a so-called lane);
  • the intersection point between the branched lane and a straight line that passes through the widthwise center of the branched lane (the lane referred to here is a so-called lane) and is parallel to the branched lane, this is only an example.
  • the projection control device 1a may set the first road surface point to a point within a predetermined range from the branch point. Note that the projection control device 1a can calculate the coordinates of the position of a point within a predetermined range from the branch point from the guide route information.
  • the first projection image is a lane ahead of the vehicle 100 after the branch (here, This is an image on the road surface of a so-called lane (see FIGS. 3 and 4), but this is only an example.
  • the projection control device 1a may cause the projection device 3 to project the first guide image onto a location other than the road surface of the lane after the branch (here, the lane is a so-called lane). It is sufficient that the first road surface point is located near the branch point.
  • the projection control device 1a is an in-vehicle device mounted on the vehicle 100, and includes the guidance route information acquisition section 11, the branch determination section 12, the guidance image generation section 13a, and the projection control section. 14a, the sensor information acquisition section 15, the shielding object detection section 16, and a control section (not shown) are included in the on-vehicle device.
  • the present invention is not limited to this, and includes the guidance route information acquisition section 11, the branch determination section 12, the guidance image generation section 13a, the projection control section 14a, the sensor information acquisition section 15, the obstruction detection section 16, and the control section (not shown).
  • the rest may be provided in a server connected to the on-vehicle device via a network.
  • all of the guidance route information acquisition section 11, the branch determination section 12, the guidance image generation section 13a, the projection control section 14a, the sensor information acquisition section 15, the obstruction detection section 16, and the control section may be provided in the server.
  • the hardware configuration of the projection control device 1a according to the second embodiment is the same as the hardware configuration of the projection control device 1 described using FIGS. 5A and 5B in the first embodiment, and therefore illustration thereof is omitted.
  • a guide route information acquisition unit 11, a branch determination unit 12, a guide image generation unit 13a, a projection control unit 14a, a sensor information acquisition unit 15, an obstruction detection unit 16, and a control (not shown)
  • the functions of the section are realized by the processing circuit 1001. That is, the projection control device 1a controls the projection device 3 to project the first guide image or the second guide image generated based on the guide route information acquired from the navigation device 2 onto the road surface in front of the vehicle 100. It includes a processing circuit 1001 for performing.
  • the processing circuit 1001 reads out and executes the program stored in the memory 1005 to acquire the guidance route information acquisition section 11, the branch determination section 12, the guidance image generation section 13a, the projection control section 14a, and the sensor information acquisition section. 15, a shielding object detection section 16, and a control section (not shown). That is, when executed by the processing circuit 1001, the projection control device 1a performs steps ST11 to ST19 in FIG. 7 described above, steps ST11 to ST19 in FIG. 9 described above, or steps ST101 to ST101 in FIG. 11 described above.
  • a memory 1005 is provided for storing a program that will eventually be executed in step ST111.
  • the program stored in the memory 1005 includes the guidance route information acquisition section 11, the branch determination section 12, the guidance image generation section 13a, the projection control section 14a, the sensor information acquisition section 15, and the obstruction detection section 16. It can also be said that it causes a computer to execute a processing procedure or method of a control unit (not shown).
  • the projection control device 1a includes an input interface device 1002 and an output interface device 1003 that perform wired or wireless communication with devices such as the navigation device 2, the projection device 3, or the sensor 4.
  • a storage unit (not shown) includes a memory 1005 and the like.
  • the projection control device 1a includes the guidance route information acquisition section 11, the branch determination section 12, the guidance image generation section 13a, and the projection control section 14a, as well as the projection control device 1a in front of the vehicle 100.
  • a sensor information acquisition unit 15 acquires sensor information regarding the object detected by the sensor information acquisition unit 15, and a shielding object that blocks the area on the road surface on which the first guide image is projected is located based on the sensor information acquired by the sensor information acquisition unit 15.
  • the shielding object detection unit 16 is configured to include a shielding object detection unit 16 that detects whether or not it exists.
  • the guide image generation unit 13a determines the position of the vehicle 100, the distance from the vehicle 100 to the branch point, and the first road surface on which the first end of the first guide image is projected. The position of the point and the second end opposite to the first end of the first guide image are projected on the road surface located further away from the first road surface point in the direction of the fork. A first guide image is generated based on the position of the second road surface point, and as the vehicle 100 approaches the branch point, a first guide image is generated in which the second end approaches the first end.
  • the guidance image generation unit 13a when the obstruction detection unit 16 detects that an obstruction exists, the guidance image generation unit 13a generates, in addition to the first guidance image, a second guidance image indicating the direction of the branch point as seen from the vehicle 100, The projection control unit 14a causes the projection device 3 to project the second guide image onto the road surface in front of the vehicle 100.
  • the projection control device 1a provides a first projection control device that allows the occupant of the vehicle 100 to recognize the existence of a branch point and the branch direction, as intended, due to the presence of a shield on the road surface in front of the vehicle 100. If the projected image cannot be presented, the direction of the branch point can be presented to the occupants of the vehicle 100 by causing the projection device 3 to project the second guide image.
  • the projection control device 1a can make the occupants of the vehicle 100 aware that they are about to reach a fork in the road.
  • the projection control device 1a also provides a first projected image for the occupant of the vehicle 100 in which there is no obstruction on the road surface in front of the vehicle 100 and the occupant can recognize the presence of the branch point and the branch direction, as intended.
  • the projection device 3 projects only the first guide image and does not project the second guide image.
  • the projection device 3 can suppress power consumption for unnecessary projection of the second guide image.
  • Projection control allows vehicle occupants to grasp the distance to a branch point and the branch direction without removing the branch point from their field of vision in route guidance by projecting images onto the road surface.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • Navigation (AREA)

Abstract

A projection control device equipped with a junction determination unit (12) for determining whether or not a vehicle (100) is approaching a junction, a guidance image generation unit (13, 13a) for generating a first guidance image on the basis of guidance route information when it is determined that the vehicle (100) is approaching a junction, and a projection control unit (14, 14a) for causing a projection device 3 to project the first guidance image on a road surface to the front of the vehicle (100), wherein: the guidance image generation unit (13, 13a) generates a first guidance image on the basis of the position of the vehicle (100), the distance from the vehicle (100) to the junction, the position on the road surface of a first road surface point on which a first end section of the first guidance image is projected, and the position on the road surface of a second road surface point, which is farther from the junction in the direction of the junction than is the first road surface point and on which a second end section of the first guidance image which is opposite the first end section thereof is projected; and said guidance image generation unit (13, 13a) generates first guidance images in which the second end section is nearer to the first end section as the vehicle approaches the junction.

Description

投影制御装置および投影制御方法Projection control device and projection control method
 本開示は、車両用の投影制御装置および投影制御方法に関する。 The present disclosure relates to a projection control device and a projection control method for vehicles.
 従来、車両が案内経路上の分岐点に近づくと、路面上に案内経路に関する情報を投影することで、当該車両の経路案内を行う技術が知られている。
 例えば、特許文献1には、自車両が経路上の分岐点に乗員が視認可能な程度まで近づいたとき、経路情報に基づいて、自車両を分岐方向へと誘導する経路案内画像である矢印画像を、自車両の前方の路面上に投影する車両用投影装置が開示されている。特許文献1開示されている車両用投影装置は、分岐点に近づくに従い、矢印画像が有する、分岐点において分岐方向に沿って延びる分岐指示部、を長くするように制御する。
2. Description of the Related Art Conventionally, a technique is known in which when a vehicle approaches a branch point on a guide route, information regarding the guide route is projected onto the road surface to guide the vehicle along the route.
For example, Patent Document 1 describes an arrow image that is a route guidance image that guides the own vehicle in the branching direction based on route information when the own vehicle approaches a branch point on the route to the extent that the occupant can see it. A vehicle projection device has been disclosed that projects the image on the road surface in front of the own vehicle. The vehicular projection device disclosed in Patent Document 1 controls a branch indicating portion, which is included in an arrow image and extends along a branch direction at a branch point, to lengthen as the arrow image approaches the branch point.
特開2012-247369号公報Japanese Patent Application Publication No. 2012-247369
 特許文献1に開示されているような従来技術では、車両の乗員は、分岐点までの距離および分岐方向を把握するためには、分岐点に近づくに従って分岐方向に沿って延びる矢印画像の分岐指示部の先端、言い換えれば、路面に投影されている矢印の先端、に視線を向けなければならない。すなわち、車両の乗員は、分岐点に近づくに従って、当該分岐点から遠ざかる方向へ視線を向けなければならない。
 従来技術では、車両の乗員は、分岐点までの距離および分岐方向を把握しようとすると、分岐点が視界からはずれる方向へ視線を移動させなければならず、分岐点が把握できなくなるおそれがあるという課題があった。
In the conventional technology as disclosed in Patent Document 1, in order to grasp the distance to the branch point and the branch direction, the vehicle occupant needs to follow the branch instructions of an arrow image extending along the branch direction as he approaches the branch point. In other words, you must direct your gaze to the tip of the arrow projected on the road surface. That is, as the occupant of the vehicle approaches a branch point, he or she must direct his/her line of sight in a direction that moves away from the branch point.
With conventional technology, when vehicle occupants try to grasp the distance to a branch point and the direction of the branch, they have to move their line of sight in a direction where the branch point is out of their field of vision, and there is a risk that they may not be able to grasp the branch point. There was an issue.
 本開示は上記のような課題を解決するためになされたもので、路面への画像投影による経路案内において、車両の乗員に対して、視界から分岐点をはずすことなく、分岐点までの距離および分岐方向を把握させることができる投影制御装置を提供することを目的とする。 The present disclosure has been made to solve the above-mentioned problems, and in route guidance using image projection on the road surface, vehicle occupants can be informed of the distance to a junction and the distance to the junction without removing the junction from their field of vision. It is an object of the present invention to provide a projection control device that can grasp a branching direction.
 本開示に係る投影制御装置は、車両の案内経路に関する案内経路情報を取得する案内経路情報取得部と、案内経路情報取得部が取得した案内経路情報に基づき、車両は案内経路上の分岐点に近づいているか否かを判定する分岐判定部と、分岐判定部が、車両は分岐点に近づいていると判定した場合、案内経路情報に基づき、分岐点の存在および分岐方向を示す第1案内画像を生成する案内画像生成部と、投影装置に対して、車両の前方の路面に、案内画像生成部が生成した第1案内画像を投影させる投影制御部とを備え、案内画像生成部は、車両の位置と、車両から分岐点までの距離と、第1案内画像の第1端部が投影される路面上の第1路面点の位置と、第1案内画像の第1端部とは反対側の第2端部が投影される、分岐点に対し分岐方向に向かって第1路面点よりも遠くに位置する路面上の第2路面点の位置とに基づき第1案内画像を生成し、車両が分岐点に近づくに従って、第2端部を第1端部に近づけた第1案内画像を生成することを特徴とする。 The projection control device according to the present disclosure includes a guide route information acquisition unit that acquires guide route information regarding the guide route of the vehicle, and a guide route information acquisition unit that allows the vehicle to reach a branch point on the guide route based on the guide route information acquired by the guide route information acquisition unit. A branch determining unit that determines whether the vehicle is approaching a branch point; and when the branch determining unit determines that the vehicle is approaching a branch point, a first guide image that indicates the presence of the branch point and the branch direction based on the guidance route information; a guide image generating section that generates a first guide image generated by the guide image generating section, and a projection control section that causes a projection device to project a first guide image generated by the guide image generating section onto a road surface in front of the vehicle, the guide image generating section , the distance from the vehicle to the branch point, the position of the first road surface point on the road surface onto which the first end of the first guide image is projected, and the side opposite to the first end of the first guide image. A first guide image is generated based on the position of a second road surface point on the road surface, which is located further away from the first road surface point in the direction of the branch with respect to the branch point, on which the second end of the vehicle is projected. The first guide image is characterized in that as the second end approaches the first end, the first guide image is generated as the second end approaches the first end.
 本開示によれば、路面への画像投影による経路案内において、車両の乗員に対して、視界から分岐点をはずすことなく、分岐点までの距離および分岐方向を把握させることができる。 According to the present disclosure, in route guidance by projecting an image onto the road surface, it is possible to make the vehicle occupant grasp the distance to the branch point and the branch direction without removing the branch point from the driver's field of vision.
実施の形態1に係る投影制御装置の構成例を示す図である。1 is a diagram illustrating a configuration example of a projection control device according to Embodiment 1. FIG. 実施の形態1に係る投影制御装置の動作の一例について説明するためのフローチャートである。3 is a flowchart for explaining an example of the operation of the projection control device according to the first embodiment. 実施の形態1において、投影制御装置の投影制御部が、投影装置に対して、車両の前方の路面に第1案内画像を投影させた様子の一例を説明するための図である。FIG. 2 is a diagram illustrating an example of how the projection control unit of the projection control device causes the projection device to project a first guide image onto the road surface in front of the vehicle in the first embodiment. 実施の形態1において、投影制御装置の投影制御部が、投影装置に対して、車両の前方の路面に第1案内画像を投影させた様子のその他の一例を説明するための図である。FIG. 7 is a diagram for explaining another example of how the projection control unit of the projection control device causes the projection device to project the first guide image onto the road surface in front of the vehicle in the first embodiment. 図5Aおよび図5Bは、実施の形態1に係る投影制御装置のハードウェア構成の一例を示す図である。5A and 5B are diagrams illustrating an example of the hardware configuration of the projection control device according to the first embodiment. 実施の形態2に係る投影制御装置の構成例を示す図である。FIG. 3 is a diagram illustrating a configuration example of a projection control device according to a second embodiment. 実施の形態2に係る投影制御装置の動作の一例について説明するためのフローチャートである。7 is a flowchart for explaining an example of the operation of the projection control device according to the second embodiment. 実施の形態2において、投影制御装置の投影制御部が、投影装置に対して、車両の前方の路面に第1案内画像および第2案内画像を投影させた様子の一例を説明するための図である。2 is a diagram for explaining an example of how the projection control unit of the projection control device causes the projection device to project the first guide image and the second guide image onto the road surface in front of the vehicle in the second embodiment; FIG. be. 実施の形態2に係る投影制御装置の動作のその他の一例について説明するためのフローチャートである。7 is a flowchart for explaining another example of the operation of the projection control device according to the second embodiment. 実施の形態2において、投影制御装置の投影制御部が、投影装置に対して、車両の前方の路面への第1案内画像の投影を中止させ、車両の前方の路面へ第2案内画像を投影させた様子の一例を説明するための図である。In the second embodiment, the projection control unit of the projection control device causes the projection device to stop projecting the first guide image onto the road surface in front of the vehicle, and projects the second guide image onto the road surface in front of the vehicle. FIG. 3 is a diagram for explaining an example of a state where 実施の形態2に係る投影制御装置の動作のその他の一例について説明するためのフローチャートである。7 is a flowchart for explaining another example of the operation of the projection control device according to the second embodiment.
 以下、本開示の実施の形態について、図面を参照しながら詳細に説明する。
実施の形態1.
 図1は、実施の形態1に係る投影制御装置1の構成例を示す図である。
 実施の形態1において、投影制御装置1は、車両100に搭載されていることを想定する。
 投影制御装置1は、ナビゲーション装置2および投影装置3と接続される。
 ナビゲーション装置2は、車両100に搭載され、車両100の経路案内を行う、一般的なナビゲーション装置である。
 投影装置3は、車両100に搭載され、投影制御装置1の制御に基づき、車両100の前方の路面に、車両100の経路を案内するための画像(以下「案内画像」という。)を投影する。投影装置3は、車両100において、例えば、ヘッドライト付近に設けられている、路面への画像投影専用のランプである。なお、これは一例に過ぎず、投影装置3は、例えば、前照灯等、その他の機能を有する装置と共通の装置であってもよい。
Embodiments of the present disclosure will be described in detail below with reference to the drawings.
Embodiment 1.
FIG. 1 is a diagram showing a configuration example of a projection control device 1 according to the first embodiment.
In the first embodiment, it is assumed that the projection control device 1 is mounted on a vehicle 100.
The projection control device 1 is connected to a navigation device 2 and a projection device 3.
The navigation device 2 is a general navigation device that is mounted on the vehicle 100 and provides route guidance for the vehicle 100.
The projection device 3 is mounted on the vehicle 100 and projects an image for guiding the route of the vehicle 100 (hereinafter referred to as a “guidance image”) onto the road surface in front of the vehicle 100 under the control of the projection control device 1. . The projection device 3 is, for example, a lamp provided near the headlights of the vehicle 100 and dedicated to projecting an image onto the road surface. Note that this is just an example, and the projection device 3 may be a device that has other functions, such as a headlamp.
 投影制御装置1は、ナビゲーション装置2から車両100の案内経路に関する情報(以下「案内経路情報」という。)を取得し、案内経路情報に基づいて車両100が案内経路上の分岐点に近づいていると判定した場合、投影装置3に対し、車両100の前方の路面に、分岐点の存在、および、分岐方向を示す案内画像(以下「第1案内画像」という。)を、投影させる。投影制御装置1、第1案内画像、および、投影制御装置1による第1案内画像の投影方法の詳細については、後述する。
 なお、実施の形態1において、「分岐点」とは、十字路、三叉路(Y字路)、丁字路(T字路)等、道路等が分かれる地点のことをいう。以下の実施の形態1において、車両100の案内経路上の分岐点を、単に分岐点ともいう。また、実施の形態1において、「分岐方向」とは、分岐後の車両100の進路方向をいう。
The projection control device 1 acquires information regarding the guidance route of the vehicle 100 (hereinafter referred to as "guidance route information") from the navigation device 2, and determines whether the vehicle 100 is approaching a branch point on the guidance route based on the guidance route information. If it is determined that this is the case, the projection device 3 is caused to project a guide image (hereinafter referred to as "first guide image") indicating the existence of a branch point and the branch direction onto the road surface in front of the vehicle 100. Details of the projection control device 1, the first guide image, and the method of projecting the first guide image by the projection control device 1 will be described later.
Note that in the first embodiment, a "branch point" refers to a point where a road or the like branches, such as a crossroads, a three-way intersection (Y-junction), and a T-junction (T-junction). In Embodiment 1 below, a branch point on the guide route of vehicle 100 is also simply referred to as a branch point. Furthermore, in the first embodiment, the "branching direction" refers to the course direction of vehicle 100 after branching.
 以下の実施の形態1では、一例として、第1案内画像は、矢印画像とする。第1案内画像は、車両100の前方の路面に投影されると、分岐点付近を始点とし分岐方向へ延びて分岐方向を示す矢印として車両100の乗員に提示される。
 また、以下の実施の形態1では、一例として、投影制御装置1は、投影装置3に対し、第1案内画像、言い換えれば、矢印画像を、車両100の前方の分岐後の車線にて、車両100からみて左右方向に延びる、分岐後の車線に平行な画像として投影させるものとする。なお、実施の形態1において、「平行」とは、厳密に平行であることに限定されず、略平行も含む。
In Embodiment 1 below, as an example, the first guide image is an arrow image. When the first guide image is projected onto the road surface in front of the vehicle 100, it is presented to the occupant of the vehicle 100 as an arrow that starts near the branch point, extends in the branch direction, and indicates the branch direction.
In Embodiment 1 below, as an example, the projection control device 1 transmits a first guide image, in other words, an arrow image, to the projection device 3 in a lane after a branch in front of the vehicle 100. It is assumed that the image is projected as an image extending in the left-right direction when viewed from 100 and parallel to the lane after the branch. Note that in the first embodiment, "parallel" is not limited to strictly parallel, but also includes substantially parallel.
 図1に示すように、投影制御装置1は、案内経路情報取得部11、分岐判定部12、案内画像生成部13、および、投影制御部14を備える。 As shown in FIG. 1, the projection control device 1 includes a guide route information acquisition section 11, a branch determination section 12, a guide image generation section 13, and a projection control section 14.
 案内経路情報取得部11は、ナビゲーション装置2から案内経路情報を取得する。
 案内経路情報には、車両100の目的地までの経路に関する情報、車両100の現在位置に関する情報、および、地図情報が含まれる。
 車両100の現在位置に関する情報は、例えば、ナビゲーション装置2が、車両100に搭載されているGPS(Global Positioning System。図示省略)から取得する。なお、これは一例に過ぎず、例えば、案内経路情報取得部11が、直接、GPSから車両100の現在位置に関する情報を取得し、ナビゲーション装置2から取得した案内経路情報に含めてもよい。
 地図情報は、例えば、道路の位置、車線(ここでいう車線とはいわゆるレーン)の位置、道路の形状、道路の幅、車線(ここでいう車線とはいわゆるレーン)の幅、分岐点の位置、および、道路種別に関する情報を含む。実施の形態1において、「分岐点の位置」は、交差する車線(ここでいう車線とはいわゆるレーン)の車幅方向の中心を通る直線の交点であらわされるものとする。なお、地図情報における座標系は、地球上の位置をあらわす、いわゆる「地理座標系」とする。地図座標系は、2次元では緯度および経度であらわされ、3次元ではこれに標高が加えられるのが一般的である。
 案内経路情報取得部11は、取得した案内経路情報を、分岐判定部12に出力する。
The guide route information acquisition unit 11 acquires guide route information from the navigation device 2 .
The guide route information includes information regarding the route to the destination of the vehicle 100, information regarding the current position of the vehicle 100, and map information.
Information regarding the current position of the vehicle 100 is acquired by the navigation device 2 from a GPS (Global Positioning System, not shown) installed in the vehicle 100, for example. Note that this is just an example; for example, the guidance route information acquisition unit 11 may directly acquire information regarding the current position of the vehicle 100 from the GPS and include it in the guidance route information acquired from the navigation device 2.
The map information includes, for example, the location of the road, the location of the lane (the lane referred to here is the so-called lane), the shape of the road, the width of the road, the width of the lane (the lane referred to here is the so-called lane), and the location of the branch point. , and information regarding the road type. In the first embodiment, it is assumed that the "position of a branch point" is represented by the intersection of straight lines passing through the centers of intersecting lanes (the lanes herein are so-called lanes) in the vehicle width direction. Note that the coordinate system in the map information is a so-called "geographical coordinate system" that represents a position on the earth. A map coordinate system is generally expressed in two dimensions by latitude and longitude, and in three dimensions, elevation is added to these.
The guidance route information acquisition unit 11 outputs the acquired guidance route information to the branch determination unit 12.
 分岐判定部12は、案内経路情報取得部11が取得した案内経路情報に基づき、車両100は分岐点に近づいているか否かを判定する。
 詳細には、例えば、分岐判定部12は、車両100から分岐点までの距離と予め設定された閾値(以下「接近判定用閾値」という。)との比較によって、車両100は分岐点に近づいているか否かを判定する。分岐判定部12は、車両100から分岐点までの距離が接近判定用閾値以下である場合、車両100は分岐点に近づいていると判定する。
 車両100から分岐点までの距離とは、詳細には、例えば、車両100の現在位置から分岐点の位置までの距離である。案内経路情報には車両100の現在位置および地図情報が含まれているので、分岐判定部12は、案内経路情報に基づけば、車両100の現在位置および分岐点の位置を判定できる。
 接近判定用閾値は、管理者等によって適宜設定されている。接近判定用閾値は、投影装置3が第1案内画像を投影可能な範囲内の距離となるよう設定されることが好ましい。
The branch determination unit 12 determines whether the vehicle 100 is approaching a branch point based on the guidance route information acquired by the guidance route information acquisition unit 11.
Specifically, for example, the branch determination unit 12 determines whether the vehicle 100 is approaching the junction by comparing the distance from the vehicle 100 to the junction with a preset threshold (hereinafter referred to as "approach determination threshold"). Determine whether or not there is. The branch determining unit 12 determines that the vehicle 100 is approaching the branch point when the distance from the vehicle 100 to the branch point is less than or equal to the approach determination threshold.
Specifically, the distance from vehicle 100 to the branch point is, for example, the distance from the current position of vehicle 100 to the position of the branch point. Since the guide route information includes the current position of the vehicle 100 and map information, the branch determining unit 12 can determine the current position of the vehicle 100 and the position of the branch point based on the guide route information.
The approach determination threshold is appropriately set by an administrator or the like. The approach determination threshold is preferably set to a distance within a range in which the projection device 3 can project the first guide image.
 なお、実施の形態1では、分岐判定部12は、車両100から分岐点までの距離が接近判定用閾値以下であるか否かによって車両100は分岐点に近づいているか否かを判定するが、これは一例に過ぎない。例えば、分岐判定部12は、車両100から分岐点までの距離だけではなく、車両100の車速も考慮して、車両100が分岐点に近づいているか否かを判定してもよい。例えば、分岐判定部12は、車両100から分岐点までの距離が接近判定用閾値以下であって、かつ、車両100の車速が予め設定された閾値(以下「車速判定用閾値」という。)以上である場合に、車両100は分岐点に近づいていると判定してもよい。なお、車速は、案内経路情報として、ナビゲーション装置2から取得されるものとすればよい。 Note that in the first embodiment, the branch determination unit 12 determines whether the vehicle 100 is approaching the branch point based on whether the distance from the vehicle 100 to the branch point is less than or equal to the approach determination threshold; This is just one example. For example, the branch determination unit 12 may determine whether the vehicle 100 is approaching the branch point by considering not only the distance from the vehicle 100 to the branch point but also the vehicle speed of the vehicle 100. For example, the branch determination unit 12 determines that the distance from the vehicle 100 to the branch point is less than or equal to the approach determination threshold, and the vehicle speed of the vehicle 100 is greater than or equal to a preset threshold (hereinafter referred to as "vehicle speed determination threshold"). In this case, it may be determined that the vehicle 100 is approaching a branch point. Note that the vehicle speed may be acquired from the navigation device 2 as guidance route information.
 分岐判定部12は、車両100が分岐点に近づいていると判定した場合、車両100が分岐点に近づいている旨の判定結果(以下「分岐判定結果」という。)を、案内画像生成部13に出力する。 When the branch determination unit 12 determines that the vehicle 100 is approaching the junction, the guide image generation unit 13 transmits the determination result that the vehicle 100 is approaching the junction (hereinafter referred to as the “branch determination result”). Output to.
 なお、分岐判定部12は、車両100が分岐点に到達したか否かもあわせて判定する。分岐判定部12は、例えば、車両100から分岐点までの距離が「0」になった場合に、車両100が分岐点に到達したと判定すればよい。
 分岐判定部12は、車両100が分岐点に到達したと判定した場合は、車両100が分岐点に到達した旨の情報(以下「分岐点到達情報」という。)を、投影制御部14に出力する。分岐判定部12は、分岐点到達情報を、案内画像生成部13を介して投影制御部14に出力してもよいし、投影制御部14に直接出力してもよい。なお、図1では、分岐判定部12から投影制御部14への矢印の図示は省略している。
Note that the branch determining unit 12 also determines whether the vehicle 100 has reached a branch point. For example, the branch determining unit 12 may determine that the vehicle 100 has reached the branch point when the distance from the vehicle 100 to the branch point becomes "0".
If the branch determining unit 12 determines that the vehicle 100 has reached the branch point, it outputs information indicating that the vehicle 100 has reached the branch point (hereinafter referred to as “branch point arrival information”) to the projection control unit 14. do. The branch determination unit 12 may output the branch point arrival information to the projection control unit 14 via the guide image generation unit 13 or directly to the projection control unit 14. In addition, in FIG. 1, illustration of an arrow from the branch determination unit 12 to the projection control unit 14 is omitted.
 案内画像生成部13は、分岐判定部12が車両100は分岐点に近づいていると判定した場合、案内経路情報に基づき、第1案内画像を生成する。なお、案内画像生成部13は、分岐判定部12を介して、案内経路情報取得部11が取得した案内経路情報を取得すればよい。 If the branch determining unit 12 determines that the vehicle 100 is approaching a branch point, the guide image generating unit 13 generates a first guide image based on the guide route information. Note that the guidance image generation section 13 may acquire the guidance route information acquired by the guidance route information acquisition section 11 via the branch determination section 12.
 案内画像生成部13による第1案内画像の生成方法の一例について説明する。
 例えば、案内画像生成部13は、車両100の位置(詳細には現在位置)と、分岐点の位置と、第1案内画像の一方の端部(以下「第1端部」という。)が投影される路面上の点(以下「第1路面点」という。)の位置と、第1案内画像の第1端部とは反対側の端部(以下「第2端部」という。)が投影される、分岐点に対し分岐方向に向かって第1路面点よりも遠くに位置する路面上の点(以下「第2路面点」という。)の位置と、に基づき、第1案内画像を生成する。すなわち、実施の形態1において、第1端部は、矢印の始点であり、第2端部は矢印の終点である。
An example of a method for generating the first guide image by the guide image generation unit 13 will be described.
For example, the guidance image generation unit 13 projects the position of the vehicle 100 (more specifically, the current position), the position of the branch point, and one end (hereinafter referred to as "first end") of the first guidance image. The position of the point on the road surface (hereinafter referred to as the "first road surface point") and the end of the first guide image opposite to the first end (hereinafter referred to as the "second end") are projected. A first guide image is generated based on the position of a point on the road surface located further away from the first road surface point (hereinafter referred to as "second road surface point") toward the branch point in the direction of the branch point. do. That is, in the first embodiment, the first end is the starting point of the arrow, and the second end is the ending point of the arrow.
 車両100の前方の路面上のどの点を第1路面点とするかは、予め決められているものとする。実施の形態1では、第1路面点の位置は、車両100の位置を示す点を通り車両100が走行中の車線と平行な直線と、分岐後の車線(ここでいう車線とはいわゆるレーン)の幅方向の中心を通り分岐後の車線と平行な直線との交点の位置とする。 It is assumed that which point on the road surface in front of the vehicle 100 is to be the first road surface point is determined in advance. In the first embodiment, the position of the first road surface point is a straight line that passes through a point indicating the position of the vehicle 100 and is parallel to the lane in which the vehicle 100 is traveling, and a lane after a branch (here, the lane is a so-called lane). This is the intersection of the straight line passing through the widthwise center of the lane and parallel to the lane after branching.
 案内画像生成部13は、まず、第1路面点の位置の座標を算出する。案内画像生成部13は、案内経路情報から、第1路面点の位置の座標を算出できる。
 次に、案内画像生成部13は、第1路面点の位置の座標と、車両100から分岐点までの距離とに基づいて、第2路面点の位置を決め、当該第2路面点の座標を算出する。例えば、案内画像生成部13は、予め設定され内部に保持している条件(以下「終点決定用条件」という。)に則って、第2路面点の位置を決める。終点決定用条件には、分岐判定部12によって車両100がある分岐点に近づいたと最初に判定されたときに、言い換えれば、車両100から分岐点までの距離が接近判定用閾値となったときに、第1路面点からどれぐらいの距離(以下「基準距離」という。)の点を、第2路面点とするかの条件が定義されている。また、終点決定用条件には、車両100から分岐点までの距離が接近判定用閾値となった後、車両100が分岐点にどれぐらい近づいたかによって、第2路面点の位置を第1路面点の位置にどれぐらい近づけるかの条件が定義されている。なお、案内画像生成部13は、案内経路情報から、車両100から分岐点までの距離を算出できる。
 管理者等は、予め終点決定用条件を設定し、案内画像生成部13に保持させておく。
The guide image generation unit 13 first calculates the coordinates of the position of the first road surface point. The guide image generation unit 13 can calculate the coordinates of the position of the first road surface point from the guide route information.
Next, the guide image generation unit 13 determines the position of the second road surface point based on the coordinates of the position of the first road surface point and the distance from the vehicle 100 to the branch point, and calculates the coordinates of the second road surface point. calculate. For example, the guide image generation unit 13 determines the position of the second road surface point in accordance with preset and internally held conditions (hereinafter referred to as "end point determination conditions"). The end point determination condition includes when the branch determining unit 12 first determines that the vehicle 100 approaches a certain branch point, in other words, when the distance from the vehicle 100 to the branch point reaches the approach determination threshold. , conditions are defined as to how far away from the first road surface point (hereinafter referred to as "reference distance") a point should be set as the second road surface point. In addition, the conditions for determining the end point include changing the position of the second road surface point to the first road surface point depending on how close the vehicle 100 has approached the junction after the distance from the vehicle 100 to the junction becomes the approach determination threshold. The conditions for how close to the position are defined. Note that the guide image generation unit 13 can calculate the distance from the vehicle 100 to the branch point from the guide route information.
An administrator or the like sets conditions for determining the end point in advance and stores them in the guide image generation section 13.
 そして、案内画像生成部13は、車両100の位置の座標と、車両100から分岐点までの距離と、第1路面点の座標とを、学習済みのモデル(以下「第1機械学習モデル」という。)に入力し、第1案内画像の第1端部の座標を得る。また、案内画像生成部13は、車両100の位置の座標と、車両100から分岐点までの距離と、第2路面点の座標とを第1機械学習モデルに入力し、第1案内画像の第2端部の座標を得る。 Then, the guide image generation unit 13 generates the coordinates of the position of the vehicle 100, the distance from the vehicle 100 to the branch point, and the coordinates of the first road point using a learned model (hereinafter referred to as the "first machine learning model"). ) to obtain the coordinates of the first end of the first guide image. Further, the guide image generation unit 13 inputs the coordinates of the position of the vehicle 100, the distance from the vehicle 100 to the branch point, and the coordinates of the second road surface point to the first machine learning model, and Obtain the coordinates of the two ends.
 ここで、投影装置3が投影する第1案内画像の座標系を表示手段座標系、車両100の前方の、実空間における、投影装置3が第1案内画像を投影可能な路面上の領域(以下「投影可能領域」という。)の座標系を対象領域座標系というものとする。なお、投影可能領域は、投影装置3の設置位置等に応じて予め決められる。対象領域座標系は、地図情報における座標系同様、いわゆる「地理座標系」である。
 第1機械学習モデルは、第1案内画像上の位置と路面上の位置との対応関係に基づき、車両100から分岐点までの距離を考慮して、対象領域座標系の点を、表示手段座標系の点へと変換するためのモデルである。
 第1機械学習モデルは、例えば、投影制御装置1の製品出荷前等、事前に、管理者等によって生成され、投影制御装置1が参照可能な場所に記憶されている。
 例えば、管理者等は、車両100を試走させ、実験的に、分岐点近くにて投影装置3から第1案内画像を投影させる。なお、分岐点は実際の分岐点である必要はなく、例えば、管理者等は、分岐のない道路にて分岐点とみなす点を設定してもよい。管理者等は、投影装置3が第1案内画像を車両100の前方に投影すると、当該第1案内画像における測定点と、路面上に投影されている第1案内画像の測定点を複数取得する。以下の実施の形態1において、路面上に投影されている第1案内画像を、「第1投影画像」ともいう。また、管理者等は、第1案内画像が車両100の前方の路面に投影されたときの車両100の位置と、車両100から分岐点までの距離とを取得する。この実験を、管理者等は、車両100から分岐点までの距離を変えて、複数回行う。
 そして、管理者等は、学習器に学習させ、車両100から分岐点までの距離と、第1投影画像における測定点の座標を入力とし、第1案内画像における測定点の座標を出力する第1機械学習モデルを生成する。
 なお、第1投影画像における測定点の座標は対象領域座標系であらわされ、第1案内画像における測定点の座標は表示手段座標系であらわされる。
Here, the coordinate system of the first guide image projected by the projection device 3 is referred to as the display means coordinate system, and the area in front of the vehicle 100 on the road surface in real space where the projection device 3 can project the first guide image (hereinafter referred to as The coordinate system of the "projectable area") is called the target area coordinate system. Note that the projectable area is determined in advance according to the installation position of the projection device 3 and the like. The target area coordinate system is a so-called "geographical coordinate system" similar to the coordinate system in map information.
The first machine learning model is based on the correspondence between the position on the first guide image and the position on the road surface, takes into account the distance from the vehicle 100 to the branch point, and converts the points in the target area coordinate system into the display means coordinates. This is a model for converting the system into points.
The first machine learning model is generated by an administrator or the like in advance, for example, before shipping the product of the projection control device 1, and is stored in a location that can be referenced by the projection control device 1.
For example, the administrator or the like takes the vehicle 100 for a test run and experimentally projects the first guide image from the projection device 3 near a junction. Note that the branch point does not need to be an actual branch point; for example, an administrator or the like may set a point on a road with no branches to be considered as a branch point. When the projection device 3 projects the first guide image in front of the vehicle 100, the administrator etc. obtains a plurality of measurement points on the first guide image and a plurality of measurement points of the first guide image projected on the road surface. . In the following first embodiment, the first guide image projected onto the road surface is also referred to as a "first projected image." The administrator or the like also obtains the position of the vehicle 100 when the first guide image is projected onto the road surface in front of the vehicle 100 and the distance from the vehicle 100 to the branch point. The administrator or the like performs this experiment multiple times by changing the distance from the vehicle 100 to the branch point.
Then, the administrator or the like causes the learning device to learn, inputs the distance from the vehicle 100 to the branch point, and the coordinates of the measurement point in the first projection image, and outputs the coordinates of the measurement point in the first guide image. Generate machine learning models.
Note that the coordinates of the measurement point in the first projection image are expressed in the target area coordinate system, and the coordinates of the measurement point in the first guide image are expressed in the display means coordinate system.
 案内画像生成部13は、第1案内画像の第1端部の座標と第1案内画像の第2端部の座標とを得ると、これらの座標に基づき、記憶部(図示省略)に記憶されている規則(以下「第1案内画像用規則」という。)に則って、第1案内画像を生成する。
 記憶部には、管理者等によって予め設定された、投影装置3に投影させる第1案内画像を生成する際の第1案内画像用規則が記憶されている。案内画像生成部13は、記憶部に記憶された第1案内画像用規則に則って、分岐点付近を始点とし分岐方向へ延びる矢印を示す第1投影画像が車両100の前方の路面に投影されるよう、第1案内画像を生成する。
 第1案内画像用規則には、例えば、分岐後の車線に沿った方向に、矢印の棒の幅70cm、矢印の三角の底辺の幅1mで、青色で塗りつぶす、という規則が設定されている。この場合、案内画像生成部13は、第1路面点を始点とし第2路面点を終点とする、棒の幅70cm、矢印の三角の底辺の幅1mの、青色の矢印であって、分岐後の車線の路面上で当該分岐方向に延びる矢印を示す第1投影画像を投影させる第1案内画像を生成することになる。
 なお、上述した例は一例に過ぎず、第1案内画像用規則には、例えば、第1案内画像の色、模様、幅等、案内画像生成部13が当該第1案内画像用規則に則って生成した第1案内画像が車両100の前方の路面に投影された際に、車両100の乗員が、分岐点の存在、および、分岐方向を認識できる第1投影画像になるような規則が設定されていればよい。
 また、ここでは、第1案内画像用規則は予め管理者等が設定しておくものとしたが、これは一例に過ぎない。例えば、車両100の乗員等が適宜、第1案内画像用規則を設定可能としてもよい。
Upon obtaining the coordinates of the first end of the first guide image and the coordinates of the second end of the first guide image, the guide image generation unit 13 stores the coordinates in a storage unit (not shown) based on these coordinates. A first guide image is generated in accordance with the rules (hereinafter referred to as "first guide image rules").
The storage unit stores rules for a first guide image when generating a first guide image to be projected on the projection device 3, which are set in advance by an administrator or the like. The guide image generation unit 13 projects a first projection image showing an arrow starting near the branch point and extending in the branch direction onto the road surface in front of the vehicle 100, in accordance with the first guide image rules stored in the storage unit. A first guide image is generated so that the
The first guide image rule includes, for example, a rule that the bar of the arrow is 70 cm wide and the base of the triangle of the arrow is 1 m wide, and is filled in blue in the direction along the lane after the branch. In this case, the guide image generation unit 13 generates a blue arrow with a bar width of 70 cm and a triangular base width of 1 m, starting from the first road surface point and ending at the second road surface point. A first guidance image is generated that projects a first projection image showing an arrow extending in the branching direction on the road surface of the lane.
The above-mentioned example is just an example, and the first guide image rules include, for example, the color, pattern, width, etc. of the first guide image, which the guide image generation unit 13 uses in accordance with the first guide image rules. A rule is set such that when the generated first guide image is projected onto the road surface in front of the vehicle 100, the first projected image allows the occupant of the vehicle 100 to recognize the existence of a branch point and the branch direction. All you have to do is stay there.
Further, here, it is assumed that the rules for the first guide image are set in advance by the administrator or the like, but this is only an example. For example, the occupant of the vehicle 100 or the like may be able to set the rules for the first guide image as appropriate.
 案内画像生成部13は、第1案内画像を生成すると、生成した第1案内画像を、投影制御部14に出力する。 After generating the first guide image, the guide image generation unit 13 outputs the generated first guide image to the projection control unit 14.
 投影制御部14は、投影装置3に対して、車両100の前方の路面に、案内画像生成部13が生成した第1案内画像を投影させる。 The projection control unit 14 causes the projection device 3 to project the first guide image generated by the guide image generation unit 13 onto the road surface in front of the vehicle 100.
 なお、ここでは、一例として、投影制御装置1において、案内画像生成部13は、まず第1路面点の座標を算出し、その後、第2路面点の座標を算出すると、第1機械学習モデルを用いて、表示手段座標系の第1案内画像の第1端部の座標と第1案内画像の第2端部の座標とを得、これと第1案内画像生成用規則とに則って第1案内画像を生成する例を挙げた。そして、投影制御部14は、投影装置3に対して、案内画像生成部13が生成した、表示手段座標系の座標であらわされる第1案内画像を、投影させるものとした。
 しかし、これは一例に過ぎず、投影制御装置1は、その他の方法で第1案内画像の生成および投影装置3に対する第1案内画像の投影制御を行ってもよい。
 例えば、投影制御装置1において、案内画像生成部13は、予め、初期画像(ここでは矢印画像)を記憶しておき、算出した第1路面点と第2路面点との距離に基づいて当該初期画像を拡大または縮小させた第1案内画像を生成してもよい。そして、投影制御部14は、案内画像生成部13が生成した第1案内画像の第1端部が、案内画像生成部13が算出した第1路面点に投影され、第1案内画像の第2端部が、案内画像生成部13が算出した第2路面点に投影されるよう、投影装置3のランプから発する光の投影角度を変化させることで、投影装置3に対して、車両100の前方の路面に、第1案内画像を投影させるようにしてもよい。
Here, as an example, in the projection control device 1, the guide image generation unit 13 first calculates the coordinates of the first road surface point, and then calculates the coordinates of the second road surface point, and then calculates the first machine learning model. to obtain the coordinates of the first end of the first guide image and the coordinates of the second end of the first guide image in the display means coordinate system, and according to this and the rules for generating the first guide image, An example of generating a guide image was given. Then, the projection control unit 14 causes the projection device 3 to project the first guide image generated by the guide image generation unit 13 and expressed by the coordinates of the display means coordinate system.
However, this is just an example, and the projection control device 1 may generate the first guide image and control the projection of the first guide image onto the projection device 3 using other methods.
For example, in the projection control device 1, the guide image generation unit 13 stores an initial image (in this case, an arrow image) in advance, and uses the initial image based on the calculated distance between the first road surface point and the second road surface point. The first guide image may be generated by enlarging or reducing the image. Then, the projection control unit 14 projects the first end of the first guide image generated by the guide image generation unit 13 onto the first road surface point calculated by the guide image generation unit 13, and By changing the projection angle of the light emitted from the lamp of the projection device 3 so that the end portion is projected on the second road surface point calculated by the guide image generation unit 13, the front of the vehicle 100 is projected with respect to the projection device 3. The first guide image may be projected onto the road surface.
 実施の形態1に係る投影制御装置1の動作について説明する。
 図2は、実施の形態1に係る投影制御装置1の動作の一例について説明するためのフローチャートである。
 投影制御装置1は、例えば、車両100の電源がオンにされ、車両100の走行が開始されると、車両100の電源がオフにされるまで、図2のフローチャートに示すような動作を繰り返し行う。
The operation of the projection control device 1 according to the first embodiment will be explained.
FIG. 2 is a flowchart for explaining an example of the operation of the projection control device 1 according to the first embodiment.
For example, when the power of the vehicle 100 is turned on and the vehicle 100 starts running, the projection control device 1 repeatedly performs the operation shown in the flowchart of FIG. 2 until the power of the vehicle 100 is turned off. .
 案内経路情報取得部11は、ナビゲーション装置2から案内経路情報を取得する(ステップST1)。
 案内経路情報取得部11は、取得した案内経路情報を、分岐判定部12に出力する。
The guide route information acquisition unit 11 acquires guide route information from the navigation device 2 (step ST1).
The guidance route information acquisition unit 11 outputs the acquired guidance route information to the branch determination unit 12.
 分岐判定部12は、ステップST1にて案内経路情報取得部11が取得した案内経路情報に基づき、車両100は分岐点に近づいているか否かを判定する(ステップST2)。 The branch determination unit 12 determines whether the vehicle 100 is approaching a branch point based on the guidance route information acquired by the guidance route information acquisition unit 11 in step ST1 (step ST2).
 分岐判定部12が、車両100は分岐点に近づいていないと判定した場合(ステップST1の“NO”の場合)、投影制御装置1の動作は、ステップST1の処理に戻る。 If the branch determining unit 12 determines that the vehicle 100 is not approaching the branch point (“NO” in step ST1), the operation of the projection control device 1 returns to the processing in step ST1.
 分岐判定部12は、車両100は分岐点に近づいていると判定した場合(ステップST1の“YES”の場合)、車両100が分岐点に近づいている旨の分岐判定結果を案内画像生成部13に出力する。
 例えば、今、分岐判定部12は、車両100の電源がオンにされ、車両100の走行が開始されてからはじめて、車両100が分岐点に近づいていると判定したとする。この場合、分岐判定部12は、車両100が分岐点に近づいている旨の分岐判定結果を案内画像生成部13に出力する。
When the branch determination unit 12 determines that the vehicle 100 is approaching a branch point (“YES” in step ST1), the branch determination unit 12 transmits the branch determination result that the vehicle 100 is approaching the branch point to the guidance image generation unit 13. Output to.
For example, assume that the branch determining unit 12 determines that the vehicle 100 is approaching a branch point for the first time after the power of the vehicle 100 is turned on and the vehicle 100 starts traveling. In this case, the branch determination unit 12 outputs a branch determination result indicating that the vehicle 100 is approaching a branch point to the guide image generation unit 13.
 案内画像生成部13は、ステップST1にて案内経路情報取得部11が取得した案内経路情報に基づき、第1案内画像を生成する(ステップST3)。
 案内画像生成部13は、第1案内画像を生成すると、生成した第1案内画像を、投影制御部14に出力する。
The guide image generation unit 13 generates a first guide image based on the guide route information acquired by the guide route information acquisition unit 11 in step ST1 (step ST3).
When the guide image generation unit 13 generates the first guide image, it outputs the generated first guide image to the projection control unit 14 .
 投影制御部14は、投影装置3に対して、車両100の前方の路面に、ステップST3にて案内画像生成部13が生成した第1案内画像を投影させる(ステップST4)。
 その結果、車両100の前方の路面は、例えば、図3に示すような、第1投影画像が投影された状態となる。
 図3は、実施の形態1において、投影制御装置1の投影制御部14が、投影装置3に対して、車両100の前方の路面に第1案内画像を投影させた様子の一例を説明するための図である。なお、図3は、分岐点および当該分岐点に近づいた車両100を上方からみた俯瞰図としている。
 図3において、分岐点は「BP」で示され、車両100の前方の路面に投影されている第1投影画像は、「Ai1」で示されている。また、第1路面点は「E1」で示され、第2路面点は「E2」で示されている。第2路面点の方向が分岐方向であるため、図3において、車両100の案内経路は、分岐点を右折する経路であることがわかる。なお、図3では、第1路面点の位置が分岐点の位置と重なったものとしている。
 今、分岐判定部12は、車両100の電源がオンにされ、車両100の走行が開始されてからはじめて、車両100がある分岐点に近づいていると判定したとしているので、図3において、車両100から分岐点までの距離は接近判定用閾値である。車両100の前方の路面には、第1路面点と第2路面点との距離が基準距離となる第1投影画像が投影されている。
The projection control unit 14 causes the projection device 3 to project the first guide image generated by the guide image generation unit 13 in step ST3 onto the road surface in front of the vehicle 100 (step ST4).
As a result, the road surface in front of the vehicle 100 is in a state where the first projection image is projected, for example, as shown in FIG.
FIG. 3 is for explaining an example of how the projection control unit 14 of the projection control device 1 causes the projection device 3 to project the first guide image onto the road surface in front of the vehicle 100 in the first embodiment. This is a diagram. Note that FIG. 3 is an overhead view of a branch point and the vehicle 100 approaching the branch point.
In FIG. 3, the branch point is indicated by "BP", and the first projection image projected onto the road surface in front of the vehicle 100 is indicated by "Ai1". Further, the first road surface point is indicated by "E1", and the second road surface point is indicated by "E2". Since the direction of the second road surface point is the branching direction, it can be seen in FIG. 3 that the guide route for the vehicle 100 is a route that turns right at the branching point. In addition, in FIG. 3, it is assumed that the position of the first road surface point overlaps with the position of the branch point.
Now, it is assumed that the branch determining unit 12 has determined that the vehicle 100 is approaching a certain branch point for the first time after the power of the vehicle 100 is turned on and the vehicle 100 starts traveling. The distance from 100 to the branch point is the approach determination threshold. A first projection image in which the distance between the first road surface point and the second road surface point is a reference distance is projected onto the road surface in front of the vehicle 100.
 図2のフローチャートに沿った投影制御装置1の動作説明に戻る。
 ステップST4にて、投影制御部14が、投影装置3に対して、車両100の前方の路面に第1案内画像を投影させると、投影制御装置1の制御部(図示省略)は、例えば、車両100の電源がオフにされたかを判定する。車両100の電源がオフにされていない場合、投影制御装置1の動作は、ステップST1の処理に戻り、再びステップST1以降の処理を行う。
 ここでは、車両100の電源がオフにされていないものとする。
Returning to the explanation of the operation of the projection control device 1 according to the flowchart of FIG. 2.
In step ST4, when the projection control unit 14 causes the projection device 3 to project the first guide image onto the road surface in front of the vehicle 100, the control unit (not shown) of the projection control device 1 controls the vehicle 100 is turned off. If the power of the vehicle 100 is not turned off, the operation of the projection control device 1 returns to the process of step ST1 and performs the process from step ST1 again.
Here, it is assumed that the power of vehicle 100 is not turned off.
 ここで、今、車両100は、図3に示したような位置から、さらに分岐点に近づいたとする。この場合、ステップST2にて、分岐判定部12は、車両100は分岐点に近づいていると判定する(ステップST2の“YES”の場合)。
 案内画像生成部13は、ステップST1にて案内経路情報取得部11が取得した案内経路情報に基づき、第1案内画像を生成(ステップST3)し、投影制御部14は、投影装置3に対して、車両100の前方の路面に、ステップST3にて案内画像生成部13が生成した第1案内画像を投影させる(ステップST4)。
 その結果、車両100の前方の路面は、例えば、図4に示すような、第1投影画像が投影された状態となる。
 図4は、実施の形態1において、投影制御装置1の投影制御部14が、投影装置3に対して、車両100の前方の路面に第1案内画像を投影させた様子のその他の一例を説明するための図である。図4は、図3同様、分岐点および当該分岐点に近づいた車両100を上方からみた俯瞰図としている。
 図4は、図3に示したような、車両100が分岐点に近づいている状態から、さらに車両100が分岐点に近づいた場合の、分岐点および当該分岐点に近づいた車両100を上方からみた俯瞰図である。
Here, it is assumed that the vehicle 100 is now approaching a branch point from the position shown in FIG. In this case, in step ST2, the branch determining unit 12 determines that the vehicle 100 is approaching the branch point (in the case of "YES" in step ST2).
The guide image generation unit 13 generates a first guide image (step ST3) based on the guide route information acquired by the guide route information acquisition unit 11 in step ST1, and the projection control unit 14 generates a first guide image for the projection device 3. , the first guide image generated by the guide image generator 13 in step ST3 is projected onto the road surface in front of the vehicle 100 (step ST4).
As a result, the road surface in front of the vehicle 100 is in a state where the first projection image is projected, for example, as shown in FIG. 4 .
FIG. 4 illustrates another example of how the projection control unit 14 of the projection control device 1 causes the projection device 3 to project the first guide image onto the road surface in front of the vehicle 100 in the first embodiment. This is a diagram for Similar to FIG. 3, FIG. 4 is an overhead view of a branch point and the vehicle 100 approaching the branch point.
FIG. 4 shows the fork and the vehicle 100 approaching the fork from above when the vehicle 100 approaches the fork as shown in FIG. 3 and further approaches the fork. This is an overhead view.
 案内画像生成部13は、終点決定用条件に則り、図3で示した位置よりも車両100が分岐点に近づいた距離に応じた分だけ第1路面点の位置に近づけた位置を、第2路面点の位置に決定する。
 すなわち、案内画像生成部13は、図3で示した位置よりも車両100が分岐点に近づいた距離に応じた分だけ、第2端部を第1端部に近づけた第1案内画像を生成する。
 投影制御部14は、投影装置3に対して、車両100の前方の路面に、案内画像生成部13が生成した、第2端部を第1端部に近づけた第1案内画像、を投影させる。
 その結果、図4に示す第1投影画像は、図3に示した第1投影画像よりも、第1路面点の方向に向かって短くなる。詳細には、案内画像生成部13が生成し、投影制御部14が車両100の前方の路面に投影させた矢印は、図3に示した矢印と比べ、図4に示した矢印のほうが、終点が始点に近づいた矢印となる。
 なお、図4では、わかりやすさのため、図3に示した第1投影画像と図4に示す第1投影画像との差分を点線で示している。
In accordance with the end point determination conditions, the guide image generation unit 13 sets a position closer to the first road point by an amount corresponding to the distance that the vehicle 100 approaches the junction than the position shown in FIG. Determine the position of the road surface point.
That is, the guide image generation unit 13 generates a first guide image in which the second end is brought closer to the first end by an amount corresponding to the distance that the vehicle 100 is closer to the branch point than the position shown in FIG. do.
The projection control unit 14 causes the projection device 3 to project a first guide image generated by the guide image generation unit 13 with the second end close to the first end on the road surface in front of the vehicle 100. .
As a result, the first projected image shown in FIG. 4 becomes shorter toward the first road surface point than the first projected image shown in FIG. 3. Specifically, the arrows generated by the guide image generation unit 13 and projected onto the road surface in front of the vehicle 100 by the projection control unit 14 have a higher end point in the arrow shown in FIG. 4 than in the arrow shown in FIG. 3. becomes the arrow approaching the starting point.
Note that in FIG. 4, for ease of understanding, the difference between the first projection image shown in FIG. 3 and the first projection image shown in FIG. 4 is shown by a dotted line.
 例えば、案内画像生成部13は、ステップST2において、分岐判定部12から、車両100が分岐点に近づいたとの分岐判定結果が出力されると、投影制御装置1が参照可能な場所に記憶されている分岐接近フラグを参照する。当該分岐接近フラグが初期値である「0」の場合、案内画像生成部13は、車両100が、ある分岐点に近づいたとはじめて判定されたと判定する。案内画像生成部13は、終点決定用条件に則って、第1路面点から基準距離だけ離れた点を第2路面点に決定すると、分岐接近フラグに「1」を設定しておく。このとき、案内画像生成部13は、決定した第1路面点の座標、および、第2路面点の座標を、投影制御装置1が参照可能な場所に備えられている記憶部に記憶しておく。
 再び、投影制御装置1にてステップST2の処理が行われた場合、案内画像生成部13は、分岐接近フラグを参照し、分岐接近フラグに「1」が設定されていれば、ある分岐点に近づいたとはじめて判定された後の状態、言い換えれば、既に車両100の前方の路面には第1投影画像が投影されている状態であると判定する。案内画像生成部13は、車両100から分岐点までの距離に応じて、終点決定用条件に則って、記憶部に記憶しておいた第2路面点の位置を第1路面点に近づけた点を第2路面点に決定する。そして、案内画像生成部13は、記憶部に記憶させている第2路面点の座標を更新する。
 これにより、投影制御装置1は、車両100が分岐点に近づくほど、第2端部を第1端部に近づけた第1案内画像を生成することになり、車両100が分岐点に近づくほど、第2路面点が第1路面点に近づいていく第1投影画像、言い換えれば、終点が始点に近づいていく矢印、を、車両100の前方に投影できる。
For example, when the branch determination unit 12 outputs a branch determination result indicating that the vehicle 100 approaches a branch point in step ST2, the guide image generation unit 13 stores the branch determination result in a location where the projection control device 1 can refer to it. Refer to the branch approach flag. When the branch approach flag has an initial value of "0", the guide image generation unit 13 determines that it is determined for the first time that the vehicle 100 approaches a certain branch point. When the guide image generation unit 13 determines a point that is a reference distance away from the first road surface point as the second road surface point in accordance with the end point determination conditions, it sets "1" in the branch approach flag. At this time, the guide image generation unit 13 stores the determined coordinates of the first road point and the determined second road point in a storage unit provided at a location where the projection control device 1 can refer to them. .
When the process of step ST2 is performed again in the projection control device 1, the guide image generation unit 13 refers to the branch approach flag, and if the branch approach flag is set to "1", the guidance image generation unit 13 moves to a certain branch point. In other words, it is determined that the first projection image is already projected on the road surface in front of the vehicle 100 after it is determined for the first time that the vehicle 100 is approaching. The guide image generation unit 13 generates a point that moves the position of the second road surface point stored in the storage unit closer to the first road surface point according to the end point determination conditions according to the distance from the vehicle 100 to the branch point. is determined as the second road surface point. Then, the guide image generation unit 13 updates the coordinates of the second road surface point stored in the storage unit.
As a result, the projection control device 1 generates a first guide image in which the second end approaches the first end as the vehicle 100 approaches the branch point. A first projected image in which the second road surface point approaches the first road surface point, in other words, an arrow in which the end point approaches the starting point, can be projected in front of the vehicle 100.
 なお、ステップST2では、分岐判定部12は、車両100が分岐点に到達したか否かもあわせて判定している。
 車両100が分岐点に到達したと判定した場合は、分岐判定部12は、分岐点到達情報を、投影制御部14に出力する。この場合、投影制御部14は、第1案内画像の投影を終了し、投影制御装置1の動作は、ステップST3の処理およびステップST4の処理をスキップして、ステップST1の処理に戻る。
 このとき、投影制御装置1の制御部は、分岐接近フラグと、記憶部に記憶されている第1路面点の座標および第2路面点の座標とを、クリアする。
 なお、制御部は、車両100の電源がオフにされた場合、または、車両100の電源がオンにされた場合にも、分岐接近フラグと、記憶部に記憶されている第1路面点の座標および第2路面点の座標とを、クリアする。
Note that, in step ST2, the branch determining unit 12 also determines whether the vehicle 100 has reached the branch point.
If it is determined that the vehicle 100 has reached the branch point, the branch determination section 12 outputs the branch point arrival information to the projection control section 14 . In this case, the projection control unit 14 ends the projection of the first guide image, and the operation of the projection control device 1 skips the processing of step ST3 and step ST4, and returns to the processing of step ST1.
At this time, the control section of the projection control device 1 clears the branch approach flag and the coordinates of the first road surface point and the second road surface point stored in the storage section.
Note that even when the power of the vehicle 100 is turned off or the power of the vehicle 100 is turned on, the control unit displays the branch approach flag and the coordinates of the first road surface point stored in the storage unit. and the coordinates of the second road surface point.
 投影制御装置1は、以上のような動作を、例えば、車両100の電源がオフにされるまで、繰り返す。 The projection control device 1 repeats the above operations until, for example, the power of the vehicle 100 is turned off.
 このように、投影制御装置1は、ナビゲーション装置2から取得した案内経路情報に基づき、車両100は案内経路上の分岐点に近づいていると判定した場合、分岐点の存在および分岐方向を示す第1案内画像を生成し、投影装置3に対して、車両100の前方の路面に、生成した第1案内画像を投影させる。投影制御装置1は、車両100の位置と、分岐点の位置と、第1案内画像の第1端部が投影される路面上の第1路面点の位置と、第1案内画像の第1端部とは反対側の第2端部が投影される、分岐点に対し分岐方向に向かって第1路面点よりも遠くに位置する路面上の第2路面点の位置とに基づき第1案内画像を生成し、車両100が分岐点に近づくに従って、第2端部を第1端部に近づけた第1案内画像を生成する。
 そのため、投影制御装置1は、路面への画像投影による経路案内において、車両100の乗員に対して、視界から分岐点をはずすことなく、分岐点までの距離および分岐方向を把握させることができる。
 路面に投影された第1案内画像、すなわち、第1投影画像は、車両100の乗員からみると、車両100が分岐点に近づくに従って、分岐点の方向に向かう長さが短くなっていく。車両100が分岐点に近づいても、第1投影画像の左右の端部、言い換えれば、路面に投影されている矢印の始点と終点とが、乗員の視界からはずれることがない。つまり、車両100の乗員は、分岐点までの距離および分岐方向を把握しようとして、分岐点が視界からはずれる方向へ視線を移動させる必要がなく、分岐点を把握したまま、分岐点までの距離および分岐方向を把握できる。
In this way, when the projection control device 1 determines that the vehicle 100 is approaching a branch point on the guide route based on the guide route information acquired from the navigation device 2, the projection control device 1 displays a number indicating the existence of the branch point and the branch direction. A first guide image is generated, and the projection device 3 is caused to project the generated first guide image onto the road surface in front of the vehicle 100. The projection control device 1 determines the position of the vehicle 100, the position of the branch point, the position of a first road surface point on the road surface onto which the first end of the first guide image is projected, and the first end of the first guide image. a first guide image based on the position of a second road surface point on a road surface located further away from the first road surface point in the direction of the branch point with respect to the branch point, on which the second end portion on the opposite side to the branch point is projected; , and as the vehicle 100 approaches the branch point, a first guide image is generated in which the second end approaches the first end.
Therefore, in route guidance by projecting images onto the road surface, the projection control device 1 can allow the occupants of the vehicle 100 to grasp the distance to the fork and the direction of the fork without removing the fork from the driver's field of view.
From the perspective of the occupant of the vehicle 100, the length of the first guide image projected on the road surface, that is, the first projected image toward the fork becomes shorter as the vehicle 100 approaches the fork. Even when the vehicle 100 approaches a branch point, the left and right ends of the first projected image, in other words, the start and end points of the arrow projected on the road surface do not fall out of the occupant's field of view. In other words, the occupant of the vehicle 100 does not have to move his/her line of sight in a direction where the fork is out of sight in an attempt to grasp the distance to the fork and the fork in the direction. You can understand the branching direction.
 なお、以上の実施の形態1では、一例として、第1案内画像は矢印画像としたが、これは一例に過ぎない。第1案内画像は、分岐点付近を始点とし分岐方向へ延びて分岐方向を示す画像であって、分岐点があること、および、分岐方向を示す画像であればよい。
 投影制御装置1は、車両100が分岐点に近づくに従って、第2端部を第1端部に近づけた第1案内画像を生成し、当該第1案内画像を車両100の前方の路面に投影させた第1投影画像が、車両100の乗員からみて、車両100が分岐点に近づくに従って、分岐点の方向に対して短くなるように、投影装置3に対して当該第1案内画像を投影させるようになっていればよい。
In addition, in the above-mentioned Embodiment 1, as an example, the first guide image is an arrow image, but this is only an example. The first guide image may be an image that starts near the branch point and extends in the branch direction to indicate the branch direction, and is an image that indicates the presence of the branch point and the branch direction.
The projection control device 1 generates a first guide image in which the second end approaches the first end as the vehicle 100 approaches the branch point, and projects the first guide image onto the road surface in front of the vehicle 100. The projection device 3 is configured to project the first guide image such that the first projected image becomes shorter in the direction of the fork as the vehicle 100 approaches the fork as viewed from the occupant of the vehicle 100. It should be .
 また、以上の実施の形態1では、第1路面点の位置は、車両100の位置を示す点を通り車両100が走行中の車線(ここでいう車線とはいわゆるレーン)と平行な直線と、分岐後の車線(ここでいう車線とはいわゆるレーン)の幅方向の中心を通り分岐後の車線と平行な直線との交点の位置としたが、これは一例に過ぎない。投影制御装置1は、第1路面点の位置を、分岐点から所定の範囲内の点の位置としていればよい。なお、投影制御装置1は、分岐点から所定の範囲内の点の位置の座標を、案内経路情報から算出できる。
 ただし、第1路面点の位置は、車両100の位置を示す点を通り車両100が走行中の車線(ここでいう車線とはいわゆるレーン)と平行な直線と、分岐後の車線(ここでいう車線とはいわゆるレーン)の幅方向の中心を通り分岐後の車線と平行な直線との交点の位置であるほうが、その他の点の位置であるよりも、車両100の乗員にとって分岐点の存在および分岐点までの距離を把握しやすい位置となる。車両100の乗員は、正面方向に第1投影画像の始点があったほうが、第1投影画像の始点、すなわち、分岐点付近、を確認しやすいためである。
Further, in the first embodiment described above, the position of the first road surface point is a straight line passing through a point indicating the position of the vehicle 100 and parallel to the lane in which the vehicle 100 is traveling (here, the lane is a so-called lane); Although the intersection point between the branched lane and a straight line that passes through the widthwise center of the branched lane (the lane referred to here is a so-called lane) and is parallel to the branched lane, this is only an example. The projection control device 1 may set the first road point to a point within a predetermined range from the branch point. Note that the projection control device 1 can calculate the coordinates of the position of a point within a predetermined range from the branch point from the guide route information.
However, the position of the first road surface point is determined by a straight line that passes through a point indicating the position of the vehicle 100 and is parallel to the lane in which the vehicle 100 is traveling (the lane referred to here is a The location of the intersection with a straight line that passes through the widthwise center of the lane and is parallel to the lane after branching is more likely to indicate the existence of the branch and to the occupants of vehicle 100 than the location of any other point. This location makes it easy to gauge the distance to the junction. This is because it is easier for the occupants of the vehicle 100 to confirm the starting point of the first projected image, that is, near the branch point, if the starting point of the first projected image is in the front direction.
 また、以上の実施の形態1では、投影制御装置1が投影装置3に対して第1案内画像を投影させた結果、第1投影画像は、車両100の前方の、分岐後の車線(ここでいう車線とはいわゆるレーン)の路面上の画像となるようにした(図3および図4参照)が、これは一例に過ぎない。
 例えば、投影制御装置1は、投影装置3に対して、分岐後の車線(ここでいう車線とはいわゆるレーン)の路面上以外の場所に、第1案内画像を投影させてもよい。第1路面点が、分岐点付近に位置するようになっていればよい。
Further, in the first embodiment described above, as a result of the projection control device 1 projecting the first guide image on the projection device 3, the first projection image is a lane ahead of the vehicle 100 after the branch (here, This is an image on the road surface of a so-called lane (see FIGS. 3 and 4), but this is only an example.
For example, the projection control device 1 may cause the projection device 3 to project the first guide image onto a location other than the road surface of the lane after the branch (here, the lane is a so-called lane). It is sufficient that the first road surface point is located near the branch point.
 また、以上の実施の形態1では、投影制御装置1は、車両100に搭載される車載装置とし、案内経路情報取得部11と、分岐判定部12と、案内画像生成部13と、投影制御部14と、図示しない制御部とは、車載装置に備えられているものとした。これに限らず、案内経路情報取得部11と、分岐判定部12と、案内画像生成部13と、投影制御部14と、図示しない制御部のうち、一部が車両100の車載装置に備えられるものとし、その他が当該車載装置とネットワークを介して接続されるサーバに備えられてもよい。また、案内経路情報取得部11と、分岐判定部12と、案内画像生成部13と、投影制御部14と、図示しない制御部の全部がサーバに備えられてもよい。 In the first embodiment described above, the projection control device 1 is an in-vehicle device mounted on the vehicle 100, and includes the guidance route information acquisition section 11, the branch determination section 12, the guidance image generation section 13, and the projection control section. 14 and a control section (not shown) are assumed to be included in the vehicle-mounted device. However, the present invention is not limited to this, and some of the guidance route information acquisition unit 11, branch determination unit 12, guidance image generation unit 13, projection control unit 14, and a control unit (not shown) may be provided in the on-vehicle device of the vehicle 100. However, other components may be provided in a server connected to the in-vehicle device via a network. Further, the guide route information acquisition section 11, the branch determination section 12, the guide image generation section 13, the projection control section 14, and a control section (not shown) may all be included in the server.
 図5Aおよび図5Bは、実施の形態1に係る投影制御装置1のハードウェア構成の一例を示す図である。
 実施の形態1において、案内経路情報取得部11と、分岐判定部12と、案内画像生成部13と、投影制御部14と、図示しない制御部の機能は、処理回路1001により実現される。すなわち、投影制御装置1は、投影装置3に対して、車両100の前方の路面に、ナビゲーション装置2から取得した案内経路情報に基づいて生成した第1案内画像を投影させる制御を行うための処理回路1001を備える。
 処理回路1001は、図5Aに示すように専用のハードウェアであっても、図5Bに示すようにメモリ1005に格納されるプログラムを実行するプロセッサ1004であってもよい。
5A and 5B are diagrams showing an example of the hardware configuration of the projection control device 1 according to the first embodiment.
In the first embodiment, the functions of the guide route information acquisition section 11, the branch determination section 12, the guide image generation section 13, the projection control section 14, and a control section (not shown) are realized by the processing circuit 1001. That is, the projection control device 1 performs a process for controlling the projection device 3 to project the first guide image generated based on the guide route information acquired from the navigation device 2 onto the road surface in front of the vehicle 100. A circuit 1001 is provided.
Processing circuit 1001 may be dedicated hardware as shown in FIG. 5A, or may be processor 1004 that executes a program stored in memory 1005 as shown in FIG. 5B.
 処理回路1001が専用のハードウェアである場合、処理回路1001は、例えば、単一回路、複合回路、プログラム化したプロセッサ、並列プログラム化したプロセッサ、ASIC(Application Specific Integrated Circuit)、FPGA(Field-Programmable Gate Array)、またはこれらを組み合わせたものが該当する。 When the processing circuit 1001 is dedicated hardware, the processing circuit 1001 is, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), or an FPGA (Field-Programmable Circuit). Gate Array), or a combination of these.
 処理回路がプロセッサ1004の場合、案内経路情報取得部11と、分岐判定部12と、案内画像生成部13と、投影制御部14と、図示しない制御部の機能は、ソフトウェア、ファームウェア、または、ソフトウェアとファームウェアとの組み合わせにより実現される。ソフトウェアまたはファームウェアは、プログラムとして記述され、メモリ1005に記憶される。プロセッサ1004は、メモリ1005に記憶されたプログラムを読み出して実行することにより、案内経路情報取得部11と、分岐判定部12と、案内画像生成部13と、投影制御部14と、図示しない制御部の機能を実行する。すなわち、投影制御装置1は、プロセッサ1004により実行されるときに、上述の図2のステップST1~ステップST4が結果的に実行されることになるプログラムを格納するためのメモリ1005を備える。また、メモリ1005に記憶されたプログラムは、案内経路情報取得部11と、分岐判定部12と、案内画像生成部13と、投影制御部14と、図示しない制御部の処理の手順または方法をコンピュータに実行させるものであるともいえる。ここで、メモリ1005とは、例えば、RAM、ROM(Read Only Memory)、フラッシュメモリ、EPROM(Erasable Programmable Read Only Memory)、EEPROM(登録商標)(Electrically Erasable Programmable Read-Only Memory)等の不揮発性または揮発性の半導体メモリ、磁気ディスク、フレキシブルディスク、光ディスク、コンパクトディスク、ミニディスク、DVD(Digital Versatile Disc)等が該当する。 When the processing circuit is the processor 1004, the functions of the guidance route information acquisition unit 11, branch determination unit 12, guidance image generation unit 13, projection control unit 14, and control unit (not shown) are implemented by software, firmware, or software. This is realized by a combination of and firmware. Software or firmware is written as a program and stored in memory 1005. By reading and executing the program stored in the memory 1005, the processor 1004 controls the guidance route information acquisition unit 11, the branch determination unit 12, the guidance image generation unit 13, the projection control unit 14, and a control unit (not shown). perform the functions of That is, the projection control device 1 includes a memory 1005 for storing a program that, when executed by the processor 1004, results in the execution of steps ST1 to ST4 in FIG. 2 described above. Further, the program stored in the memory 1005 can be used to explain the processing procedures or methods of the guidance route information acquisition unit 11, the branch determination unit 12, the guidance image generation unit 13, the projection control unit 14, and a control unit (not shown) to the computer. It can also be said that it is something that can be carried out. Here, the memory 1005 is, for example, RAM, ROM (Read Only Memory), flash memory, EPROM (Erasable Programmable Read Only Memory), EEPROM (registered trademark) (Electrically Erasable Programmable Read-Only Memory) or other non-volatile This includes volatile semiconductor memories, magnetic disks, flexible disks, optical disks, compact disks, mini disks, DVDs (Digital Versatile Discs), and the like.
 なお、案内経路情報取得部11と、分岐判定部12と、案内画像生成部13と、投影制御部14と、図示しない制御部の機能について、一部を専用のハードウェアで実現し、一部をソフトウェアまたはファームウェアで実現するようにしてもよい。例えば、案内経路情報取得部11については専用のハードウェアとしての処理回路1001でその機能を実現し、分岐判定部12と、案内画像生成部13と、投影制御部14と、図示しない制御部についてはプロセッサ1004がメモリ1005に格納されたプログラムを読み出して実行することによってその機能を実現することが可能である。
 また、投影制御装置1は、ナビゲーション装置2または投影装置3等の装置と、有線通信または無線通信を行う入力インタフェース装置1002および出力インタフェース装置1003を備える。
 図示しない記憶部は、メモリ1005等によって構成される。
Note that some of the functions of the guidance route information acquisition unit 11, the branch determination unit 12, the guidance image generation unit 13, the projection control unit 14, and a control unit (not shown) are realized by dedicated hardware; may be realized by software or firmware. For example, the function of the guidance route information acquisition unit 11 is realized by the processing circuit 1001 as dedicated hardware, and the function of the branch determination unit 12, the guidance image generation unit 13, the projection control unit 14, and a control unit (not shown) is realized. The functions can be realized by the processor 1004 reading and executing a program stored in the memory 1005.
The projection control device 1 also includes an input interface device 1002 and an output interface device 1003 that perform wired or wireless communication with devices such as the navigation device 2 or the projection device 3.
A storage unit (not shown) includes a memory 1005 and the like.
 以上のように、実施の形態1に係る投影制御装置1は、車両100の案内経路に関する案内経路情報を取得する案内経路情報取得部11と、案内経路情報取得部11が取得した案内経路情報に基づき、車両100は案内経路上の分岐点に近づいているか否かを判定する分岐判定部12と、分岐判定部12が、車両100は分岐点に近づいていると判定した場合、案内経路情報に基づき、分岐点の存在および分岐方向を示す第1案内画像を生成する案内画像生成部13と、投影装置3に対して、車両100の前方の路面に、案内画像生成部13が生成した第1案内画像を投影させる投影制御部14とを備え、案内画像生成部13は、車両100の位置と、車両100から分岐点までの距離と、第1案内画像の第1端部が投影される路面上の第1路面点の位置と、第1案内画像の第1端部とは反対側の第2端部が投影される、分岐点に対し分岐方向に向かって第1路面点よりも遠くに位置する路面上の第2路面点の位置とに基づき第1案内画像を生成し、車両100が分岐点に近づくに従って、第2端部を第1端部に近づけた第1案内画像を生成するように構成した。そのため、投影制御装置1は、路面への画像投影による経路案内において、車両100の乗員に対して、視界から分岐点をはずすことなく、分岐点までの距離および分岐方向を把握させることができる。 As described above, the projection control device 1 according to the first embodiment includes the guide route information acquisition unit 11 that acquires guide route information regarding the guide route of the vehicle 100, and the guide route information acquired by the guide route information acquisition unit 11. If the branch determining unit 12 determines that the vehicle 100 is approaching a branch point on the guide route information, the branch determining unit 12 determines whether the vehicle 100 is approaching a branch point on the guide route. The guide image generating unit 13 generates a first guide image indicating the existence of a branch point and the branch direction based on the information, and the projection device 3 displays the first guide image generated by the guide image generating unit 13 on the road surface in front of the vehicle 100. The guide image generator 13 includes a projection control unit 14 that projects a guide image, and a guide image generator 13 that determines the position of the vehicle 100, the distance from the vehicle 100 to the branch point, and the road surface on which the first end of the first guide image is projected. The position of the first road point above and the second end opposite to the first end of the first guide image are projected further away from the first road point in the direction of the fork with respect to the fork. A first guide image is generated based on the position of the second road surface point on the road surface, and as the vehicle 100 approaches the branch point, a first guide image is generated in which the second end approaches the first end. It was configured as follows. Therefore, in route guidance by projecting images onto the road surface, the projection control device 1 can allow the occupants of the vehicle 100 to grasp the distance to the fork and the direction of the fork without removing the fork from the driver's field of view.
実施の形態2.
 実施の形態1では、投影制御装置は、投影装置に対し、車両の前方の路面において、分岐点の存在および分岐方向を示す第1案内画像を投影させるものとしていた。
 実施の形態2では、投影制御装置は、投影装置の投影可能領域のうち、投影装置が第1案内画像を投影させる領域(以下「第1案内画像投影領域」という。)を遮蔽する遮蔽物が存在するか否かを判定し、遮蔽物が存在すると判定した場合には、投影装置に対し、車両からみた分岐点の方向を示す案内画像(以下「第2案内画像」という。)を投影させる実施の形態について説明する。
Embodiment 2.
In the first embodiment, the projection control device causes the projection device to project a first guide image indicating the presence of a branch point and the branch direction on the road surface in front of the vehicle.
In the second embodiment, the projection control device includes a shielding object that blocks a region on which the projection device projects the first guide image (hereinafter referred to as “first guide image projection region”) out of the projectable region of the projection device. It is determined whether or not there is an obstruction, and if it is determined that there is an obstruction, the projection device is caused to project a guidance image (hereinafter referred to as "second guidance image") indicating the direction of the branch point as seen from the vehicle. An embodiment will be described.
 図6は、実施の形態2に係る投影制御装置1aの構成例を示す図である。
 実施の形態2において、投影制御装置1aは、車両100に搭載されていることを想定する
 実施の形態2に係る投影制御装置1aは、ナビゲーション装置2および投影装置3に加え、センサ4と接続される。
 センサ4は、車両100に搭載され、車両100の前方に存在する物体を検知する。センサ4は、車両100の前方にて検知された物体に関する情報(以下「センサ情報」という。)を、投影制御装置1aに出力する。実施の形態2において、センサ4は、車両100の前方を撮像する撮像装置を想定している。センサ4が撮像装置である場合、センサ情報は、車両100の前方が撮像された撮像画像である。以下の実施の形態1では、センサ4を撮像装置、センサ情報を撮像画像として説明する。なお、これは一例に過ぎず、センサ4は撮像装置に限られない。センサ4は、例えば、ライダー、または、測距センサ等、車両100の前方に存在する物体までの距離を検知可能なその他の装置を含む。
FIG. 6 is a diagram showing a configuration example of a projection control device 1a according to the second embodiment.
In the second embodiment, it is assumed that the projection control device 1a is mounted on the vehicle 100.The projection control device 1a according to the second embodiment is connected to the sensor 4 in addition to the navigation device 2 and the projection device 3. Ru.
The sensor 4 is mounted on the vehicle 100 and detects objects present in front of the vehicle 100. The sensor 4 outputs information regarding an object detected in front of the vehicle 100 (hereinafter referred to as "sensor information") to the projection control device 1a. In the second embodiment, the sensor 4 is assumed to be an imaging device that captures an image in front of the vehicle 100. When the sensor 4 is an imaging device, the sensor information is a captured image of the front of the vehicle 100. In the following first embodiment, the sensor 4 will be described as an imaging device, and the sensor information will be described as a captured image. Note that this is just an example, and the sensor 4 is not limited to an imaging device. The sensor 4 includes, for example, a lidar or other device capable of detecting the distance to an object present in front of the vehicle 100, such as a distance sensor.
 図6に示す投影制御装置1aの構成例について、実施の形態1にて図1を用いて説明した投影制御装置1と同様の構成例については、同じ符号を付して重複した説明を省略する。
 実施の形態2に係る投影制御装置1aは、実施の形態1に係る投影制御装置1とは、センサ情報取得部15および遮蔽物検知部16を備えた点が異なる。
 また、実施の形態2に係る投影制御装置1aにおける案内画像生成部13aおよび投影制御部14aの具体的な動作が、それぞれ、実施の形態1に係る投影制御装置1における案内画像生成部13および投影制御部14の具体的な動作とは異なる。
Regarding the configuration example of the projection control device 1a shown in FIG. 6, the same configuration examples as the projection control device 1 described in Embodiment 1 using FIG. .
The projection control device 1a according to the second embodiment differs from the projection control device 1 according to the first embodiment in that it includes a sensor information acquisition section 15 and an obstruction detection section 16.
Further, the specific operations of the guide image generation unit 13a and the projection control unit 14a in the projection control device 1a according to the second embodiment are the same as those of the guide image generation unit 13 and the projection control unit 14a in the projection control device 1 according to the first embodiment, respectively. This is different from the specific operation of the control unit 14.
 センサ情報取得部15は、案内画像生成部13aから第1案内画像投影領域に関する情報が出力されると、撮像装置から、当該撮像装置が撮像した撮像画像を取得する。案内画像生成部13aの詳細については、後述する。
 センサ情報取得部15は、取得した撮像画像を、案内画像生成部13aから出力された第1案内画像投影領域に関する情報とともに、遮蔽物検知部16に出力する。
When the information regarding the first guide image projection area is output from the guide image generation unit 13a, the sensor information acquisition unit 15 acquires a captured image captured by the image capture device from the image capture device. Details of the guide image generation unit 13a will be described later.
The sensor information acquisition unit 15 outputs the acquired captured image to the obstructing object detection unit 16 together with information regarding the first guide image projection area output from the guide image generation unit 13a.
 遮蔽物検知部16は、センサ情報取得部15が取得した撮像画像に基づき、第1案内画像投影領域を遮蔽する遮蔽物が存在するか否かを検知する。
 なお、センサ4の設置位置および検知範囲、ここでは、撮像装置の設置位置および画角、は予めわかっているため、遮蔽物検知部16は、撮像画像の座標系と、第1案内画像投影領域の座標系との相対位置関係を把握できる。なお、第1案内画像投影領域の座標系は、対象領域座標系であり、いわゆる「地理座標系」である。
 遮蔽物検知部16は、例えば、撮像画像に対して、公知の画像認識処理、または、パターンマッチングを実施して、第1案内画像投影領域を遮蔽する遮蔽物が存在するか否かを検知する。
 遮蔽物検知部16は、第1案内画像投影領域を遮蔽する遮蔽物が存在するか否かの検知結果(以下「遮蔽物検知結果」という。)を、案内画像生成部13aに出力する。
The shielding object detection unit 16 detects whether or not there is a shielding object that blocks the first guide image projection area, based on the captured image acquired by the sensor information acquisition unit 15.
Note that since the installation position and detection range of the sensor 4, in this case the installation position and viewing angle of the imaging device, are known in advance, the shielding object detection unit 16 uses the coordinate system of the captured image and the first guide image projection area. The relative positional relationship with the coordinate system can be grasped. Note that the coordinate system of the first guide image projection area is a target area coordinate system, and is a so-called "geographical coordinate system."
The obstructing object detection unit 16 detects whether or not there is an obstructing object that obstructs the first guide image projection area by, for example, performing a known image recognition process or pattern matching on the captured image. .
The shielding object detection section 16 outputs a detection result (hereinafter referred to as "obstruction detection result") as to whether or not there is a shielding object that blocks the first guide image projection area to the guide image generation section 13a.
 案内画像生成部13aは、分岐判定部12が車両100は分岐点に近づいていると判定した場合、第1路面点の位置の座標と、第2路面点の位置の座標とを算出する。
 なお、実施の形態2でも、実施の形態1同様、一例として、第1路面点の位置は、車両100の位置を示す点を通り車両100が走行中の車線(ここでいう車線とはいわゆるレーン)と平行な直線と、分岐後の車線(ここでいう車線とはいわゆるレーン)の幅方向の中心を通り分岐後の車線と平行な直線との交点の位置とする。
 案内画像生成部13aは、第1路面点の位置の座標に基づき、終点決定用条件に則って、分岐判定部12によって車両100がある分岐点に近づいたと最初に判定されたときの第2路面点の位置の座標を算出する。案内画像生成部13aは、実施の形態1にて説明済の、案内画像生成部13による第2路面点の位置の座標の算出方法と同様の方法で、第2路面点の位置の座標を算出すればよい。
 案内画像生成部13aは、算出した第1路面点の位置の座標と、第2路面点の位置の座標とから、第1案内画像投影領域を算出する。なお、例えば、予め、分岐判定部12によって車両100がある分岐点に近づいたと最初に判定されたときの、第1投影画像の形状および大きさは決められているものとする。
 案内画像生成部13aは、第1案内画像投影領域に関する情報を、センサ情報取得部15に出力する。第1案内画像投影領域に関する情報は、例えば、第1投影画像の全周の座標を示す情報であってもよいし、第1投影画像を囲む最小矩形の四隅の座標であってもよい。
When the branch determining unit 12 determines that the vehicle 100 is approaching a branch point, the guide image generating unit 13a calculates the coordinates of the first road point and the coordinates of the second road point.
Note that in the second embodiment, as in the first embodiment, the position of the first road surface point is, for example, the lane in which the vehicle 100 is traveling through the point indicating the position of the vehicle 100 (the lane referred to here is the so-called lane). ) and a straight line that passes through the widthwise center of the lane after the branch (here, the lane is what is called a lane) and is parallel to the lane after the branch.
Based on the coordinates of the position of the first road surface point, the guide image generating section 13a generates a second road surface when the branch determining section 12 first determines that the vehicle 100 approaches a certain branch point, in accordance with the end point determination conditions. Calculate the coordinates of a point. The guide image generating unit 13a calculates the coordinates of the position of the second road surface point using the same method as the method of calculating the coordinates of the position of the second road surface point by the guide image generating unit 13, which has already been explained in the first embodiment. do it.
The guide image generation unit 13a calculates a first guide image projection area from the calculated coordinates of the position of the first road surface point and the coordinates of the position of the second road surface point. Note that, for example, it is assumed that the shape and size of the first projection image are determined in advance when the branch determining unit 12 first determines that the vehicle 100 approaches a certain branch point.
The guide image generation unit 13a outputs information regarding the first guide image projection area to the sensor information acquisition unit 15. The information regarding the first guide image projection area may be, for example, information indicating the coordinates of the entire circumference of the first projection image, or may be the coordinates of the four corners of the minimum rectangle surrounding the first projection image.
 また、案内画像生成部13aは、遮蔽物検知部16から出力された遮蔽物検知結果に基づき、第1案内画像、または、第1案内画像と第2案内画像を生成する。
 詳細には、案内画像生成部13aは、遮蔽物検知部16から、第1案内画像投影領域を遮蔽する遮蔽物が存在する旨の遮蔽物検知結果が出力された場合、第1案内画像および第2案内画像を生成する。
 案内画像生成部13aは、遮蔽物検知部16から、第1案内画像投影領域を遮蔽する遮蔽物が存在しない旨の遮蔽物検知結果が出力された場合、第1案内画像を生成する。この場合、案内画像生成部13aは、第2案内画像は生成しない。
Further, the guide image generating section 13a generates a first guide image or a first guide image and a second guide image based on the shielding object detection result output from the shielding object detecting section 16.
Specifically, when the shielding object detection unit 16 outputs a shielding object detection result indicating that there is a shielding object that blocks the first guiding image projection area, the guiding image generating unit 13a generates a first guiding image and a second guiding image. 2. Generate a guide image.
The guide image generation unit 13a generates a first guide image when the shield detection unit 16 outputs a shield detection result indicating that there is no shield that blocks the first guide image projection area. In this case, the guide image generation unit 13a does not generate the second guide image.
 案内画像生成部13aによる第1案内画像の生成方法について、案内画像生成部13aは、実施の形態1にて説明済みの、案内画像生成部13が第1案内画像を生成するのと同様の方法で第1案内画像を生成すればよいので、重複した説明を省略する。 Regarding the method of generating the first guide image by the guide image generating unit 13a, the guide image generating unit 13a uses the same method as the method by which the guide image generating unit 13 generates the first guide image, which has already been explained in Embodiment 1. Since it is sufficient to generate the first guide image using the steps shown in FIG.
 案内画像生成部13aによる第2案内画像の生成方法について説明する。
 実施の形態2では、一例として、第2案内画像は、矩形を示す画像とする。投影装置3が第2案内画像を投影すると、車両100の前方の路面は、所定の距離だけ車両100の前方にある点(以下「車両前方点」という。)と、第1路面点とを結ぶ、予め設定された幅を有する第2案内画像、ここでは、矩形を示す画像が投影された状態となる。以下の実施の形態2において、路面上に投影されている第2案内画像を、「第2投影画像」ともいう。なお、車両100の前方の所定の距離は、管理者等によって適宜設定される。管理者等は、少なくとも接近判定用閾値よりも小さい距離を、車両100の前方の所定の距離に設定し、投影制御装置1aが参照可能な場所に記憶しておく。
 実施の形態1において、第2案内画像の、第1路面点に対応する端部を、第3端部という。また、第2案内画像の、車両前方点に対応する端部を、第4端部という。
A method of generating the second guide image by the guide image generation unit 13a will be explained.
In the second embodiment, as an example, the second guide image is an image showing a rectangle. When the projection device 3 projects the second guide image, the road surface in front of the vehicle 100 connects a point located in front of the vehicle 100 by a predetermined distance (hereinafter referred to as "vehicle front point") and the first road surface point. , a second guide image having a preset width, in this case an image representing a rectangle, is projected. In the second embodiment below, the second guide image projected onto the road surface is also referred to as a "second projected image." Note that the predetermined distance in front of the vehicle 100 is appropriately set by an administrator or the like. An administrator or the like sets a distance smaller than at least the approach determination threshold value to a predetermined distance in front of the vehicle 100, and stores it in a location that can be referenced by the projection control device 1a.
In the first embodiment, the end of the second guide image corresponding to the first road surface point is referred to as a third end. Furthermore, the end of the second guide image corresponding to the point in front of the vehicle is referred to as a fourth end.
 まず、案内画像生成部13aは、車両100の位置から、車両前方点の座標を算出する。
 そして、案内画像生成部13aは、車両100から分岐点までの距離と第1路面点の座標とを第1機械学習モデルに入力し、第2案内画像の第3端部の座標を得る。なお、ここでは、第2案内画像の第3端部の座標は、第1案内画像の第1端部の座標と同じ座標となる。
 また、案内画像生成部13aは、対象領域座標系と表示手段座標系との相対位置関係に関する情報とに基づき、車両前方点の座標を表示手段座標系の座標に変換して、第2案内画像の第4端部の座標を算出する。
 対象領域座標系と表示手段座標系との相対位置関係に関する情報とは、例えば、対象領域座標系の座標を表示手段座標系の座標に変換することができる座標変換パラメータである。例えば、管理者等は、予め、投影制御装置1aの製品出荷前等、事前に、投影装置3から、第2案内画像を投影させ、当該第2案内画像が投影されている投影面(例えば、路面)における第2投影画像の測定点と、第2案内画像における測定点を複数取得し、第2投影画像と第2案内画像との間の位置の対応関係を導出する。具体的には、管理者等は、第2投影画像上の位置と第2案内画像上の位置との対応関係に基づき、対象領域座標系の座標を表示手段座標系の座標に変換するための座標変換パラメータを算出する。そして、管理者等は、算出した座標変換パラメータを、投影制御装置1aが参照可能な場所に記憶させておく。
First, the guide image generation unit 13a calculates the coordinates of a point in front of the vehicle from the position of the vehicle 100.
Then, the guide image generation unit 13a inputs the distance from the vehicle 100 to the branch point and the coordinates of the first road surface point to the first machine learning model, and obtains the coordinates of the third end of the second guide image. Note that here, the coordinates of the third end of the second guide image are the same as the coordinates of the first end of the first guide image.
Further, the guide image generation unit 13a converts the coordinates of the point in front of the vehicle into the coordinates of the display means coordinate system based on the information regarding the relative positional relationship between the target area coordinate system and the display means coordinate system, and generates a second guide image. Calculate the coordinates of the fourth end of .
The information regarding the relative positional relationship between the target area coordinate system and the display means coordinate system is, for example, a coordinate conversion parameter that can convert the coordinates of the target area coordinate system to the coordinates of the display means coordinate system. For example, the administrator or the like may project the second guide image from the projection device 3 in advance, such as before shipping the product of the projection control device 1a, on the projection surface on which the second guide image is projected (for example, A plurality of measurement points of the second projection image on the road surface) and a plurality of measurement points of the second guide image are obtained, and a positional correspondence relationship between the second projection image and the second guide image is derived. Specifically, the administrator etc. can convert the coordinates of the target area coordinate system to the coordinates of the display means coordinate system based on the correspondence between the position on the second projection image and the position on the second guide image. Calculate coordinate transformation parameters. Then, the administrator or the like stores the calculated coordinate transformation parameters in a location where the projection control device 1a can refer to them.
 案内画像生成部13aは、第1案内画像の第1端部の座標と第1案内画像の第2端部の座標とを得ると、これらの座標に基づき、記憶部(図示省略)に記憶されている規則(以下「第2案内画像用規則」という。)に則って、第2案内画像を生成する。
 実施の形態2において、記憶部には、第1案内画像用規則とともに、管理者等によって予め設定された、投影装置3に投影させる第2案内画像を作成する際の第2案内画像用規則が記憶されている。案内画像生成部13aは、記憶部に記憶された第2案内画像用規則に則って、車両100からみた分岐点の方向を示す第2案内画像を生成する。
 第2案内画像用規則には、例えば、車両100が走行中の車線に沿った方向に、幅70cmで、青色で塗りつぶす、という規則が記憶されている。この場合、案内画像生成部13aは、第4端部から第3端部の方向に延びる幅70cmの矩形であって、車両100が走行している車線(ここでいう車線はいわゆるレーン)上で車両前方点から分岐点の方向へ延びる矩形を示す第2案内画像を投影させる第2案内画像を生成することになる。
 なお、上述した例は一例に過ぎず、第2案内画像用規則には、例えば、第2案内画像の色、模様、幅等、案内画像生成部13aが当該第2案内画像用規則に則って生成した第2案内画像が車両100の前方の路面に投影された際に、車両100の乗員が、車両100からみた分岐点の方向を認識できる第2投影画像になるような規則が設定されていればよい。
 また、ここでは、第2案内画像用規則は予め管理者等が設定しておくものとしたが、これは一例に過ぎない。例えば、車両100の乗員等が適宜、第2案内画像用規則を設定可能としてもよい。
Upon obtaining the coordinates of the first end of the first guide image and the coordinates of the second end of the first guide image, the guide image generation unit 13a stores the coordinates in a storage unit (not shown) based on these coordinates. A second guide image is generated in accordance with the rules (hereinafter referred to as "second guide image rules").
In the second embodiment, the storage unit stores, together with the first guide image rules, rules for the second guide image when creating the second guide image to be projected on the projection device 3, which are set in advance by an administrator or the like. remembered. The guide image generation unit 13a generates a second guide image indicating the direction of the branch point as seen from the vehicle 100, in accordance with the second guide image rules stored in the storage unit.
The second guide image rule stores, for example, a rule that the second guide image is painted in blue in a width of 70 cm in the direction along the lane in which the vehicle 100 is traveling. In this case, the guide image generation unit 13a is a rectangle with a width of 70 cm extending from the fourth end toward the third end, and is formed on the lane in which the vehicle 100 is traveling (the lane referred to here is a so-called lane). A second guide image is generated that projects a second guide image showing a rectangle extending from the point in front of the vehicle toward the branch point.
The above-mentioned example is just an example, and the second guide image rules include, for example, the color, pattern, width, etc. of the second guide image, which the guide image generation unit 13a uses in accordance with the second guide image rules. A rule is set such that when the generated second guide image is projected onto the road surface in front of the vehicle 100, the second projected image is such that the occupant of the vehicle 100 can recognize the direction of the branch point as seen from the vehicle 100. That's fine.
Further, here, it is assumed that the rules for the second guide image are set in advance by the administrator or the like, but this is only an example. For example, the occupant of the vehicle 100 or the like may be able to set the rules for the second guide image as appropriate.
 案内画像生成部13aは、第1案内画像、または、第1案内画像と第2案内画像、を生成すると、生成した第1案内画像、または、第1案内画像と第2案内画像を、投影制御部14aに出力する。
 案内画像生成部13aは、遮蔽物検知部16から出力された遮蔽物検知結果もあわせて、投影制御部14aに出力する。
After generating the first guide image or the first guide image and the second guide image, the guide image generation unit 13a performs projection control on the generated first guide image or the first guide image and the second guide image. The output signal is output to the section 14a.
The guide image generation section 13a also outputs the shielding object detection result outputted from the shielding object detection section 16 to the projection control section 14a.
 投影制御部14aは、投影装置3に対して、車両100の前方の路面に、案内画像生成部13が生成した第1案内画像を投影させる。
 また、投影制御部14aは、遮蔽物検知部16が、第1案内画像投影領域を遮蔽する遮蔽物が存在すると検知した場合には、投影装置3に対して、車両100の前方の路面に、案内画像生成部13が生成した第2案内画像を投影させる。
The projection control section 14a causes the projection device 3 to project the first guide image generated by the guide image generation section 13 onto the road surface in front of the vehicle 100.
In addition, when the shielding object detection section 16 detects that there is a shielding object that blocks the first guide image projection area, the projection control section 14a causes the projection device 3 to detect the presence of a shielding object on the road surface in front of the vehicle 100. The second guide image generated by the guide image generator 13 is projected.
 実施の形態2に係る投影制御装置1aの動作について説明する。
 図7は、実施の形態2に係る投影制御装置1aの動作の一例について説明するためのフローチャートである。
 投影制御装置1aは、例えば、車両100の電源がオンにされ、車両100の走行が開始されると、車両100の電源がオフにされるまで、図7のフローチャートに示すような動作を繰り返し行う。
The operation of the projection control device 1a according to the second embodiment will be explained.
FIG. 7 is a flowchart for explaining an example of the operation of the projection control device 1a according to the second embodiment.
For example, when the power of the vehicle 100 is turned on and the vehicle 100 starts running, the projection control device 1a repeatedly performs the operation shown in the flowchart of FIG. 7 until the power of the vehicle 100 is turned off. .
 図7において、ステップST11~ステップST12の処理の具体的な内容は、それぞれ、実施の形態1にて説明済みの、図2のフローチャートのステップST1~ステップST2の処理の具体的な内容と同様であるため、重複した説明を省略する。 In FIG. 7, the specific details of the processing in steps ST11 and ST12 are the same as the specific details of the processing in steps ST1 and ST2 of the flowchart in FIG. 2, which have been explained in the first embodiment. Therefore, duplicate explanations will be omitted.
 案内画像生成部13aは、ステップST12にて分岐判定部12が車両100は分岐点に近づいていると判定した場合(ステップST12の“YES”の場合)、第1路面点の位置の座標と、第2路面点の位置の座標とを算出する。
 例えば、今、分岐判定部12は、車両100の電源がオンにされ、車両100の走行が開始されてからはじめて、車両100が分岐点に近づいていると判定したとする。
 案内画像生成部13aは、算出した第1路面点の位置の座標と、第2路面点の位置の座標とから、第1案内画像投影領域を算出する(ステップST13)。
 案内画像生成部13aは、第1案内画像投影領域に関する情報を、センサ情報取得部15に出力する。
When the branch determining unit 12 determines in step ST12 that the vehicle 100 is approaching a branch point (“YES” in step ST12), the guide image generating unit 13a generates the coordinates of the position of the first road point, The coordinates of the position of the second road surface point are calculated.
For example, assume that the branch determining unit 12 determines that the vehicle 100 is approaching a branch point for the first time after the power of the vehicle 100 is turned on and the vehicle 100 starts traveling.
The guide image generation unit 13a calculates a first guide image projection area from the calculated coordinates of the position of the first road surface point and the coordinates of the position of the second road surface point (step ST13).
The guide image generation unit 13a outputs information regarding the first guide image projection area to the sensor information acquisition unit 15.
 センサ情報取得部15は、ステップST13にて案内画像生成部13aから第1案内画像投影領域に関する情報が出力されると、撮像装置が撮像した撮像画像を取得する(ステップST14)。
 今、分岐判定部12は、車両100の電源がオンにされ、車両100の走行が開始されてからはじめて、車両100が分岐点に近づいていると判定したものとている。したがって、投影制御装置1aは、投影装置3に対して、まだ、車両100の前方の路面に、ある分岐点に関する案内画像(第1案内画像および第2案内画像)を、投影させていない状態である。すなわち、センサ情報取得部15が取得する撮像画像は、第1案内画像および第2案内画像が投影されていない状態の車両100の前方が撮像された撮像画像(以下「投影前撮像画像」という。)である。
 センサ情報取得部15は、取得した投影前撮像画像を、案内画像生成部13aから出力された第1案内画像投影領域に関する情報とともに、遮蔽物検知部16に出力する。
When the information regarding the first guide image projection area is output from the guide image generating unit 13a in step ST13, the sensor information acquisition unit 15 acquires the captured image captured by the imaging device (step ST14).
It is now assumed that the branch determining unit 12 has determined that the vehicle 100 is approaching a branch point only after the power of the vehicle 100 is turned on and the vehicle 100 starts traveling. Therefore, the projection control device 1a is in a state where the projection device 3 has not yet projected the guidance image (the first guidance image and the second guidance image) regarding a certain branch point onto the road surface in front of the vehicle 100. be. That is, the captured image acquired by the sensor information acquisition unit 15 is a captured image of the front of the vehicle 100 on which the first guide image and the second guide image are not projected (hereinafter referred to as "pre-projection captured image"). ).
The sensor information acquisition unit 15 outputs the acquired pre-projection captured image to the shielding object detection unit 16 together with the information regarding the first guide image projection area output from the guide image generation unit 13a.
 遮蔽物検知部16は、ステップST14にてセンサ情報取得部15が取得した投影前撮像画像に基づき、第1案内画像投影領域を遮蔽する遮蔽物が存在するか否かを検知する(ステップST15)。
 遮蔽物検知部16は、遮蔽物検知結果を、案内画像生成部13aに出力する。
The shielding object detection unit 16 detects whether or not there is a shielding object that blocks the first guide image projection area based on the pre-projection captured image acquired by the sensor information acquisition unit 15 in step ST14 (step ST15). .
The shielding object detection section 16 outputs the shielding object detection result to the guide image generation section 13a.
 ステップST15にて、遮蔽物検知部16が、第1案内画像投影領域を遮蔽する遮蔽物が存在すると検知した場合(ステップST15の“YES”の場合)、案内画像生成部13aは、第1案内画像および第2案内画像を生成する(ステップST16)。
 案内画像生成部13aは、生成した第1案内画像および第2案内画像を、投影制御部14aに出力する。案内画像生成部13aは、遮蔽物検知部16から出力された遮蔽物検知結果もあわせて、投影制御部14aに出力する。
In step ST15, when the shielding object detection section 16 detects that there is a shielding object that shields the first guide image projection area (in the case of "YES" in step ST15), the guide image generating section 13a An image and a second guide image are generated (step ST16).
The guide image generation unit 13a outputs the generated first guide image and second guide image to the projection control unit 14a. The guide image generation section 13a also outputs the shielding object detection result outputted from the shielding object detection section 16 to the projection control section 14a.
 投影制御部14aは、投影装置3に対して、車両100の前方の路面に、ステップST16にて案内画像生成部13が生成した第1案内画像および第2案内画像を投影させる(ステップST17)。
 その結果、車両100の前方の路面は、例えば、図8に示すような、第1投影画像および第2投影画像が投影された状態となる。
 図8は、実施の形態2において、投影制御装置1aの投影制御部14aが、投影装置3に対して、車両100の前方の路面に第1案内画像および第2案内画像を投影させた様子の一例を説明するための図である。なお、図8は、分岐点および当該分岐点に近づいた車両100を上方からみた俯瞰図としている。
 図8において、分岐点は「BP」で示され、車両100の前方の路面に投影されている第1投影画像は、「Ai1」で示されている。また、第1路面点は「E1」で示され、第2路面点は「E2」で示されている。第2路面点の方向が分岐方向であるため、図8において、車両100の案内経路は、分岐点を左折する経路であることがわかる。なお、図8では、第1路面点の位置が分岐点の位置と重なったものとしている。
 ただし、図8では、車両100の前方に他車両が存在している。図8において、他車両は、「V」で示されている。また、車両100の側方には建物が存在している。図8において、建物は「BLDG」で示されている。
 他車両および建物は、第1案内画像投影領域を遮蔽する遮蔽物となり、実際には、第1投影画像は、他車両または建物で遮られ、分岐後の車線(ここでいう車線はいわゆるレーン)上には投影されない。なお、図8では、便宜上、第1投影画像は分岐後の車線(ここでいう車線はいわゆるレーン)上に投影されているものとしている。
 また、図8において、第3路面点は「E3」で示され、車両前方点は「E4」で示されている。分岐点、第1路面点、および、第3路面点は、他車両で隠されている。
 上述のとおり、第1投影画像は、他車両または建物で遮られ、分岐後の車線上には投影されないが、車両前方点から分岐点の方向へ延びる第2投影画像が、他車両または建物で遮られることなく投影される。
 車両100の乗員は、第1投影画像を確認できなくても、第2投影画像を見れば、前方に分岐点があることを認識できる。
The projection control unit 14a causes the projection device 3 to project the first guide image and the second guide image generated by the guide image generation unit 13 in step ST16 onto the road surface in front of the vehicle 100 (step ST17).
As a result, the road surface in front of the vehicle 100 is in a state where the first projection image and the second projection image are projected, for example, as shown in FIG. 8 .
FIG. 8 shows a state in which the projection control unit 14a of the projection control device 1a causes the projection device 3 to project the first guide image and the second guide image on the road surface in front of the vehicle 100 in the second embodiment. FIG. 3 is a diagram for explaining an example. Note that FIG. 8 is an overhead view of a branch point and the vehicle 100 approaching the branch point.
In FIG. 8, the branch point is indicated by "BP", and the first projection image projected onto the road surface in front of the vehicle 100 is indicated by "Ai1". Further, the first road surface point is indicated by "E1", and the second road surface point is indicated by "E2". Since the direction of the second road surface point is the branching direction, it can be seen in FIG. 8 that the guide route for the vehicle 100 is a route that turns left at the branching point. In addition, in FIG. 8, the position of the first road surface point overlaps with the position of the branch point.
However, in FIG. 8, another vehicle exists in front of the vehicle 100. In FIG. 8, other vehicles are indicated by "V". Further, a building exists on the side of the vehicle 100. In FIG. 8, the building is indicated by "BLDG".
Other vehicles and buildings become shields that block the first guide image projection area, and in reality, the first projected image is blocked by other vehicles or buildings, and the lane after branching (here, the lane is a so-called lane) It is not projected onto the top. In addition, in FIG. 8, for convenience, it is assumed that the first projection image is projected on the lane after the branch (the lane referred to here is a so-called lane).
Further, in FIG. 8, the third road surface point is indicated by "E3", and the point in front of the vehicle is indicated by "E4". The branch point, the first road surface point, and the third road surface point are hidden by other vehicles.
As mentioned above, the first projected image is blocked by other vehicles or buildings and is not projected onto the lane after the branch, but the second projected image extending from the point in front of the vehicle toward the branch point is blocked by other vehicles or buildings. Projected without obstruction.
Even if the occupant of vehicle 100 cannot see the first projected image, he or she can recognize that there is a branch point ahead by looking at the second projected image.
 一方、ステップST15にて、遮蔽物検知部16が、第1案内画像投影領域を遮蔽する遮蔽物が存在しないと検知した場合(ステップST15の“NO”の場合)、案内画像生成部13aは、第1案内画像を生成する(ステップST18)。
 案内画像生成部13aは、生成した第1案内画像を、投影制御部14aに出力する。案内画像生成部13aは、遮蔽物検知部16から出力された遮蔽物検知結果もあわせて、投影制御部14aに出力する。
On the other hand, if the shielding object detection unit 16 detects in step ST15 that there is no shielding object that blocks the first guide image projection area (“NO” in step ST15), the guide image generating unit 13a A first guide image is generated (step ST18).
The guide image generation unit 13a outputs the generated first guide image to the projection control unit 14a. The guide image generation section 13a also outputs the shielding object detection result outputted from the shielding object detection section 16 to the projection control section 14a.
 投影制御部14aは、投影装置3に対して、車両100の前方の路面に、ステップST18にて案内画像生成部13aが生成した第1案内画像を投影させる(ステップST19)。 The projection control unit 14a causes the projection device 3 to project the first guide image generated by the guide image generation unit 13a in step ST18 onto the road surface in front of the vehicle 100 (step ST19).
 ステップST17の処理、または、ステップST19の処理を行うと、投影制御装置1aの動作は、ステップST11に戻り、再び、以降の処理に進む。
 なお、再び実施されたステップST14では、センサ情報取得部15は、すでに、投影装置3が第1案内画像、または、第1案内画像と第2案内画像を投影した、言い換えれば、第1投影画像、または、第1投影画像と第2投影画像が投影されている車両100の前方が撮像された撮像画像(以下「投影後撮像画像」という。)を取得することになる。
 遮蔽物検知部16は、ステップST15にて、投影後撮像画像に基づき、第1案内画像投影領域を遮蔽する遮蔽物が存在するか否かを検知する。
 遮蔽物検知部16が第1案内画像投影領域を遮蔽する遮蔽物が存在すると検知した場合、案内画像生成部13aは、第1案内画像および第2案内画像を生成し(ステップST16)、投影制御部14aは、投影装置3に対して、車両100の前方の路面に、第1案内画像および第2案内画像を投影させる(ステップST17)。前回、投影装置3に対して、第1案内画像のみを投影させていた場合、投影制御部14aは、第1案内画像のみの投影から、第1案内画像および第2案内画像の投影へと、投影装置3に対する制御を切り替えることになる。前回、投影装置3に対して、第1案内画像および第2案内画像を投影させていた場合、投影制御部14aは、投影装置3に対して第1案内画像および第2案内画像を投影させる制御を継続することになる。
 遮蔽物検知部16が第1案内画像投影領域を遮蔽する遮蔽物が存在しないと検知した場合、案内画像生成部13aは、第1案内画像を生成し(ステップST18)、投影制御部14aは、投影装置3に対して、車両100の前方の路面に、第1案内画像を投影させる(ステップST19)。前回、投影装置3に対して、第1案内画像および第2案内画像を投影させていた場合、投影制御部14aは、第1案内画像および第2案内画像の投影から、第1案内画像のみの投影へと、投影装置3に対する制御を切り替えることになる。前回、投影装置3に対して、第1案内画像のみを投影させていた場合、投影制御部14aは、投影装置3に対して第1案内画像のみを投影させる制御を継続することになる。
After performing the process of step ST17 or the process of step ST19, the operation of the projection control device 1a returns to step ST11 and proceeds to the subsequent processes again.
Note that in step ST14 performed again, the sensor information acquisition unit 15 has already projected the first guide image or the first guide image and the second guide image by the projection device 3, in other words, the first projected image Alternatively, a captured image in which the front of the vehicle 100 on which the first projection image and the second projection image are projected (hereinafter referred to as "post-projection captured image") is acquired.
In step ST15, the shielding object detection unit 16 detects whether or not there is a shielding object that shields the first guide image projection area based on the captured image after projection.
When the shielding object detection unit 16 detects that there is a shielding object that blocks the first guide image projection area, the guide image generating unit 13a generates a first guide image and a second guide image (step ST16), and performs projection control. The unit 14a causes the projection device 3 to project the first guide image and the second guide image onto the road surface in front of the vehicle 100 (step ST17). When the projection device 3 was projecting only the first guide image last time, the projection control unit 14a changes from projecting only the first guide image to projecting the first guide image and the second guide image. Control over the projection device 3 will be switched. When the first guide image and the second guide image were projected on the projection device 3 last time, the projection control unit 14a controls the projection device 3 to project the first guide image and the second guide image. will continue.
When the shielding object detection unit 16 detects that there is no shielding object that blocks the first guide image projection area, the guide image generation unit 13a generates the first guide image (step ST18), and the projection control unit 14a, The projection device 3 is caused to project the first guide image onto the road surface in front of the vehicle 100 (step ST19). When the first guide image and the second guide image were projected onto the projection device 3 last time, the projection control unit 14a changes the projection of only the first guide image from the projection of the first guide image and the second guide image. Control over the projection device 3 is switched to projection. If the projection device 3 was previously projecting only the first guide image, the projection control unit 14a continues to control the projection device 3 to project only the first guide image.
 投影制御装置1aは、以上のような動作を、例えば、車両100の電源がオフにされるまで、繰り返す。 The projection control device 1a repeats the above operations until, for example, the power of the vehicle 100 is turned off.
 なお、図7に示すフローチャートでは、ステップST16の処理およびステップST18の処理は、ステップST15の処理の後に行われるものとしているが、これは一例に過ぎない。例えば、ステップST16の処理およびステップST18の処理は、ステップST12にて、あわせて実施されてもよい。つまり、案内画像生成部13aは、遮蔽物検知部16による遮蔽物の有無の検知よりも前の、第1案内画像投影領域を算出した際に、第1案内画像および第2案内画像を生成しておくようにしてもよい。この場合、例えば、投影制御部14aが、遮蔽物検知結果に基づき、投影装置3に対して、第1案内画像および第2案内画像を投影させるか、第1案内画像のみを投影させるかを制御すればよい。 Note that in the flowchart shown in FIG. 7, the processing in step ST16 and the processing in step ST18 are assumed to be performed after the processing in step ST15, but this is only an example. For example, the processing in step ST16 and the processing in step ST18 may be performed together in step ST12. In other words, the guide image generation unit 13a generates the first guide image and the second guide image when calculating the first guide image projection area before the obstruction detection unit 16 detects the presence or absence of an obstruction. You may also leave it there. In this case, for example, the projection control unit 14a controls whether the projection device 3 projects the first guide image and the second guide image, or projects only the first guide image, based on the obstruction detection result. do it.
 このように、投影制御装置1aは、車両100の前方にて検知された物体に関するセンサ情報(ここでは、撮像画像)に基づき、第1案内画像が投影される路面上の領域(第1案内画像投影領域)を遮蔽する遮蔽物が存在すると検知した場合、第1案内画像に加え、車両100からみた分岐点の方向を示す第2案内画像を生成し、投影装置3に対して、車両100の前方の路面に第1案内画像に加え、第2案内画像を投影させる。
 投影制御装置1aは、車両100の前方の路面に遮蔽物が存在することで、意図通り、車両100の乗員に対して当該乗員が分岐点の存在および分岐方向を認識可能な第1投影画像を提示できない場合には、投影装置3に対して、第2案内画像を投影させることで、車両100の乗員に対し、分岐点の方向を提示することができる。これにより、投影制御装置1aは、車両100の乗員に対し、まもなく分岐点に差し掛かることを認識させることができる。
 また、投影制御装置1aは、車両100の前方の路面に遮蔽物が存在せず、意図通り、車両100の乗員に対して当該乗員が分岐点の存在および分岐方向を認識可能な第1投影画像を提示できる場合には、投影装置3に対して、第1案内画像のみ投影させ、第2案内画像は投影させない。投影装置3は、不要な第2案内画像の投影のための消費電力を抑えることができる。
In this way, the projection control device 1a selects an area on the road surface onto which the first guide image is projected (the first guide image If it is detected that there is a blocking object that blocks the projection area), in addition to the first guide image, a second guide image indicating the direction of the branch point as seen from the vehicle 100 is generated, and the projection device 3 In addition to the first guide image, a second guide image is projected onto the road ahead.
As intended, the projection control device 1a displays a first projected image to the occupant of the vehicle 100 that allows the occupant to recognize the existence of the branch point and the branch direction, due to the presence of a shield on the road surface in front of the vehicle 100. If the direction of the branch point cannot be presented, the direction of the branch point can be presented to the occupants of the vehicle 100 by projecting the second guide image onto the projection device 3. Thereby, the projection control device 1a can make the occupants of the vehicle 100 aware that they are about to reach a fork in the road.
The projection control device 1a also provides a first projected image for the occupant of the vehicle 100 in which there is no obstruction on the road surface in front of the vehicle 100 and the occupant can recognize the presence of the branch point and the branch direction, as intended. can be presented, the projection device 3 projects only the first guide image and does not project the second guide image. The projection device 3 can suppress power consumption for unnecessary projection of the second guide image.
 以上の実施の形態2では、遮蔽物検知部16が第1案内画像投影領域を遮蔽する遮蔽物が存在すると検知した場合、案内画像生成部13aは、第1案内画像および第2案内画像を生成し、投影制御部14aは、投影装置3に対して、車両100の前方の路面に、案内画像生成部13が生成した第1案内画像および第2案内画像を投影させていた。
 そして、その結果、車両100の前方の路面は、例えば、図8に示したように、第1投影画像および第2投影画像が投影されている状態となるものとしていた。しかし、上述のとおり、実際には、第1投影画像は、他車両または建物等で遮られ、分岐後の車線上には投影されない。
In the second embodiment described above, when the shielding object detection section 16 detects that there is a shielding object that blocks the first guidance image projection area, the guidance image generation section 13a generates the first guidance image and the second guidance image. However, the projection control section 14a caused the projection device 3 to project the first guide image and the second guide image generated by the guide image generation section 13 onto the road surface in front of the vehicle 100.
As a result, the first projection image and the second projection image are projected onto the road surface in front of the vehicle 100, as shown in FIG. 8, for example. However, as described above, in reality, the first projected image is blocked by other vehicles, buildings, etc., and is not projected onto the lane after the branch.
 そこで、以上の実施の形態2において、投影制御装置1aは、第1案内画像投影領域を遮蔽する遮蔽物が存在すると検知した場合は、投影装置3に対する第1案内画像の投影制御を中止し、投影装置3から第1案内画像が投影されないようにしてもよい。
 具体的には、投影制御装置1aにおいて、遮蔽物検知部16が、第1案内画像投影領域を遮蔽する遮蔽物が存在すると検知した場合、投影制御部14aは、投影装置3に対して、車両100の前方の路面に、案内画像生成部13aが生成した第2案内画像を投影させるとともに、第1案内画像の投影を中止させる。
 この場合の、投影制御装置1aの構成例は、図6に示す構成例のとおりである。
 投影制御装置1aの動作は、図7のフローチャートに示したような動作に代えて、図9のフローチャートに示すような動作となる。
 図9のステップST11~ステップST16、ステップST18~ステップST19の処理の具体的な内容は、それぞれ、図7のステップST11~ステップST16、ステップST18~ステップST19の処理の具体的な内容と同様であるため、重複した説明を省略する。
 この場合、ステップST16において、案内画像生成部13aが第1案内画像および第2案内画像を生成すると、投影制御部14aは、投影装置3に対して、それまで第1案内画像を投影させていた場合、当該第1案内画像の投影を中止させる(ステップST171)。なお、投影制御部14aは、投影装置3に対して第1案内画像を投影させていない場合は、第1案内画像を投影させない状態を維持する。
 また、投影制御部14aは、車両100の前方の路面に第2案内画像を投影させる(ステップST172)。
Therefore, in the second embodiment described above, when the projection control device 1a detects that there is a shielding object that blocks the first guide image projection area, the projection control device 1a stops controlling the projection of the first guide image to the projection device 3, The first guide image may not be projected from the projection device 3.
Specifically, in the projection control device 1a, when the blocking object detection unit 16 detects that there is a blocking object that blocks the first guide image projection area, the projection control unit 14a instructs the projection device 3 to The second guide image generated by the guide image generator 13a is projected onto the road surface in front of the vehicle 100, and the projection of the first guide image is stopped.
A configuration example of the projection control device 1a in this case is as shown in FIG. 6.
The operation of the projection control device 1a is as shown in the flowchart of FIG. 9 instead of the operation shown in the flowchart of FIG.
The specific contents of steps ST11 to ST16 and steps ST18 to ST19 in FIG. 9 are the same as those of steps ST11 to ST16 and steps ST18 to ST19 in FIG. 7, respectively. Therefore, duplicate explanations will be omitted.
In this case, in step ST16, when the guide image generation section 13a generates the first guide image and the second guide image, the projection control section 14a causes the projection device 3 to project the first guide image. If so, the projection of the first guide image is stopped (step ST171). Note that when the projection control unit 14a does not cause the projection device 3 to project the first guide image, it maintains a state in which the first guide image is not projected.
Furthermore, the projection control unit 14a projects the second guide image onto the road surface in front of the vehicle 100 (step ST172).
 この場合、ステップST172の処理が行われると、車両100の前方の路面は、例えば、図10に示すように、第2投影画像が投影された状態となる。
 図10は、実施の形態2において、投影制御装置1aの投影制御部14aが、投影装置3に対して、車両100の前方の路面への第1案内画像の投影を中止させ、車両100の前方の路面へ第2案内画像を投影させた様子の一例を説明するための図である。なお、図10は、分岐点および当該分岐点に近づいた車両100を上方からみた俯瞰図としている。
 図10に示す状態は、図8に示す状態とは、投影装置3が第1案内画像を投影していない、言い換えれば、車両100の前方の路面には第1投影画像が投影されていない点が異なる。なお、図10では、わかりやすさのため、図8に示した第1投影画像を点線で示している。
In this case, when the process of step ST172 is performed, the road surface in front of the vehicle 100 will be in a state where the second projection image is projected, as shown in FIG. 10, for example.
FIG. 10 shows that in the second embodiment, the projection control unit 14a of the projection control device 1a causes the projection device 3 to stop projecting the first guide image onto the road surface in front of the vehicle 100, and FIG. 4 is a diagram for explaining an example of how the second guide image is projected onto the road surface of the vehicle. Note that FIG. 10 is an overhead view of a branch point and the vehicle 100 approaching the branch point.
The state shown in FIG. 10 differs from the state shown in FIG. 8 in that the projection device 3 does not project the first guide image, in other words, the first projection image is not projected on the road surface in front of the vehicle 100. are different. Note that in FIG. 10, the first projection image shown in FIG. 8 is shown by a dotted line for ease of understanding.
 ステップST172の処理、または、ステップST19の処理を行うと、投影制御装置1aの動作は、ステップST11に戻り、再び、以降の処理に進む。
 なお、再び実施されたステップST14では、センサ情報取得部15は、投影後撮像画像を取得することになる。
 遮蔽物検知部16は、ステップST15にて、投影後撮像画像に基づき、第1案内画像投影領域を遮蔽する遮蔽物が存在するか否かを検知する。
 遮蔽物検知部16が第1案内画像投影領域を遮蔽する遮蔽物が存在すると検知した場合、案内画像生成部13aは、第1案内画像および第2案内画像を生成し(ステップST16)、投影制御部14aは、投影装置3に対して、車両100の前方の路面への、第1案内画像の投影を中止させる(ステップST171)とともに、車両100の前方の路面に、第2案内画像を投影させる(ステップST172)。前回、投影装置3に対して第1案内画像を投影させていた場合、投影制御部14aは、当該第1案内画像の投影を中止させることになる。また、前回、投影装置3に対して第2案内画像のみを投影させていた場合、投影制御部14aは、投影装置3に対して第2案内画像のみを投影させる制御を継続することになる。
 遮蔽物検知部16が第1案内画像投影領域を遮蔽する遮蔽物が存在しないと検知した場合、案内画像生成部13aは、第1案内画像を生成し(ステップST18)、投影制御部14aは、投影装置3に対して、車両100の前方の路面に、第1案内画像を投影させる(ステップST19)。前回、投影装置3に対して、第2案内画像のみを投影させていた場合、投影制御部14aは、第2案内画像のみの投影から、第1案内画像のみの投影へと、投影装置3に対する制御を切り替えることになる。前回、投影装置3に対して、第1案内画像のみを投影させていた場合、投影制御部14aは、投影装置3に対して第1案内画像のみを投影させる制御を継続することになる。
After performing the process of step ST172 or the process of step ST19, the operation of the projection control device 1a returns to step ST11 and proceeds to the subsequent processes again.
Note that in step ST14, which is performed again, the sensor information acquisition unit 15 acquires a post-projection captured image.
In step ST15, the shielding object detection unit 16 detects whether or not there is a shielding object that shields the first guide image projection area based on the captured image after projection.
When the shielding object detection unit 16 detects that there is a shielding object that blocks the first guide image projection area, the guide image generating unit 13a generates a first guide image and a second guide image (step ST16), and performs projection control. The unit 14a causes the projection device 3 to stop projecting the first guide image onto the road surface in front of the vehicle 100 (step ST171), and causes the projection device 3 to project the second guide image onto the road surface in front of the vehicle 100. (Step ST172). If the first guide image was projected onto the projection device 3 last time, the projection control unit 14a will stop the projection of the first guide image. Furthermore, if the projection device 3 was previously projecting only the second guide image, the projection control unit 14a continues to control the projection device 3 to project only the second guide image.
When the shielding object detection unit 16 detects that there is no shielding object that blocks the first guide image projection area, the guide image generation unit 13a generates the first guide image (step ST18), and the projection control unit 14a, The projection device 3 is caused to project the first guide image onto the road surface in front of the vehicle 100 (step ST19). When the projection device 3 was previously projecting only the second guide image, the projection control unit 14a changes the projection device 3 from projecting only the second guide image to projecting only the first guide image. Control will be switched. If the projection device 3 was previously projecting only the first guide image, the projection control unit 14a continues to control the projection device 3 to project only the first guide image.
 投影制御装置1aは、以上のような動作を、例えば、車両100の電源がオフにされるまで、繰り返す。 The projection control device 1a repeats the above operations until, for example, the power of the vehicle 100 is turned off.
 なお、図9のフローチャートでは、ステップST171、ステップST172の順で処理が行われるようにしているが、これは一例に過ぎない。ステップST171の処理とステップST172の処理の順番は逆でもよいし、ステップST171の処理とステップST172の処理とが並行して行われてもよい。
 また、図9のフローチャートでは、ステップST16において、案内画像生成部13aは、第1案内画像を生成するものとしているが、これは一例に過ぎない。案内画像生成部13aは、ステップST16において、第1案内画像を生成しなくてもよい。案内画像生成部13aは、第1案内画像を生成しないようにすることで、処理負荷を軽減することができる。
Note that in the flowchart of FIG. 9, the processing is performed in the order of step ST171 and step ST172, but this is only an example. The order of the processing in step ST171 and the processing in step ST172 may be reversed, or the processing in step ST171 and the processing in step ST172 may be performed in parallel.
Further, in the flowchart of FIG. 9, the guide image generation unit 13a generates the first guide image in step ST16, but this is only an example. The guide image generation unit 13a does not need to generate the first guide image in step ST16. The guide image generation unit 13a can reduce the processing load by not generating the first guide image.
 このように、投影制御装置1aは、車両100の前方にて検知された物体に関するセンサ情報(ここでは、撮像画像)に基づき、第1案内画像が投影される路面上の領域(第1案内画像投影領域)を遮蔽する遮蔽物が存在すると検知した場合、第1案内画像に加え、車両100からみた分岐点の方向を示す第2案内画像を生成し、投影装置3に対して、車両100の前方の路面に第2案内画像を投影させるとともに、投影装置3に対して、車両100の前方の路面に、第1案内画像の投影を中止させるようにしてもよい。
 投影制御装置1aは、車両100の前方の路面に遮蔽物が存在することで、意図通り、車両100の乗員に対して当該乗員が分岐点の存在および分岐方向を認識可能な第1投影画像を提示できない場合には、投影装置3に対して、第1案内画像の投影を中止させることで、さらに不要な消費電力を抑えることができる。
 また、遮蔽物に遮蔽され正しく提示されない第1投影画像によって、車両100の乗員は、混乱する可能性がある。投影制御装置1aは、第1投影画像の投影を中止させることで、車両100の乗員に対し、混乱を生じさせないようにできる。
In this way, the projection control device 1a selects an area on the road surface onto which the first guide image is projected (the first guide image If it is detected that there is a blocking object that blocks the projection area), in addition to the first guide image, a second guide image indicating the direction of the branch point as seen from the vehicle 100 is generated, and the projection device 3 The second guide image may be projected onto the road surface in front of the vehicle, and the projection device 3 may be made to stop projecting the first guide image onto the road surface in front of the vehicle 100.
As intended, the projection control device 1a displays a first projected image to the occupant of the vehicle 100 that allows the occupant to recognize the existence of the branch point and the branch direction, due to the presence of a shield on the road surface in front of the vehicle 100. If the first guide image cannot be presented, unnecessary power consumption can be further suppressed by causing the projection device 3 to stop projecting the first guide image.
Furthermore, the occupants of the vehicle 100 may be confused by the first projected image that is blocked by a blocking object and is not presented correctly. By stopping the projection of the first projection image, the projection control device 1a can avoid causing confusion for the occupants of the vehicle 100.
 また、以上の実施の形態2において、一度、投影装置3に対する投影の制御を切り替えると、その後は、切り替え後の制御を維持するようにしてもよい。
 この場合の、投影制御装置1aの構成例は、図6に示す構成例のとおりである。
 例えば、投影制御部14aは、投影装置3に対する投影の制御の切り替えを行った場合、投影切替フラグに「1」を設定する。投影切替フラグは、投影制御装置1aが参照可能な場所に記憶されている。このとき、投影制御部14aは、記憶部に、切り替え後の投影装置3に対する制御内容を、記憶しておく。
 具体的には、投影制御部14aは、投影装置3に対して第1案内画像を投影させる制御から、第2案内画像を新たに投影させる制御へと切り替えた場合、投影切替フラグに「1」を設定する。このとき、投影制御部14aは、記憶部に、投影装置3に対して第1案内画像および第2案内画像の投影を行わせている旨を記憶しておく。
 また、投影制御部14aは、投影装置3に対して第1案内画像および第2案内画像を投影させる制御から、第1案内画像のみを投影させる制御へと切り替えた場合、投影切替フラグに「1」を設定する。このとき、投影制御部14aは、記憶部に、投影装置3に対して第1案内画像のみの投影を行わせている旨を記憶しておく。
Furthermore, in the second embodiment described above, once the control of projection on the projection device 3 is switched, the control after switching may be maintained thereafter.
A configuration example of the projection control device 1a in this case is as shown in FIG. 6.
For example, when the projection control unit 14a switches the projection control for the projection device 3, the projection control unit 14a sets the projection switching flag to “1”. The projection switching flag is stored in a location that can be referenced by the projection control device 1a. At this time, the projection control section 14a stores the control details for the projection device 3 after switching in the storage section.
Specifically, when the projection control unit 14a switches from control for projecting the first guide image on the projection device 3 to control for newly projecting the second guide image, the projection control unit 14a sets the projection switching flag to “1”. Set. At this time, the projection control section 14a stores in the storage section the fact that the projection device 3 is caused to project the first guide image and the second guide image.
In addition, when the projection control unit 14a switches from control for projecting the first guide image and second guide image on the projection device 3 to control for projecting only the first guide image, the projection control unit 14a sets the projection switching flag to “1”. ”. At this time, the projection control section 14a stores in the storage section the fact that the projection device 3 is caused to project only the first guide image.
 例えば、案内画像生成部13aは、分岐判定部12が車両100は分岐点に近づいていると判定した場合、投影切替フラグを参照する。
 案内画像生成部13aは、投影切替フラグに「1」が設定されている場合、記憶部に記憶されている、投影装置3に対する制御内容に従って、第1案内画像、または、第1案内画像と第2案内画像、を、生成する。例えば、記憶部に、投影装置3に対して第1案内画像および第2案内画像の投影を行わせている旨が記憶されていれば、案内画像生成部13aは、第1案内画像および第2案内画像を生成する。例えば、記憶部に、投影装置3に対して第1案内画像のみの投影を行わせている旨が記憶されていれば、案内画像生成部13aは、第1案内画像を生成する。
For example, when the branch determining unit 12 determines that the vehicle 100 is approaching a branch point, the guide image generating unit 13a refers to the projection switching flag.
When the projection switching flag is set to "1", the guide image generation unit 13a generates the first guide image or the first guide image and the first guide image according to the control details for the projection device 3 stored in the storage unit. 2 guide images are generated. For example, if the storage unit stores that the projection device 3 is to project the first guide image and the second guide image, the guide image generation unit 13a generates the first guide image and the second guide image. Generate a guide image. For example, if the storage unit stores that the projection device 3 is to project only the first guide image, the guide image generation unit 13a generates the first guide image.
 なお、投影切替フラグ、および、記憶部に記憶されている、切り替え後の投影装置3に対する制御内容は、分岐接近フラグと同じタイミングでクリアされる。
 投影制御部14aは、記憶部に、切り替え後の投影装置3に対する制御内容が記憶されていない場合は、まだ投影装置3に対する投影の制御の切り替えは行っていないと判定する。
Note that the projection switching flag and the control details for the projection device 3 after switching, which are stored in the storage unit, are cleared at the same timing as the branch approach flag.
If the storage unit does not store the control details for the projection device 3 after switching, the projection control unit 14a determines that the projection control for the projection device 3 has not been switched yet.
 この場合、投影制御装置1aの動作は、例えば、図7のフローチャートに示したような動作に代えて、図11のフローチャートに示すような動作となる。
 図11のステップST101~ステップST103、ステップST105~ステップST107、ステップST110の処理の具体的な内容は、それぞれ、図7のステップST11~ステップST16、ステップST18の処理の具体的な内容と同様であるため、重複した説明を省略する。
In this case, the operation of the projection control device 1a is, for example, as shown in the flowchart of FIG. 11 instead of the operation shown in the flowchart of FIG.
The specific contents of steps ST101 to ST103, steps ST105 to ST107, and step ST110 in FIG. 11 are the same as the specific contents of steps ST11 to ST16 and step ST18 in FIG. 7, respectively. Therefore, duplicate explanations will be omitted.
 ステップST104にて、案内画像生成部13aは、投影装置3に対する投影の制御を切り替え済みであるか否かを判定する(ステップST104)。具体的には、案内画像生成部13aは、投影切替フラグに「1」が設定されているか否かを判定する。 In step ST104, the guide image generation unit 13a determines whether or not the projection control for the projection device 3 has been switched (step ST104). Specifically, the guide image generation unit 13a determines whether the projection switching flag is set to "1".
 投影装置3に対する投影の制御を切り替え済みではないと判定した場合、すなわち、投影切替フラグに「1」が設定されていない場合(ステップST104の“NO”の場合)、案内画像生成部13aは、ステップST103にて算出した第1案内画像投影領域に関する情報をセンサ情報取得部15に出力する。そして、投影制御装置1aの動作は、ステップST105に進む。 If it is determined that the projection control for the projection device 3 has not been switched, that is, if the projection switching flag is not set to "1" ("NO" in step ST104), the guide image generation unit 13a: The information regarding the first guide image projection area calculated in step ST103 is output to the sensor information acquisition section 15. The operation of the projection control device 1a then proceeds to step ST105.
 一方、投影装置3に対する投影の制御を切り替え済みであると判定した場合、すなわち、投影切替フラグに「1」が設定されている場合(ステップST104の“YES”の場合)、案内画像生成部13aは、投影制御部14aが、投影装置3に対して、第2案内画像を投影させる制御に切り替えたか否かを判定する(ステップST109)。案内画像生成部13aは、記憶部を参照すれば、投影制御部14aが投影装置3に対して第2案内画像を投影させる制御に切り替えたか否かを判定できる。 On the other hand, if it is determined that the projection control for the projection device 3 has been switched, that is, if the projection switching flag is set to "1" ("YES" in step ST104), the guide image generation unit 13a The projection control unit 14a determines whether the projection control unit 14a has switched to controlling the projection device 3 to project the second guide image (step ST109). The guide image generation unit 13a can determine whether the projection control unit 14a has switched to controlling the projection device 3 to project the second guide image by referring to the storage unit.
 案内画像生成部13aが、投影制御部14aは、投影装置3に対して、第2案内画像を投影させる制御に切り替えたと判定した場合(ステップST109の“YES”の場合)、投影制御装置1aの動作は、ステップST107の処理に進む。
 一方、案内画像生成部13aが、投影制御部14aは、投影装置3に対して、第2案内画像を投影させる制御に切り替えていないと判定した場合(ステップST109の“NO”の場合)、言い換えれば、投影制御部14aは、投影装置3に対して、第1案内画像のみを投影させる制御に切り替えたと判定した場合、投影制御装置1aの動作は、ステップST110の処理に進む。
When the guide image generation unit 13a determines that the projection control unit 14a has switched to control for projecting the second guide image on the projection device 3 (“YES” in step ST109), the projection control unit 14a The operation proceeds to step ST107.
On the other hand, if the guide image generation unit 13a determines that the projection control unit 14a has not switched to control for projecting the second guide image on the projection device 3 (“NO” in step ST109), in other words, For example, if the projection control unit 14a determines that the control has been switched to projecting only the first guide image on the projection device 3, the operation of the projection control device 1a proceeds to step ST110.
 このように、投影制御装置1aは、投影装置3に対して第1案内画像を投影させている状態から第2案内画像を新たに投影させる状態へ切り替える投影装置3への投影の制御の切り替え、または、投影装置3に対して第2案内画像を投影させている状態から第1案内画像を投影させる状態へ切り替える投影装置3への投影の制御の切り替えを行った場合、切り替え後は、遮蔽物の存在の有無によらず切り替えた後の投影装置3への投影の制御を維持する。
 これにより、投影制御装置1aは、車両100の乗員に対し、車両100の前方に投影されている第1投影画像または第2投影画像の投影状態が頻繁に変わることへの煩わしさを低減できる。
In this way, the projection control device 1a switches control of projection on the projection device 3 from a state in which the first guide image is projected onto the projection device 3 to a state in which the second guide image is newly projected; Alternatively, when the projection control on the projection device 3 is switched from a state in which the second guide image is projected to the projection device 3 to a state in which the first guide image is projected on the projection device 3, after switching, Control of projection to the projection device 3 after switching is maintained regardless of the presence or absence of the projection device 3.
Thereby, the projection control device 1a can reduce the trouble that the projection state of the first projection image or the second projection image projected in front of the vehicle 100 frequently changes for the occupant of the vehicle 100.
 なお、投影制御装置1aは、図11のフローチャートに示すような動作を行う場合、ステップST107の処理に代えて、図9のフローチャートのステップST171およびステップST172の処理を行ってもよい。 Note that when performing the operation shown in the flowchart of FIG. 11, the projection control device 1a may perform the processing of step ST171 and step ST172 of the flowchart of FIG. 9 instead of the processing of step ST107.
 また、以上の実施の形態2では、投影制御装置1aは、車両100が分岐点に近づいていると判定した後はじめて第1案内画像投影領域を遮蔽する遮蔽物が存在するか否かの検知を、第1案内画像が車両100の前方の路面に投影される前に行うようにしていたが、これは一例に過ぎない。
 投影制御装置1aは、車両100が分岐点に近づいていると判定した後はじめて第1案内画像投影領域を遮蔽する遮蔽物が存在するか否かの検知を行う場合に、一度、第1案内画像を投影させてみて、第1案内画像が投影された車両100の前方が撮像された投影後撮像画像に基づいて、第1案内画像投影領域を遮蔽する遮蔽物が存在するか否かを検知してもよい。
Furthermore, in the second embodiment described above, the projection control device 1a detects whether or not there is a blocking object that blocks the first guide image projection area for the first time after determining that the vehicle 100 is approaching a branch point. , is performed before the first guide image is projected onto the road surface in front of the vehicle 100, but this is only an example.
When the projection control device 1a detects whether or not there is a shielding object that blocks the first guide image projection area for the first time after determining that the vehicle 100 is approaching a branch point, the projection control device 1a once displays the first guide image. is projected, and it is detected whether or not there is a blocking object that blocks the first guide image projection area based on a captured image after projection in which the front of the vehicle 100 on which the first guide image is projected is imaged. You can.
 この場合、投影制御装置1aは、図7のフローチャートのステップST13において、第1案内画像を生成し、投影装置3に対して、車両100の前方の路面に、生成した第1案内画像を投影させる。具体的には、図7のフローチャートのステップST13において、案内画像生成部13aは、第1案内画像を生成し、投影制御部14aは、投影装置3に対して、車両100の前方の路面に、案内画像生成部13aが生成した第1案内画像を投影させる。そして、案内画像生成部13aは、第1案内画像投影領域を算出する。 In this case, the projection control device 1a generates a first guide image in step ST13 of the flowchart in FIG. 7, and causes the projection device 3 to project the generated first guide image on the road surface in front of the vehicle 100. . Specifically, in step ST13 of the flowchart in FIG. The first guide image generated by the guide image generator 13a is projected. Then, the guide image generation unit 13a calculates the first guide image projection area.
 なお、投影制御装置1aが図9のフローチャートに示すような動作を行う場合、投影制御装置1aは、図9のステップST13において、第1案内画像を生成し、投影装置3に対して、車両100の前方の路面に、生成した第1案内画像を投影させる。また、投影制御装置1aが図11のフローチャートに示すような動作を行う場合、投影制御装置1aは、図11のステップST103において、第1案内画像を生成し、投影装置3に対して、車両100の前方の路面に、生成した第1案内画像を投影させる。 Note that when the projection control device 1a performs the operation shown in the flowchart of FIG. 9, the projection control device 1a generates a first guide image in step ST13 of FIG. The generated first guide image is projected onto the road surface in front of the vehicle. Further, when the projection control device 1a performs the operation shown in the flowchart of FIG. 11, the projection control device 1a generates a first guide image in step ST103 of FIG. The generated first guide image is projected onto the road surface in front of the vehicle.
 このようにしても、投影制御装置1aは、車両100の前方の路面に遮蔽物が存在することで、意図通り、車両100の乗員に対して当該乗員が分岐点の存在および分岐方向を認識可能な第1投影画像を提示できない場合には、投影装置3に対して、第2案内画像を投影させることで、車両100の乗員に対し、分岐点の方向を提示することができる。これにより、投影制御装置1aは、車両100の乗員に対し、まもなく分岐点に差し掛かることを認識させることができる。
 また、投影制御装置1aは、車両100の前方の路面に遮蔽物が存在せず、意図通り、車両100の乗員に対して当該乗員が分岐点の存在および分岐方向を認識可能な第1投影画像を提示できる場合には、投影装置3に対して、第1案内画像のみ投影させ、第2案内画像は投影させない。投影装置3は、不要な第2案内画像の投影のための消費電力を抑えることができる。
 また、投影制御装置1aは、例えば、遮蔽物が他車両のように動く物体であっても、都度、当該遮蔽物が第1案内画像投影領域を遮蔽していることを精度よく検知できる。
Even in this case, the projection control device 1a allows the occupant of the vehicle 100 to recognize the existence of the branch point and the branch direction, as intended, due to the presence of a shield on the road surface in front of the vehicle 100. If the first projected image cannot be presented, the direction of the branch point can be presented to the occupants of the vehicle 100 by causing the projection device 3 to project the second guide image. Thereby, the projection control device 1a can make the occupants of the vehicle 100 aware that they are about to reach a fork in the road.
The projection control device 1a also provides a first projected image for the occupant of the vehicle 100 in which there is no obstruction on the road surface in front of the vehicle 100 and the occupant can recognize the presence of the branch point and the branch direction, as intended. can be presented, the projection device 3 projects only the first guide image and does not project the second guide image. The projection device 3 can suppress power consumption for unnecessary projection of the second guide image.
Further, the projection control device 1a can accurately detect that the shielding object is blocking the first guide image projection area each time, for example, even if the blocking object is a moving object such as another vehicle.
 なお、以上の実施の形態2でも、実施の形態1同様、一例として、第1案内画像は矢印画像としたが、これは一例に過ぎない。第1案内画像は、分岐点付近を始点とし分岐方向へ延びて分岐方向を示す画像であって、分岐点があること、および、分岐方向を示す画像であればよい。
 投影制御装置1aは、車両100が分岐点に近づくに従って、第2端部を第1端部に近づけた第1案内画像を生成し、当該第1案内画像を車両100の前方の投影させた第1投影画像が、車両100の乗員からみて、車両100が分岐点に近づくに従って、分岐点の方向に対して短くなるように、投影装置3に対して当該第1案内画像を投影させるようになっていればよい。
Note that in the above-described second embodiment, as in the first embodiment, the first guide image is an arrow image as an example, but this is only an example. The first guide image may be an image that starts near the branch point and extends in the branch direction to indicate the branch direction, and is an image that indicates the presence of the branch point and the branch direction.
The projection control device 1a generates a first guide image in which the second end approaches the first end as the vehicle 100 approaches the branch point, and generates a first guide image in which the first guide image is projected in front of the vehicle 100. The projection device 3 is configured to project the first guide image so that the first projected image becomes shorter in the direction of the fork as the vehicle 100 approaches the fork as viewed from the occupant of the vehicle 100. All you have to do is stay there.
 また、以上の実施の形態2では、一例として、第2案内画像は矩形を示す画像としたが、これは一例に過ぎない。第2案内画像は、車両100からみた分岐点の方向を示す画像であればよい。例えば、第2案内画像は、矢印画像であってもよいし、「前方に分岐点あり」等、メッセージを示す画像であってもよい。
 また、以上の実施の形態2では、投影制御装置1aは、第1投影画像の第1路面点と第2投影画像の第3路面点とが同じ点となるよう、言い換えれば、第1案内画像の第1端部と第2案内画像の第3端部とが同じ点となるよう、第2案内画像を生成していたが、これも一例に過ぎない。第1案内画像と第2案内画像とが接続されていることは必須ではない。
Further, in the above second embodiment, the second guide image is an image showing a rectangle as an example, but this is only an example. The second guide image may be an image that shows the direction of the branch point as seen from the vehicle 100. For example, the second guide image may be an arrow image or an image indicating a message such as "There is a fork ahead."
Furthermore, in the second embodiment described above, the projection control device 1a controls the first guide image so that the first road surface point of the first projection image and the third road surface point of the second projection image are the same point. Although the second guide image is generated so that the first end of the guide image and the third end of the second guide image are at the same point, this is also just an example. It is not essential that the first guide image and the second guide image are connected.
 また、以上の実施の形態2では、第1路面点の位置は、車両100の位置を示す点を通り車両100が走行中の車線(ここでいう車線とはいわゆるレーン)と平行な直線と、分岐後の車線(ここでいう車線とはいわゆるレーン)の幅方向の中心を通り分岐後の車線と平行な直線との交点の位置としたが、これは一例に過ぎない。投影制御装置1aは、第1路面点の位置を、分岐点から所定の範囲内の点の位置としていればよい。なお、投影制御装置1aは、分岐点から所定の範囲内の点の位置の座標を、案内経路情報から算出できる。 Further, in the second embodiment described above, the position of the first road surface point is a straight line passing through a point indicating the position of the vehicle 100 and parallel to the lane in which the vehicle 100 is traveling (here, the lane is a so-called lane); Although the intersection point between the branched lane and a straight line that passes through the widthwise center of the branched lane (the lane referred to here is a so-called lane) and is parallel to the branched lane, this is only an example. The projection control device 1a may set the first road surface point to a point within a predetermined range from the branch point. Note that the projection control device 1a can calculate the coordinates of the position of a point within a predetermined range from the branch point from the guide route information.
 また、以上の実施の形態2では、投影制御装置1aが投影装置3に対して第1案内画像を投影させた結果、第1投影画像は、車両100の前方の、分岐後の車線(ここでいう車線とはいわゆるレーン)の路面上の画像となるようにした(図3および図4参照)が、これは一例に過ぎない。
 例えば、投影制御装置1aは、投影装置3に対して、分岐後の車線(ここでいう車線とはいわゆるレーン)の路面上以外の場所に、第1案内画像を投影させてもよい。第1路面点が、分岐点付近に位置するようになっていればよい。
Further, in the second embodiment described above, as a result of the projection control device 1a projecting the first guide image on the projection device 3, the first projection image is a lane ahead of the vehicle 100 after the branch (here, This is an image on the road surface of a so-called lane (see FIGS. 3 and 4), but this is only an example.
For example, the projection control device 1a may cause the projection device 3 to project the first guide image onto a location other than the road surface of the lane after the branch (here, the lane is a so-called lane). It is sufficient that the first road surface point is located near the branch point.
 また、以上の実施の形態2では、投影制御装置1aは、車両100に搭載される車載装置とし、案内経路情報取得部11と、分岐判定部12と、案内画像生成部13aと、投影制御部14aと、センサ情報取得部15と、遮蔽物検知部16と、図示しない制御部とは、車載装置に備えられているものとした。これに限らず、案内経路情報取得部11と、分岐判定部12と、案内画像生成部13aと、投影制御部14aと、センサ情報取得部15と、遮蔽物検知部16と、図示しない制御部のうち、一部が車両100の車載装置に備えられるものとし、その他が当該車載装置とネットワークを介して接続されるサーバに備えられてもよい。また、案内経路情報取得部11と、分岐判定部12と、案内画像生成部13aと、投影制御部14aと、センサ情報取得部15と、遮蔽物検知部16と、図示しない制御部の全部がサーバに備えられてもよい。 In the second embodiment described above, the projection control device 1a is an in-vehicle device mounted on the vehicle 100, and includes the guidance route information acquisition section 11, the branch determination section 12, the guidance image generation section 13a, and the projection control section. 14a, the sensor information acquisition section 15, the shielding object detection section 16, and a control section (not shown) are included in the on-vehicle device. However, the present invention is not limited to this, and includes the guidance route information acquisition section 11, the branch determination section 12, the guidance image generation section 13a, the projection control section 14a, the sensor information acquisition section 15, the obstruction detection section 16, and the control section (not shown). Some of them may be provided in the on-vehicle device of the vehicle 100, and the rest may be provided in a server connected to the on-vehicle device via a network. In addition, all of the guidance route information acquisition section 11, the branch determination section 12, the guidance image generation section 13a, the projection control section 14a, the sensor information acquisition section 15, the obstruction detection section 16, and the control section (not shown) It may be provided in the server.
 実施の形態2に係る投影制御装置1aのハードウェア構成は、実施の形態1において図5Aおよび図5Bを用いて説明した投影制御装置1のハードウェア構成と同様であるため、図示を省略する。
 実施の形態2において、案内経路情報取得部11と、分岐判定部12と、案内画像生成部13aと、投影制御部14aと、センサ情報取得部15と、遮蔽物検知部16と、図示しない制御部の機能は、処理回路1001により実現される。すなわち、投影制御装置1aは、投影装置3に対して、車両100の前方の路面に、ナビゲーション装置2から取得した案内経路情報に基づいて生成した第1案内画像または第2案内画像を投影させる制御を行うための処理回路1001を備える。
The hardware configuration of the projection control device 1a according to the second embodiment is the same as the hardware configuration of the projection control device 1 described using FIGS. 5A and 5B in the first embodiment, and therefore illustration thereof is omitted.
In the second embodiment, a guide route information acquisition unit 11, a branch determination unit 12, a guide image generation unit 13a, a projection control unit 14a, a sensor information acquisition unit 15, an obstruction detection unit 16, and a control (not shown) The functions of the section are realized by the processing circuit 1001. That is, the projection control device 1a controls the projection device 3 to project the first guide image or the second guide image generated based on the guide route information acquired from the navigation device 2 onto the road surface in front of the vehicle 100. It includes a processing circuit 1001 for performing.
 処理回路1001は、メモリ1005に記憶されたプログラムを読み出して実行することにより、案内経路情報取得部11と、分岐判定部12と、案内画像生成部13aと、投影制御部14aと、センサ情報取得部15と、遮蔽物検知部16と、図示しない制御部の機能を実行する。すなわち、投影制御装置1aは、処理回路1001により実行されるときに、上述の図7のステップST11~ステップST19、上述の図9のステップST11~ステップST19、または、上述の図11のステップST101~ステップST111が結果的に実行されることになるプログラムを格納するためのメモリ1005を備える。また、メモリ1005に記憶されたプログラムは、案内経路情報取得部11と、分岐判定部12と、案内画像生成部13aと、投影制御部14aと、センサ情報取得部15と、遮蔽物検知部16と、図示しない制御部の処理の手順または方法をコンピュータに実行させるものであるともいえる。
 投影制御装置1aは、ナビゲーション装置2、投影装置3、または、センサ4等の装置と、有線通信または無線通信を行う入力インタフェース装置1002および出力インタフェース装置1003を備える。
 図示しない記憶部は、メモリ1005等によって構成される。
The processing circuit 1001 reads out and executes the program stored in the memory 1005 to acquire the guidance route information acquisition section 11, the branch determination section 12, the guidance image generation section 13a, the projection control section 14a, and the sensor information acquisition section. 15, a shielding object detection section 16, and a control section (not shown). That is, when executed by the processing circuit 1001, the projection control device 1a performs steps ST11 to ST19 in FIG. 7 described above, steps ST11 to ST19 in FIG. 9 described above, or steps ST101 to ST101 in FIG. 11 described above. A memory 1005 is provided for storing a program that will eventually be executed in step ST111. Further, the program stored in the memory 1005 includes the guidance route information acquisition section 11, the branch determination section 12, the guidance image generation section 13a, the projection control section 14a, the sensor information acquisition section 15, and the obstruction detection section 16. It can also be said that it causes a computer to execute a processing procedure or method of a control unit (not shown).
The projection control device 1a includes an input interface device 1002 and an output interface device 1003 that perform wired or wireless communication with devices such as the navigation device 2, the projection device 3, or the sensor 4.
A storage unit (not shown) includes a memory 1005 and the like.
 以上のように、実施の形態2に係る投影制御装置1aは、案内経路情報取得部11と、分岐判定部12と、案内画像生成部13aと、投影制御部14aとに加え、車両100の前方にて検知された物体に関するセンサ情報を取得するセンサ情報取得部15と、センサ情報取得部15が取得したセンサ情報に基づき、第1案内画像が投影される路面上の領域を遮蔽する遮蔽物が存在するか否かを検知する遮蔽物検知部16とを備えるように構成した。そして、投影制御装置1aにおいて、案内画像生成部13aは、車両100の位置と、車両100から分岐点までの距離と、第1案内画像の第1端部が投影される路面上の第1路面点の位置と、第1案内画像の第1端部とは反対側の第2端部が投影される、分岐点に対し分岐方向に向かって第1路面点よりも遠くに位置する路面上の第2路面点の位置とに基づき第1案内画像を生成し、車両100が分岐点に近づくに従って、第2端部を第1端部に近づけた第1案内画像を生成する。また、案内画像生成部13aは、遮蔽物検知部16が、遮蔽物が存在すると検知した場合、第1案内画像に加え、車両100からみた分岐点の方向を示す第2案内画像を生成し、投影制御部14aは、投影装置3に対して、車両100の前方の路面に、第2案内画像を投影させる。
 これにより、投影制御装置1aは、車両100の前方の路面に遮蔽物が存在することで、意図通り、車両100の乗員に対して当該乗員が分岐点の存在および分岐方向を認識可能な第1投影画像を提示できない場合には、投影装置3に対して、第2案内画像を投影させることで、車両100の乗員に対し、分岐点の方向を提示することができる。これにより、投影制御装置1aは、車両100の乗員に対し、まもなく分岐点に差し掛かることを認識させることができる。
 また、投影制御装置1aは、車両100の前方の路面に遮蔽物が存在せず、意図通り、車両100の乗員に対して当該乗員が分岐点の存在および分岐方向を認識可能な第1投影画像を提示できる場合には、投影装置3に対して、第1案内画像のみ投影させ、第2案内画像は投影させない。投影装置3は、不要な第2案内画像の投影のための消費電力を抑えることができる。
As described above, the projection control device 1a according to the second embodiment includes the guidance route information acquisition section 11, the branch determination section 12, the guidance image generation section 13a, and the projection control section 14a, as well as the projection control device 1a in front of the vehicle 100. A sensor information acquisition unit 15 acquires sensor information regarding the object detected by the sensor information acquisition unit 15, and a shielding object that blocks the area on the road surface on which the first guide image is projected is located based on the sensor information acquired by the sensor information acquisition unit 15. The shielding object detection unit 16 is configured to include a shielding object detection unit 16 that detects whether or not it exists. In the projection control device 1a, the guide image generation unit 13a determines the position of the vehicle 100, the distance from the vehicle 100 to the branch point, and the first road surface on which the first end of the first guide image is projected. The position of the point and the second end opposite to the first end of the first guide image are projected on the road surface located further away from the first road surface point in the direction of the fork. A first guide image is generated based on the position of the second road surface point, and as the vehicle 100 approaches the branch point, a first guide image is generated in which the second end approaches the first end. Further, when the obstruction detection unit 16 detects that an obstruction exists, the guidance image generation unit 13a generates, in addition to the first guidance image, a second guidance image indicating the direction of the branch point as seen from the vehicle 100, The projection control unit 14a causes the projection device 3 to project the second guide image onto the road surface in front of the vehicle 100.
As a result, the projection control device 1a provides a first projection control device that allows the occupant of the vehicle 100 to recognize the existence of a branch point and the branch direction, as intended, due to the presence of a shield on the road surface in front of the vehicle 100. If the projected image cannot be presented, the direction of the branch point can be presented to the occupants of the vehicle 100 by causing the projection device 3 to project the second guide image. Thereby, the projection control device 1a can make the occupants of the vehicle 100 aware that they are about to reach a fork in the road.
The projection control device 1a also provides a first projected image for the occupant of the vehicle 100 in which there is no obstruction on the road surface in front of the vehicle 100 and the occupant can recognize the presence of the branch point and the branch direction, as intended. can be presented, the projection device 3 projects only the first guide image and does not project the second guide image. The projection device 3 can suppress power consumption for unnecessary projection of the second guide image.
 なお、各実施の形態の自由な組み合わせ、あるいは各実施の形態の任意の構成要素の変形、もしくは各実施の形態において任意の構成要素の省略が可能である。 Note that it is possible to freely combine each embodiment, to modify any component of each embodiment, or to omit any component in each embodiment.
 本開示に係る投影制御は、路面への画像投影による経路案内において、車両の乗員に対して、視界から分岐点をはずすことなく、分岐点までの距離および分岐方向を把握させることができる。 Projection control according to the present disclosure allows vehicle occupants to grasp the distance to a branch point and the branch direction without removing the branch point from their field of vision in route guidance by projecting images onto the road surface.
 1,1a 投影制御装置、11 案内経路情報取得部、12 分岐判定部、13 案内画像生成部、14,14a 投影制御部、15 センサ情報取得部、16 遮蔽物検知部、2 ナビゲーション装置、3 投影装置、4 センサ、1001 処理回路、1002 入力インタフェース装置、1003 出力インタフェース装置、1004 プロセッサ、1005 メモリ。 1, 1a Projection control device, 11 Guidance route information acquisition unit, 12 Branch determination unit, 13 Guidance image generation unit, 14, 14a Projection control unit, 15 Sensor information acquisition unit, 16 Obstruction detection unit, 2 Navigation device, 3 Projection Device, 4 sensor, 1001 processing circuit, 1002 input interface device, 1003 output interface device, 1004 processor, 1005 memory.

Claims (10)

  1.  車両の案内経路に関する案内経路情報を取得する案内経路情報取得部と、
     前記案内経路情報取得部が取得した前記案内経路情報に基づき、前記車両は前記案内経路上の分岐点に近づいているか否かを判定する分岐判定部と、
     前記分岐判定部が、前記車両は前記分岐点に近づいていると判定した場合、前記案内経路情報に基づき、前記分岐点の存在および分岐方向を示す第1案内画像を生成する案内画像生成部と、
     投影装置に対して、前記車両の前方の路面に、前記案内画像生成部が生成した前記第1案内画像を投影させる投影制御部とを備え、
     前記案内画像生成部は、前記車両の位置と、前記車両から前記分岐点までの距離と、前記第1案内画像の第1端部が投影される前記路面上の第1路面点の位置と、前記第1案内画像の前記第1端部とは反対側の第2端部が投影される、前記分岐点に対し前記分岐方向に向かって前記第1路面点よりも遠くに位置する前記路面上の第2路面点の位置とに基づき前記第1案内画像を生成し、前記車両が前記分岐点に近づくに従って、前記第2端部を前記第1端部に近づけた前記第1案内画像を生成する
     ことを特徴とする投影制御装置。
    a guidance route information acquisition unit that acquires guidance route information regarding the vehicle guidance route;
    a branch determination unit that determines whether the vehicle is approaching a branch point on the guide route based on the guide route information acquired by the guide route information acquisition unit;
    a guide image generation unit that generates a first guide image indicating the existence of the branch point and a branch direction based on the guide route information when the branch determination unit determines that the vehicle is approaching the branch point; ,
    a projection control unit that causes the projection device to project the first guide image generated by the guide image generation unit onto a road surface in front of the vehicle;
    The guide image generation unit includes a position of the vehicle, a distance from the vehicle to the branch point, and a position of a first road surface point on the road surface onto which a first end of the first guide image is projected; A second end of the first guide image opposite to the first end is projected onto the road surface located further away from the first road surface point in the direction of the fork with respect to the fork. the first guide image is generated based on the position of a second road point, and as the vehicle approaches the branch point, the first guide image is generated in which the second end approaches the first end. A projection control device characterized by:
  2.  前記分岐判定部は、前記車両から前記分岐点までの距離が接近判定用閾値以下である場合、前記車両は前記分岐点に近づいていると判定する
     ことを特徴とする請求項1記載の投影制御装置。
    The projection control according to claim 1, wherein the branch determining unit determines that the vehicle is approaching the branch point when a distance from the vehicle to the branch point is less than or equal to an approach determination threshold. Device.
  3.  前記第1案内画像は、前記第1端部を始点とし前記第2端部を終点として前記分岐方向へ延びて前記分岐方向を示す矢印画像である
     ことを特徴とする請求項1または請求項2記載の投影制御装置。
    2. The first guide image is an arrow image extending in the branching direction from the first end to the second end and indicating the branching direction. The projection control device described.
  4.  前記第1路面点の位置は、前記車両の位置を示す点を通り前記車両が走行中の車線と平行な直線と、前記分岐後の前記車線の幅方向の中心を通り前記分岐後の前記車線と平行な直線との交点の位置である
     ことを特徴とする請求項1から請求項3のうちのいずれか1項記載の投影制御装置。
    The position of the first road surface point is determined by a straight line passing through a point indicating the position of the vehicle and parallel to the lane in which the vehicle is traveling, and a straight line passing through the widthwise center of the lane after the branch and the lane after the branch. The projection control device according to any one of claims 1 to 3, wherein the position is a point of intersection with a straight line parallel to the projection control device.
  5.  前記車両の前方にて検知された物体に関するセンサ情報を取得するセンサ情報取得部と、
     前記センサ情報取得部が取得した前記センサ情報に基づき、前記第1案内画像が投影される前記路面上の領域を遮蔽する遮蔽物が存在するか否かを検知する遮蔽物検知部とを備え、
     前記案内画像生成部は、前記遮蔽物検知部が、前記遮蔽物が存在すると検知した場合、前記第1案内画像に加え、前記車両からみた前記分岐点の方向を示す第2案内画像を生成し、
     前記投影制御部は、前記投影装置に対して、前記車両の前方の前記路面に、前記第2案内画像を投影させる
     ことを特徴とする請求項1から請求項4のうちのいずれか1項記載の投影制御装置。
    a sensor information acquisition unit that acquires sensor information regarding an object detected in front of the vehicle;
    an obstruction detection unit that detects whether or not there is an obstruction that blocks the area on the road surface on which the first guide image is projected, based on the sensor information acquired by the sensor information acquisition unit;
    The guide image generation unit generates, in addition to the first guide image, a second guide image indicating the direction of the branch point as seen from the vehicle, when the shielding object detection unit detects that the shielding object exists. ,
    The projection control unit causes the projection device to project the second guide image onto the road surface in front of the vehicle, according to any one of claims 1 to 4. projection control device.
  6.  前記投影制御部は、前記投影装置に対して、前記車両の前方の前記路面に、前記第2案内画像を投影させるとともに、前記第1案内画像の投影を中止させる
     ことを特徴とする請求項5記載の投影制御装置。
    5. The projection control unit causes the projection device to project the second guide image onto the road surface in front of the vehicle and to stop projecting the first guide image. The projection control device described.
  7.  前記投影制御部は、前記投影装置に対して前記第1案内画像を投影させている状態から前記第2案内画像を新たに投影させる状態へ切り替える前記投影装置への投影の制御の切り替え、または、前記投影装置に対して前記第2案内画像を投影させている状態から前記第1案内画像を投影させる状態へ切り替える前記投影装置への投影の制御の切り替えを行った場合、切り替え後は、前記遮蔽物の存在の有無によらず切り替えた後の前記投影装置への投影の制御を維持する
     ことを特徴とする請求項5または請求項6記載の投影制御装置。
    The projection control unit switches control of projection on the projection device from a state in which the first guide image is projected onto the projection device to a state in which the second guide image is newly projected, or When the projection control on the projection device is switched from a state in which the second guide image is projected to the projection device to a state in which the first guide image is projected on the projection device, after switching, the shielding The projection control device according to claim 5 or 6, wherein control of projection to the projection device after switching is maintained regardless of the presence or absence of an object.
  8.  前記センサ情報は前記車両の前方が撮像された撮像画像であり、
     前記センサ情報取得部は、前記第1案内画像が投影される前の前記車両の前方が撮像された投影前撮像画像を取得し、
     前記遮蔽物検知部は、前記センサ情報取得部が取得した前記投影前撮像画像に基づき、前記遮蔽物が存在するか否かを検知する
     ことを特徴とする請求項5から請求項7のうちのいずれか1項記載の投影制御装置。
    The sensor information is a captured image of the front of the vehicle,
    The sensor information acquisition unit acquires a pre-projection captured image in which the front of the vehicle is captured before the first guide image is projected;
    The shielding object detection section detects whether or not the shielding object exists based on the pre-projection captured image acquired by the sensor information acquisition section. The projection control device according to any one of the items.
  9.  前記センサ情報は前記車両の前方が撮像された撮像画像であり、
     前記センサ情報取得部は、前記第1案内画像が投影された後の前記車両の前方が撮像された投影後撮像画像を取得し、
     前記遮蔽物検知部は、前記センサ情報取得部が取得した前記投影後撮像画像に基づき、前記遮蔽物が存在するか否かを検知する
     ことを特徴とする請求項5から請求項7のうちのいずれか1項記載の投影制御装置。
    The sensor information is a captured image of the front of the vehicle,
    The sensor information acquisition unit acquires a post-projection captured image in which the front of the vehicle is captured after the first guide image is projected;
    The shielding object detection unit detects whether or not the shielding object exists based on the projected image acquired by the sensor information acquisition unit. The projection control device according to any one of the items.
  10.  案内経路情報取得部が、車両の案内経路に関する案内経路情報を取得するステップと、
     分岐判定部が、前記案内経路情報取得部が取得した前記案内経路情報に基づき、前記車両は前記案内経路上の分岐点に近づいているか否かを判定するステップと、
     案内画像生成部が、前記分岐判定部が、前記車両は前記分岐点に近づいていると判定した場合、前記案内経路情報に基づき、前記分岐点の存在および分岐方向を示す第1案内画像を生成するステップと、
     投影制御部が、投影装置に対して、前記車両の前方の路面に、前記案内画像生成部が生成した前記第1案内画像を投影させるステップとを備え、
     前記案内画像生成部は、前記車両の位置と、前記車両から前記分岐点までの距離と、前記第1案内画像の第1端部が投影される前記路面上の第1路面点の位置と、前記第1案内画像の前記第1端部とは反対側の第2端部が投影される、前記分岐点に対し前記分岐方向に向かって前記第1路面点よりも遠くに位置する前記路面上の第2路面点の位置とに基づき前記第1案内画像を生成し、前記車両が前記分岐点に近づくに従って、前記第2端部を前記第1端部に近づけた前記第1案内画像を生成するステップを有する
     ことを特徴とする投影制御方法。
    a step in which the guidance route information acquisition unit acquires guidance route information regarding the vehicle guidance route;
    a branch determination unit determining whether the vehicle is approaching a branch point on the guide route based on the guide route information acquired by the guide route information acquisition unit;
    When the branch determining unit determines that the vehicle is approaching the branch point, the guide image generation unit generates a first guide image indicating the presence of the branch point and the branch direction based on the guide route information. the step of
    a projection control unit causing a projection device to project the first guide image generated by the guide image generation unit onto a road surface in front of the vehicle;
    The guide image generation unit includes a position of the vehicle, a distance from the vehicle to the branch point, and a position of a first road surface point on the road surface onto which a first end of the first guide image is projected; A second end of the first guide image opposite to the first end is projected onto the road surface located further away from the first road surface point in the direction of the fork with respect to the fork. the first guide image is generated based on the position of a second road point, and as the vehicle approaches the branch point, the first guide image is generated in which the second end approaches the first end. A projection control method comprising the steps of:
PCT/JP2022/027962 2022-07-19 2022-07-19 Projection control device and projection control method WO2024018497A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/027962 WO2024018497A1 (en) 2022-07-19 2022-07-19 Projection control device and projection control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/027962 WO2024018497A1 (en) 2022-07-19 2022-07-19 Projection control device and projection control method

Publications (1)

Publication Number Publication Date
WO2024018497A1 true WO2024018497A1 (en) 2024-01-25

Family

ID=89617431

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/027962 WO2024018497A1 (en) 2022-07-19 2022-07-19 Projection control device and projection control method

Country Status (1)

Country Link
WO (1) WO2024018497A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012247369A (en) * 2011-05-30 2012-12-13 Honda Motor Co Ltd Projection device for vehicle
WO2016114048A1 (en) * 2015-01-13 2016-07-21 日立マクセル株式会社 Image projection device
WO2020208779A1 (en) * 2019-04-11 2020-10-15 三菱電機株式会社 Display control device, and display control method
JP2021079835A (en) * 2019-11-19 2021-05-27 株式会社小糸製作所 Road surface drawing device
JP2021079907A (en) * 2019-11-22 2021-05-27 株式会社小糸製作所 Vehicle drive support system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012247369A (en) * 2011-05-30 2012-12-13 Honda Motor Co Ltd Projection device for vehicle
WO2016114048A1 (en) * 2015-01-13 2016-07-21 日立マクセル株式会社 Image projection device
WO2020208779A1 (en) * 2019-04-11 2020-10-15 三菱電機株式会社 Display control device, and display control method
JP2021079835A (en) * 2019-11-19 2021-05-27 株式会社小糸製作所 Road surface drawing device
JP2021079907A (en) * 2019-11-22 2021-05-27 株式会社小糸製作所 Vehicle drive support system

Similar Documents

Publication Publication Date Title
US11535155B2 (en) Superimposed-image display device and computer program
US10974765B2 (en) Vehicle capable of displaying information on road and control method therefor
US11511627B2 (en) Display device and computer program
JP7052786B2 (en) Display control device and display control program
WO2016092591A1 (en) Collision risk calculation device, collision risk display device, and vehicle body control device
JP7002246B2 (en) Vehicle display method and vehicle display device
JP6094337B2 (en) Operation control device
JP7251582B2 (en) Display controller and display control program
CN108859958B (en) Image conversion device
CN107923761B (en) Display control device, display device, and display control method
US20210042945A1 (en) Stereo camera device
JP2009255639A (en) Automatic lighting device for vehicle
JP7121361B2 (en) Autonomous mobile
JP7043765B2 (en) Vehicle driving control method and equipment
JP7377822B2 (en) Driving support method and driving support device
JP6968258B2 (en) Driving support device and driving support method
WO2021166410A1 (en) Travel assistance device
WO2024018497A1 (en) Projection control device and projection control method
JP2019172068A (en) Operation determination device, operation determination method, and program
JP7416114B2 (en) Display control device and display control program
US20210209947A1 (en) Traffic lane position information output device
EP3865815A1 (en) Vehicle-mounted system
JP7031748B2 (en) Self-position estimation method and self-position estimation device
JP2021101268A (en) Automatic operation vehicle
JP7228698B2 (en) vehicle controller

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22951888

Country of ref document: EP

Kind code of ref document: A1