WO2024018497A1 - Dispositif de commande de projection et procédé de commande de projection - Google Patents

Dispositif de commande de projection et procédé de commande de projection Download PDF

Info

Publication number
WO2024018497A1
WO2024018497A1 PCT/JP2022/027962 JP2022027962W WO2024018497A1 WO 2024018497 A1 WO2024018497 A1 WO 2024018497A1 JP 2022027962 W JP2022027962 W JP 2022027962W WO 2024018497 A1 WO2024018497 A1 WO 2024018497A1
Authority
WO
WIPO (PCT)
Prior art keywords
guide image
vehicle
projection
road surface
branch
Prior art date
Application number
PCT/JP2022/027962
Other languages
English (en)
Japanese (ja)
Inventor
祐子 山本
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to JP2024535590A priority Critical patent/JPWO2024018497A1/ja
Priority to PCT/JP2022/027962 priority patent/WO2024018497A1/fr
Publication of WO2024018497A1 publication Critical patent/WO2024018497A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers

Definitions

  • the present disclosure relates to a projection control device and a projection control method for vehicles.
  • Patent Document 1 describes an arrow image that is a route guidance image that guides the own vehicle in the branching direction based on route information when the own vehicle approaches a branch point on the route to the extent that the occupant can see it.
  • a vehicle projection device has been disclosed that projects the image on the road surface in front of the own vehicle.
  • the vehicular projection device disclosed in Patent Document 1 controls a branch indicating portion, which is included in an arrow image and extends along a branch direction at a branch point, to lengthen as the arrow image approaches the branch point.
  • the present disclosure has been made to solve the above-mentioned problems, and in route guidance using image projection on the road surface, vehicle occupants can be informed of the distance to a junction and the distance to the junction without removing the junction from their field of vision. It is an object of the present invention to provide a projection control device that can grasp a branching direction.
  • the projection control device includes a guide route information acquisition unit that acquires guide route information regarding the guide route of the vehicle, and a guide route information acquisition unit that allows the vehicle to reach a branch point on the guide route based on the guide route information acquired by the guide route information acquisition unit.
  • a branch determining unit that determines whether the vehicle is approaching a branch point; and when the branch determining unit determines that the vehicle is approaching a branch point, a first guide image that indicates the presence of the branch point and the branch direction based on the guidance route information; a guide image generating section that generates a first guide image generated by the guide image generating section, and a projection control section that causes a projection device to project a first guide image generated by the guide image generating section onto a road surface in front of the vehicle, the guide image generating section , the distance from the vehicle to the branch point, the position of the first road surface point on the road surface onto which the first end of the first guide image is projected, and the side opposite to the first end of the first guide image.
  • a first guide image is generated based on the position of a second road surface point on the road surface, which is located further away from the first road surface point in the direction of the branch with respect to the branch point, on which the second end of the vehicle is projected.
  • the first guide image is characterized in that as the second end approaches the first end, the first guide image is generated as the second end approaches the first end.
  • FIG. 1 is a diagram illustrating a configuration example of a projection control device according to Embodiment 1.
  • FIG. 3 is a flowchart for explaining an example of the operation of the projection control device according to the first embodiment.
  • FIG. 2 is a diagram illustrating an example of how the projection control unit of the projection control device causes the projection device to project a first guide image onto the road surface in front of the vehicle in the first embodiment.
  • FIG. 7 is a diagram for explaining another example of how the projection control unit of the projection control device causes the projection device to project the first guide image onto the road surface in front of the vehicle in the first embodiment.
  • 5A and 5B are diagrams illustrating an example of the hardware configuration of the projection control device according to the first embodiment.
  • FIG. 1 is a diagram illustrating a configuration example of a projection control device according to Embodiment 1.
  • FIG. 3 is a flowchart for explaining an example of the operation of the projection control device according to the first embodiment.
  • FIG. 2 is a diagram illustrating an example of how the
  • FIG. 3 is a diagram illustrating a configuration example of a projection control device according to a second embodiment.
  • 7 is a flowchart for explaining an example of the operation of the projection control device according to the second embodiment.
  • 2 is a diagram for explaining an example of how the projection control unit of the projection control device causes the projection device to project the first guide image and the second guide image onto the road surface in front of the vehicle in the second embodiment;
  • FIG. be. 7 is a flowchart for explaining another example of the operation of the projection control device according to the second embodiment.
  • the projection control unit of the projection control device causes the projection device to stop projecting the first guide image onto the road surface in front of the vehicle, and projects the second guide image onto the road surface in front of the vehicle.
  • FIG. 3 is a diagram for explaining an example of a state where 7 is a flowchart for explaining another example of the operation of the projection control device according to the second embodiment.
  • FIG. 1 is a diagram showing a configuration example of a projection control device 1 according to the first embodiment.
  • the projection control device 1 is mounted on a vehicle 100.
  • the projection control device 1 is connected to a navigation device 2 and a projection device 3.
  • the navigation device 2 is a general navigation device that is mounted on the vehicle 100 and provides route guidance for the vehicle 100.
  • the projection device 3 is mounted on the vehicle 100 and projects an image for guiding the route of the vehicle 100 (hereinafter referred to as a “guidance image”) onto the road surface in front of the vehicle 100 under the control of the projection control device 1. .
  • guidance image an image for guiding the route of the vehicle 100
  • the projection device 3 is, for example, a lamp provided near the headlights of the vehicle 100 and dedicated to projecting an image onto the road surface. Note that this is just an example, and the projection device 3 may be a device that has other functions, such as a headlamp.
  • the projection control device 1 acquires information regarding the guidance route of the vehicle 100 (hereinafter referred to as "guidance route information") from the navigation device 2, and determines whether the vehicle 100 is approaching a branch point on the guidance route based on the guidance route information. If it is determined that this is the case, the projection device 3 is caused to project a guide image (hereinafter referred to as "first guide image") indicating the existence of a branch point and the branch direction onto the road surface in front of the vehicle 100. Details of the projection control device 1, the first guide image, and the method of projecting the first guide image by the projection control device 1 will be described later.
  • a "branch point” refers to a point where a road or the like branches, such as a crossroads, a three-way intersection (Y-junction), and a T-junction (T-junction).
  • a branch point on the guide route of vehicle 100 is also simply referred to as a branch point.
  • the "branching direction” refers to the course direction of vehicle 100 after branching.
  • the first guide image is an arrow image.
  • the projection control device 1 transmits a first guide image, in other words, an arrow image, to the projection device 3 in a lane after a branch in front of the vehicle 100. It is assumed that the image is projected as an image extending in the left-right direction when viewed from 100 and parallel to the lane after the branch. Note that in the first embodiment, "parallel" is not limited to strictly parallel, but also includes substantially parallel.
  • the projection control device 1 includes a guide route information acquisition section 11, a branch determination section 12, a guide image generation section 13, and a projection control section 14.
  • the guide route information acquisition unit 11 acquires guide route information from the navigation device 2 .
  • the guide route information includes information regarding the route to the destination of the vehicle 100, information regarding the current position of the vehicle 100, and map information.
  • Information regarding the current position of the vehicle 100 is acquired by the navigation device 2 from a GPS (Global Positioning System, not shown) installed in the vehicle 100, for example. Note that this is just an example; for example, the guidance route information acquisition unit 11 may directly acquire information regarding the current position of the vehicle 100 from the GPS and include it in the guidance route information acquired from the navigation device 2.
  • the map information includes, for example, the location of the road, the location of the lane (the lane referred to here is the so-called lane), the shape of the road, the width of the road, the width of the lane (the lane referred to here is the so-called lane), and the location of the branch point. , and information regarding the road type.
  • the "position of a branch point" is represented by the intersection of straight lines passing through the centers of intersecting lanes (the lanes herein are so-called lanes) in the vehicle width direction.
  • the coordinate system in the map information is a so-called "geographical coordinate system” that represents a position on the earth.
  • a map coordinate system is generally expressed in two dimensions by latitude and longitude, and in three dimensions, elevation is added to these.
  • the guidance route information acquisition unit 11 outputs the acquired guidance route information to the branch determination unit 12.
  • the branch determination unit 12 determines whether the vehicle 100 is approaching a branch point based on the guidance route information acquired by the guidance route information acquisition unit 11. Specifically, for example, the branch determination unit 12 determines whether the vehicle 100 is approaching the junction by comparing the distance from the vehicle 100 to the junction with a preset threshold (hereinafter referred to as "approach determination threshold"). Determine whether or not there is. The branch determining unit 12 determines that the vehicle 100 is approaching the branch point when the distance from the vehicle 100 to the branch point is less than or equal to the approach determination threshold. Specifically, the distance from vehicle 100 to the branch point is, for example, the distance from the current position of vehicle 100 to the position of the branch point.
  • the branch determining unit 12 can determine the current position of the vehicle 100 and the position of the branch point based on the guide route information.
  • the approach determination threshold is appropriately set by an administrator or the like.
  • the approach determination threshold is preferably set to a distance within a range in which the projection device 3 can project the first guide image.
  • the branch determination unit 12 determines whether the vehicle 100 is approaching the branch point based on whether the distance from the vehicle 100 to the branch point is less than or equal to the approach determination threshold; This is just one example.
  • the branch determination unit 12 may determine whether the vehicle 100 is approaching the branch point by considering not only the distance from the vehicle 100 to the branch point but also the vehicle speed of the vehicle 100.
  • the branch determination unit 12 determines that the distance from the vehicle 100 to the branch point is less than or equal to the approach determination threshold, and the vehicle speed of the vehicle 100 is greater than or equal to a preset threshold (hereinafter referred to as "vehicle speed determination threshold"). In this case, it may be determined that the vehicle 100 is approaching a branch point.
  • the vehicle speed may be acquired from the navigation device 2 as guidance route information.
  • the guide image generation unit 13 transmits the determination result that the vehicle 100 is approaching the junction (hereinafter referred to as the “branch determination result”). Output to.
  • the branch determining unit 12 also determines whether the vehicle 100 has reached a branch point. For example, the branch determining unit 12 may determine that the vehicle 100 has reached the branch point when the distance from the vehicle 100 to the branch point becomes "0". If the branch determining unit 12 determines that the vehicle 100 has reached the branch point, it outputs information indicating that the vehicle 100 has reached the branch point (hereinafter referred to as “branch point arrival information”) to the projection control unit 14. do.
  • the branch determination unit 12 may output the branch point arrival information to the projection control unit 14 via the guide image generation unit 13 or directly to the projection control unit 14. In addition, in FIG. 1, illustration of an arrow from the branch determination unit 12 to the projection control unit 14 is omitted.
  • the guide image generating unit 13 If the branch determining unit 12 determines that the vehicle 100 is approaching a branch point, the guide image generating unit 13 generates a first guide image based on the guide route information. Note that the guidance image generation section 13 may acquire the guidance route information acquired by the guidance route information acquisition section 11 via the branch determination section 12.
  • the guidance image generation unit 13 projects the position of the vehicle 100 (more specifically, the current position), the position of the branch point, and one end (hereinafter referred to as "first end") of the first guidance image.
  • the position of the point on the road surface (hereinafter referred to as the "first road surface point”) and the end of the first guide image opposite to the first end (hereinafter referred to as the "second end") are projected.
  • a first guide image is generated based on the position of a point on the road surface located further away from the first road surface point (hereinafter referred to as "second road surface point") toward the branch point in the direction of the branch point. do. That is, in the first embodiment, the first end is the starting point of the arrow, and the second end is the ending point of the arrow.
  • the position of the first road surface point is a straight line that passes through a point indicating the position of the vehicle 100 and is parallel to the lane in which the vehicle 100 is traveling, and a lane after a branch (here, the lane is a so-called lane). This is the intersection of the straight line passing through the widthwise center of the lane and parallel to the lane after branching.
  • the guide image generation unit 13 first calculates the coordinates of the position of the first road surface point.
  • the guide image generation unit 13 can calculate the coordinates of the position of the first road surface point from the guide route information.
  • the guide image generation unit 13 determines the position of the second road surface point based on the coordinates of the position of the first road surface point and the distance from the vehicle 100 to the branch point, and calculates the coordinates of the second road surface point. calculate.
  • the guide image generation unit 13 determines the position of the second road surface point in accordance with preset and internally held conditions (hereinafter referred to as "end point determination conditions").
  • the end point determination condition includes when the branch determining unit 12 first determines that the vehicle 100 approaches a certain branch point, in other words, when the distance from the vehicle 100 to the branch point reaches the approach determination threshold.
  • conditions are defined as to how far away from the first road surface point (hereinafter referred to as "reference distance") a point should be set as the second road surface point.
  • the conditions for determining the end point include changing the position of the second road surface point to the first road surface point depending on how close the vehicle 100 has approached the junction after the distance from the vehicle 100 to the junction becomes the approach determination threshold.
  • the conditions for how close to the position are defined.
  • the guide image generation unit 13 can calculate the distance from the vehicle 100 to the branch point from the guide route information. An administrator or the like sets conditions for determining the end point in advance and stores them in the guide image generation section 13.
  • the guide image generation unit 13 generates the coordinates of the position of the vehicle 100, the distance from the vehicle 100 to the branch point, and the coordinates of the first road point using a learned model (hereinafter referred to as the "first machine learning model"). ) to obtain the coordinates of the first end of the first guide image. Further, the guide image generation unit 13 inputs the coordinates of the position of the vehicle 100, the distance from the vehicle 100 to the branch point, and the coordinates of the second road surface point to the first machine learning model, and Obtain the coordinates of the two ends.
  • a learned model hereinafter referred to as the "first machine learning model”
  • the coordinate system of the first guide image projected by the projection device 3 is referred to as the display means coordinate system, and the area in front of the vehicle 100 on the road surface in real space where the projection device 3 can project the first guide image (hereinafter referred to as The coordinate system of the "projectable area") is called the target area coordinate system.
  • the target area coordinate system is a so-called “geographical coordinate system” similar to the coordinate system in map information.
  • the first machine learning model is based on the correspondence between the position on the first guide image and the position on the road surface, takes into account the distance from the vehicle 100 to the branch point, and converts the points in the target area coordinate system into the display means coordinates.
  • the first machine learning model is generated by an administrator or the like in advance, for example, before shipping the product of the projection control device 1, and is stored in a location that can be referenced by the projection control device 1.
  • the administrator or the like takes the vehicle 100 for a test run and experimentally projects the first guide image from the projection device 3 near a junction.
  • the branch point does not need to be an actual branch point; for example, an administrator or the like may set a point on a road with no branches to be considered as a branch point.
  • the projection device 3 projects the first guide image in front of the vehicle 100, the administrator etc.
  • the first guide image projected onto the road surface is also referred to as a "first projected image.”
  • the administrator or the like also obtains the position of the vehicle 100 when the first guide image is projected onto the road surface in front of the vehicle 100 and the distance from the vehicle 100 to the branch point.
  • the administrator or the like performs this experiment multiple times by changing the distance from the vehicle 100 to the branch point.
  • the administrator or the like causes the learning device to learn, inputs the distance from the vehicle 100 to the branch point, and the coordinates of the measurement point in the first projection image, and outputs the coordinates of the measurement point in the first guide image.
  • the coordinates of the measurement point in the first projection image are expressed in the target area coordinate system
  • the coordinates of the measurement point in the first guide image are expressed in the display means coordinate system.
  • the guide image generation unit 13 Upon obtaining the coordinates of the first end of the first guide image and the coordinates of the second end of the first guide image, the guide image generation unit 13 stores the coordinates in a storage unit (not shown) based on these coordinates.
  • a first guide image is generated in accordance with the rules (hereinafter referred to as "first guide image rules").
  • the storage unit stores rules for a first guide image when generating a first guide image to be projected on the projection device 3, which are set in advance by an administrator or the like.
  • the guide image generation unit 13 projects a first projection image showing an arrow starting near the branch point and extending in the branch direction onto the road surface in front of the vehicle 100, in accordance with the first guide image rules stored in the storage unit.
  • a first guide image is generated so that the The first guide image rule includes, for example, a rule that the bar of the arrow is 70 cm wide and the base of the triangle of the arrow is 1 m wide, and is filled in blue in the direction along the lane after the branch.
  • the guide image generation unit 13 generates a blue arrow with a bar width of 70 cm and a triangular base width of 1 m, starting from the first road surface point and ending at the second road surface point.
  • a first guidance image is generated that projects a first projection image showing an arrow extending in the branching direction on the road surface of the lane.
  • the first guide image rules include, for example, the color, pattern, width, etc.
  • a rule is set such that when the generated first guide image is projected onto the road surface in front of the vehicle 100, the first projected image allows the occupant of the vehicle 100 to recognize the existence of a branch point and the branch direction. All you have to do is stay there. Further, here, it is assumed that the rules for the first guide image are set in advance by the administrator or the like, but this is only an example. For example, the occupant of the vehicle 100 or the like may be able to set the rules for the first guide image as appropriate.
  • the guide image generation unit 13 After generating the first guide image, the guide image generation unit 13 outputs the generated first guide image to the projection control unit 14.
  • the projection control unit 14 causes the projection device 3 to project the first guide image generated by the guide image generation unit 13 onto the road surface in front of the vehicle 100.
  • the guide image generation unit 13 first calculates the coordinates of the first road surface point, and then calculates the coordinates of the second road surface point, and then calculates the first machine learning model. to obtain the coordinates of the first end of the first guide image and the coordinates of the second end of the first guide image in the display means coordinate system, and according to this and the rules for generating the first guide image, An example of generating a guide image was given. Then, the projection control unit 14 causes the projection device 3 to project the first guide image generated by the guide image generation unit 13 and expressed by the coordinates of the display means coordinate system.
  • the projection control device 1 may generate the first guide image and control the projection of the first guide image onto the projection device 3 using other methods.
  • the guide image generation unit 13 stores an initial image (in this case, an arrow image) in advance, and uses the initial image based on the calculated distance between the first road surface point and the second road surface point.
  • the first guide image may be generated by enlarging or reducing the image.
  • the projection control unit 14 projects the first end of the first guide image generated by the guide image generation unit 13 onto the first road surface point calculated by the guide image generation unit 13, and By changing the projection angle of the light emitted from the lamp of the projection device 3 so that the end portion is projected on the second road surface point calculated by the guide image generation unit 13, the front of the vehicle 100 is projected with respect to the projection device 3.
  • the first guide image may be projected onto the road surface.
  • FIG. 2 is a flowchart for explaining an example of the operation of the projection control device 1 according to the first embodiment. For example, when the power of the vehicle 100 is turned on and the vehicle 100 starts running, the projection control device 1 repeatedly performs the operation shown in the flowchart of FIG. 2 until the power of the vehicle 100 is turned off. .
  • the guide route information acquisition unit 11 acquires guide route information from the navigation device 2 (step ST1).
  • the guidance route information acquisition unit 11 outputs the acquired guidance route information to the branch determination unit 12.
  • the branch determination unit 12 determines whether the vehicle 100 is approaching a branch point based on the guidance route information acquired by the guidance route information acquisition unit 11 in step ST1 (step ST2).
  • step ST1 If the branch determining unit 12 determines that the vehicle 100 is not approaching the branch point (“NO” in step ST1), the operation of the projection control device 1 returns to the processing in step ST1.
  • the branch determination unit 12 determines that the vehicle 100 is approaching a branch point (“YES” in step ST1)
  • the branch determination unit 12 transmits the branch determination result that the vehicle 100 is approaching the branch point to the guidance image generation unit 13. Output to.
  • the branch determining unit 12 determines that the vehicle 100 is approaching a branch point for the first time after the power of the vehicle 100 is turned on and the vehicle 100 starts traveling. In this case, the branch determination unit 12 outputs a branch determination result indicating that the vehicle 100 is approaching a branch point to the guide image generation unit 13.
  • the guide image generation unit 13 generates a first guide image based on the guide route information acquired by the guide route information acquisition unit 11 in step ST1 (step ST3).
  • the guide image generation unit 13 When the guide image generation unit 13 generates the first guide image, it outputs the generated first guide image to the projection control unit 14 .
  • the projection control unit 14 causes the projection device 3 to project the first guide image generated by the guide image generation unit 13 in step ST3 onto the road surface in front of the vehicle 100 (step ST4).
  • the road surface in front of the vehicle 100 is in a state where the first projection image is projected, for example, as shown in FIG.
  • FIG. 3 is for explaining an example of how the projection control unit 14 of the projection control device 1 causes the projection device 3 to project the first guide image onto the road surface in front of the vehicle 100 in the first embodiment.
  • FIG. 3 is an overhead view of a branch point and the vehicle 100 approaching the branch point.
  • the branch point is indicated by "BP”
  • the first projection image projected onto the road surface in front of the vehicle 100 is indicated by "Ai1".
  • the first road surface point is indicated by "E1"
  • the second road surface point is indicated by "E2”. Since the direction of the second road surface point is the branching direction, it can be seen in FIG. 3 that the guide route for the vehicle 100 is a route that turns right at the branching point. In addition, in FIG. 3, it is assumed that the position of the first road surface point overlaps with the position of the branch point. Now, it is assumed that the branch determining unit 12 has determined that the vehicle 100 is approaching a certain branch point for the first time after the power of the vehicle 100 is turned on and the vehicle 100 starts traveling. The distance from 100 to the branch point is the approach determination threshold. A first projection image in which the distance between the first road surface point and the second road surface point is a reference distance is projected onto the road surface in front of the vehicle 100.
  • step ST4 when the projection control unit 14 causes the projection device 3 to project the first guide image onto the road surface in front of the vehicle 100, the control unit (not shown) of the projection control device 1 controls the vehicle 100 is turned off. If the power of the vehicle 100 is not turned off, the operation of the projection control device 1 returns to the process of step ST1 and performs the process from step ST1 again. Here, it is assumed that the power of vehicle 100 is not turned off.
  • step ST2 the branch determining unit 12 determines that the vehicle 100 is approaching the branch point (in the case of "YES" in step ST2).
  • the guide image generation unit 13 generates a first guide image (step ST3) based on the guide route information acquired by the guide route information acquisition unit 11 in step ST1, and the projection control unit 14 generates a first guide image for the projection device 3.
  • the first guide image generated by the guide image generator 13 in step ST3 is projected onto the road surface in front of the vehicle 100 (step ST4).
  • the road surface in front of the vehicle 100 is in a state where the first projection image is projected, for example, as shown in FIG. 4 .
  • FIG. 4 illustrates another example of how the projection control unit 14 of the projection control device 1 causes the projection device 3 to project the first guide image onto the road surface in front of the vehicle 100 in the first embodiment.
  • FIG. 4 is an overhead view of a branch point and the vehicle 100 approaching the branch point.
  • FIG. 4 shows the fork and the vehicle 100 approaching the fork from above when the vehicle 100 approaches the fork as shown in FIG. 3 and further approaches the fork. This is an overhead view.
  • the guide image generation unit 13 sets a position closer to the first road point by an amount corresponding to the distance that the vehicle 100 approaches the junction than the position shown in FIG. Determine the position of the road surface point. That is, the guide image generation unit 13 generates a first guide image in which the second end is brought closer to the first end by an amount corresponding to the distance that the vehicle 100 is closer to the branch point than the position shown in FIG. do.
  • the projection control unit 14 causes the projection device 3 to project a first guide image generated by the guide image generation unit 13 with the second end close to the first end on the road surface in front of the vehicle 100. .
  • the first projected image shown in FIG. 4 becomes shorter toward the first road surface point than the first projected image shown in FIG. 3.
  • the arrows generated by the guide image generation unit 13 and projected onto the road surface in front of the vehicle 100 by the projection control unit 14 have a higher end point in the arrow shown in FIG. 4 than in the arrow shown in FIG. 3. becomes the arrow approaching the starting point.
  • FIG. 4 for ease of understanding, the difference between the first projection image shown in FIG. 3 and the first projection image shown in FIG. 4 is shown by a dotted line.
  • the guide image generation unit 13 stores the branch determination result in a location where the projection control device 1 can refer to it.
  • the branch approach flag When the branch approach flag has an initial value of "0", the guide image generation unit 13 determines that it is determined for the first time that the vehicle 100 approaches a certain branch point.
  • the guide image generation unit 13 determines a point that is a reference distance away from the first road surface point as the second road surface point in accordance with the end point determination conditions, it sets "1" in the branch approach flag.
  • the guide image generation unit 13 stores the determined coordinates of the first road point and the determined second road point in a storage unit provided at a location where the projection control device 1 can refer to them. .
  • the guide image generation unit 13 refers to the branch approach flag, and if the branch approach flag is set to "1", the guidance image generation unit 13 moves to a certain branch point. In other words, it is determined that the first projection image is already projected on the road surface in front of the vehicle 100 after it is determined for the first time that the vehicle 100 is approaching.
  • the guide image generation unit 13 generates a point that moves the position of the second road surface point stored in the storage unit closer to the first road surface point according to the end point determination conditions according to the distance from the vehicle 100 to the branch point. is determined as the second road surface point. Then, the guide image generation unit 13 updates the coordinates of the second road surface point stored in the storage unit. As a result, the projection control device 1 generates a first guide image in which the second end approaches the first end as the vehicle 100 approaches the branch point. A first projected image in which the second road surface point approaches the first road surface point, in other words, an arrow in which the end point approaches the starting point, can be projected in front of the vehicle 100.
  • step ST2 the branch determining unit 12 also determines whether the vehicle 100 has reached the branch point. If it is determined that the vehicle 100 has reached the branch point, the branch determination section 12 outputs the branch point arrival information to the projection control section 14 . In this case, the projection control unit 14 ends the projection of the first guide image, and the operation of the projection control device 1 skips the processing of step ST3 and step ST4, and returns to the processing of step ST1. At this time, the control section of the projection control device 1 clears the branch approach flag and the coordinates of the first road surface point and the second road surface point stored in the storage section. Note that even when the power of the vehicle 100 is turned off or the power of the vehicle 100 is turned on, the control unit displays the branch approach flag and the coordinates of the first road surface point stored in the storage unit. and the coordinates of the second road surface point.
  • the projection control device 1 repeats the above operations until, for example, the power of the vehicle 100 is turned off.
  • the projection control device 1 determines that the vehicle 100 is approaching a branch point on the guide route based on the guide route information acquired from the navigation device 2, the projection control device 1 displays a number indicating the existence of the branch point and the branch direction.
  • a first guide image is generated, and the projection device 3 is caused to project the generated first guide image onto the road surface in front of the vehicle 100.
  • the projection control device 1 determines the position of the vehicle 100, the position of the branch point, the position of a first road surface point on the road surface onto which the first end of the first guide image is projected, and the first end of the first guide image.
  • the projection control device 1 can allow the occupants of the vehicle 100 to grasp the distance to the fork and the direction of the fork without removing the fork from the driver's field of view.
  • the length of the first guide image projected on the road surface that is, the first projected image toward the fork becomes shorter as the vehicle 100 approaches the fork.
  • the left and right ends of the first projected image in other words, the start and end points of the arrow projected on the road surface do not fall out of the occupant's field of view.
  • the occupant of the vehicle 100 does not have to move his/her line of sight in a direction where the fork is out of sight in an attempt to grasp the distance to the fork and the fork in the direction. You can understand the branching direction.
  • the first guide image is an arrow image, but this is only an example.
  • the first guide image may be an image that starts near the branch point and extends in the branch direction to indicate the branch direction, and is an image that indicates the presence of the branch point and the branch direction.
  • the projection control device 1 generates a first guide image in which the second end approaches the first end as the vehicle 100 approaches the branch point, and projects the first guide image onto the road surface in front of the vehicle 100.
  • the projection device 3 is configured to project the first guide image such that the first projected image becomes shorter in the direction of the fork as the vehicle 100 approaches the fork as viewed from the occupant of the vehicle 100. It should be .
  • the position of the first road surface point is a straight line passing through a point indicating the position of the vehicle 100 and parallel to the lane in which the vehicle 100 is traveling (here, the lane is a so-called lane);
  • the intersection point between the branched lane and a straight line that passes through the widthwise center of the branched lane (the lane referred to here is a so-called lane) and is parallel to the branched lane, this is only an example.
  • the projection control device 1 may set the first road point to a point within a predetermined range from the branch point. Note that the projection control device 1 can calculate the coordinates of the position of a point within a predetermined range from the branch point from the guide route information.
  • the position of the first road surface point is determined by a straight line that passes through a point indicating the position of the vehicle 100 and is parallel to the lane in which the vehicle 100 is traveling (the lane referred to here is a).
  • the location of the intersection with a straight line that passes through the widthwise center of the lane and is parallel to the lane after branching is more likely to indicate the existence of the branch and to the occupants of vehicle 100 than the location of any other point. This location makes it easy to gauge the distance to the junction. This is because it is easier for the occupants of the vehicle 100 to confirm the starting point of the first projected image, that is, near the branch point, if the starting point of the first projected image is in the front direction.
  • the first projection image is a lane ahead of the vehicle 100 after the branch (here, This is an image on the road surface of a so-called lane (see FIGS. 3 and 4), but this is only an example.
  • the projection control device 1 may cause the projection device 3 to project the first guide image onto a location other than the road surface of the lane after the branch (here, the lane is a so-called lane). It is sufficient that the first road surface point is located near the branch point.
  • the projection control device 1 is an in-vehicle device mounted on the vehicle 100, and includes the guidance route information acquisition section 11, the branch determination section 12, the guidance image generation section 13, and the projection control section. 14 and a control section (not shown) are assumed to be included in the vehicle-mounted device.
  • the present invention is not limited to this, and some of the guidance route information acquisition unit 11, branch determination unit 12, guidance image generation unit 13, projection control unit 14, and a control unit (not shown) may be provided in the on-vehicle device of the vehicle 100.
  • other components may be provided in a server connected to the in-vehicle device via a network.
  • the guide route information acquisition section 11, the branch determination section 12, the guide image generation section 13, the projection control section 14, and a control section may all be included in the server.
  • FIG. 5A and 5B are diagrams showing an example of the hardware configuration of the projection control device 1 according to the first embodiment.
  • the functions of the guide route information acquisition section 11, the branch determination section 12, the guide image generation section 13, the projection control section 14, and a control section are realized by the processing circuit 1001. That is, the projection control device 1 performs a process for controlling the projection device 3 to project the first guide image generated based on the guide route information acquired from the navigation device 2 onto the road surface in front of the vehicle 100.
  • a circuit 1001 is provided.
  • Processing circuit 1001 may be dedicated hardware as shown in FIG. 5A, or may be processor 1004 that executes a program stored in memory 1005 as shown in FIG. 5B.
  • the processing circuit 1001 is dedicated hardware, the processing circuit 1001 is, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), or an FPGA (Field-Programmable Circuit). Gate Array), or a combination of these.
  • ASIC Application Specific Integrated Circuit
  • FPGA Field-Programmable Circuit
  • the processing circuit is the processor 1004, the functions of the guidance route information acquisition unit 11, branch determination unit 12, guidance image generation unit 13, projection control unit 14, and control unit (not shown) are implemented by software, firmware, or software. This is realized by a combination of and firmware.
  • Software or firmware is written as a program and stored in memory 1005.
  • the processor 1004 controls the guidance route information acquisition unit 11, the branch determination unit 12, the guidance image generation unit 13, the projection control unit 14, and a control unit (not shown). perform the functions of That is, the projection control device 1 includes a memory 1005 for storing a program that, when executed by the processor 1004, results in the execution of steps ST1 to ST4 in FIG. 2 described above.
  • the program stored in the memory 1005 can be used to explain the processing procedures or methods of the guidance route information acquisition unit 11, the branch determination unit 12, the guidance image generation unit 13, the projection control unit 14, and a control unit (not shown) to the computer. It can also be said that it is something that can be carried out.
  • the memory 1005 is, for example, RAM, ROM (Read Only Memory), flash memory, EPROM (Erasable Programmable Read Only Memory), EEPROM (registered trademark) (Electrically Erasable Programmable Read-Only Memory) or other non-volatile This includes volatile semiconductor memories, magnetic disks, flexible disks, optical disks, compact disks, mini disks, DVDs (Digital Versatile Discs), and the like.
  • the functions of the guidance route information acquisition unit 11, the branch determination unit 12, the guidance image generation unit 13, the projection control unit 14, and a control unit are realized by dedicated hardware; may be realized by software or firmware.
  • the function of the guidance route information acquisition unit 11 is realized by the processing circuit 1001 as dedicated hardware, and the function of the branch determination unit 12, the guidance image generation unit 13, the projection control unit 14, and a control unit (not shown) is realized.
  • the functions can be realized by the processor 1004 reading and executing a program stored in the memory 1005.
  • the projection control device 1 also includes an input interface device 1002 and an output interface device 1003 that perform wired or wireless communication with devices such as the navigation device 2 or the projection device 3.
  • a storage unit (not shown) includes a memory 1005 and the like.
  • the projection control device 1 includes the guide route information acquisition unit 11 that acquires guide route information regarding the guide route of the vehicle 100, and the guide route information acquired by the guide route information acquisition unit 11. If the branch determining unit 12 determines that the vehicle 100 is approaching a branch point on the guide route information, the branch determining unit 12 determines whether the vehicle 100 is approaching a branch point on the guide route.
  • the guide image generating unit 13 generates a first guide image indicating the existence of a branch point and the branch direction based on the information, and the projection device 3 displays the first guide image generated by the guide image generating unit 13 on the road surface in front of the vehicle 100.
  • the guide image generator 13 includes a projection control unit 14 that projects a guide image, and a guide image generator 13 that determines the position of the vehicle 100, the distance from the vehicle 100 to the branch point, and the road surface on which the first end of the first guide image is projected.
  • the position of the first road point above and the second end opposite to the first end of the first guide image are projected further away from the first road point in the direction of the fork with respect to the fork.
  • a first guide image is generated based on the position of the second road surface point on the road surface, and as the vehicle 100 approaches the branch point, a first guide image is generated in which the second end approaches the first end. It was configured as follows. Therefore, in route guidance by projecting images onto the road surface, the projection control device 1 can allow the occupants of the vehicle 100 to grasp the distance to the fork and the direction of the fork without removing the fork from the driver's field of view.
  • the projection control device causes the projection device to project a first guide image indicating the presence of a branch point and the branch direction on the road surface in front of the vehicle.
  • the projection control device includes a shielding object that blocks a region on which the projection device projects the first guide image (hereinafter referred to as “first guide image projection region”) out of the projectable region of the projection device. It is determined whether or not there is an obstruction, and if it is determined that there is an obstruction, the projection device is caused to project a guidance image (hereinafter referred to as "second guidance image”) indicating the direction of the branch point as seen from the vehicle.
  • first guide image projection region a shielding object that blocks a region on which the projection device projects the first guide image (hereinafter referred to as “first guide image projection region”) out of the projectable region of the projection device. It is determined whether or not there is an obstruction, and if it is determined that there is an obstruction, the projection device is caused to project a guidance image (hereinafter referred to as "second guidance image”) indicating
  • FIG. 6 is a diagram showing a configuration example of a projection control device 1a according to the second embodiment.
  • the projection control device 1a is mounted on the vehicle 100.
  • the projection control device 1a according to the second embodiment is connected to the sensor 4 in addition to the navigation device 2 and the projection device 3. Ru.
  • the sensor 4 is mounted on the vehicle 100 and detects objects present in front of the vehicle 100.
  • the sensor 4 outputs information regarding an object detected in front of the vehicle 100 (hereinafter referred to as "sensor information") to the projection control device 1a.
  • the sensor 4 is assumed to be an imaging device that captures an image in front of the vehicle 100.
  • the sensor information is a captured image of the front of the vehicle 100.
  • the senor 4 will be described as an imaging device, and the sensor information will be described as a captured image. Note that this is just an example, and the sensor 4 is not limited to an imaging device.
  • the sensor 4 includes, for example, a lidar or other device capable of detecting the distance to an object present in front of the vehicle 100, such as a distance sensor.
  • the projection control device 1a according to the second embodiment differs from the projection control device 1 according to the first embodiment in that it includes a sensor information acquisition section 15 and an obstruction detection section 16. Further, the specific operations of the guide image generation unit 13a and the projection control unit 14a in the projection control device 1a according to the second embodiment are the same as those of the guide image generation unit 13 and the projection control unit 14a in the projection control device 1 according to the first embodiment, respectively. This is different from the specific operation of the control unit 14.
  • the sensor information acquisition unit 15 acquires a captured image captured by the image capture device from the image capture device. Details of the guide image generation unit 13a will be described later.
  • the sensor information acquisition unit 15 outputs the acquired captured image to the obstructing object detection unit 16 together with information regarding the first guide image projection area output from the guide image generation unit 13a.
  • the shielding object detection unit 16 detects whether or not there is a shielding object that blocks the first guide image projection area, based on the captured image acquired by the sensor information acquisition unit 15. Note that since the installation position and detection range of the sensor 4, in this case the installation position and viewing angle of the imaging device, are known in advance, the shielding object detection unit 16 uses the coordinate system of the captured image and the first guide image projection area. The relative positional relationship with the coordinate system can be grasped.
  • the coordinate system of the first guide image projection area is a target area coordinate system, and is a so-called "geographical coordinate system.”
  • the obstructing object detection unit 16 detects whether or not there is an obstructing object that obstructs the first guide image projection area by, for example, performing a known image recognition process or pattern matching on the captured image. .
  • the shielding object detection section 16 outputs a detection result (hereinafter referred to as "obstruction detection result") as to whether or not there is a shielding object that blocks the first guide image projection area to the guide image generation section 13a.
  • the guide image generating unit 13a calculates the coordinates of the first road point and the coordinates of the second road point.
  • the position of the first road surface point is, for example, the lane in which the vehicle 100 is traveling through the point indicating the position of the vehicle 100 (the lane referred to here is the so-called lane).
  • a straight line that passes through the widthwise center of the lane after the branch here, the lane is what is called a lane
  • the guide image generating section 13a Based on the coordinates of the position of the first road surface point, the guide image generating section 13a generates a second road surface when the branch determining section 12 first determines that the vehicle 100 approaches a certain branch point, in accordance with the end point determination conditions. Calculate the coordinates of a point.
  • the guide image generating unit 13a calculates the coordinates of the position of the second road surface point using the same method as the method of calculating the coordinates of the position of the second road surface point by the guide image generating unit 13, which has already been explained in the first embodiment. do it.
  • the guide image generation unit 13a calculates a first guide image projection area from the calculated coordinates of the position of the first road surface point and the coordinates of the position of the second road surface point.
  • the guide image generation unit 13a outputs information regarding the first guide image projection area to the sensor information acquisition unit 15.
  • the information regarding the first guide image projection area may be, for example, information indicating the coordinates of the entire circumference of the first projection image, or may be the coordinates of the four corners of the minimum rectangle surrounding the first projection image.
  • the guide image generating section 13a generates a first guide image or a first guide image and a second guide image based on the shielding object detection result output from the shielding object detecting section 16. Specifically, when the shielding object detection unit 16 outputs a shielding object detection result indicating that there is a shielding object that blocks the first guiding image projection area, the guiding image generating unit 13a generates a first guiding image and a second guiding image. 2. Generate a guide image. The guide image generation unit 13a generates a first guide image when the shield detection unit 16 outputs a shield detection result indicating that there is no shield that blocks the first guide image projection area. In this case, the guide image generation unit 13a does not generate the second guide image.
  • the guide image generating unit 13a uses the same method as the method by which the guide image generating unit 13 generates the first guide image, which has already been explained in Embodiment 1. Since it is sufficient to generate the first guide image using the steps shown in FIG.
  • the second guide image is an image showing a rectangle.
  • vehicle front point a point located in front of the vehicle 100 by a predetermined distance (hereinafter referred to as "vehicle front point") and the first road surface point.
  • a second guide image having a preset width in this case an image representing a rectangle, is projected.
  • the second guide image projected onto the road surface is also referred to as a "second projected image.” Note that the predetermined distance in front of the vehicle 100 is appropriately set by an administrator or the like.
  • An administrator or the like sets a distance smaller than at least the approach determination threshold value to a predetermined distance in front of the vehicle 100, and stores it in a location that can be referenced by the projection control device 1a.
  • the end of the second guide image corresponding to the first road surface point is referred to as a third end.
  • the end of the second guide image corresponding to the point in front of the vehicle is referred to as a fourth end.
  • the guide image generation unit 13a calculates the coordinates of a point in front of the vehicle from the position of the vehicle 100. Then, the guide image generation unit 13a inputs the distance from the vehicle 100 to the branch point and the coordinates of the first road surface point to the first machine learning model, and obtains the coordinates of the third end of the second guide image. Note that here, the coordinates of the third end of the second guide image are the same as the coordinates of the first end of the first guide image. Further, the guide image generation unit 13a converts the coordinates of the point in front of the vehicle into the coordinates of the display means coordinate system based on the information regarding the relative positional relationship between the target area coordinate system and the display means coordinate system, and generates a second guide image.
  • the information regarding the relative positional relationship between the target area coordinate system and the display means coordinate system is, for example, a coordinate conversion parameter that can convert the coordinates of the target area coordinate system to the coordinates of the display means coordinate system.
  • the administrator or the like may project the second guide image from the projection device 3 in advance, such as before shipping the product of the projection control device 1a, on the projection surface on which the second guide image is projected (for example, A plurality of measurement points of the second projection image on the road surface) and a plurality of measurement points of the second guide image are obtained, and a positional correspondence relationship between the second projection image and the second guide image is derived.
  • the target area coordinate system can convert the coordinates of the target area coordinate system to the coordinates of the display means coordinate system based on the correspondence between the position on the second projection image and the position on the second guide image. Calculate coordinate transformation parameters. Then, the administrator or the like stores the calculated coordinate transformation parameters in a location where the projection control device 1a can refer to them.
  • the guide image generation unit 13a Upon obtaining the coordinates of the first end of the first guide image and the coordinates of the second end of the first guide image, the guide image generation unit 13a stores the coordinates in a storage unit (not shown) based on these coordinates.
  • a second guide image is generated in accordance with the rules (hereinafter referred to as "second guide image rules").
  • the storage unit stores, together with the first guide image rules, rules for the second guide image when creating the second guide image to be projected on the projection device 3, which are set in advance by an administrator or the like. remembered.
  • the guide image generation unit 13a generates a second guide image indicating the direction of the branch point as seen from the vehicle 100, in accordance with the second guide image rules stored in the storage unit.
  • the second guide image rule stores, for example, a rule that the second guide image is painted in blue in a width of 70 cm in the direction along the lane in which the vehicle 100 is traveling.
  • the guide image generation unit 13a is a rectangle with a width of 70 cm extending from the fourth end toward the third end, and is formed on the lane in which the vehicle 100 is traveling (the lane referred to here is a so-called lane).
  • a second guide image is generated that projects a second guide image showing a rectangle extending from the point in front of the vehicle toward the branch point.
  • the second guide image rules include, for example, the color, pattern, width, etc.
  • a rule is set such that when the generated second guide image is projected onto the road surface in front of the vehicle 100, the second projected image is such that the occupant of the vehicle 100 can recognize the direction of the branch point as seen from the vehicle 100. That's fine.
  • the rules for the second guide image are set in advance by the administrator or the like, but this is only an example. For example, the occupant of the vehicle 100 or the like may be able to set the rules for the second guide image as appropriate.
  • the guide image generation unit 13a After generating the first guide image or the first guide image and the second guide image, the guide image generation unit 13a performs projection control on the generated first guide image or the first guide image and the second guide image.
  • the output signal is output to the section 14a.
  • the guide image generation section 13a also outputs the shielding object detection result outputted from the shielding object detection section 16 to the projection control section 14a.
  • the projection control section 14a causes the projection device 3 to project the first guide image generated by the guide image generation section 13 onto the road surface in front of the vehicle 100.
  • the projection control section 14a causes the projection device 3 to detect the presence of a shielding object on the road surface in front of the vehicle 100.
  • the second guide image generated by the guide image generator 13 is projected.
  • FIG. 7 is a flowchart for explaining an example of the operation of the projection control device 1a according to the second embodiment. For example, when the power of the vehicle 100 is turned on and the vehicle 100 starts running, the projection control device 1a repeatedly performs the operation shown in the flowchart of FIG. 7 until the power of the vehicle 100 is turned off. .
  • the guide image generating unit 13a When the branch determining unit 12 determines in step ST12 that the vehicle 100 is approaching a branch point (“YES” in step ST12), the guide image generating unit 13a generates the coordinates of the position of the first road point, The coordinates of the position of the second road surface point are calculated. For example, assume that the branch determining unit 12 determines that the vehicle 100 is approaching a branch point for the first time after the power of the vehicle 100 is turned on and the vehicle 100 starts traveling. The guide image generation unit 13a calculates a first guide image projection area from the calculated coordinates of the position of the first road surface point and the coordinates of the position of the second road surface point (step ST13). The guide image generation unit 13a outputs information regarding the first guide image projection area to the sensor information acquisition unit 15.
  • the sensor information acquisition unit 15 acquires the captured image captured by the imaging device (step ST14). It is now assumed that the branch determining unit 12 has determined that the vehicle 100 is approaching a branch point only after the power of the vehicle 100 is turned on and the vehicle 100 starts traveling. Therefore, the projection control device 1a is in a state where the projection device 3 has not yet projected the guidance image (the first guidance image and the second guidance image) regarding a certain branch point onto the road surface in front of the vehicle 100. be.
  • the captured image acquired by the sensor information acquisition unit 15 is a captured image of the front of the vehicle 100 on which the first guide image and the second guide image are not projected (hereinafter referred to as "pre-projection captured image”).
  • the sensor information acquisition unit 15 outputs the acquired pre-projection captured image to the shielding object detection unit 16 together with the information regarding the first guide image projection area output from the guide image generation unit 13a.
  • the shielding object detection unit 16 detects whether or not there is a shielding object that blocks the first guide image projection area based on the pre-projection captured image acquired by the sensor information acquisition unit 15 in step ST14 (step ST15). .
  • the shielding object detection section 16 outputs the shielding object detection result to the guide image generation section 13a.
  • step ST15 when the shielding object detection section 16 detects that there is a shielding object that shields the first guide image projection area (in the case of "YES" in step ST15), the guide image generating section 13a An image and a second guide image are generated (step ST16).
  • the guide image generation unit 13a outputs the generated first guide image and second guide image to the projection control unit 14a.
  • the guide image generation section 13a also outputs the shielding object detection result outputted from the shielding object detection section 16 to the projection control section 14a.
  • the projection control unit 14a causes the projection device 3 to project the first guide image and the second guide image generated by the guide image generation unit 13 in step ST16 onto the road surface in front of the vehicle 100 (step ST17).
  • the road surface in front of the vehicle 100 is in a state where the first projection image and the second projection image are projected, for example, as shown in FIG. 8 .
  • FIG. 8 shows a state in which the projection control unit 14a of the projection control device 1a causes the projection device 3 to project the first guide image and the second guide image on the road surface in front of the vehicle 100 in the second embodiment.
  • FIG. 3 is a diagram for explaining an example. Note that FIG. 8 is an overhead view of a branch point and the vehicle 100 approaching the branch point. In FIG.
  • the branch point is indicated by "BP"
  • the first projection image projected onto the road surface in front of the vehicle 100 is indicated by "Ai1”.
  • the first road surface point is indicated by "E1”
  • the second road surface point is indicated by "E2”. Since the direction of the second road surface point is the branching direction, it can be seen in FIG. 8 that the guide route for the vehicle 100 is a route that turns left at the branching point.
  • the position of the first road surface point overlaps with the position of the branch point.
  • another vehicle exists in front of the vehicle 100.
  • other vehicles are indicated by "V”.
  • a building exists on the side of the vehicle 100. In FIG. 8, the building is indicated by "BLDG”.
  • the lane after branching (here, the lane is a so-called lane) It is not projected onto the top.
  • the first projection image is projected on the lane after the branch (the lane referred to here is a so-called lane).
  • the third road surface point is indicated by "E3”, and the point in front of the vehicle is indicated by "E4". The branch point, the first road surface point, and the third road surface point are hidden by other vehicles.
  • the first projected image is blocked by other vehicles or buildings and is not projected onto the lane after the branch, but the second projected image extending from the point in front of the vehicle toward the branch point is blocked by other vehicles or buildings. Projected without obstruction. Even if the occupant of vehicle 100 cannot see the first projected image, he or she can recognize that there is a branch point ahead by looking at the second projected image.
  • the guide image generating unit 13a A first guide image is generated (step ST18).
  • the guide image generation unit 13a outputs the generated first guide image to the projection control unit 14a.
  • the guide image generation section 13a also outputs the shielding object detection result outputted from the shielding object detection section 16 to the projection control section 14a.
  • the projection control unit 14a causes the projection device 3 to project the first guide image generated by the guide image generation unit 13a in step ST18 onto the road surface in front of the vehicle 100 (step ST19).
  • step ST17 After performing the process of step ST17 or the process of step ST19, the operation of the projection control device 1a returns to step ST11 and proceeds to the subsequent processes again.
  • the sensor information acquisition unit 15 has already projected the first guide image or the first guide image and the second guide image by the projection device 3, in other words, the first projected image
  • a captured image in which the front of the vehicle 100 on which the first projection image and the second projection image are projected (hereinafter referred to as "post-projection captured image”) is acquired.
  • post-projection captured image a captured image in which the front of the vehicle 100 on which the first projection image and the second projection image are projected.
  • the shielding object detection unit 16 detects whether or not there is a shielding object that shields the first guide image projection area based on the captured image after projection.
  • the guide image generating unit 13a When the shielding object detection unit 16 detects that there is a shielding object that blocks the first guide image projection area, the guide image generating unit 13a generates a first guide image and a second guide image (step ST16), and performs projection control.
  • the unit 14a causes the projection device 3 to project the first guide image and the second guide image onto the road surface in front of the vehicle 100 (step ST17).
  • the projection control unit 14a changes from projecting only the first guide image to projecting the first guide image and the second guide image. Control over the projection device 3 will be switched.
  • the projection control unit 14a controls the projection device 3 to project the first guide image and the second guide image.
  • the shielding object detection unit 16 detects that there is no shielding object that blocks the first guide image projection area
  • the guide image generation unit 13a generates the first guide image (step ST18)
  • the projection control unit 14a The projection device 3 is caused to project the first guide image onto the road surface in front of the vehicle 100 (step ST19).
  • the projection control unit 14a changes the projection of only the first guide image from the projection of the first guide image and the second guide image. Control over the projection device 3 is switched to projection. If the projection device 3 was previously projecting only the first guide image, the projection control unit 14a continues to control the projection device 3 to project only the first guide image.
  • the projection control device 1a repeats the above operations until, for example, the power of the vehicle 100 is turned off.
  • step ST16 and the processing in step ST18 are assumed to be performed after the processing in step ST15, but this is only an example.
  • the processing in step ST16 and the processing in step ST18 may be performed together in step ST12.
  • the guide image generation unit 13a generates the first guide image and the second guide image when calculating the first guide image projection area before the obstruction detection unit 16 detects the presence or absence of an obstruction. You may also leave it there.
  • the projection control unit 14a controls whether the projection device 3 projects the first guide image and the second guide image, or projects only the first guide image, based on the obstruction detection result. do it.
  • the projection control device 1a selects an area on the road surface onto which the first guide image is projected (the first guide image If it is detected that there is a blocking object that blocks the projection area), in addition to the first guide image, a second guide image indicating the direction of the branch point as seen from the vehicle 100 is generated, and the projection device 3 In addition to the first guide image, a second guide image is projected onto the road ahead.
  • the projection control device 1a displays a first projected image to the occupant of the vehicle 100 that allows the occupant to recognize the existence of the branch point and the branch direction, due to the presence of a shield on the road surface in front of the vehicle 100.
  • the direction of the branch point can be presented to the occupants of the vehicle 100 by projecting the second guide image onto the projection device 3.
  • the projection control device 1a can make the occupants of the vehicle 100 aware that they are about to reach a fork in the road.
  • the projection control device 1a also provides a first projected image for the occupant of the vehicle 100 in which there is no obstruction on the road surface in front of the vehicle 100 and the occupant can recognize the presence of the branch point and the branch direction, as intended.
  • the projection device 3 projects only the first guide image and does not project the second guide image.
  • the projection device 3 can suppress power consumption for unnecessary projection of the second guide image.
  • the guidance image generation section 13a when the shielding object detection section 16 detects that there is a shielding object that blocks the first guidance image projection area, the guidance image generation section 13a generates the first guidance image and the second guidance image.
  • the projection control section 14a caused the projection device 3 to project the first guide image and the second guide image generated by the guide image generation section 13 onto the road surface in front of the vehicle 100.
  • the first projection image and the second projection image are projected onto the road surface in front of the vehicle 100, as shown in FIG. 8, for example.
  • the first projected image is blocked by other vehicles, buildings, etc., and is not projected onto the lane after the branch.
  • the projection control device 1a when the projection control device 1a detects that there is a shielding object that blocks the first guide image projection area, the projection control device 1a stops controlling the projection of the first guide image to the projection device 3, The first guide image may not be projected from the projection device 3. Specifically, in the projection control device 1a, when the blocking object detection unit 16 detects that there is a blocking object that blocks the first guide image projection area, the projection control unit 14a instructs the projection device 3 to The second guide image generated by the guide image generator 13a is projected onto the road surface in front of the vehicle 100, and the projection of the first guide image is stopped.
  • a configuration example of the projection control device 1a in this case is as shown in FIG. 6.
  • step ST16 when the guide image generation section 13a generates the first guide image and the second guide image, the projection control section 14a causes the projection device 3 to project the first guide image. If so, the projection of the first guide image is stopped (step ST171).
  • the projection control unit 14a does not cause the projection device 3 to project the first guide image, it maintains a state in which the first guide image is not projected. Furthermore, the projection control unit 14a projects the second guide image onto the road surface in front of the vehicle 100 (step ST172).
  • FIG. 10 shows that in the second embodiment, the projection control unit 14a of the projection control device 1a causes the projection device 3 to stop projecting the first guide image onto the road surface in front of the vehicle 100
  • FIG. 4 is a diagram for explaining an example of how the second guide image is projected onto the road surface of the vehicle.
  • FIG. 10 is an overhead view of a branch point and the vehicle 100 approaching the branch point. The state shown in FIG. 10 differs from the state shown in FIG.
  • the projection device 3 does not project the first guide image, in other words, the first projection image is not projected on the road surface in front of the vehicle 100. are different. Note that in FIG. 10, the first projection image shown in FIG. 8 is shown by a dotted line for ease of understanding.
  • step ST17 After performing the process of step ST172 or the process of step ST19, the operation of the projection control device 1a returns to step ST11 and proceeds to the subsequent processes again.
  • step ST14 which is performed again, the sensor information acquisition unit 15 acquires a post-projection captured image.
  • the shielding object detection unit 16 detects whether or not there is a shielding object that shields the first guide image projection area based on the captured image after projection.
  • the guide image generating unit 13a When the shielding object detection unit 16 detects that there is a shielding object that blocks the first guide image projection area, the guide image generating unit 13a generates a first guide image and a second guide image (step ST16), and performs projection control.
  • the unit 14a causes the projection device 3 to stop projecting the first guide image onto the road surface in front of the vehicle 100 (step ST171), and causes the projection device 3 to project the second guide image onto the road surface in front of the vehicle 100. (Step ST172). If the first guide image was projected onto the projection device 3 last time, the projection control unit 14a will stop the projection of the first guide image. Furthermore, if the projection device 3 was previously projecting only the second guide image, the projection control unit 14a continues to control the projection device 3 to project only the second guide image.
  • the guide image generation unit 13a When the shielding object detection unit 16 detects that there is no shielding object that blocks the first guide image projection area, the guide image generation unit 13a generates the first guide image (step ST18), and the projection control unit 14a, The projection device 3 is caused to project the first guide image onto the road surface in front of the vehicle 100 (step ST19).
  • the projection control unit 14a changes the projection device 3 from projecting only the second guide image to projecting only the first guide image. Control will be switched. If the projection device 3 was previously projecting only the first guide image, the projection control unit 14a continues to control the projection device 3 to project only the first guide image.
  • the projection control device 1a repeats the above operations until, for example, the power of the vehicle 100 is turned off.
  • the processing is performed in the order of step ST171 and step ST172, but this is only an example.
  • the order of the processing in step ST171 and the processing in step ST172 may be reversed, or the processing in step ST171 and the processing in step ST172 may be performed in parallel.
  • the guide image generation unit 13a generates the first guide image in step ST16, but this is only an example.
  • the guide image generation unit 13a does not need to generate the first guide image in step ST16.
  • the guide image generation unit 13a can reduce the processing load by not generating the first guide image.
  • the projection control device 1a selects an area on the road surface onto which the first guide image is projected (the first guide image If it is detected that there is a blocking object that blocks the projection area), in addition to the first guide image, a second guide image indicating the direction of the branch point as seen from the vehicle 100 is generated, and the projection device 3
  • the second guide image may be projected onto the road surface in front of the vehicle, and the projection device 3 may be made to stop projecting the first guide image onto the road surface in front of the vehicle 100.
  • the projection control device 1a displays a first projected image to the occupant of the vehicle 100 that allows the occupant to recognize the existence of the branch point and the branch direction, due to the presence of a shield on the road surface in front of the vehicle 100.
  • the projection control device 1a can avoid causing confusion for the occupants of the vehicle 100.
  • a configuration example of the projection control device 1a in this case is as shown in FIG. 6.
  • the projection control unit 14a switches the projection control for the projection device 3
  • the projection control unit 14a sets the projection switching flag to “1”.
  • the projection switching flag is stored in a location that can be referenced by the projection control device 1a.
  • the projection control section 14a stores the control details for the projection device 3 after switching in the storage section. Specifically, when the projection control unit 14a switches from control for projecting the first guide image on the projection device 3 to control for newly projecting the second guide image, the projection control unit 14a sets the projection switching flag to “1”. Set.
  • the projection control section 14a stores in the storage section the fact that the projection device 3 is caused to project the first guide image and the second guide image.
  • the projection control unit 14a switches from control for projecting the first guide image and second guide image on the projection device 3 to control for projecting only the first guide image
  • the projection control unit 14a sets the projection switching flag to “1”. ”.
  • the projection control section 14a stores in the storage section the fact that the projection device 3 is caused to project only the first guide image.
  • the guide image generating unit 13a refers to the projection switching flag.
  • the projection switching flag is set to "1"
  • the guide image generation unit 13a generates the first guide image or the first guide image and the first guide image according to the control details for the projection device 3 stored in the storage unit. 2 guide images are generated.
  • the guide image generation unit 13a generates the first guide image and the second guide image.
  • Generate a guide image For example, if the storage unit stores that the projection device 3 is to project only the first guide image, the guide image generation unit 13a generates the first guide image.
  • the projection switching flag and the control details for the projection device 3 after switching which are stored in the storage unit, are cleared at the same timing as the branch approach flag. If the storage unit does not store the control details for the projection device 3 after switching, the projection control unit 14a determines that the projection control for the projection device 3 has not been switched yet.
  • the operation of the projection control device 1a is, for example, as shown in the flowchart of FIG. 11 instead of the operation shown in the flowchart of FIG.
  • the specific contents of steps ST101 to ST103, steps ST105 to ST107, and step ST110 in FIG. 11 are the same as the specific contents of steps ST11 to ST16 and step ST18 in FIG. 7, respectively. Therefore, duplicate explanations will be omitted.
  • step ST104 the guide image generation unit 13a determines whether or not the projection control for the projection device 3 has been switched (step ST104). Specifically, the guide image generation unit 13a determines whether the projection switching flag is set to "1".
  • step ST104 If it is determined that the projection control for the projection device 3 has not been switched, that is, if the projection switching flag is not set to "1" ("NO" in step ST104), the guide image generation unit 13a: The information regarding the first guide image projection area calculated in step ST103 is output to the sensor information acquisition section 15. The operation of the projection control device 1a then proceeds to step ST105.
  • the guide image generation unit 13a determines whether the projection control unit 14a has switched to controlling the projection device 3 to project the second guide image (step ST109).
  • the guide image generation unit 13a can determine whether the projection control unit 14a has switched to controlling the projection device 3 to project the second guide image by referring to the storage unit.
  • step ST109 When the guide image generation unit 13a determines that the projection control unit 14a has switched to control for projecting the second guide image on the projection device 3 (“YES” in step ST109), the projection control unit 14a The operation proceeds to step ST107. On the other hand, if the guide image generation unit 13a determines that the projection control unit 14a has not switched to control for projecting the second guide image on the projection device 3 (“NO” in step ST109), in other words, For example, if the projection control unit 14a determines that the control has been switched to projecting only the first guide image on the projection device 3, the operation of the projection control device 1a proceeds to step ST110.
  • the projection control device 1a switches control of projection on the projection device 3 from a state in which the first guide image is projected onto the projection device 3 to a state in which the second guide image is newly projected;
  • the projection control on the projection device 3 is switched from a state in which the second guide image is projected to the projection device 3 to a state in which the first guide image is projected on the projection device 3, after switching, Control of projection to the projection device 3 after switching is maintained regardless of the presence or absence of the projection device 3.
  • the projection control device 1a can reduce the trouble that the projection state of the first projection image or the second projection image projected in front of the vehicle 100 frequently changes for the occupant of the vehicle 100.
  • the projection control device 1a may perform the processing of step ST171 and step ST172 of the flowchart of FIG. 9 instead of the processing of step ST107.
  • the projection control device 1a detects whether or not there is a blocking object that blocks the first guide image projection area for the first time after determining that the vehicle 100 is approaching a branch point. , is performed before the first guide image is projected onto the road surface in front of the vehicle 100, but this is only an example.
  • the projection control device 1a detects whether or not there is a shielding object that blocks the first guide image projection area for the first time after determining that the vehicle 100 is approaching a branch point
  • the projection control device 1a once displays the first guide image. is projected, and it is detected whether or not there is a blocking object that blocks the first guide image projection area based on a captured image after projection in which the front of the vehicle 100 on which the first guide image is projected is imaged. You can.
  • the projection control device 1a generates a first guide image in step ST13 of the flowchart in FIG. 7, and causes the projection device 3 to project the generated first guide image on the road surface in front of the vehicle 100. .
  • step ST13 of the flowchart in FIG. The first guide image generated by the guide image generator 13a is projected.
  • the guide image generation unit 13a calculates the first guide image projection area.
  • the projection control device 1a when the projection control device 1a performs the operation shown in the flowchart of FIG. 9, the projection control device 1a generates a first guide image in step ST13 of FIG. The generated first guide image is projected onto the road surface in front of the vehicle. Further, when the projection control device 1a performs the operation shown in the flowchart of FIG. 11, the projection control device 1a generates a first guide image in step ST103 of FIG. The generated first guide image is projected onto the road surface in front of the vehicle.
  • the projection control device 1a allows the occupant of the vehicle 100 to recognize the existence of the branch point and the branch direction, as intended, due to the presence of a shield on the road surface in front of the vehicle 100. If the first projected image cannot be presented, the direction of the branch point can be presented to the occupants of the vehicle 100 by causing the projection device 3 to project the second guide image. Thereby, the projection control device 1a can make the occupants of the vehicle 100 aware that they are about to reach a fork in the road. The projection control device 1a also provides a first projected image for the occupant of the vehicle 100 in which there is no obstruction on the road surface in front of the vehicle 100 and the occupant can recognize the presence of the branch point and the branch direction, as intended.
  • the projection device 3 projects only the first guide image and does not project the second guide image.
  • the projection device 3 can suppress power consumption for unnecessary projection of the second guide image.
  • the projection control device 1a can accurately detect that the shielding object is blocking the first guide image projection area each time, for example, even if the blocking object is a moving object such as another vehicle.
  • the first guide image is an arrow image as an example, but this is only an example.
  • the first guide image may be an image that starts near the branch point and extends in the branch direction to indicate the branch direction, and is an image that indicates the presence of the branch point and the branch direction.
  • the projection control device 1a generates a first guide image in which the second end approaches the first end as the vehicle 100 approaches the branch point, and generates a first guide image in which the first guide image is projected in front of the vehicle 100.
  • the projection device 3 is configured to project the first guide image so that the first projected image becomes shorter in the direction of the fork as the vehicle 100 approaches the fork as viewed from the occupant of the vehicle 100. All you have to do is stay there.
  • the second guide image is an image showing a rectangle as an example, but this is only an example.
  • the second guide image may be an image that shows the direction of the branch point as seen from the vehicle 100.
  • the second guide image may be an arrow image or an image indicating a message such as "There is a fork ahead.”
  • the projection control device 1a controls the first guide image so that the first road surface point of the first projection image and the third road surface point of the second projection image are the same point.
  • the second guide image is generated so that the first end of the guide image and the third end of the second guide image are at the same point, this is also just an example. It is not essential that the first guide image and the second guide image are connected.
  • the position of the first road surface point is a straight line passing through a point indicating the position of the vehicle 100 and parallel to the lane in which the vehicle 100 is traveling (here, the lane is a so-called lane);
  • the intersection point between the branched lane and a straight line that passes through the widthwise center of the branched lane (the lane referred to here is a so-called lane) and is parallel to the branched lane, this is only an example.
  • the projection control device 1a may set the first road surface point to a point within a predetermined range from the branch point. Note that the projection control device 1a can calculate the coordinates of the position of a point within a predetermined range from the branch point from the guide route information.
  • the first projection image is a lane ahead of the vehicle 100 after the branch (here, This is an image on the road surface of a so-called lane (see FIGS. 3 and 4), but this is only an example.
  • the projection control device 1a may cause the projection device 3 to project the first guide image onto a location other than the road surface of the lane after the branch (here, the lane is a so-called lane). It is sufficient that the first road surface point is located near the branch point.
  • the projection control device 1a is an in-vehicle device mounted on the vehicle 100, and includes the guidance route information acquisition section 11, the branch determination section 12, the guidance image generation section 13a, and the projection control section. 14a, the sensor information acquisition section 15, the shielding object detection section 16, and a control section (not shown) are included in the on-vehicle device.
  • the present invention is not limited to this, and includes the guidance route information acquisition section 11, the branch determination section 12, the guidance image generation section 13a, the projection control section 14a, the sensor information acquisition section 15, the obstruction detection section 16, and the control section (not shown).
  • the rest may be provided in a server connected to the on-vehicle device via a network.
  • all of the guidance route information acquisition section 11, the branch determination section 12, the guidance image generation section 13a, the projection control section 14a, the sensor information acquisition section 15, the obstruction detection section 16, and the control section may be provided in the server.
  • the hardware configuration of the projection control device 1a according to the second embodiment is the same as the hardware configuration of the projection control device 1 described using FIGS. 5A and 5B in the first embodiment, and therefore illustration thereof is omitted.
  • a guide route information acquisition unit 11, a branch determination unit 12, a guide image generation unit 13a, a projection control unit 14a, a sensor information acquisition unit 15, an obstruction detection unit 16, and a control (not shown)
  • the functions of the section are realized by the processing circuit 1001. That is, the projection control device 1a controls the projection device 3 to project the first guide image or the second guide image generated based on the guide route information acquired from the navigation device 2 onto the road surface in front of the vehicle 100. It includes a processing circuit 1001 for performing.
  • the processing circuit 1001 reads out and executes the program stored in the memory 1005 to acquire the guidance route information acquisition section 11, the branch determination section 12, the guidance image generation section 13a, the projection control section 14a, and the sensor information acquisition section. 15, a shielding object detection section 16, and a control section (not shown). That is, when executed by the processing circuit 1001, the projection control device 1a performs steps ST11 to ST19 in FIG. 7 described above, steps ST11 to ST19 in FIG. 9 described above, or steps ST101 to ST101 in FIG. 11 described above.
  • a memory 1005 is provided for storing a program that will eventually be executed in step ST111.
  • the program stored in the memory 1005 includes the guidance route information acquisition section 11, the branch determination section 12, the guidance image generation section 13a, the projection control section 14a, the sensor information acquisition section 15, and the obstruction detection section 16. It can also be said that it causes a computer to execute a processing procedure or method of a control unit (not shown).
  • the projection control device 1a includes an input interface device 1002 and an output interface device 1003 that perform wired or wireless communication with devices such as the navigation device 2, the projection device 3, or the sensor 4.
  • a storage unit (not shown) includes a memory 1005 and the like.
  • the projection control device 1a includes the guidance route information acquisition section 11, the branch determination section 12, the guidance image generation section 13a, and the projection control section 14a, as well as the projection control device 1a in front of the vehicle 100.
  • a sensor information acquisition unit 15 acquires sensor information regarding the object detected by the sensor information acquisition unit 15, and a shielding object that blocks the area on the road surface on which the first guide image is projected is located based on the sensor information acquired by the sensor information acquisition unit 15.
  • the shielding object detection unit 16 is configured to include a shielding object detection unit 16 that detects whether or not it exists.
  • the guide image generation unit 13a determines the position of the vehicle 100, the distance from the vehicle 100 to the branch point, and the first road surface on which the first end of the first guide image is projected. The position of the point and the second end opposite to the first end of the first guide image are projected on the road surface located further away from the first road surface point in the direction of the fork. A first guide image is generated based on the position of the second road surface point, and as the vehicle 100 approaches the branch point, a first guide image is generated in which the second end approaches the first end.
  • the guidance image generation unit 13a when the obstruction detection unit 16 detects that an obstruction exists, the guidance image generation unit 13a generates, in addition to the first guidance image, a second guidance image indicating the direction of the branch point as seen from the vehicle 100, The projection control unit 14a causes the projection device 3 to project the second guide image onto the road surface in front of the vehicle 100.
  • the projection control device 1a provides a first projection control device that allows the occupant of the vehicle 100 to recognize the existence of a branch point and the branch direction, as intended, due to the presence of a shield on the road surface in front of the vehicle 100. If the projected image cannot be presented, the direction of the branch point can be presented to the occupants of the vehicle 100 by causing the projection device 3 to project the second guide image.
  • the projection control device 1a can make the occupants of the vehicle 100 aware that they are about to reach a fork in the road.
  • the projection control device 1a also provides a first projected image for the occupant of the vehicle 100 in which there is no obstruction on the road surface in front of the vehicle 100 and the occupant can recognize the presence of the branch point and the branch direction, as intended.
  • the projection device 3 projects only the first guide image and does not project the second guide image.
  • the projection device 3 can suppress power consumption for unnecessary projection of the second guide image.
  • Projection control allows vehicle occupants to grasp the distance to a branch point and the branch direction without removing the branch point from their field of vision in route guidance by projecting images onto the road surface.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • Navigation (AREA)

Abstract

L'invention concerne un dispositif de commande de projection équipé d'une unité de détermination d'intersection (12) destinée à déterminer si un véhicule (100) s'approche ou non d'une intersection, une unité de génération d'image de guidage (13, 13a) destinée à générer une première image de guidage sur la base d'informations d'itinéraire de guidage lorsqu'il est déterminé que le véhicule (100) s'approche d'une intersection, et une unité de commande de projection (14, 14a) destinée à amener un dispositif de projection (3) à projeter la première image de guidage sur une surface de route vers l'avant du véhicule (100). Selon l'invention : l'unité de génération d'image de guidage (13, 13a) génère une première image de guidage sur la base de la position du véhicule (100), de la distance entre le véhicule (100) et l'intersection, de la position sur la surface de route d'un premier point de surface de route sur lequel est projetée une première section d'extrémité de la première image de guidage, et de la position sur la surface de route d'un deuxième point de surface de route, qui est plus éloigné de l'intersection dans la direction de l'intersection que le premier point de surface de route et sur laquelle est projetée une deuxième section d'extrémité de la première image de guidage, qui est opposée à la première section d'extrémité de celle-ci ; et ladite unité de génération d'image de guidage (13, 13a) génère des premières images de guidage dans lesquelles la deuxième section d'extrémité est plus proche de la première section d'extrémité lorsque le véhicule s'approche de l'intersection.
PCT/JP2022/027962 2022-07-19 2022-07-19 Dispositif de commande de projection et procédé de commande de projection WO2024018497A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2024535590A JPWO2024018497A1 (fr) 2022-07-19 2022-07-19
PCT/JP2022/027962 WO2024018497A1 (fr) 2022-07-19 2022-07-19 Dispositif de commande de projection et procédé de commande de projection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/027962 WO2024018497A1 (fr) 2022-07-19 2022-07-19 Dispositif de commande de projection et procédé de commande de projection

Publications (1)

Publication Number Publication Date
WO2024018497A1 true WO2024018497A1 (fr) 2024-01-25

Family

ID=89617431

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/027962 WO2024018497A1 (fr) 2022-07-19 2022-07-19 Dispositif de commande de projection et procédé de commande de projection

Country Status (2)

Country Link
JP (1) JPWO2024018497A1 (fr)
WO (1) WO2024018497A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012247369A (ja) * 2011-05-30 2012-12-13 Honda Motor Co Ltd 車両用投影装置
WO2016114048A1 (fr) * 2015-01-13 2016-07-21 日立マクセル株式会社 Dispositif de projection d'image
WO2020208779A1 (fr) * 2019-04-11 2020-10-15 三菱電機株式会社 Dispositif de commande d'affichage et procédé de commande d'affichage
JP2021079907A (ja) * 2019-11-22 2021-05-27 株式会社小糸製作所 車両運転支援システム
JP2021079835A (ja) * 2019-11-19 2021-05-27 株式会社小糸製作所 路面描画装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012247369A (ja) * 2011-05-30 2012-12-13 Honda Motor Co Ltd 車両用投影装置
WO2016114048A1 (fr) * 2015-01-13 2016-07-21 日立マクセル株式会社 Dispositif de projection d'image
WO2020208779A1 (fr) * 2019-04-11 2020-10-15 三菱電機株式会社 Dispositif de commande d'affichage et procédé de commande d'affichage
JP2021079835A (ja) * 2019-11-19 2021-05-27 株式会社小糸製作所 路面描画装置
JP2021079907A (ja) * 2019-11-22 2021-05-27 株式会社小糸製作所 車両運転支援システム

Also Published As

Publication number Publication date
JPWO2024018497A1 (fr) 2024-01-25

Similar Documents

Publication Publication Date Title
US11535155B2 (en) Superimposed-image display device and computer program
US10974765B2 (en) Vehicle capable of displaying information on road and control method therefor
JP7052786B2 (ja) 表示制御装置および表示制御プログラム
US11511627B2 (en) Display device and computer program
BR112018008270B1 (pt) Dispositivo e método de detecção de linha de vaga de estacionamento
JPWO2016092591A1 (ja) 衝突リスク算出装置、衝突リスク表示装置、車体制御装置
JP7002246B2 (ja) 車両用表示方法及び車両用表示装置
JP6094337B2 (ja) 運転制御装置
JP7251582B2 (ja) 表示制御装置および表示制御プログラム
CN108859958B (zh) 图像转换装置
CN107923761B (zh) 显示控制装置、显示装置及显示控制方法
JP2009255639A (ja) 車両用オートライト装置
JP2019149071A (ja) ステレオカメラ装置
JP7121361B2 (ja) 自律移動体
JP7043765B2 (ja) 車両走行制御方法及び装置
JP7377822B2 (ja) 運転支援方法及び運転支援装置
JP6968258B2 (ja) 運転支援装置および運転支援方法
WO2021166410A1 (fr) Dispositif d'aide au déplacement
WO2024018497A1 (fr) Dispositif de commande de projection et procédé de commande de projection
JP2019172068A (ja) 動作決定装置、動作決定方法およびプログラム
JP7416114B2 (ja) 表示制御装置および表示制御プログラム
US20210209947A1 (en) Traffic lane position information output device
EP3865815A1 (fr) Système embarqué
JP7031748B2 (ja) 自己位置推定方法及び自己位置推定装置
JP2011134237A (ja) 車両用画像表示装置及び車両用画像表示方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22951888

Country of ref document: EP

Kind code of ref document: A1